Purification through Emotions: The Role of Shame in Plato's "Sophist" 230B4-E5
ERIC Educational Resources Information Center
Candiotto, Laura
2018-01-01
This article proposes an analysis of Plato's "Sophist" (230b4--e5) that underlines the bond between the logical and the emotional components of the Socratic "elenchus", with the aim of depicting the social valence of this philosophical practice. The use of emotions characterizing the 'elenctic' method described by Plato is…
Shepherd, Marilyn Murphy; Wipke-Tevis, Deidre D.; Alexander, Gregory L.
2015-01-01
Purpose The purpose of this study was to compare pressure ulcer prevention programs in 2 long term care facilities (LTC) with diverse Information Technology Sophistication (ITS), one with high sophistication and one with low sophistication, and to identify implications for the Wound Ostomy Continence Nurse (WOC Nurse) Design Secondary analysis of narrative data obtained from a mixed methods study. Subjects and Setting The study setting was 2 LTC facilities in the Midwestern United States. The sample comprised 39 staff from 2 facilities, including 26 from a high ITS facility and 13 from the low ITS facility. Respondents included Certified Nurse Assistants,, Certified Medical Technicians, Restorative Medical Technicians, Social Workers, Registered Nurses, Licensed Practical Nurses, Information Technology staff, Administrators, and Directors. Methods This study is a secondary analysis of interviews regarding communication and education strategies in two longterm care agencies. This analysis focused on focus group interviews, which included both direct and non-direct care providers. Results Eight themes (codes) were identified in the analysis. Three themes are presented individually with exemplars of communication and education strategies. The analysis revealed specific differences between the high ITS and low ITS facility in regards to education and communication involving pressure ulcer prevention. These differences have direct implications for WOC nurses consulting in the LTC setting. Conclusions Findings from this study suggest that effective strategies for staff education and communication regarding PU prevention differ based on the level of ITS within a given facility. Specific strategies for education and communication are suggested for agencies with high ITS and agencies with low ITS sophistication. PMID:25945822
USDA-ARS?s Scientific Manuscript database
For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...
USDA-ARS?s Scientific Manuscript database
For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...
Turning up the heat on aircraft structures. [design and analysis for high-temperature conditions
NASA Technical Reports Server (NTRS)
Dobyns, Alan; Saff, Charles; Johns, Robert
1992-01-01
An overview is presented of the current effort in design and development of aircraft structures to achieve the lowest cost for best performance. Enhancements in this area are focused on integrated design, improved design analysis tools, low-cost fabrication techniques, and more sophisticated test methods. 3D CAD/CAM data are becoming the method through which design, manufacturing, and engineering communicate.
USDA-ARS?s Scientific Manuscript database
Increasing availability of genomic data and sophistication of analytical methodology in fungi has elevated the need for functional genomics tools in these organisms. Gene deletion is a critical tool for functional analysis. The targeted deletion of genes requires both a suitable method for the trans...
ERIC Educational Resources Information Center
Braun, W. John
2012-01-01
The Analysis of Variance is often taught in introductory statistics courses, but it is not clear that students really understand the method. This is because the derivation of the test statistic and p-value requires a relatively sophisticated mathematical background which may not be well-remembered or understood. Thus, the essential concept behind…
Overview of artificial neural networks.
Zou, Jinming; Han, Yi; So, Sung-Sau
2008-01-01
The artificial neural network (ANN), or simply neural network, is a machine learning method evolved from the idea of simulating the human brain. The data explosion in modem drug discovery research requires sophisticated analysis methods to uncover the hidden causal relationships between single or multiple responses and a large set of properties. The ANN is one of many versatile tools to meet the demand in drug discovery modeling. Compared to a traditional regression approach, the ANN is capable of modeling complex nonlinear relationships. The ANN also has excellent fault tolerance and is fast and highly scalable with parallel processing. This chapter introduces the background of ANN development and outlines the basic concepts crucially important for understanding more sophisticated ANN. Several commonly used learning methods and network setups are discussed briefly at the end of the chapter.
Machine cost analysis using the traditional machine-rate method and ChargeOut!
E. M. (Ted) Bilek
2009-01-01
Forestry operations require ever more use of expensive capital equipment. Mechanization is frequently necessary to perform cost-effective and safe operations. Increased capital should mean more sophisticated capital costing methodologies. However the machine rate method, which is the costing methodology most frequently used, dates back to 1942. CHARGEOUT!, a recently...
Shan Gao; Xiping Wang; Michael C. Wiemann; Brian K. Brashaw; Robert J. Ross; Lihai Wang
2017-01-01
Key message Field methods for rapid determination of wood density in trees have evolved from increment borer, torsiometer, Pilodyn, and nail withdrawal into sophisticated electronic tools of resistance drilling measurement. A partial resistance drilling approach coupled with knowledge of internal tree density distribution may...
Development of BEM for ceramic composites
NASA Technical Reports Server (NTRS)
Henry, D. P.; Banerjee, P. K.; Dargush, G. F.
1991-01-01
It is evident that for proper micromechanical analysis of ceramic composites, one needs to use a numerical method that is capable of idealizing the individual fibers or individual bundles of fibers embedded within a three-dimensional ceramic matrix. The analysis must be able to account for high stress or temperature gradients from diffusion of stress or temperature from the fiber to the ceramic matrix and allow for interaction between the fibers through the ceramic matrix. The analysis must be sophisticated enough to deal with the failure of fibers described by a series of increasingly sophisticated constitutive models. Finally, the analysis must deal with micromechanical modeling of the composite under nonlinear thermal and dynamic loading. This report details progress made towards the development of a boundary element code designed for the micromechanical studies of an advanced ceramic composite. Additional effort has been made in generalizing the implementation to allow the program to be applicable to real problems in the aerospace industry.
Discrete choice experiments of pharmacy services: a systematic review.
Vass, Caroline; Gray, Ewan; Payne, Katherine
2016-06-01
Background Two previous systematic reviews have summarised the application of discrete choice experiments to value preferences for pharmacy services. These reviews identified a total of twelve studies and described how discrete choice experiments have been used to value pharmacy services but did not describe or discuss the application of methods used in the design or analysis. Aims (1) To update the most recent systematic review and critically appraise current discrete choice experiments of pharmacy services in line with published reporting criteria and; (2) To provide an overview of key methodological developments in the design and analysis of discrete choice experiments. Methods The review used a comprehensive strategy to identify eligible studies (published between 1990 and 2015) by searching electronic databases for key terms related to discrete choice and best-worst scaling (BWS) experiments. All healthcare choice experiments were then hand-searched for key terms relating to pharmacy. Data were extracted using a published checklist. Results A total of 17 discrete choice experiments eliciting preferences for pharmacy services were identified for inclusion in the review. No BWS studies were identified. The studies elicited preferences from a variety of populations (pharmacists, patients, students) for a range of pharmacy services. Most studies were from a United Kingdom setting, although examples from Europe, Australia and North America were also identified. Discrete choice experiments for pharmacy services tended to include more attributes than non-pharmacy choice experiments. Few studies reported the use of qualitative research methods in the design and interpretation of the experiments (n = 9) or use of new methods of analysis to identify and quantify preference and scale heterogeneity (n = 4). No studies reported the use of Bayesian methods in their experimental design. Conclusion Incorporating more sophisticated methods in the design of pharmacy-related discrete choice experiments could help researchers produce more efficient experiments which are better suited to valuing complex pharmacy services. Pharmacy-related discrete choice experiments could also benefit from more sophisticated analytical techniques such as investigations into scale and preference heterogeneity. Employing these sophisticated methods for both design and analysis could extend the usefulness of discrete choice experiments to inform health and pharmacy policy.
The Sophistical Attitude and the Invention of Rhetoric
ERIC Educational Resources Information Center
Crick, Nathan
2010-01-01
Traditionally, the Older Sophists were conceived as philosophical skeptics who rejected speculative inquiry to focus on rhetorical methods of being successful in practical life. More recently, this view has been complicated by studies revealing the Sophists to be a diverse group of intellectuals who practiced their art prior to the categorization…
The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership
ERIC Educational Resources Information Center
Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.
2011-01-01
The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…
Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application
ERIC Educational Resources Information Center
Kyle, Kristopher; Crossley, Scott A.
2015-01-01
This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…
ERIC Educational Resources Information Center
Kim, Minkyung; Crossley, Scott A.; Kyle, Kristopher
2018-01-01
This study conceptualizes lexical sophistication as a multidimensional phenomenon by reducing numerous lexical features of lexical sophistication into 12 aggregated components (i.e., dimensions) via a principal component analysis approach. These components were then used to predict second language (L2) writing proficiency levels, holistic lexical…
Naive vs. Sophisticated Methods of Forecasting Public Library Circulations.
ERIC Educational Resources Information Center
Brooks, Terrence A.
1984-01-01
Two sophisticated--autoregressive integrated moving average (ARIMA), straight-line regression--and two naive--simple average, monthly average--forecasting techniques were used to forecast monthly circulation totals of 34 public libraries. Comparisons of forecasts and actual totals revealed that ARIMA and monthly average methods had smallest mean…
Grimaldi, Loredana
2012-01-01
Recently, there has been a concentrated effort by companies to better understand the needs and desires of their consumers. Such efforts usually employ different and sophisticated analysis techniques for monitoring the consumers preferences and how such consumers perceive the advertising communication campaign from a specific company.
How Commercial Banks Use the World Wide Web: A Content Analysis.
ERIC Educational Resources Information Center
Leovic, Lydia K.
New telecommunications vehicles expand the possible ways that business is conducted. The hypermedia portion of the Internet, the World Wide Web, is such a telecommunications device. The Web is presently one of the most flexible and dynamic methods for electronic information dissemination. The level of technological sophistication necessary to…
Does a better model yield a better argument? An info-gap analysis
NASA Astrophysics Data System (ADS)
Ben-Haim, Yakov
2017-04-01
Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.
Profile Of 'Original Articles' Published In 2016 By The Journal Of Ayub Medical College, Pakistan.
Shaikh, Masood Ali
2018-01-01
Journal of Ayub Medical College (JAMC) is the only Medline indexed biomedical journal of Pakistan that is edited and published by a medical college. Assessing the trends of study designs employed, statistical methods used, and statistical analysis software used in the articles of medical journals help understand the sophistication of research published. The objectives of this descriptive study were to assess all original articles published by JAMC in the year 2016. JAMC published 147 original articles in the year 2016. The most commonly used study design was crosssectional studies, with 64 (43.5%) articles reporting its use. Statistical tests involving bivariate analysis were most common and reported by 73 (49.6%) articles. Use of SPSS software was reported by 109 (74.1%) of articles. Most 138 (93.9%) of the original articles published were based on studies conducted in Pakistan. The number and sophistication of analysis reported in JAMC increased from year 2014 to 2016.
Missing data imputation: focusing on single imputation.
Zhang, Zhongheng
2016-01-01
Complete case analysis is widely used for handling missing data, and it is the default method in many statistical packages. However, this method may introduce bias and some useful information will be omitted from analysis. Therefore, many imputation methods are developed to make gap end. The present article focuses on single imputation. Imputations with mean, median and mode are simple but, like complete case analysis, can introduce bias on mean and deviation. Furthermore, they ignore relationship with other variables. Regression imputation can preserve relationship between missing values and other variables. There are many sophisticated methods exist to handle missing values in longitudinal data. This article focuses primarily on how to implement R code to perform single imputation, while avoiding complex mathematical calculations.
Missing data imputation: focusing on single imputation
2016-01-01
Complete case analysis is widely used for handling missing data, and it is the default method in many statistical packages. However, this method may introduce bias and some useful information will be omitted from analysis. Therefore, many imputation methods are developed to make gap end. The present article focuses on single imputation. Imputations with mean, median and mode are simple but, like complete case analysis, can introduce bias on mean and deviation. Furthermore, they ignore relationship with other variables. Regression imputation can preserve relationship between missing values and other variables. There are many sophisticated methods exist to handle missing values in longitudinal data. This article focuses primarily on how to implement R code to perform single imputation, while avoiding complex mathematical calculations. PMID:26855945
Acoustic emission monitoring of polymer composite materials
NASA Technical Reports Server (NTRS)
Bardenheier, R.
1981-01-01
The techniques of acoustic emission monitoring of polymer composite materials is described. It is highly sensitive, quasi-nondestructive testing method that indicates the origin and behavior of flaws in such materials when submitted to different load exposures. With the use of sophisticated signal analysis methods it is possible the distinguish between different types of failure mechanisms, such as fiber fracture delamination or fiber pull-out. Imperfections can be detected while monitoring complex composite structures by acoustic emission measurements.
ERIC Educational Resources Information Center
Gold, Stephanie
2005-01-01
The concept of data-driven professional development is both straight-forward and sensible. Implementing this approach is another story, which is why many administrators are turning to sophisticated tools to help manage data collection and analysis. These tools allow educators to assess and correlate student outcomes, instructional methods, and…
NASA Astrophysics Data System (ADS)
Mohammadian, E.; Hamidi, H.; Azdarpour, A.
2018-05-01
CO2 sequestration is considered as one of the most anticipated methods to mitigate CO2 concentration in the atmosphere. Solubility mechanism is one of the most important and sophisticated mechanisms by which CO2 is rendered immobile while it is being injected into aquifers. A semi-empirical, easy to use model was developed to calculate the solubility of CO2 in NaCl brines with thermodynamic conditions (pressure, temperature) and salinity gradients representative CO2 sequestration in the Malay basin. The model was compared to the previous more sophisticated models and a good consistency was found among the data obtained using the two models. A Sensitivity analysis was also conducted on the model to test its performance beyond its limits.
Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi
2015-01-01
Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107
Experimental design and quantitative analysis of microbial community multiomics.
Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis
2017-11-30
Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.
Missing data exploration: highlighting graphical presentation of missing pattern.
Zhang, Zhongheng
2015-12-01
Functions shipped with R base can fulfill many tasks of missing data handling. However, because the data volume of electronic medical record (EMR) system is always very large, more sophisticated methods may be helpful in data management. The article focuses on missing data handling by using advanced techniques. There are three types of missing data, that is, missing completely at random (MCAR), missing at random (MAR) and not missing at random (NMAR). This classification system depends on how missing values are generated. Two packages, Multivariate Imputation by Chained Equations (MICE) and Visualization and Imputation of Missing Values (VIM), provide sophisticated functions to explore missing data pattern. In particular, the VIM package is especially helpful in visual inspection of missing data. Finally, correlation analysis provides information on the dependence of missing data on other variables. Such information is useful in subsequent imputations.
Mechanical break junctions: enormous information in a nanoscale package.
Natelson, Douglas
2012-04-24
Mechanical break junctions, particularly those in which a metal tip is repeatedly moved in and out of contact with a metal film, have provided many insights into electronic conduction at the atomic and molecular scale, most often by averaging over many possible junction configurations. This averaging throws away a great deal of information, and Makk et al. in this issue of ACS Nano demonstrate that, with both simulated and real experimental data, more sophisticated two-dimensional analysis methods can reveal information otherwise obscured in simple histograms. As additional measured quantities come into play in break junction experiments, including thermopower, noise, and optical response, these more sophisticated analytic approaches are likely to become even more powerful. While break junctions are not directly practical for useful electronic devices, they are incredibly valuable tools for unraveling the electronic transport physics relevant for ultrascaled nanoelectronics.
Viscous-Inviscid Methods in Unsteady Aerodynamic Analysis of Bio-Inspired Morphing Wings
NASA Astrophysics Data System (ADS)
Dhruv, Akash V.
Flight has been one of the greatest realizations of human imagination, revolutionizing communication and transportation over the years. This has greatly influenced the growth of technology itself, enabling researchers to communicate and share their ideas more effectively, extending the human potential to create more sophisticated systems. While the end product of a sophisticated technology makes our lives easier, its development process presents an array of challenges in itself. In last decade, scientists and engineers have turned towards bio-inspiration to design more efficient and robust aerodynamic systems to enhance the ability of Unmanned Aerial Vehicles (UAVs) to be operated in cluttered environments, where tight maneuverability and controllability are necessary. Effective use of UAVs in domestic airspace will mark the beginning of a new age in communication and transportation. The design of such complex systems necessitates the need for faster and more effective tools to perform preliminary investigations in design, thereby streamlining the design process. This thesis explores the implementation of numerical panel methods for aerodynamic analysis of bio-inspired morphing wings. Numerical panel methods have been one of the earliest forms of computational methods for aerodynamic analysis to be developed. Although the early editions of this method performed only inviscid analysis, the algorithm has matured over the years as a result of contributions made by prominent aerodynamicists. The method discussed in this thesis is influenced by recent advancements in panel methods and incorporates both viscous and inviscid analysis of multi-flap wings. The surface calculation of aerodynamic coefficients makes this method less computationally expensive than traditional Computational Fluid Dynamics (CFD) solvers available, and thus is effective when both speed and accuracy are desired. The morphing wing design, which consists of sequential feather-like flaps installed over the upper and lower surfaces of a standard airfoil, proves to be an effective alternative to standard control surfaces by increasing the flight capability of bird-scale UAVs. The results obtained for this wing design under various flight and flap configurations provide insight into its aerodynamic behavior, which enhance the maneuverability and controllability. The overall method acts as an important tool to create an aerodynamic database to develop a distributed control system for autonomous operation of the multi-flap morphing wing, supporting the use of viscous-inviscid methods as a tool in rapid aerodynamic analysis.
Changing Epistemological Beliefs: The Unexpected Impact of a Short-Term Intervention
ERIC Educational Resources Information Center
Kienhues, Dorothe; Bromme, Rainer; Stahl, Elmar
2008-01-01
Background: Previous research has shown that sophisticated epistemological beliefs exert a positive influence on students' learning strategies and learning outcomes. This gives a clear educational relevance to studies on the development of methods for promoting a change in epistemological beliefs and making them more sophisticated. Aims: To…
ERIC Educational Resources Information Center
Lee, Hyeon Woo
2011-01-01
As the technology-enriched learning environments and theoretical constructs involved in instructional design become more sophisticated and complex, a need arises for equally sophisticated analytic methods to research these environments, theories, and models. Thus, this paper illustrates a comprehensive approach for analyzing data arising from…
Analysis of Vertiport Studies Funded by the Airport Improvement Program (AIP)
1994-05-01
the general population and travel behavior factors from surveys and other sources. FEASIBILITY The vertiport studies recognize the need to address the ... behavior factors obtained from surveys and other sources. All of the methods were dependent upon various secondary data and/or information sources that...economic responses and of travel behavior . The five types, in order of increasing analytical sophistication, are briefly identified as follows. I
A review of machine learning in obesity.
DeGregory, K W; Kuiper, P; DeSilvio, T; Pleuss, J D; Miller, R; Roginski, J W; Fisher, C B; Harness, D; Viswanath, S; Heymsfield, S B; Dungan, I; Thomas, D M
2018-05-01
Rich sources of obesity-related data arising from sensors, smartphone apps, electronic medical health records and insurance data can bring new insights for understanding, preventing and treating obesity. For such large datasets, machine learning provides sophisticated and elegant tools to describe, classify and predict obesity-related risks and outcomes. Here, we review machine learning methods that predict and/or classify such as linear and logistic regression, artificial neural networks, deep learning and decision tree analysis. We also review methods that describe and characterize data such as cluster analysis, principal component analysis, network science and topological data analysis. We introduce each method with a high-level overview followed by examples of successful applications. The algorithms were then applied to National Health and Nutrition Examination Survey to demonstrate methodology, utility and outcomes. The strengths and limitations of each method were also evaluated. This summary of machine learning algorithms provides a unique overview of the state of data analysis applied specifically to obesity. © 2018 World Obesity Federation.
SimHap GUI: an intuitive graphical user interface for genetic association analysis.
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-12-25
Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.
Missing data exploration: highlighting graphical presentation of missing pattern
2015-01-01
Functions shipped with R base can fulfill many tasks of missing data handling. However, because the data volume of electronic medical record (EMR) system is always very large, more sophisticated methods may be helpful in data management. The article focuses on missing data handling by using advanced techniques. There are three types of missing data, that is, missing completely at random (MCAR), missing at random (MAR) and not missing at random (NMAR). This classification system depends on how missing values are generated. Two packages, Multivariate Imputation by Chained Equations (MICE) and Visualization and Imputation of Missing Values (VIM), provide sophisticated functions to explore missing data pattern. In particular, the VIM package is especially helpful in visual inspection of missing data. Finally, correlation analysis provides information on the dependence of missing data on other variables. Such information is useful in subsequent imputations. PMID:26807411
Predicting Second Language Writing Proficiency: The Roles of Cohesion and Linguistic Sophistication
ERIC Educational Resources Information Center
Crossley, Scott A.; McNamara, Danielle S.
2012-01-01
This study addresses research gaps in predicting second language (L2) writing proficiency using linguistic features. Key to this analysis is the inclusion of linguistic measures at the surface, textbase and situation model level that assess text cohesion and linguistic sophistication. The results of this study demonstrate that five variables…
ERIC Educational Resources Information Center
Woodfield, Brian F.; Andrus, Merritt B.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Allen, Rob; Bodily, Greg; Andersen, Tricia; Miller, Jordan; Simmons, Bryon; Stanger, Richard
2005-01-01
A set of sophisticated and realistic laboratory simulations is created for use in freshman- and sophomore-level chemistry classes and laboratories called 'Virtual ChemLab'. A detailed assessment of student responses is provided and the simulation's pedagogical utility is described using the organic simulation.
Political Trust and Sophistication: Taking Measurement Seriously.
Turper, Sedef; Aarts, Kees
2017-01-01
Political trust is an important indicator of political legitimacy. Hence, seemingly decreasing levels of political trust in Western democracies have stimulated a growing body of research on the causes and consequences of political trust. However, the neglect of potential measurement problems of political trust raises doubts about the findings of earlier studies. The current study revisits the measurement of political trust and re-examines the relationship between political trust and sophistication in the Netherlands by utilizing European Social Survey (ESS) data across five time points and four-wave panel data from the Panel Component of ESS. Our findings illustrate that high and low political sophistication groups display different levels of political trust even when measurement characteristics of political trust are taken into consideration. However, the relationship between political sophistication and political trust is weaker than it is often suggested by earlier research. Our findings also provide partial support for the argument that the gap between sophistication groups is widening over time. Furthermore, we demonstrate that, although the between-method differences between the latent means and the composite score means of political trust for high- and low sophistication groups are relatively minor, it is important to analyze the measurement characteristics of the political trust construct.
Fast automated analysis of strong gravitational lenses with convolutional neural networks.
Hezaveh, Yashar D; Levasseur, Laurence Perreault; Marshall, Philip J
2017-08-30
Quantifying image distortions caused by strong gravitational lensing-the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures-and estimating the corresponding matter distribution of these structures (the 'gravitational lens') has primarily been performed using maximum likelihood modelling of observations. This procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physical processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. Here we report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the 'singular isothermal ellipsoid' density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.
NASA Astrophysics Data System (ADS)
Gu, Jiangyue
Epistemic beliefs are individuals' beliefs about the nature of knowledge, how knowledge is constructed, and how knowledge can be justified. This study employed a mixed-methods approach to examine: (a) middle and high school students' self-reported epistemic beliefs (quantitative) and epistemic beliefs revealed from practice (qualitative) during a problem-based, scientific inquiry unit, (b) How do middle and high school students' epistemic beliefs contribute to the construction of students' problem solving processes, and (c) how and why do students' epistemic beliefs change by engaging in PBL. Twenty-one middle and high school students participated in a summer science class to investigate local water quality in a 2-week long problem-based learning (PBL) unit. The students worked in small groups to conduct water quality tests at in their local watershed and visited several stakeholders for their investigation. Pretest and posttest versions of the Epistemological Beliefs Questionnaire were conducted to assess students' self-reported epistemic beliefs before and after the unit. I videotaped and interviewed three groups of students during the unit and conducted discourse analysis to examine their epistemic beliefs revealed from scientific inquiry activities and triangulate with their self-reported data. There are three main findings from this study. First, students in this study self-reported relatively sophisticated epistemic beliefs on the pretest. However, the comparison between their self-reported beliefs and beliefs revealed from practice indicated that some students were able to apply sophisticated beliefs during the unit while others failed to do so. The inconsistency between these two types of epistemic beliefs may due to students' inadequate cognitive ability, low validity of self-report measure, and the influence of contextual factors. Second, qualitative analysis indicated that students' epistemic beliefs of the nature of knowing influenced their problem solving processes and construction of arguments during their inquiry activities. Students with more sophisticated epistemic beliefs acquired knowledge, presented solid evidence, and used it to support their claims more effectively than their peers. Third, students' self-reported epistemic beliefs became significantly more sophisticated by engaging in PBL. Findings from this study can potentially help researchers to better understand the relation between students' epistemic beliefs and their scientific inquiry practice,
An Excel‐based implementation of the spectral method of action potential alternans analysis
Pearman, Charles M.
2014-01-01
Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439
Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review
Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie
2015-01-01
Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115
Science as Storytelling for Teaching the Nature of Science and the Science-Religion Interface
ERIC Educational Resources Information Center
Bickmore, Barry R.; Thompson, Kirsten R.; Grandy, David A.; Tomlin, Teagan
2009-01-01
Here we describe a method for teaching the NOS called "Science as Storytelling," which was designed to directly confront narve realist preconceptions about the NOS and replace them with more sophisticated ideas, while retaining a moderate realist perspective. It was also designed to foster a more sophisticated understanding of the…
Cardiovascular imaging environment: will the future be cloud-based?
Kawel-Boehm, Nadine; Bluemke, David A
2017-07-01
In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.
Preliminary design methods for fiber reinforced composite structures employing a personal computer
NASA Technical Reports Server (NTRS)
Eastlake, C. N.
1986-01-01
The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.
ERIC Educational Resources Information Center
Regenwetter, Michel; Ho, Moon-Ho R.; Tsetlin, Ilia
2007-01-01
This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two…
ERIC Educational Resources Information Center
Wang, Chia-Yu; Barrow, Lloyd H.
2011-01-01
This study employed a case-study approach to reveal how an ability to think with mental models contributes to differences in students' understanding of molecular geometry and polarity. We were interested in characterizing features and levels of sophistication regarding first-year university chemistry learners' mental modeling behaviors while the…
Determining cantilever stiffness from thermal noise.
Lübbe, Jannis; Temmen, Matthias; Rahe, Philipp; Kühnle, Angelika; Reichling, Michael
2013-01-01
We critically discuss the extraction of intrinsic cantilever properties, namely eigenfrequency f n , quality factor Q n and specifically the stiffness k n of the nth cantilever oscillation mode from thermal noise by an analysis of the power spectral density of displacement fluctuations of the cantilever in contact with a thermal bath. The practical applicability of this approach is demonstrated for several cantilevers with eigenfrequencies ranging from 50 kHz to 2 MHz. As such an analysis requires a sophisticated spectral analysis, we introduce a new method to determine k n from a spectral analysis of the demodulated oscillation signal of the excited cantilever that can be performed in the frequency range of 10 Hz to 1 kHz regardless of the eigenfrequency of the cantilever. We demonstrate that the latter method is in particular useful for noncontact atomic force microscopy (NC-AFM) where the required simple instrumentation for spectral analysis is available in most experimental systems.
SimHap GUI: An intuitive graphical user interface for genetic association analysis
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-01-01
Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877
Methanococcus jannaschii genome: revisited
NASA Technical Reports Server (NTRS)
Kyrpides, N. C.; Olsen, G. J.; Klenk, H. P.; White, O.; Woese, C. R.
1996-01-01
Analysis of genomic sequences is necessarily an ongoing process. Initial gene assignments tend (wisely) to be on the conservative side (Venter, 1996). The analysis of the genome then grows in an iterative fashion as additional data and more sophisticated algorithms are brought to bear on the data. The present report is an emendation of the original gene list of Methanococcus jannaschii (Bult et al., 1996). By using a somewhat more updated database and more relaxed (and operator-intensive) pattern matching methods, we were able to add significantly to, and in a few cases amend, the gene identification table originally published by Bult et al. (1996).
Beyond mind-reading: multi-voxel pattern analysis of fMRI data.
Norman, Kenneth A; Polyn, Sean M; Detre, Greg J; Haxby, James V
2006-09-01
A key challenge for cognitive neuroscience is determining how mental representations map onto patterns of neural activity. Recently, researchers have started to address this question by applying sophisticated pattern-classification algorithms to distributed (multi-voxel) patterns of functional MRI data, with the goal of decoding the information that is represented in the subject's brain at a particular point in time. This multi-voxel pattern analysis (MVPA) approach has led to several impressive feats of mind reading. More importantly, MVPA methods constitute a useful new tool for advancing our understanding of neural information processing. We review how researchers are using MVPA methods to characterize neural coding and information processing in domains ranging from visual perception to memory search.
Structural analysis of jewelry from the Moche tomb of the `lady of Cao' by X-ray digital radiography
NASA Astrophysics Data System (ADS)
Azeredo, S. R.; Cesareo, R.; Franco, R.; Fernandez, A.; Bustamante, A.; Lopes, R. T.
2018-04-01
Nose ornaments from the tomb of the `Lady of Cao', a mummified woman representative of the Moche culture and dated to the third-or-fourth century AD, were analyzed by X-ray digital radiography. These spectacular gold and silver jewels are some of the most sophisticated metalworking ever produced in ancient America. The Mochecivilization flourished along the north coast of present-day Peru, between the Andes and the Pacific Ocean, approximately between 100 and 600 AD. The Moche were very sophisticated artisans and metal smiths, being considered the finest producers of jewels and artifacts of the region. A portable X-ray digital radiography (XDR) system consisting of a flat panel detector with high resolution image and a mini X-ray tube was used for the structural analysis of the Moche jewels aiming at inferring different joining methods of the silver-gold sheets. The radiographic analysis showed some differences in the joint of the silver-and-gold sheets. Presence of filler material and adhesive for joining the silver-and-gold sheets was visible as well as silver-gold junctions without filler material (or with a material invisible in radiography). Furthermore, the technique demonstrated the advantage of using a portable XDR micro system when the sample cannot be brought to the laboratory.
Rogers, R; Sewell, K W; Morey, L C; Ustad, K L
1996-12-01
Psychological assessment with multiscale inventories is largely dependent on the honesty and forthrightness of those persons evaluated. We investigated the effectiveness of the Personality Assessment Inventory (PAI) in detecting participants feigning three specific disorders: schizophrenia, major depression, and generalized anxiety disorder. With a simulation design, we tested the PAI validity scales on 166 naive (undergraduates with minimal preparation) and 80 sophisticated (doctoral psychology students with 1 week preparation) participants. We compared their results to persons with the designated disorders: schizophrenia (n = 45), major depression (n = 136), and generalized anxiety disorder (n = 40). Although moderately effective with naive simulators, the validity scales evidenced only modest positive predictive power with their sophisticated counterparts. Therefore, we performed a two-stage discriminant analysis that yielded a moderately high hit rate (> 80%) that was maintained in the cross-validation sample, irrespective of the feigned disorder or the sophistication of the simulators.
Analysing attitude data through ridit schemes.
El-rouby, M G
1994-12-02
The attitudes of individuals and populations on various issues are usually assessed through sample surveys. Responses to survey questions are then scaled and combined into a meaningful whole which defines the measured attitude. The applied scales may be of nominal, ordinal, interval, or ratio nature depending upon the degree of sophistication the researcher wants to introduce into the measurement. This paper discusses methods of analysis for categorical variables of the type used in attitude and human behavior research, and recommends adoption of ridit analysis, a technique which has been successfully applied to epidemiological, clinical investigation, laboratory, and microbiological data. The ridit methodology is described after reviewing some general attitude scaling methods and problems of analysis related to them. The ridit method is then applied to a recent study conducted to assess health care service quality in North Carolina. This technique is conceptually and computationally more simple than other conventional statistical methods, and is also distribution-free. Basic requirements and limitations on its use are indicated.
BOOK REVIEW: Vortex Methods: Theory and Practice
NASA Astrophysics Data System (ADS)
Cottet, G.-H.; Koumoutsakos, P. D.
2001-03-01
The book Vortex Methods: Theory and Practice presents a comprehensive account of the numerical technique for solving fluid flow problems. It provides a very nice balance between the theoretical development and analysis of the various techniques and their practical implementation. In fact, the presentation of the rigorous mathematical analysis of these methods instills confidence in their implementation. The book goes into some detail on the more recent developments that attempt to account for viscous effects, in particular the presence of viscous boundary layers in some flows of interest. The presentation is very readable, with most points illustrated with well-chosen examples, some quite sophisticated. It is a very worthy reference book that should appeal to a large body of readers, from those interested in the mathematical analysis of the methods to practitioners of computational fluid dynamics. The use of the book as a text is compromised by its lack of exercises for students, but it could form the basis of a graduate special topics course. Juan Lopez
Peter, Johannes; Rosman, Tom; Mayer, Anne-Kathrin; Leichner, Nikolas; Krampen, Günter
2016-06-01
Particularly in higher education, not only a view of science as a means of finding absolute truths (absolutism), but also a view of science as generally tentative (multiplicism) can be unsophisticated and obstructive for learning. Most quantitative epistemic belief inventories neglect this and understand epistemic sophistication as disagreement with absolute statements. This article suggests considering absolutism and multiplicism as separate dimensions. Following our understanding of epistemic sophistication as a cautious and reluctant endorsement of both positions, we assume evaluativism (a contextually adaptive view of knowledge as personally constructed and evidence-based) to be reflected by low agreement with both generalized absolute and generalized multiplicistic statements. Three studies with a total sample size of N = 416 psychology students were conducted. A domain-specific inventory containing both absolute and multiplicistic statements was developed. Expectations were tested by exploratory factor analysis, confirmatory factor analysis, and correlational analyses. Results revealed a two-factor solution with an absolute and a multiplicistic factor. Criterion validity of both factors was confirmed. Cross-sectional analyses revealed that agreement to generalized multiplicistic statements decreases with study progress. Moreover, consistent with our understanding of epistemic sophistication as a reluctant attitude towards generalized epistemic statements, evidence for a negative relationship between epistemic sophistication and need for cognitive closure was found. We recommend including multiplicistic statements into epistemic belief questionnaires and considering them as a separate dimension, especially when investigating individuals in later stages of epistemic development (i.e., in higher education). © 2015 The British Psychological Society.
Current management of overactive bladder.
Cartwright, Rufus; Renganathan, Arasee; Cardozo, Linda
2008-10-01
The concept of overactive bladder has helped us address the problem of urgency and urge incontinence from a symptomatic perspective. In this review, we provide a critical summary of clinically relevant recent publications, focusing in particular on advances in our understanding of assessment methods and therapeutic interventions for overactive bladder in women. According to current definitions, the prevalence of overactive bladder in western nations is now estimated as 13.0%. Although the prevalence increases with age, the symptoms of overactive bladder may follow a relapsing and remitting course. There has been a proliferation of validated symptom and quality of life measures and increasing sophistication in the analysis of bladder diaries. The role of urodynamics in the evaluation of urgency remains uncertain, with many trials showing limited benefit as a preoperative investigation. Fluid restriction and bladder retraining remain important first-line interventions. Many new anticholinergic medications have been licensed, with limited benefits compared with existing preparations. Intravesical botulinum toxin has become a popular alternative for patients who fail oral therapies. Although there have been few important therapeutic innovations, recent publications have led to greater sophistication in assessment methods and a clearer understanding of the role of existing interventions.
1981-07-01
Safety Program Erosion Embankmients Watchung Lake Dam, N.J. Visual InspectionSeae IStructural Analysis Spillways 12M~ A0ST Acr (cathiue samvwgip @ta N...determined by a qualified professional consultant engaged by the owner using more sophisticated methods , procedures and studies within six months...be overtopped. (The SDF, in this instance, is one half of the Probable Maximum Flood). The decision to consider the spillway " inaae - quate" instead of
Carling, Christopher; Bloomfield, Jonathan; Nelsen, Lee; Reilly, Thomas
2008-01-01
The optimal physical preparation of elite soccer (association football) players has become an indispensable part of the professional game, especially due to the increased physical demands of match-play. The monitoring of players' work rate profiles during competition is now feasible through computer-aided motion analysis. Traditional methods of motion analysis were extremely labour intensive and were largely restricted to university-based research projects. Recent technological developments have meant that sophisticated systems, capable of quickly recording and processing the data of all players' physical contributions throughout an entire match, are now being used in elite club environments. In recognition of the important role that motion analysis now plays as a tool for measuring the physical performance of soccer players, this review critically appraises various motion analysis methods currently employed in elite soccer and explores research conducted using these methods. This review therefore aims to increase the awareness of both practitioners and researchers of the various motion analysis systems available, and identify practical implications of the established body of knowledge, while highlighting areas that require further exploration.
ERIC Educational Resources Information Center
Greene, Jeffrey Alan; Azevedo, Roger
2009-01-01
In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…
Fast automated analysis of strong gravitational lenses with convolutional neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.
Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. Our procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physicalmore » processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. We report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.« less
Fast automated analysis of strong gravitational lenses with convolutional neural networks
Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.
2017-08-30
Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. Our procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physicalmore » processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. We report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.« less
Fast automated analysis of strong gravitational lenses with convolutional neural networks
NASA Astrophysics Data System (ADS)
Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.
2017-08-01
Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. This procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physical processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. Here we report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.
NASA Astrophysics Data System (ADS)
Tsalamengas, John L.
2018-07-01
We study plane-wave electromagnetic scattering by radially and strongly inhomogeneous dielectric cylinders at oblique incidence. The method of analysis relies on an exact reformulation of the underlying field equations as a first-order 4 × 4 system of differential equations and on the ability to restate the associated initial-value problem in the form of a system of coupled linear Volterra integral equations of the second kind. The integral equations so derived are discretized via a sophisticated variant of the Nyström method. The proposed method yields results accurate up to machine precision without relying on approximations. Numerical results and case studies ably demonstrate the efficiency and high accuracy of the algorithms.
Joint Research on Scatterometry and AFM Wafer Metrology
NASA Astrophysics Data System (ADS)
Bodermann, Bernd; Buhr, Egbert; Danzebrink, Hans-Ulrich; Bär, Markus; Scholze, Frank; Krumrey, Michael; Wurm, Matthias; Klapetek, Petr; Hansen, Poul-Erik; Korpelainen, Virpi; van Veghel, Marijn; Yacoot, Andrew; Siitonen, Samuli; El Gawhary, Omar; Burger, Sven; Saastamoinen, Toni
2011-11-01
Supported by the European Commission and EURAMET, a consortium of 10 participants from national metrology institutes, universities and companies has started a joint research project with the aim of overcoming current challenges in optical scatterometry for traceable linewidth metrology. Both experimental and modelling methods will be enhanced and different methods will be compared with each other and with specially adapted atomic force microscopy (AFM) and scanning electron microscopy (SEM) measurement systems in measurement comparisons. Additionally novel methods for sophisticated data analysis will be developed and investigated to reach significant reductions of the measurement uncertainties in critical dimension (CD) metrology. One final goal will be the realisation of a wafer based reference standard material for calibration of scatterometers.
An Excel-based implementation of the spectral method of action potential alternans analysis.
Pearman, Charles M
2014-12-01
Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.
The Matrix Element Method: Past, Present, and Future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gainer, James S.; Lykken, Joseph; Matchev, Konstantin T.
2013-07-12
The increasing use of multivariate methods, and in particular the Matrix Element Method (MEM), represents a revolution in experimental particle physics. With continued exponential growth in computing capabilities, the use of sophisticated multivariate methods-- already common-- will soon become ubiquitous and ultimately almost compulsory. While the existence of sophisticated algorithms for disentangling signal and background might naively suggest a diminished role for theorists, the use of the MEM, with its inherent connection to the calculation of differential cross sections will benefit from collaboration between theorists and experimentalists. In this white paper, we will briefly describe the MEM and some ofmore » its recent uses, note some current issues and potential resolutions, and speculate about exciting future opportunities.« less
Supercomputer use in orthopaedic biomechanics research: focus on functional adaptation of bone.
Hart, R T; Thongpreda, N; Van Buskirk, W C
1988-01-01
The authors describe two biomechanical analyses carried out using numerical methods. One is an analysis of the stress and strain in a human mandible, and the other analysis involves modeling the adaptive response of a sheep bone to mechanical loading. The computing environment required for the two types of analyses is discussed. It is shown that a simple stress analysis of a geometrically complex mandible can be accomplished using a minicomputer. However, more sophisticated analyses of the same model with dynamic loading or nonlinear materials would require supercomputer capabilities. A supercomputer is also required for modeling the adaptive response of living bone, even when simple geometric and material models are use.
NASA Astrophysics Data System (ADS)
Zaleta, Kristy L.
The purpose of this study was to investigate the impact of gender and type of inquiry curriculum (open or structured) on science process skills and epistemological beliefs in science of sixth grade students. The current study took place in an urban northeastern middle school. The researcher utilized a sample of convenience comprised of 303 sixth grade students taught by four science teachers on separate teams. The study employed mixed methods with a quasi-experimental design, pretest-posttest comparison group with 17 intact classrooms of students. Students' science process skills and epistemological beliefs in science (source, certainty, development, and justification) were measured before and after the intervention, which exposed different groups of students to different types of inquiry (structured or open). Differences between comparison and treatment groups and between male and female students were analyzed after the intervention, on science process skills, using a two-way analysis of covariance (ANCOVA), and, on epistemological beliefs in science, using a two-way multivariate analysis of covariance (MANCOVA). Responses from two focus groups of open inquiry students were cycle coded and examined for themes and patterns. Quantitative measurements indicated that girls scored significantly higher on science process skills than boys, regardless of type of inquiry instruction. Neither gender nor type of inquiry instruction predicted students' epistemological beliefs in science after accounting for students' pretest scores. The dimension Development accounted for 10.6% of the variance in students' science process skills. Qualitative results indicated that students with sophisticated epistemological beliefs expressed engagement with the open-inquiry curriculum. Students in both the sophisticated and naive beliefs groups identified challenges with the curriculum and improvement in learning as major themes. The types of challenges identified differed between the groups: sophisticated beliefs group students focused on their insecurity of not knowing how to complete the activities correctly, and naive beliefs group students focused on the amount of work and how long it took them to complete it. The description of the improvement in learning was at a basic level for the naive beliefs group and at a more complex level for the sophisticated beliefs group. Implications for researchers and educators are discussed.
Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake
NASA Astrophysics Data System (ADS)
Hough, Susan E.
2008-07-01
The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can be used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts—and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude.
Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hough, Susan E.
2008-07-08
The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can bemore » used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts - and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude.« less
Hybrid Intrusion Forecasting Framework for Early Warning System
NASA Astrophysics Data System (ADS)
Kim, Sehun; Shin, Seong-Jun; Kim, Hyunwoo; Kwon, Ki Hoon; Han, Younggoo
Recently, cyber attacks have become a serious hindrance to the stability of Internet. These attacks exploit interconnectivity of networks, propagate in an instant, and have become more sophisticated and evolutionary. Traditional Internet security systems such as firewalls, IDS and IPS are limited in terms of detecting recent cyber attacks in advance as these systems respond to Internet attacks only after the attacks inflict serious damage. In this paper, we propose a hybrid intrusion forecasting system framework for an early warning system. The proposed system utilizes three types of forecasting methods: time-series analysis, probabilistic modeling, and data mining method. By combining these methods, it is possible to take advantage of the forecasting technique of each while overcoming their drawbacks. Experimental results show that the hybrid intrusion forecasting method outperforms each of three forecasting methods.
Astronautic Structures Manual, Volume 3
NASA Technical Reports Server (NTRS)
1975-01-01
This document (Volumes I, II, and III) presents a compilation of industry-wide methods in aerospace strength analysis that can be carried out by hand, that are general enough in scope to cover most structures encountered, and that are sophisticated enough to give accurate estimates of the actual strength expected. It provides analysis techniques for the elastic and inelastic stress ranges. It serves not only as a catalog of methods not usually available, but also as a reference source for the background of the methods themselves. An overview of the manual is as follows: Section A is a general introduction of methods used and includes sections on loads, combined stresses, and interaction curves; Section B is devoted to methods of strength analysis; Section C is devoted to the topic of structural stability; Section D is on thermal stresses; Section E is on fatigue and fracture mechanics; Section F is on composites; Section G is on rotating machinery; and Section H is on statistics. These three volumes supersede Volumes I and II, NASA TM X-60041 and NASA TM X-60042, respectively.
Laser Powered Launch Vehicle Performance Analyses
NASA Technical Reports Server (NTRS)
Chen, Yen-Sen; Liu, Jiwen; Wang, Ten-See (Technical Monitor)
2001-01-01
The purpose of this study is to establish the technical ground for modeling the physics of laser powered pulse detonation phenomenon. Laser powered propulsion systems involve complex fluid dynamics, thermodynamics and radiative transfer processes. Successful predictions of the performance of laser powered launch vehicle concepts depend on the sophisticate models that reflects the underlying flow physics including the laser ray tracing the focusing, inverse Bremsstrahlung (IB) effects, finite-rate air chemistry, thermal non-equilibrium, plasma radiation and detonation wave propagation, etc. The proposed work will extend the base-line numerical model to an efficient design analysis tool. The proposed model is suitable for 3-D analysis using parallel computing methods.
Quantum algorithms for topological and geometric analysis of data
Lloyd, Seth; Garnerone, Silvano; Zanardi, Paolo
2016-01-01
Extracting useful information from large data sets can be a daunting task. Topological methods for analysing data sets provide a powerful technique for extracting such information. Persistent homology is a sophisticated tool for identifying topological features and for determining how such features persist as the data is viewed at different scales. Here we present quantum machine learning algorithms for calculating Betti numbers—the numbers of connected components, holes and voids—in persistent homology, and for finding eigenvectors and eigenvalues of the combinatorial Laplacian. The algorithms provide an exponential speed-up over the best currently known classical algorithms for topological data analysis. PMID:26806491
Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A.M.
2017-01-01
Abstract Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. PMID:28077562
Analysis of Classes of Superlinear Semipositone Problems with Nonlinear Boundary Conditions
NASA Astrophysics Data System (ADS)
Morris, Quinn A.
We study positive radial solutions for classes of steady state reaction diffusion problems on the exterior of a ball with both Dirichlet and nonlinear boundary conditions. We consider p-Laplacian problems (p > 1) with reaction terms which are superlinear at infinity and semipositone. In the case p = 2, using variational methods, we establish the existence of a solution, and via detailed analysis of the Green's function, we prove the positivity of the solution. In the case p ≠ 2, we again use variational methods to establish the existence of a solution, but the positivity of the solution is achieved via sophisticated a priori estimates. In the case p ≠ 2, the Green's function analysis is no longer available. Our results significantly enhance the literature on superlinear semipositone problems. Finally, we provide algorithms for the numerical generation of exact bifurcation curves for one-dimensional problems. In the autonomous case, we extend and analyze a quadrature method, and using nonlinear solvers in Mathematica, generate bifurcation curves. In the nonautonomous case, we employ shooting methods in Mathematica to generate bifurcation curves.
[Laboratory diagnosis of lipid imbalance].
Siemianowicz, K
1996-01-01
Accurate diagnosis of hyperlipidaemia is necessary for the effective treatment. Measurements in serum or plasma obtained after an overnight fast of over 16 hours should include total cholesterol, triglycerides and HDL-cholesterol concentrations; LDL-cholesterol can be calculated using the Friedelwald's formula. Lipoprotein electrophoresis is used to define different phenotypes of hyperlipoproteinaemia according to the Fredrickson's classification. More sophisticated tests include apolipoprotein analysis, determination of Lp(a) concentration, activities of enzymes involved in lipid metabolism and genetic studies. Secondary causes of hyperlipidaemia, including liver, kidney, endocrine disorders should be excluded using the laboratory methods.
The cancer transcriptome is shaped by genetic changes, variation in gene transcription, mRNA processing, editing and stability, and the cancer microbiome. Deciphering this variation and understanding its implications on tumorigenesis requires sophisticated computational analyses. Most RNA-Seq analyses rely on methods that first map short reads to a reference genome, and then compare them to annotated transcripts or assemble them. However, this strategy can be limited when the cancer genome is substantially different than the reference or for detecting sequences from the cancer microbiome.
Shibata, Hiroshi
2013-09-01
Since Hepatitis B virus was detected as the cause of hepatitis, many high-sensitive measurement methods have been developed. In the development history, there are many problems in accuracy, sensitivity and health insurance regulations among different types of kits with different measurement principles. Advanced medical treatments cause problems of gene mutation or reactivation of HBV, leading to the necessity for high sensitive and sophisticated determination. The history of clinical analysis for the detection of HBV was reviewed from the viewpoint of our experiences.
WebArray: an online platform for microarray data analysis
Xia, Xiaoqin; McClelland, Michael; Wang, Yipeng
2005-01-01
Background Many cutting-edge microarray analysis tools and algorithms, including commonly used limma and affy packages in Bioconductor, need sophisticated knowledge of mathematics, statistics and computer skills for implementation. Commercially available software can provide a user-friendly interface at considerable cost. To facilitate the use of these tools for microarray data analysis on an open platform we developed an online microarray data analysis platform, WebArray, for bench biologists to utilize these tools to explore data from single/dual color microarray experiments. Results The currently implemented functions were based on limma and affy package from Bioconductor, the spacings LOESS histogram (SPLOSH) method, PCA-assisted normalization method and genome mapping method. WebArray incorporates these packages and provides a user-friendly interface for accessing a wide range of key functions of limma and others, such as spot quality weight, background correction, graphical plotting, normalization, linear modeling, empirical bayes statistical analysis, false discovery rate (FDR) estimation, chromosomal mapping for genome comparison. Conclusion WebArray offers a convenient platform for bench biologists to access several cutting-edge microarray data analysis tools. The website is freely available at . It runs on a Linux server with Apache and MySQL. PMID:16371165
MHOST: An efficient finite element program for inelastic analysis of solids and structures
NASA Technical Reports Server (NTRS)
Nakazawa, S.
1988-01-01
An efficient finite element program for 3-D inelastic analysis of gas turbine hot section components was constructed and validated. A novel mixed iterative solution strategy is derived from the augmented Hu-Washizu variational principle in order to nodally interpolate coordinates, displacements, deformation, strains, stresses and material properties. A series of increasingly sophisticated material models incorporated in MHOST include elasticity, secant plasticity, infinitesimal and finite deformation plasticity, creep and unified viscoplastic constitutive model proposed by Walker. A library of high performance elements is built into this computer program utilizing the concepts of selective reduced integrations and independent strain interpolations. A family of efficient solution algorithms is implemented in MHOST for linear and nonlinear equation solution including the classical Newton-Raphson, modified, quasi and secant Newton methods with optional line search and the conjugate gradient method.
CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.
Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng
2017-01-01
Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.
Arvanitoyannis, Ioannis S; Vlachos, Antonios
2007-01-01
The authenticity of products labeled as olive oils, and in particular as virgin olive oils, stands for a very important issue both in terms of its health and commercial aspects. In view of the continuously increasing interest in virgin olive oil therapeutic properties, the traditional methods of characterization and physical and sensory analysis were further enriched with more advanced and sophisticated methods such as HPLC-MS, HPLC-GC/C/IRMS, RPLC-GC, DEPT, and CSIA among others. The results of both traditional and "novel" methods were treated both by means of classical multivariate analysis (cluster, principal component, correspondence, canonical, and discriminant) and artificial intelligence methods showing that nowadays the adulteration of virgin olive oil with seed oil is detectable at very low percentages, sometimes even at less than 1%. Furthermore, the detection of geographical origin of olive oil is equally feasible and much more accurate in countries like Italy and Spain where databases of physical/chemical properties exist. However, this geographical origin classification can also be accomplished in the absence of such databases provided that an adequate number of oil samples are used and the parameters studied have "discriminating power."
Medical history and epidemiology: their contribution to the development of public health nursing.
Earl, Catherine E
2009-01-01
The nursing profession historically has been involved in data collection in research efforts notably from the time of the Framingham Tuberculosis Project (1914-1923). Over the past century, nurses have become more sophisticated in their abilities to design, conduct, and analyze data. This article discusses the contributions of medicine and epidemiology to the development of public health nursing and the use of statistical methods by nurses in the United States in the 19th and 20th centuries. Knowledge acquired from this article will inform educators and researchers about the importance of using quantitative analysis, evidenced-based knowledge, and statistical methods when teaching students in all health professions.
[Assessment of pragmatics from verbal spoken data].
Gallardo-Paúls, B
2009-02-27
Pragmatic assessment is usually complex, long and sophisticated, especially for professionals who lack specific linguistic education and interact with impaired speakers. To design a quick method of assessment that will provide a quick general evaluation of the pragmatic effectiveness of neurologically affected speakers. This first filter will allow us to decide whether a detailed analysis of the altered categories should follow. Our starting point was the PerLA (perception, language and aphasia) profile of pragmatic assessment designed for the comprehensive analysis of conversational data in clinical linguistics; this was then converted into a quick questionnaire. A quick protocol of pragmatic assessment is proposed and the results found in a group of children with attention deficit hyperactivity disorder are discussed.
Astrophysics and Big Data: Challenges, Methods, and Tools
NASA Astrophysics Data System (ADS)
Garofalo, Mauro; Botta, Alessio; Ventre, Giorgio
2017-06-01
Nowadays there is no field research which is not flooded with data. Among the sciences, astrophysics has always been driven by the analysis of massive amounts of data. The development of new and more sophisticated observation facilities, both ground-based and spaceborne, has led data more and more complex (Variety), an exponential growth of both data Volume (i.e., in the order of petabytes), and Velocity in terms of production and transmission. Therefore, new and advanced processing solutions will be needed to process this huge amount of data. We investigate some of these solutions, based on machine learning models as well as tools and architectures for Big Data analysis that can be exploited in the astrophysical context.
Eliciting Taiwanese high school students' scientific ontological and epistemic beliefs
NASA Astrophysics Data System (ADS)
Lin, Tzung-Jin; Tsai, Chin-Chung
2017-11-01
This study employed the interview method to clarify the underlying dimensions of and relationships between students' scientific ontological and epistemic beliefs. Forty Taiwanese high school students were invited to participate in this study. Through content analysis of the participants' interview responses two ontological dimensions including 'status of nature' and 'structure of nature' were identified and found to be associated with each other. The two epistemic dimensions 'knowledge' and 'knowing' aligned with past literature were also categorised. Besides five pattern variations in terms of the aforementioned four dimensions were recognised based on the students' philosophical stances on their scientific ontological and epistemic beliefs. According to the Chi-square test results both dimensions of scientific ontological beliefs were significantly related to the two dimensions of scientific epistemic beliefs respectively. In general the students who endorsed a more sophisticated ontological stance regarding the status and structure of nature tended to express a more mature epistemic stance toward scientific knowledge and ways of knowing. The results suggest that the maturation of students' scientific epistemic beliefs may serve as a precursor and the fundamental step in promoting the sophistication of students' scientific ontological beliefs.
Improved Design of Tunnel Supports : Executive Summary
DOT National Transportation Integrated Search
1979-12-01
This report focuses on improvement of design methodologies related to the ground-structure interaction in tunneling. The design methods range from simple analytical and empirical methods to sophisticated finite element techniques as well as an evalua...
Distance-based microfluidic quantitative detection methods for point-of-care testing.
Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James
2016-04-07
Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.
Flow-gated radial phase-contrast imaging in the presence of weak flow.
Peng, Hsu-Hsia; Huang, Teng-Yi; Wang, Fu-Nien; Chung, Hsiao-Wen
2013-01-01
To implement a flow-gating method to acquire phase-contrast (PC) images of carotid arteries without use of an electrocardiography (ECG) signal to synchronize the acquisition of imaging data with pulsatile arterial flow. The flow-gating method was realized through radial scanning and sophisticated post-processing methods including downsampling, complex difference, and correlation analysis to improve the evaluation of flow-gating times in radial phase-contrast scans. Quantitatively comparable results (R = 0.92-0.96, n = 9) of flow-related parameters, including mean velocity, mean flow rate, and flow volume, with conventional ECG-gated imaging demonstrated that the proposed method is highly feasible. The radial flow-gating PC imaging method is applicable in carotid arteries. The proposed flow-gating method can potentially avoid the setting up of ECG-related equipment for brain imaging. This technique has potential use in patients with arrhythmia or weak ECG signals.
ERIC Educational Resources Information Center
Hester, Yvette
Least squares methods are sophisticated mathematical curve fitting procedures used in all classical parametric methods. The linear least squares approximation is most often associated with finding the "line of best fit" or the regression line. Since all statistical analyses are correlational and all classical parametric methods are least…
NASA Astrophysics Data System (ADS)
Izmaylov, R.; Lebedev, A.
2015-08-01
Centrifugal compressors are complex energy equipment. Automotive control and protection system should meet the requirements: of operation reliability and durability. In turbocompressors there are at least two dangerous areas: surge and rotating stall. Antisurge protecting systems usually use parametric or feature methods. As a rule industrial system are parametric. The main disadvantages of anti-surge parametric systems are difficulties in mass flow measurements in natural gas pipeline compressor. The principal idea of feature method is based on the experimental fact: as a rule just before the onset of surge rotating or precursor stall established in compressor. In this case the problem consists in detecting of unsteady pressure or velocity fluctuations characteristic signals. Wavelet analysis is the best method for detecting onset of rotating stall in spite of high level of spurious signals (rotating wakes, turbulence, etc.). This method is compatible with state of the art DSP systems of industrial control. Examples of wavelet analysis application for detecting onset of rotating stall in typical stages centrifugal compressor are presented. Experimental investigations include unsteady pressure measurement and sophisticated data acquisition system. Wavelet transforms used biorthogonal wavelets in Mathlab systems.
NASA Astrophysics Data System (ADS)
Li, Leihong
A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.
Shiraishi, Y; Yambe, T; Saijo, Y; Sato, F; Tanaka, A; Yoshizawa, M; Sugai, T K; Sakata, R; Luo, Y; Park, Y; Uematsu, M; Umezu, M; Fujimoto, T; Masumoto, N; Liu, H; Baba, A; Konno, S; Nitta, S; Imachi, K; Tabayashi, K; Sasada, H; Homma, D
2008-01-01
The authors have been developing an artificial myocardium, which is capable of supporting natural contractile function from the outside of the ventricle. The system was originally designed by using sophisticated covalent shape memory alloy fibres, and the surface did not implicate blood compatibility. The purpose of our study on the development of artificial myocardium was to achieve the assistance of myocardial functional reproduction by the integrative small mechanical elements without sensors, so that the effective circulatory support could be accomplished. In this study, the authors fabricated the prototype artificial myocardial assist unit composed of the sophisticated shape memory alloy fibre (Biometal), the diameter of which was 100 microns, and examined the mechanical response by using pulse width modulation (PWM) control method in each unit. Prior to the evaluation of dynamic characteristics, the relationship between strain and electric resistance and also the initial response of each unit were obtained. The component for the PWM control was designed in order to regulate the myocardial contractile function, which consisted of an originally-designed RISC microcomputer with the input of displacement, and its output signal was controlled by pulse wave modulation method. As a result, the optimal PWM parameters were confirmed and the fibrous displacement was successfully regulated under the different heat transfer conditions simulating internal body temperature as well as bias tensile loading. Then it was indicated that this control theory might be applied for more sophisticated ventricular passive or active restraint by the artificial myocardium on physiological demand.
Applying phylogenetic analysis to viral livestock diseases: moving beyond molecular typing.
Olvera, Alex; Busquets, Núria; Cortey, Marti; de Deus, Nilsa; Ganges, Llilianne; Núñez, José Ignacio; Peralta, Bibiana; Toskano, Jennifer; Dolz, Roser
2010-05-01
Changes in livestock production systems in recent years have altered the presentation of many diseases resulting in the need for more sophisticated control measures. At the same time, new molecular assays have been developed to support the diagnosis of animal viral disease. Nucleotide sequences generated by these diagnostic techniques can be used in phylogenetic analysis to infer phenotypes by sequence homology and to perform molecular epidemiology studies. In this review, some key elements of phylogenetic analysis are highlighted, such as the selection of the appropriate neutral phylogenetic marker, the proper phylogenetic method and different techniques to test the reliability of the resulting tree. Examples are given of current and future applications of phylogenetic reconstructions in viral livestock diseases. Copyright 2009 Elsevier Ltd. All rights reserved.
Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A M; Nilsson, Mats
2017-05-05
Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Recent advances in methods for the analysis of protein o-glycosylation at proteome level.
You, Xin; Qin, Hongqiang; Ye, Mingliang
2018-01-01
O-Glycosylation, which refers to the glycosylation of the hydroxyl group of side chains of Serine/Threonine/Tyrosine residues, is one of the most common post-translational modifications. Compared with N-linked glycosylation, O-glycosylation is less explored because of its complex structure and relatively low abundance. Recently, O-glycosylation has drawn more and more attention for its various functions in many sophisticated biological processes. To obtain a deep understanding of O-glycosylation, many efforts have been devoted to develop effective strategies to analyze the two most abundant types of O-glycosylation, i.e. O-N-acetylgalactosamine and O-N-acetylglucosamine glycosylation. In this review, we summarize the proteomics workflows to analyze these two types of O-glycosylation. For the large-scale analysis of mucin-type glycosylation, the glycan simplification strategies including the ''SimpleCell'' technology were introduced. A variety of enrichment methods including lectin affinity chromatography, hydrophilic interaction chromatography, hydrazide chemistry, and chemoenzymatic method were introduced for the proteomics analysis of O-N-acetylgalactosamine and O-N-acetylglucosamine glycosylation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Horne, R W; Wildy, P
1979-09-01
A brief historical account of the development and applications of the negative staining techniques to the study of the structure of viruses and their components as observed in the electron microscope is presented. Although the basic method of surrounding or embedding specimens in opaque dyes was used in light microscopy dating from about 1884, the equivalent preparative techniques applied to electron microscopy were comparatively recent. The combination of experiments on a sophisticated bacterial virus and the installation of a high resolution electron microscope in the Cavendish Laboratory, Cambridge, during 1954, subsequently led to the analysis of several important morphological features of animal, plant and bacterial viruses. The implications of the results from these early experiments on viruses and recent developments in negative staining methods for high resolution image analysis of electron micrographs are also discussed.
NASA Astrophysics Data System (ADS)
Ding, Zhe; Li, Li; Hu, Yujin
2018-01-01
Sophisticated engineering systems are usually assembled by subcomponents with significantly different levels of energy dissipation. Therefore, these damping systems often contain multiple damping models and lead to great difficulties in analyzing. This paper aims at developing a time integration method for structural systems with multiple damping models. The dynamical system is first represented by a generally damped model. Based on this, a new extended state-space method for the damped system is derived. A modified precise integration method with Gauss-Legendre quadrature is then proposed. The numerical stability and accuracy of the proposed integration method are discussed in detail. It is verified that the method is conditionally stable and has inherent algorithmic damping, period error and amplitude decay. Numerical examples are provided to assess the performance of the proposed method compared with other methods. It is demonstrated that the method is more accurate than other methods with rather good efficiency and the stable condition is easy to be satisfied in practice.
Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.
Goodson-Gregg, Nathan; De Stasio, Elizabeth A
2009-01-01
While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.
Assessing semantic similarity of texts - Methods and algorithms
NASA Astrophysics Data System (ADS)
Rozeva, Anna; Zerkova, Silvia
2017-12-01
Assessing the semantic similarity of texts is an important part of different text-related applications like educational systems, information retrieval, text summarization, etc. This task is performed by sophisticated analysis, which implements text-mining techniques. Text mining involves several pre-processing steps, which provide for obtaining structured representative model of the documents in a corpus by means of extracting and selecting the features, characterizing their content. Generally the model is vector-based and enables further analysis with knowledge discovery approaches. Algorithms and measures are used for assessing texts at syntactical and semantic level. An important text-mining method and similarity measure is latent semantic analysis (LSA). It provides for reducing the dimensionality of the document vector space and better capturing the text semantics. The mathematical background of LSA for deriving the meaning of the words in a given text by exploring their co-occurrence is examined. The algorithm for obtaining the vector representation of words and their corresponding latent concepts in a reduced multidimensional space as well as similarity calculation are presented.
NASA Astrophysics Data System (ADS)
Boix Mansilla, Veronica Maria
The study presented examined 16 award-winning high school students' beliefs about the criteria by which scientific theories and historical narratives are deemed trustworthy. It sought to (a) describe such beliefs as students reasoned within each discipline; (b) examine the degree to which such beliefs were organized as coherent systems of thought; and (c) explore the relationship between students' beliefs and their prior disciplinary research experience. Students were multiple-year award-winners at the Massachusetts Science Fair and the National History Day---two pre-collegiate State-level competitions. Two consecutive semi-structured interviews invited students to assess and enhance the trustworthiness of competing accounts of genetic inheritance and the Holocaust in science and history respectively. A combined qualitative and quantitative data analysis yielded the following results: (a) Students valued three standards of acceptability that were common across disciplines: e.g. empirical strength, explanatory power and formal and presentational strength. However, when reasoning within each discipline they tended to define each standard in disciplinary-specific ways. Students also valued standards of acceptability that were not shared across disciplines: i.e., external validity in science and human understanding in history. (b) In science, three distinct epistemological orientations were identified---i.e., "faith in method," "trusting the scientific community" and "working against error." In history students held two distinct epistemologies---i.e., "reproducing the past" and "organizing the past". Students' epistemological orientations tended to operate as collections of mutually supporting ideas about what renders a theory or a narrative acceptable. (c) Contrary to the standard position to date in the literature on epistemological beliefs, results revealed that students' research training in a particular discipline (e.g., science or history) was strongly related to the ways in which they interpreted problems, methods, and satisfactory solutions in each domain. Students trained in science favored a sophisticated "working against error" epistemology of science and a naive "reproducing the past" epistemology of history. Students trained in history revealed a sophisticated "organizing the past" epistemology in that discipline and a naive "faith in methods" in one in science. Students trained in both domains revealed sophisticated epistemologies in both disciplines.
Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.
2016-01-01
Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969
Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G
2015-10-01
Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.
LANDSAT-D investigations in snow hydrology
NASA Technical Reports Server (NTRS)
Dozier, J.
1983-01-01
The atmospheric radiative transfer calculation program (ATARD) and its supporting programs (setting up atmospheric profile, making Mie tables and an exponential-sum-fitting table) were completed. More sophisticated treatment of aerosol scattering (including angular phase function or asymmetric factor) and multichannel analysis of results from ATRAD are being developed. Some progress was made on a Monte Carlo program for examining two dimensional effects, specifically a surface boundary condition that varies across a scene. The MONTE program combines ATRAD and the Monte Carlo method together to produce an atmospheric point spread function. Currently the procedure passes monochromatic tests and the results are reasonable.
Coupling mRNA processing with transcription in time and space
Bentley, David L.
2015-01-01
Maturation of mRNA precursors often occurs simultaneously with their synthesis by RNA polymerase II (Pol II). The co-transcriptional nature of mRNA processing has permitted the evolution of coupling mechanisms that coordinate transcription with mRNA capping, splicing, editing and 3′ end formation. Recent experiments using sophisticated new methods for analysis of nascent RNA have provided important insights into the relative amount of co-transcriptional and post-transcriptional processing, the relationship between mRNA elongation and processing, and the role of the Pol II carboxy-terminal domain (CTD) in regulating these processes. PMID:24514444
Efficiency Analysis of Integrated Public Hospital Networks in Outpatient Internal Medicine.
Ortíz-Barrios, Miguel Angel; Escorcia-Caballero, Juan P; Sánchez-Sánchez, Fabián; De Felice, Fabio; Petrillo, Antonella
2017-09-07
Healthcare systems are evolving towards a complex network of interconnected services due to the increasing costs and the increasing expectations for high service levels. It is evidenced in the literature the importance of implementing management techniques and sophisticated methods to improve the efficiency of healthcare systems, especially in emerging economies. This paper proposes an integrated collaboration model between two public hospitals to reach the reduction of weighted average lead time in outpatient internal medicine department. A strategic framework based on value stream mapping and collaborative practices has been developed in real case study settled in Colombia.
An expert system for municipal solid waste management simulation analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsieh, M.C.; Chang, N.B.
1996-12-31
Optimization techniques were usually used to model the complicated metropolitan solid waste management system to search for the best dynamic combination of waste recycling, facility siting, and system operation, where sophisticated and well-defined interrelationship are required in the modeling process. But this paper applied the Concurrent Object-Oriented Simulation (COOS), a new simulation software construction method, to bridge the gap between the physical system and its computer representation. The case study of Kaohsiung solid waste management system in Taiwan is prepared for the illustration of the analytical methodology of COOS and its implementation in the creation of an expert system.
Some case studies of geophysical exploration of archaeological sites in Yugoslavia
NASA Astrophysics Data System (ADS)
Komatina, Snezana; Timotijevic, Zoran
1999-03-01
One of the youngest branches of environmental geophysics application is the preservation of national heritage. Numerous digital techniques developed for exploration directed to urban planning can also be applied to investigations of historic buildings. In identifying near-surface layers containing objects of previous civilizations, various sophisticated geophysical methods are used. In the paper, application of geophysics in quantification of possible problems necessary to be carried out in order to get an archaeological map of some locality is discussed [Komatina, S., 1996]. Sophisticated geophysical methods in the preservation of national heritage. Proc. of Int. Conf. Architecture and Urbanism at the turn of the Millenium, Beograd, pp. 39-44. Finally, several examples of archaeogeophysical exploration at Divostin, Bedem and Kalenic monastery localities (Serbia, Yugoslavia) are presented.
A technique for lowering risks during contract negotiations
NASA Technical Reports Server (NTRS)
Lehman, D. H.
1986-01-01
In this day and age of sophisticated weapon and space system procurements, negotiations of the statement of work, technical requirements, and schedule may be as protracted as the negotiation of costs and profit. A major problem facing the program manager during contract negotiations is what affect changes in schedule, technical requirements, and contract terms have on the price. In many instances, after the statement of work has been redrafted, the contractor will be obliged to reprice the work, either on the spot or after a short recess in the negotiations. In this paper, a method for organizing the negotiation process is presented to reduce the risks incurred by the seller. The method is built upon regression analysis and two illustrative examples of its use are provided.
Textbook of respiratory medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murray, J.F.; Nadel, J.
1987-01-01
This book presents a clinical reference of respiratory medicine. It also details basic science aspects of pulmonary physiology and describes recently developed, sophisticated diagnostic tools and therapeutic methods. It also covers anatomy, physiology, pharmacology, and pathology; microbiologic, radiologic, nuclear medicine, and biopsy methods for diagnosis.
Orthopedic workforce planning in Germany – an analysis of orthopedic accessibility
Müller, Peter; Maier, Werner; Groneberg, David A.
2017-01-01
In Germany, orthopedic workforce planning relies on population-to-provider-ratios represented by the ‘official degree of care provision’. However, with geographic information systems (GIS), more sophisticated measurements are available. By utilizing GIS-based technologies we analyzed the current state of demand and supply of the orthopedic workforce in Germany (orthopedic accessibility) with the integrated Floating Catchment Area method. The analysis of n = 153,352,220 distances revealed significant geographical variations on national scale: 5,617,595 people (6.9% of total population) lived in an area with significant low orthopedic accessibility (average z-score = -4.0), whereas 31,748,161 people (39.0% of total population) lived in an area with significant high orthopedic accessibility (average z-score = 8.0). Accessibility was positively correlated with the degree of urbanization (r = 0.49; p<0.001) and the official degree of care provision (r = 0.33; p<0.001) and negatively correlated with regional social deprivation (r = -0.47; p<0.001). Despite advantages of simpler measures regarding implementation and acceptance in health policy, more sophisticated measures of accessibility have the potential to reduce costs as well as improve health care. With this study, significant geographical variations were revealed that show the need to reduce oversupply in less deprived urban areas in order to enable adequate care in more deprived rural areas. PMID:28178335
Ladstätter, Felix; Garrosa, Eva; Moreno-Jiménez, Bernardo; Ponsoda, Vicente; Reales Aviles, José Manuel; Dai, Junming
2016-01-01
Artificial neural networks are sophisticated modelling and prediction tools capable of extracting complex, non-linear relationships between predictor (input) and predicted (output) variables. This study explores this capacity by modelling non-linearities in the hardiness-modulated burnout process with a neural network. Specifically, two multi-layer feed-forward artificial neural networks are concatenated in an attempt to model the composite non-linear burnout process. Sensitivity analysis, a Monte Carlo-based global simulation technique, is then utilised to examine the first-order effects of the predictor variables on the burnout sub-dimensions and consequences. Results show that (1) this concatenated artificial neural network approach is feasible to model the burnout process, (2) sensitivity analysis is a prolific method to study the relative importance of predictor variables and (3) the relationships among variables involved in the development of burnout and its consequences are to different degrees non-linear. Many relationships among variables (e.g., stressors and strains) are not linear, yet researchers use linear methods such as Pearson correlation or linear regression to analyse these relationships. Artificial neural network analysis is an innovative method to analyse non-linear relationships and in combination with sensitivity analysis superior to linear methods.
Caprihan, A; Pearlson, G D; Calhoun, V D
2008-08-15
Principal component analysis (PCA) is often used to reduce the dimension of data before applying more sophisticated data analysis methods such as non-linear classification algorithms or independent component analysis. This practice is based on selecting components corresponding to the largest eigenvalues. If the ultimate goal is separation of data in two groups, then these set of components need not have the most discriminatory power. We measured the distance between two such populations using Mahalanobis distance and chose the eigenvectors to maximize it, a modified PCA method, which we call the discriminant PCA (DPCA). DPCA was applied to diffusion tensor-based fractional anisotropy images to distinguish age-matched schizophrenia subjects from healthy controls. The performance of the proposed method was evaluated by the one-leave-out method. We show that for this fractional anisotropy data set, the classification error with 60 components was close to the minimum error and that the Mahalanobis distance was twice as large with DPCA, than with PCA. Finally, by masking the discriminant function with the white matter tracts of the Johns Hopkins University atlas, we identified left superior longitudinal fasciculus as the tract which gave the least classification error. In addition, with six optimally chosen tracts the classification error was zero.
Structural neuroimaging in neuropsychology: History and contemporary applications.
Bigler, Erin D
2017-11-01
Neuropsychology's origins began long before there were any in vivo methods to image the brain. That changed with the advent of computed tomography in the 1970s and magnetic resonance imaging in the early 1980s. Now computed tomography and magnetic resonance imaging are routinely a part of neuropsychological investigations with an increasing number of sophisticated methods for image analysis. This review examines the history of neuroimaging utilization in neuropsychological investigations, highlighting the basic methods that go into image quantification and the various metrics that can be derived. Neuroimaging methods and limitations for identify what constitutes a lesion are discussed. Likewise, the influence of various demographic and developmental factors that influence quantification of brain structure are reviewed. Neuroimaging is an integral part of 21st Century neuropsychology. The importance of neuroimaging to advancing neuropsychology is emphasized. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Combined qualitative and quantitative research designs.
Seymour, Jane
2012-12-01
Mixed methods research designs have been recognized as important in addressing complexity and are recommended particularly in the development and evaluation of complex interventions. This article reports a review of studies in palliative care published between 2010 and March 2012 that combine qualitative and quantitative approaches. A synthesis of approaches to mixed methods research taken in 28 examples of published research studies of relevance to palliative and supportive care is provided, using a typology based on a classic categorization put forward in 1992. Mixed-method studies are becoming more frequently employed in palliative care research and resonate with the complexity of the palliative care endeavour. Undertaking mixed methods research requires a sophisticated understanding of the research process and recognition of some of the underlying complexities encountered when working with different traditions and perspectives on issues of: sampling, validity, reliability and rigour, different sources of data and different data collection and analysis techniques.
A Computer Analysis of Library Postcards. (CALP)
ERIC Educational Resources Information Center
Stevens, Norman D.
1974-01-01
A description of a sophisticated application of computer techniques to the analysis of a collection of picture postcards of library buildings in an attempt to establish the minimum architectural requirements needed to distinguish one style of library building from another. (Author)
Digital Movement Analysis in Physical Education
ERIC Educational Resources Information Center
Trout, Josh
2013-01-01
Mobile devices such as smartphones and tablets offer applications (apps) that make digital movement analysis simple and efficient in physical education. Highly sophisticated movement analysis software has been available for many years but has mainly appealed to coaches of elite athletes and biomechanists. Apps on mobile devices are less expensive…
Research for Environmental Stewardship and Conservation at the APTRU
USDA-ARS?s Scientific Manuscript database
Research methods for mitigation of off-target spray drift, remote sensing for precision crop management, and irrigation and tillage methods are presented. Research for mitigation of off target spray drift includes development of sophisticated weather apparatus to determine weather conditions unfavor...
Two implementations of the Expert System for the Flight Analysis System (ESFAS) project
NASA Technical Reports Server (NTRS)
Wang, Lui
1988-01-01
A comparison is made between the two most sophisticated expert system building tools, the Automated Reasoning Tool (ART) and the Knowledge Engineering Environment (KEE). The same problem domain (ESFAS) was used in making the comparison. The Expert System for the Flight Analysis System (ESFAS) acts as an intelligent front end for the Flight Analysis System (FAS). FAS is a complex configuration controlled set of interrelated processors (FORTRAN routines) which will be used by the Mission Planning and Analysis Div. (MPAD) to design and analyze Shuttle and potential Space Station missions. Implementations of ESFAS are described. The two versions represent very different programming paradigms; ART uses rules and KEE uses objects. Due to each of the tools philosophical differences, KEE is implemented using a depth first traversal algorithm, whereas ART uses a user directed traversal method. Either tool could be used to solve this particular problem.
D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco
2016-02-01
Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.
Neo-Sophistic Rhetorical Theory: Sophistic Precedents for Contemporary Epistemic Rhetoric.
ERIC Educational Resources Information Center
McComiskey, Bruce
Interest in the sophists has recently intensified among rhetorical theorists, culminating in the notion that rhetoric is epistemic. Epistemic rhetoric has its first and deepest roots in sophistic epistemological and rhetorical traditions, so that the view of rhetoric as epistemic is now being dubbed "neo-sophistic." In epistemic…
NASA Technical Reports Server (NTRS)
Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.
2013-01-01
Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post-classification steps. Within this chapter, each of the four approaches is described in terms of scale and accuracy classifying urban land use and urban land cover; and for its range of urban applications. We demonstrate the overview of four main classification groups in Figure 1 while Table 1 details the approaches with respect to classification requirements and procedures (e.g., reflectance conversion, steps before training sample selection, training samples, spatial approaches commonly used, classifiers, primary inputs for classification, output structures, number of output layers, and accuracy assessment). The chapter concludes with a brief summary of the methods reviewed and the challenges that remain in developing new classification methods for improving the efficiency and accuracy of mapping urban areas.
Analysis of Time Filters in Multistep Methods
NASA Astrophysics Data System (ADS)
Hurl, Nicholas
Geophysical ow simulations have evolved sophisticated implicit-explicit time stepping methods (based on fast-slow wave splittings) followed by time filters to control any unstable models that result. Time filters are modular and parallel. Their effect on stability of the overall process has been tested in numerous simulations, but never analyzed. Stability is proven herein for the Crank-Nicolson Leapfrog (CNLF) method with the Robert-Asselin (RA) time filter and for the Crank-Nicolson Leapfrog method with the Robert-Asselin-Williams (RAW) time filter for systems by energy methods. We derive an equivalent multistep method for CNLF+RA and CNLF+RAW and stability regions are obtained. The time step restriction for energy stability of CNLF+RA is smaller than CNLF and CNLF+RAW time step restriction is even smaller. Numerical tests find that RA and RAW add numerical dissipation. This thesis also shows that all modes of the Crank-Nicolson Leap Frog (CNLF) method are asymptotically stable under the standard timestep condition.
Stacul, Stefano; Squeglia, Nunziante
2018-02-15
A Boundary Element Method (BEM) approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ.
2018-01-01
A Boundary Element Method (BEM) approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ. PMID:29462857
An Investigation of Secondary Students' Mental Models of Climate Change and the Greenhouse Effect
NASA Astrophysics Data System (ADS)
Varela, Begoña; Sesto, Vanessa; García-Rodeja, Isabel
2018-03-01
There are several studies dealing with students' conceptions on climate change, but most of them refer to understanding before instruction. In contrast, this study investigates students' conceptions and describes the levels of sophistication of their mental models on climate change and the greenhouse effect. The participants were 40 secondary students (grade 7) in Spain. As a method of data collection, a questionnaire was designed with open-ended questions focusing on the mechanism, causes, and actions that could be useful in reducing climate change. Students completed the same questionnaire before and after instruction. The students' conceptions and mental models were identified by an inductive and iterative analysis of the participants' explanations. With regard to the students' conceptions, the results show that they usually link climate change to an increase in temperature, and they tend to mention, even after instruction, generic actions to mitigate climate change, such as not polluting. With regard to the students' mental models, the results show an evolution of models with little consistency and coherence, such as the models on level 1, towards higher levels of sophistication. The paper concludes with educational implications proposed for solving learning difficulties regarding the greenhouse effect and climate change.
New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences
NASA Astrophysics Data System (ADS)
Lungaroni, M.; Murari, A.; Peluso, E.; Gelfusa, M.; Malizia, A.; Vega, J.; Talebzadeh, S.; Gaudio, P.
2016-04-01
In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.
In My End Is My Beginning: eLearning at the Crossroads
ERIC Educational Resources Information Center
Blackburn, Greg
2016-01-01
The increasingly popularity of eLearning does not refer to a specific educational method of instruction nor method of delivery. The design can have different meanings depending on the sophistication of the educational method employed, the resources made available, and the educator's skills. Unfortunately the application of technology in education…
Using Excel's Solver Function to Facilitate Reciprocal Service Department Cost Allocations
ERIC Educational Resources Information Center
Leese, Wallace R.
2013-01-01
The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated and theoretically incorrect direct or step-down methods. This article illustrates how Excel's Solver…
Using Excel's Matrix Operations to Facilitate Reciprocal Cost Allocations
ERIC Educational Resources Information Center
Leese, Wallace R.; Kizirian, Tim
2009-01-01
The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated direct or step-down methods. Here is a short example demonstrating how Excel's sometimes unknown matrix…
Elliptic Length Scales in Laminar, Two-Dimensional Supersonic Flows
2015-06-01
sophisticated computational fluid dynamics ( CFD ) methods. Additionally, for 3D interactions, the length scales would require determination in spanwise as well...Manna, M. “Experimental, Analytical, and Computational Methods Applied to Hypersonic Compression Ramp Flows,” AIAA Journal, Vol. 32, No. 2, Feb. 1994
PFAAT version 2.0: a tool for editing, annotating, and analyzing multiple sequence alignments.
Caffrey, Daniel R; Dana, Paul H; Mathur, Vidhya; Ocano, Marco; Hong, Eun-Jong; Wang, Yaoyu E; Somaroo, Shyamal; Caffrey, Brian E; Potluri, Shobha; Huang, Enoch S
2007-10-11
By virtue of their shared ancestry, homologous sequences are similar in their structure and function. Consequently, multiple sequence alignments are routinely used to identify trends that relate to function. This type of analysis is particularly productive when it is combined with structural and phylogenetic analysis. Here we describe the release of PFAAT version 2.0, a tool for editing, analyzing, and annotating multiple sequence alignments. Support for multiple annotations is a key component of this release as it provides a framework for most of the new functionalities. The sequence annotations are accessible from the alignment and tree, where they are typically used to label sequences or hyperlink them to related databases. Sequence annotations can be created manually or extracted automatically from UniProt entries. Once a multiple sequence alignment is populated with sequence annotations, sequences can be easily selected and sorted through a sophisticated search dialog. The selected sequences can be further analyzed using statistical methods that explicitly model relationships between the sequence annotations and residue properties. Residue annotations are accessible from the alignment viewer and are typically used to designate binding sites or properties for a particular residue. Residue annotations are also searchable, and allow one to quickly select alignment columns for further sequence analysis, e.g. computing percent identities. Other features include: novel algorithms to compute sequence conservation, mapping conservation scores to a 3D structure in Jmol, displaying secondary structure elements, and sorting sequences by residue composition. PFAAT provides a framework whereby end-users can specify knowledge for a protein family in the form of annotation. The annotations can be combined with sophisticated analysis to test hypothesis that relate to sequence, structure and function.
Nawaz, Tabassam; Mehmood, Zahid; Rashid, Muhammad; Habib, Hafiz Adnan
2018-01-01
Recent research on speech segregation and music fingerprinting has led to improvements in speech segregation and music identification algorithms. Speech and music segregation generally involves the identification of music followed by speech segregation. However, music segregation becomes a challenging task in the presence of noise. This paper proposes a novel method of speech segregation for unlabelled stationary noisy audio signals using the deep belief network (DBN) model. The proposed method successfully segregates a music signal from noisy audio streams. A recurrent neural network (RNN)-based hidden layer segregation model is applied to remove stationary noise. Dictionary-based fisher algorithms are employed for speech classification. The proposed method is tested on three datasets (TIMIT, MIR-1K, and MusicBrainz), and the results indicate the robustness of proposed method for speech segregation. The qualitative and quantitative analysis carried out on three datasets demonstrate the efficiency of the proposed method compared to the state-of-the-art speech segregation and classification-based methods. PMID:29558485
3D Texture Features Mining for MRI Brain Tumor Identification
NASA Astrophysics Data System (ADS)
Rahim, Mohd Shafry Mohd; Saba, Tanzila; Nayer, Fatima; Syed, Afraz Zahra
2014-03-01
Medical image segmentation is a process to extract region of interest and to divide an image into its individual meaningful, homogeneous components. Actually, these components will have a strong relationship with the objects of interest in an image. For computer-aided diagnosis and therapy process, medical image segmentation is an initial mandatory step. Medical image segmentation is a sophisticated and challenging task because of the sophisticated nature of the medical images. Indeed, successful medical image analysis heavily dependent on the segmentation accuracy. Texture is one of the major features to identify region of interests in an image or to classify an object. 2D textures features yields poor classification results. Hence, this paper represents 3D features extraction using texture analysis and SVM as segmentation technique in the testing methodologies.
ToF-SIMS PCA analysis of Myrtus communis L.
NASA Astrophysics Data System (ADS)
Piras, F. M.; Dettori, M. F.; Magnani, A.
2009-06-01
Nowadays there is a growing interest of researchers for the application of sophisticated analytical techniques in conjunction with statistical data analysis methods to the characterization of natural products to assure their authenticity and quality, and for the possibility of direct analysis of food to obtain maximum information. In this work, time-of-flight secondary ion mass spectrometry (ToF-SIMS) in conjunction with principal components analysis (PCA) are applied to study the chemical composition and variability of Sardinian myrtle ( Myrtus communis L.) through the analysis of both berries alcoholic extracts and berries epicarp. ToF-SIMS spectra of berries epicarp show that the epicuticular waxes consist mainly of carboxylic acids with chain length ranging from C20 to C30, or identical species formed from fragmentation of long-chain esters. PCA of ToF-SIMS data from myrtle berries epicarp distinguishes two groups characterized by a different surface concentration of triacontanoic acid. Variability in antocyanins, flavonols, α-tocopherol, and myrtucommulone contents is showed by ToF-SIMS PCA analysis of myrtle berries alcoholic extracts.
NASA Technical Reports Server (NTRS)
Wolf, M.
1982-01-01
It was found that the Solarex metallization design and process selection should be modified to yield substantially higher output of the 10 cm x 10 cm cells, while the Westinghouse design is extremely close to the optimum. In addition, further attention to the Solarex pn junction and base high/low junction formation processes could be beneficial. For the future efficiency improvement, it was found that refinement of the various minority carrier lifetime measurement methods is needed, as well as considerably increased sophistication in the interpretation of the results of these methods. In addition, it was determined that further experimental investigation of the Auger lifetime is needed, to conclusively determine the Auger coefficients for the direct Auger recombination at high majority carrier concentrations.
Internet of Things technology-based management methods for environmental specimen banks.
Peng, Lihong; Wang, Qian; Yu, Ang
2015-02-01
The establishment and management of environmental specimen banks (ESBs) has long been a problem worldwide. The complexity of specimen environment has made the management of ESB likewise complex. Through an analysis of the development and management of ESBs worldwide and in light of the sophisticated Internet of Things (IOT) technology, this paper presents IOT technology-based ESB management methods. An IOT technology-based ESB management system can significantly facilitate ESB ingress and egress management as well as long-term storage management under quality control. This paper elaborates on the design of IOT technology-based modules, which can be used in ESB management to achieve standardized, smart, information-based ESB management. ESB management has far-reaching implications for environmental management and for research in environmental science.
Denture-induced fibrous inflammatory hyperplasia (epulis fissuratum): research aspects.
Thomas, G A
1993-01-01
Denture-induced fibrous inflammatory hyperplasia (FIH) is a common lesion of the oral mucosa which can be treated by either surgical excision, conservative methods or both combined. Clinical aspects are briefly reviewed and a newer conservative approach to treatment is suggested. This is based on the observation that light pressure using soft lining materials may facilitate shrinkage of the fibrous mass. The histopathogenesis is discussed from the view point of the modern technologies of immunocytochemistry, and digital image analysis. The recent development of a microwave instrument with sophisticated control of power and temperature is discussed and its use in the field of histotechnology outlined.
Tailored vectorial light fields: flower, spider web and hybrid structures
NASA Astrophysics Data System (ADS)
Otte, Eileen; Alpmann, Christina; Denz, Cornelia
2017-04-01
We present the realization and analysis of tailored vector fields including polarization singularities. The fields are generated by a holographic method based on an advanced system including a spatial light modulator. We demonstrate our systems capabilities realizing specifically customized vector fields including stationary points of defined polarization in its transverse plane. Subsequently, vectorial flowers and spider webs as well as unique hybrid structures of these are introduced, and embedded singular points are characterized. These sophisticated light fields reveal attractive properties that pave the way to advanced application in e.g. optical micromanipulation. Beyond particle manipulation, they contribute essentially to actual questions in singular optics.
Frontier production function estimates for steam electric generation: a comparative analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kopp, R.J.; Smith, V.K.
1980-04-01
The performance of three frontier steam electric generation estimators is compared in terms of the consideration given to new production technologies and their technical efficiency. The Cobb-Douglas, constant elasticity of substitution, and translog production functions are examined, using the Aigner-Chu linear programming, the sophisticated Aigner-Lovell-Schmidt stochastic frontier, and the direct method of adjusted ordinary least squares frontier estimators. The use of Cobb-Douglas specification is judged to have narrowed the perceived difference between competing estimators. The choice of frontier estimator is concluded to have a greater effect on the plant efficiency than functional form. 19 references. (DCK)
An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.
Undrill, P E; Frazer, S C
1979-01-01
A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340
Biological therapies in moderate and severe psoriasis: perspectives and certainties
Constantin, MM; Poenaru, E; Constantin, T; Poenaru, C; Purcarea, VL; Mateescu, BR
2014-01-01
An inflammatory, proliferative condition with chronic evolution and systemic response, psoriasis, is positioned today among the most common inflammatory skin diseases affecting the Caucasian population worldwide. With a significant incidence, psoriasis has been increasingly defined as a disease with a major impact on the patient's life and the society to which he/she belongs. This paper conducts an analysis of the currently available therapies for the treatment of moderate and severe psoriasis, therapies with biological agents obtained through sophisticated genetic engineering technologies. Recent research and the increasing interest in therapeutic methods as complete and efficient as possible make us optimistic and confident in the future. PMID:25870666
Western, Max J.; Peacock, Oliver J.; Stathi, Afroditi; Thompson, Dylan
2015-01-01
Background Innovative physical activity monitoring technology can be used to depict rich visual feedback that encompasses the various aspects of physical activity known to be important for health. However, it is unknown whether patients who are at risk of chronic disease would understand such sophisticated personalised feedback or whether they would find it useful and motivating. The purpose of the present study was to determine whether technology-enabled multidimensional physical activity graphics and visualisations are comprehensible and usable for patients at risk of chronic disease. Method We developed several iterations of graphics depicting minute-by-minute activity patterns and integrated physical activity health targets. Subsequently, patients at moderate/high risk of chronic disease (n=29) and healthcare practitioners (n=15) from South West England underwent full 7-days activity monitoring followed by individual semi-structured interviews in which they were asked to comment on their own personalised visual feedback Framework analysis was used to gauge their interpretation and of personalised feedback, graphics and visualisations. Results We identified two main components focussing on (a) the interpretation of feedback designs and data and (b) the impact of personalised visual physical activity feedback on facilitation of health behaviour change. Participants demonstrated a clear ability to understand the sophisticated personal information plus an enhanced physical activity knowledge. They reported that receiving multidimensional feedback was motivating and could be usefully applied to facilitate their efforts in becoming more physically active. Conclusion Multidimensional physical activity feedback can be made comprehensible, informative and motivational by using appropriate graphics and visualisations. There is an opportunity to exploit the full potential created by technological innovation and provide sophisticated personalised physical activity feedback as an adjunct to support behaviour change. PMID:25938455
Changing epistemological beliefs: the unexpected impact of a short-term intervention.
Kienhues, Dorothe; Bromme, Rainer; Stahl, Elmar
2008-12-01
Previous research has shown that sophisticated epistemological beliefs exert a positive influence on students' learning strategies and learning outcomes. This gives a clear educational relevance to studies on the development of methods for promoting a change in epistemological beliefs and making them more sophisticated. To investigate the potential for influencing domain-specific epistemological beliefs through a short instructional intervention. On the basis of their performance on a preliminary survey of epistemological beliefs, 58 students at a German university (87.7% females) with a mean age of 21.86 years (SD=2.88) were selected. Half of them had more naive beliefs and the other half had more sophisticated ones. Participants were randomly assigned to one of two groups: one whose epistemological beliefs were challenged through refutational epistemological instruction or another receiving non-challenging informational instruction. The treatment effect was assessed by comparing pre- and post-instructional scores on two instruments tapping different layers of epistemological beliefs (DEBQ and CAEB). Data were subjected to factor analyses and analyses of variance. According to the CAEB, the naive group receiving the refutational epistemological instruction changed towards a more sophisticated view, whereas the sophisticated students receiving the informational instruction changed towards an unintended, more naive standpoint. According to the DEBQ, all research groups except the naive refutational group revealed changes towards a more naive view. This study indicates the possibility of changing domain-specific epistemological beliefs through a short-term intervention. However, it questions the stability and elaborateness of domain-specific epistemological beliefs, particularly when domain knowledge is shallow.
Harb, Omar S; Roos, David S
2015-01-01
Over the past 20 years, advances in high-throughput biological techniques and the availability of computational resources including fast Internet access have resulted in an explosion of large genome-scale data sets "big data." While such data are readily available for download and personal use and analysis from a variety of repositories, often such analysis requires access to seldom-available computational skills. As a result a number of databases have emerged to provide scientists with online tools enabling the interrogation of data without the need for sophisticated computational skills beyond basic knowledge of Internet browser utility. This chapter focuses on the Eukaryotic Pathogen Databases (EuPathDB: http://eupathdb.org) Bioinformatic Resource Center (BRC) and illustrates some of the available tools and methods.
Data Analysis and Data Mining: Current Issues in Biomedical Informatics
Bellazzi, Riccardo; Diomidous, Marianna; Sarkar, Indra Neil; Takabayashi, Katsuhiko; Ziegler, Andreas; McCray, Alexa T.
2011-01-01
Summary Background Medicine and biomedical sciences have become data-intensive fields, which, at the same time, enable the application of data-driven approaches and require sophisticated data analysis and data mining methods. Biomedical informatics provides a proper interdisciplinary context to integrate data and knowledge when processing available information, with the aim of giving effective decision-making support in clinics and translational research. Objectives To reflect on different perspectives related to the role of data analysis and data mining in biomedical informatics. Methods On the occasion of the 50th year of Methods of Information in Medicine a symposium was organized, that reflected on opportunities, challenges and priorities of organizing, representing and analysing data, information and knowledge in biomedicine and health care. The contributions of experts with a variety of backgrounds in the area of biomedical data analysis have been collected as one outcome of this symposium, in order to provide a broad, though coherent, overview of some of the most interesting aspects of the field. Results The paper presents sections on data accumulation and data-driven approaches in medical informatics, data and knowledge integration, statistical issues for the evaluation of data mining models, translational bioinformatics and bioinformatics aspects of genetic epidemiology. Conclusions Biomedical informatics represents a natural framework to properly and effectively apply data analysis and data mining methods in a decision-making context. In the future, it will be necessary to preserve the inclusive nature of the field and to foster an increasing sharing of data and methods between researchers. PMID:22146916
Handbook of capture-recapture analysis
Amstrup, Steven C.; McDonald, Trent L.; Manly, Bryan F.J.
2005-01-01
Every day, biologists in parkas, raincoats, and rubber boots go into the field to capture and mark a variety of animal species. Back in the office, statisticians create analytical models for the field biologists' data. But many times, representatives of the two professions do not fully understand one another's roles. This book bridges this gap by helping biologists understand state-of-the-art statistical methods for analyzing capture-recapture data. In so doing, statisticians will also become more familiar with the design of field studies and with the real-life issues facing biologists.Reliable outcomes of capture-recapture studies are vital to answering key ecological questions. Is the population increasing or decreasing? Do more or fewer animals have a particular characteristic? In answering these questions, biologists cannot hope to capture and mark entire populations. And frequently, the populations change unpredictably during a study. Thus, increasingly sophisticated models have been employed to convert data into answers to ecological questions. This book, by experts in capture-recapture analysis, introduces the most up-to-date methods for data analysis while explaining the theory behind those methods. Thorough, concise, and portable, it will be immensely useful to biologists, biometricians, and statisticians, students in both fields, and anyone else engaged in the capture-recapture process.
Synchronisation and coupling analysis: applied cardiovascular physics in sleep medicine.
Wessel, Niels; Riedl, Maik; Kramer, Jan; Muller, Andreas; Penzel, Thomas; Kurths, Jurgen
2013-01-01
Sleep is a physiological process with an internal program of a number of well defined sleep stages and intermediate wakefulness periods. The sleep stages modulate the autonomous nervous system and thereby the sleep stages are accompanied by different regulation regimes for the cardiovascular and respiratory system. The differences in regulation can be distinguished by new techniques of cardiovascular physics. The number of patients suffering from sleep disorders increases unproportionally with the increase of the human population and aging, leading to very high expenses in the public health system. Therefore, the challenge of cardiovascular physics is to develop highly-sophisticated methods which are able to, on the one hand, supplement and replace expensive medical devices and, on the other hand, improve the medical diagnostics with decreasing the patient's risk. Methods of cardiovascular physics are used to analyze heart rate, blood pressure and respiration to detect changes of the autonomous nervous system in different diseases. Data driven modeling analysis, synchronization and coupling analysis and their applications to biosignals in healthy subjects and patients with different sleep disorders are presented. Newly derived methods of cardiovascular physics can help to find indicators for these health risks.
ERIC Educational Resources Information Center
Davis, Gary Alan; Kovacs, Paul J.; Scarpino, John; Turchek, John C.
2010-01-01
The emergence of increasingly sophisticated communication technologies and the media-rich extensions of the World Wide Web have prompted universities to use alternatives to the traditional classroom teaching and learning methods. This demand for alternative delivery methods has led to the development of a wide range of eLearning techniques.…
... Home Health Info Health Topics Fluoride Share The Story of Fluoridation It started as an observation, that ... this time using photospectrographic analysis, a more sophisticated technology than that used by McKay. Churchill asked an ...
The Delphi Method in Rehabilitation Counseling Research
ERIC Educational Resources Information Center
Vazquez-Ramos, Robinson; Leahy, Michael; Estrada Hernandez, Noel
2007-01-01
Rehabilitation researchers have found in the application of the Delphi method a more sophisticated way of obtaining consensus from experts in the field on certain matters. The application of this research methodology has affected and certainly advanced the body of knowledge of the rehabilitation counseling practice. However, the rehabilitation…
Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method
ERIC Educational Resources Information Center
Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo
2012-01-01
This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)
Biomarkers: unrealized potential in sports doping analysis.
Teale, Phil; Barton, Chris; Driver, Philip M; Kay, Richard G
2009-09-01
The fight against doping in sport using analytical chemistry is a mature area with a history of approximately 100 years in horse racing and at least 40 years in human sport. Over that period, the techniques used and the breadth of coverage have developed significantly. These improvements in the testing methods have been matched by the increased sophistication of the methods, drugs and therapies available to the cheat and, as a result, testing has been a reactive process constantly adapting to meet new threats. Following the inception of the World Anti-Doping Agency, research into the methods and technologies available for human doping control have received coordinated funding on an international basis. The area of biomarker research has been a major beneficiary of this funding. The aim of this article is to review recent developments in the application of biomarkers to doping control and to assess the impact this could make in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lafleur, Jarret Marshall; Purvis, Liston Keith; Roesler, Alexander William
2014-04-01
Of the many facets of the criminal world, few have captured societys fascination as has that of high stakes robbery. The combination of meticulousness, cunning, and audacity required to execute a real-life Oceans Eleven may be uncommon among criminals, but fortunately it is common enough to extract a wealth of lessons for the protection of high-value assets. To assist in informing the analyses and decisions of security professionals, this paper surveys 23 sophisticated and high-value heists that have occurred or been attempted around the world, particularly over the past three decades. The results, compiled in a Heist Methods and Characteristicsmore » Database, have been analyzed qualitatively and quantitatively, with the goals of both identifying common characteristics and characterizing the range and diversity of criminal methods used. The analysis is focused in six areas: (1) Defeated Security Measures and Devices, (2) Deception Methods, (3) Timing, (4) Weapons, (5) Resources, and (6) Insiders.« less
A Review of Issues Related to Data Acquisition and Analysis in EEG/MEG Studies.
Puce, Aina; Hämäläinen, Matti S
2017-05-31
Electroencephalography (EEG) and magnetoencephalography (MEG) are non-invasive electrophysiological methods, which record electric potentials and magnetic fields due to electric currents in synchronously-active neurons. With MEG being more sensitive to neural activity from tangential currents and EEG being able to detect both radial and tangential sources, the two methods are complementary. Over the years, neurophysiological studies have changed considerably: high-density recordings are becoming de rigueur; there is interest in both spontaneous and evoked activity; and sophisticated artifact detection and removal methods are available. Improved head models for source estimation have also increased the precision of the current estimates, particularly for EEG and combined EEG/MEG. Because of their complementarity, more investigators are beginning to perform simultaneous EEG/MEG studies to gain more complete information about neural activity. Given the increase in methodological complexity in EEG/MEG, it is important to gather data that are of high quality and that are as artifact free as possible. Here, we discuss some issues in data acquisition and analysis of EEG and MEG data. Practical considerations for different types of EEG and MEG studies are also discussed.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Trowbridge, D.
2001-01-01
A critical issue in the micromechanics-based analysis of composite structures becomes the availability of a computationally efficient homogenization technique: one that is 1) Capable of handling the sophisticated, physically based, viscoelastoplastic constitutive and life models for each constituent; 2) Able to generate accurate displacement and stress fields at both the macro and the micro levels; 3) Compatible with the finite element method. The Generalized Method of Cells (GMC) developed by Paley and Aboudi is one such micromechanical model that has been shown to predict accurately the overall macro behavior of various types of composites given the required constituent properties. Specifically, the method provides "closed-form" expressions for the macroscopic composite response in terms of the properties, size, shape, distribution, and response of the individual constituents or phases that make up the material. Furthermore, expressions relating the internal stress and strain fields in the individual constituents in terms of the macroscopically applied stresses and strains are available through strain or stress concentration matrices. These expressions make possible the investigation of failure processes at the microscopic level at each step of an applied load history.
Differential Variance Analysis: a direct method to quantify and visualize dynamic heterogeneities
NASA Astrophysics Data System (ADS)
Pastore, Raffaele; Pesce, Giuseppe; Caggioni, Marco
2017-03-01
Many amorphous materials show spatially heterogenous dynamics, as different regions of the same system relax at different rates. Such a signature, known as Dynamic Heterogeneity, has been crucial to understand the nature of the jamming transition in simple model systems and is currently considered very promising to characterize more complex fluids of industrial and biological relevance. Unfortunately, measurements of dynamic heterogeneities typically require sophisticated experimental set-ups and are performed by few specialized groups. It is now possible to quantitatively characterize the relaxation process and the emergence of dynamic heterogeneities using a straightforward method, here validated on video microscopy data of hard-sphere colloidal glasses. We call this method Differential Variance Analysis (DVA), since it focuses on the variance of the differential frames, obtained subtracting images at different time-lags. Moreover, direct visualization of dynamic heterogeneities naturally appears in the differential frames, when the time-lag is set to the one corresponding to the maximum dynamic susceptibility. This approach opens the way to effectively characterize and tailor a wide variety of soft materials, from complex formulated products to biological tissues.
Setting the standards for signal transduction research.
Saez-Rodriguez, Julio; Alexopoulos, Leonidas G; Stolovitzky, Gustavo
2011-02-15
Major advances in high-throughput technology platforms, coupled with increasingly sophisticated computational methods for systematic data analysis, have provided scientists with tools to better understand the complexity of signaling networks. In this era of massive and diverse data collection, standardization efforts that streamline data gathering, analysis, storage, and sharing are becoming a necessity. Here, we give an overview of current technologies to study signal transduction. We argue that along with the opportunities the new technologies open, their heterogeneous nature poses critical challenges for data handling that are further increased when data are to be integrated in mathematical models. Efficient standardization through markup languages and data annotation is a sine qua non condition for a systems-level analysis of signaling processes. It remains to be seen the extent to which and the speed at which the emerging standardization efforts will be embraced by the signaling community.
Harnessing psychoanalytical methods for a phenomenological neuroscience
Cusumano, Emma P.; Raz, Amir
2014-01-01
Psychoanalysis proffers a wealth of phenomenological tools to advance the study of consciousness. Techniques for elucidating the structures of subjective life are sorely lacking in the cognitive sciences; as such, experiential reporting techniques must rise to meet both complex theories of brain function and increasingly sophisticated neuroimaging technologies. Analysis may offer valuable methods for bridging the gap between first-person and third-person accounts of the mind. Using both systematic observational approaches alongside unstructured narrative interactions, psychoanalysts help patients articulate their experience and bring unconscious mental contents into awareness. Similar to seasoned meditators or phenomenologists, individuals who have undergone analysis are experts in discerning and describing their subjective experience, thus making them ideal candidates for neurophenomenology. Moreover, analytic techniques may provide a means of guiding untrained experimental participants to greater awareness of their mental continuum, as well as gathering subjective reports about fundamental yet elusive aspects of experience including selfhood, temporality, and inter-subjectivity. Mining psychoanalysis for its methodological innovations provides a fresh turn for the neuropsychoanalysis movement and cognitive science as a whole – showcasing the integrity of analysis alongside the irreducibility of human experience. PMID:24808869
Sensorimotor Learning of Acupuncture Needle Manipulation Using Visual Feedback
Jung, Won-Mo; Lim, Jinwoong; Lee, In-Seon; Park, Hi-Joon; Wallraven, Christian; Chae, Younbyoung
2015-01-01
Objective Humans can acquire a wide variety of motor skills using sensory feedback pertaining to discrepancies between intended and actual movements. Acupuncture needle manipulation involves sophisticated hand movements and represents a fundamental skill for acupuncturists. We investigated whether untrained students could improve their motor performance during acupuncture needle manipulation using visual feedback (VF). Methods Twenty-one untrained medical students were included, randomly divided into concurrent (n = 10) and post-trial (n = 11) VF groups. Both groups were trained in simple lift/thrusting techniques during session 1, and in complicated lift/thrusting techniques in session 2 (eight training trials per session). We compared the motion patterns and error magnitudes of pre- and post-training tests. Results During motion pattern analysis, both the concurrent and post-trial VF groups exhibited greater improvements in motion patterns during the complicated lifting/thrusting session. In the magnitude error analysis, both groups also exhibited reduced error magnitudes during the simple lifting/thrusting session. For the training period, the concurrent VF group exhibited reduced error magnitudes across all training trials, whereas the post-trial VF group was characterized by greater error magnitudes during initial trials, which gradually reduced during later trials. Conclusions Our findings suggest that novices can improve the sophisticated hand movements required for acupuncture needle manipulation using sensorimotor learning with VF. Use of two types of VF can be beneficial for untrained students in terms of learning how to manipulate acupuncture needles, using either automatic or cognitive processes. PMID:26406248
Multiscale methods for gore curvature calculations from FSI modeling of spacecraft parachutes
NASA Astrophysics Data System (ADS)
Takizawa, Kenji; Tezduyar, Tayfun E.; Kolesar, Ryan; Boswell, Cody; Kanai, Taro; Montel, Kenneth
2014-12-01
There are now some sophisticated and powerful methods for computer modeling of parachutes. These methods are capable of addressing some of the most formidable computational challenges encountered in parachute modeling, including fluid-structure interaction (FSI) between the parachute and air flow, design complexities such as those seen in spacecraft parachutes, and operational complexities such as use in clusters and disreefing. One should be able to extract from a reliable full-scale parachute modeling any data or analysis needed. In some cases, however, the parachute engineers may want to perform quickly an extended or repetitive analysis with methods based on simplified models. Some of the data needed by a simplified model can very effectively be extracted from a full-scale computer modeling that serves as a pilot. A good example of such data is the circumferential curvature of a parachute gore, where a gore is the slice of the parachute canopy between two radial reinforcement cables running from the parachute vent to the skirt. We present the multiscale methods we devised for gore curvature calculation from FSI modeling of spacecraft parachutes. The methods include those based on the multiscale sequentially-coupled FSI technique and using NURBS meshes. We show how the methods work for the fully-open and two reefed stages of the Orion spacecraft main and drogue parachutes.
Microcomputers and Stimulus Control: From the Laboratory to the Classroom.
ERIC Educational Resources Information Center
LeBlanc, Judith M.; And Others
1985-01-01
The need for developing a technology of teaching that equals current sophistication of microcomputer technology is addressed. The importance of principles of learning and behavior analysis is emphasized. Potential roles of stimulus control and precise error analysis in educational program development and in prescription of specific learning…
The Impending Revolution in School Business Management.
ERIC Educational Resources Information Center
James, H. Thomas
The development of logically sophisticated analytical models in a growing number of fields has placed new emphasis on efficiency in school management. Recent systems models guiding the longrun analysis of school management in terms of efficiency--through cost-benefit studies, systems analysis, and program planning and budgeting systems--are in…
ERIC Educational Resources Information Center
Pierce, Robyn; Stacey, Kaye; Wander, Roger; Ball, Lynda
2011-01-01
Current technologies incorporating sophisticated mathematical analysis software (calculation, graphing, dynamic geometry, tables, and more) provide easy access to multiple representations of mathematical problems. Realising the affordances of such technology for students' learning requires carefully designed lessons. This paper reports on design…
Improving Instructions Using a Data Analysis Collaborative Model
ERIC Educational Resources Information Center
Good, Rebecca B.; Jackson, Sherion H.
2007-01-01
As student data analysis reports become more sophisticated, these reports reveal greater details on low performance skills. Availability of models and programs depicting detailed instructions or guidance for utilizing data to impact classroom instruction, in an effort to increase student achievement, has been lacking. This study examines the…
Interaction Analysis: Theory, Research and Application.
ERIC Educational Resources Information Center
Amidon, Edmund J., Ed.; Hough, John J., Ed.
This volume of selected readings developed for students and practitioners at various levels of sophistication is intended to be representative of work done to date on interaction analysis. The contents include journal articles, papers read at professional meetings, abstracts of doctoral dissertations, and selections from larger monographs, plus 12…
ERIC Educational Resources Information Center
Haddad, Paul; And Others
1983-01-01
Background information, procedures, and results are provided for an experiment demonstrating techniques of solvent selection, gradient elution, pH control, and ion-pairing in the analysis of an analgesic mixture using reversed-phase liquid chromatography on an octadecylsilane column. Although developed using sophisticated/expensive equipment, less…
NLSE: Parameter-Based Inversion Algorithm
NASA Astrophysics Data System (ADS)
Sabbagh, Harold A.; Murphy, R. Kim; Sabbagh, Elias H.; Aldrin, John C.; Knopp, Jeremy S.
Chapter 11 introduced us to the notion of an inverse problem and gave us some examples of the value of this idea to the solution of realistic industrial problems. The basic inversion algorithm described in Chap. 11 was based upon the Gauss-Newton theory of nonlinear least-squares estimation and is called NLSE in this book. In this chapter we will develop the mathematical background of this theory more fully, because this algorithm will be the foundation of inverse methods and their applications during the remainder of this book. We hope, thereby, to introduce the reader to the application of sophisticated mathematical concepts to engineering practice without introducing excessive mathematical sophistication.
Lloyd, G C
1996-01-01
Contends that as techniques to motivate, empower and reward staff become ever more sophisticated and expensive, one of the most obvious, though overlooked, ways of tapping the creativity of employees is the suggestion scheme. A staff suggestion scheme may well be dismissed as a simplistic and outdated vehicle by proponents of modern management methods, but to its owners it can be like a classic model--needing just a little care and attention in order for it to run smoothly and at a very low cost. Proposes that readers should spare some time to consider introducing a suggestion scheme as an entry level initiative and a precursor to more sophisticated, elaborate and costly change management mechanisms.
Platonic Dialogue, Maieutic Method and Critical Thinking
ERIC Educational Resources Information Center
Leigh, Fiona
2007-01-01
In this paper I offer a reading of one of Plato's later works, the "Sophist", that reveals it to be informed by principles comparable on the face of it with those that have emerged recently in the field of critical thinking. As a development of the famous Socratic method of his teacher, I argue, Plato deployed his own pedagogical method, a…
ERIC Educational Resources Information Center
Kies, Cosette
1975-01-01
A discussion of the way marketing expertise has employed sophisticated and psychological methods in packing a variety of products, including those items stocked by libraries and media centers; books, records, periodicals and audio-visual materials. (Author)
Effective Materials Property Information Management for the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju; Cebon, David; Arnold, Steve
2010-01-01
This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in industry, research organizations and government agencies. In part these are fuelled by the demands for higher efficiency in material testing, product design and development and engineering analysis. But equally important, organizations are being driven to employ sophisticated methods and software tools for managing their mission-critical materials information by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Furthermore the use of increasingly sophisticated nonlinear,more » anisotropic and multi-scale engineering analysis approaches, particularly for composite materials, requires both processing of much larger volumes of test data for development of constitutive models and much more complex materials data input requirements for Computer-Aided Engineering (CAE) software. And finally, the globalization of engineering processes and outsourcing of design and development activities generates much greater needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands. They have evolved from hard copy archives, through simple electronic databases, to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access control, version control, and quality control; (ii) a wide range of data import, export and analysis capabilities; (iii) mechanisms for ensuring that all data is traceable to its pedigree sources: details of testing programs, published sources, etc; (iv) tools for searching, reporting and viewing the data; and (v) access to the information via a wide range of interfaces, including web browsers, rich clients, programmatic access and clients embedded in third-party applications, such as CAE systems. This paper discusses the important requirements for advanced material data management systems as well as the future challenges and opportunities such as automated error checking, automated data quality assessment and characterization, identification of gaps in data, as well as functionalities and business models to keep users returning to the source: to generate user demand to fuel database growth and maintenance.« less
NASA Astrophysics Data System (ADS)
Wang, Qianren; Chen, Xing; Yin, Yuehong; Lu, Jian
2017-08-01
With the increasing complexity of mechatronic products, traditional empirical or step-by-step design methods are facing great challenges with various factors and different stages having become inevitably coupled during the design process. Management of massive information or big data, as well as the efficient operation of information flow, is deeply involved in the process of coupled design. Designers have to address increased sophisticated situations when coupled optimisation is also engaged. Aiming at overcoming these difficulties involved in conducting the design of the spindle box system of ultra-precision optical grinding machine, this paper proposed a coupled optimisation design method based on state-space analysis, with the design knowledge represented by ontologies and their semantic networks. An electromechanical coupled model integrating mechanical structure, control system and driving system of the motor is established, mainly concerning the stiffness matrix of hydrostatic bearings, ball screw nut and rolling guide sliders. The effectiveness and precision of the method are validated by the simulation results of the natural frequency and deformation of the spindle box when applying an impact force to the grinding wheel.
Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones
2015-07-01
COMPREHENSION-DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES IOWA STATE UNIVERSITY JULY 2015 FINAL...DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES Sb. GRANT NUMBER N/A Sc. PROGRAM ELEMENT NUMBER 6 1101E 6. AUTHOR(S) Sd...machine analysis system to detect novel, sophisticated Android malware. (c) An innovative library summarization technique and its incorporation in
Statistical Tests of Reliability of NDE
NASA Technical Reports Server (NTRS)
Baaklini, George Y.; Klima, Stanley J.; Roth, Don J.; Kiser, James D.
1987-01-01
Capabilities of advanced material-testing techniques analyzed. Collection of four reports illustrates statistical method for characterizing flaw-detecting capabilities of sophisticated nondestructive evaluation (NDE). Method used to determine reliability of several state-of-the-art NDE techniques for detecting failure-causing flaws in advanced ceramic materials considered for use in automobiles, airplanes, and space vehicles.
Hospital site selection using fuzzy AHP and its derivatives.
Vahidnia, Mohammad H; Alesheikh, Ali A; Alimohammadi, Abbas
2009-07-01
Environmental managers are commonly faced with sophisticated decisions, such as choosing the location of a new facility subject to multiple conflicting criteria. This paper considers the specific problem of creating a well-distributed network of hospitals that delivers its services to the target population with minimal time, pollution and cost. We develop a Multi-Criteria Decision Analysis process that combines Geographical Information System (GIS) analysis with the Fuzzy Analytical Hierarchy Process (FAHP), and use this process to determine the optimum site for a new hospital in the Tehran urban area. The GIS was used to calculate and classify governing criteria, while FAHP was used to evaluate the decision factors and their impacts on alternative sites. Three methods were used to estimate the total weights and priorities of the candidate sites: fuzzy extent analysis, center-of-area defuzzification, and the alpha-cut method. The three methods yield identical priorities for the five alternatives considered. Fuzzy extent analysis provides less discriminating power, but is simpler to implement and compute than the other two methods. The alpha-cut method is more complicated, but integrates the uncertainty and overall attitude of the decision-maker. The usefulness of the new hospital site is evaluated by computing an accessibility index for each pixel in the GIS, defined as the ratio of population density to travel time. With the addition of a new hospital at the optimum site, this index improved over about 6.5 percent of the geographical area.
Sophistry, the Sophists and modern medical education.
Macsuibhne, S P
2010-01-01
The term 'sophist' has become a term of intellectual abuse in both general discourse and that of educational theory. However the actual thought of the fifth century BC Athenian-based philosophers who were the original Sophists was very different from the caricature. In this essay, I draw parallels between trends in modern medical educational practice and the thought of the Sophists. Specific areas discussed are the professionalisation of medical education, the teaching of higher-order characterological attributes such as personal development skills, and evidence-based medical education. Using the specific example of the Sophist Protagoras, it is argued that the Sophists were precursors of philosophical approaches and practices of enquiry underlying modern medical education.
U.S. Naval War College Strategic Plan 2014-2018
2014-06-01
Weaknesses, Opportunities, and Threats ( SWOT ) ............ 23 Strengths...27 SWOT Net Assessment...enterprise underpins excellence in other mission endeavors. Sophisticated war gaming, research, and analysis will continue to support the Navy’s
NASA Technical Reports Server (NTRS)
Young, William D.
1992-01-01
The application of formal methods to the analysis of computing systems promises to provide higher and higher levels of assurance as the sophistication of our tools and techniques increases. Improvements in tools and techniques come about as we pit the current state of the art against new and challenging problems. A promising area for the application of formal methods is in real-time and distributed computing. Some of the algorithms in this area are both subtle and important. In response to this challenge and as part of an ongoing attempt to verify an implementation of the Interactive Convergence Clock Synchronization Algorithm (ICCSA), we decided to undertake a proof of the correctness of the algorithm using the Boyer-Moore theorem prover. This paper describes our approach to proving the ICCSA using the Boyer-Moore prover.
Composite Load Spectra for Select Space Propulsion Structural Components
NASA Technical Reports Server (NTRS)
Ho, Hing W.; Newell, James F.
1994-01-01
Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.
Marinelli, A; Dunning, M; Weathersby, S; Hemsing, E; Xiang, D; Andonian, G; O'Shea, F; Miao, Jianwei; Hast, C; Rosenzweig, J B
2013-03-01
With the advent of coherent x rays provided by the x-ray free-electron laser (FEL), strong interest has been kindled in sophisticated diffraction imaging techniques. In this Letter, we exploit such techniques for the diagnosis of the density distribution of the intense electron beams typically utilized in an x-ray FEL itself. We have implemented this method by analyzing the far-field coherent transition radiation emitted by an inverse-FEL microbunched electron beam. This analysis utilizes an oversampling phase retrieval method on the transition radiation angular spectrum to reconstruct the transverse spatial distribution of the electron beam. This application of diffraction imaging represents a significant advance in electron beam physics, having critical applications to the diagnosis of high-brightness beams, as well as the collective microbunching instabilities afflicting these systems.
Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis
Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.
2006-01-01
In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709
Kessels-Habraken, Marieke; De Jonge, Jan; Van der Schaaf, Tjerk; Rutte, Christel
2010-05-01
Hospitals can apply prospective and retrospective methods to reduce the large number of medical errors. Retrospective methods are used to identify errors after they occur and to facilitate learning. Prospective methods aim to determine, assess and minimise risks before incidents happen. This paper questions whether the order of implementation of those two methods influences the resultant impact on incident reporting behaviour. From November 2007 until June 2008, twelve wards of two Dutch general hospitals participated in a quasi-experimental reversed-treatment non-equivalent control group design. The six units of Hospital 1 first conducted a prospective analysis, after which a sophisticated incident reporting and analysis system was implemented. On the six units of Hospital 2 the two methods were implemented in reverse order. Data from the incident reporting and analysis system and from a questionnaire were used to assess between-hospital differences regarding the number of reported incidents, the spectrum of reported incident types, and the profession of reporters. The results show that carrying out a prospective analysis first can improve incident reporting behaviour in terms of a wider spectrum of reported incident types and a larger proportion of incidents reported by doctors. However, the proposed order does not necessarily yield a larger number of reported incidents. This study fills an important gap in safety management research regarding the order of the implementation of prospective and retrospective methods, and contributes to literature on incident reporting. This research also builds on the network theory of social contagion. The results might indicate that health care employees can disseminate their risk perceptions through communication with their direct colleagues. Copyright 2010 Elsevier Ltd. All rights reserved.
Studying DNA in the Classroom.
ERIC Educational Resources Information Center
Zarins, Silja
1993-01-01
Outlines a workshop for teachers that illustrates a method of extracting DNA and provides instructions on how to do some simple work with DNA without sophisticated and expensive equipment. Provides details on viscosity studies and breaking DNA molecules. (DDR)
Cognitive Pathways: Analysis of Students' Written Texts for Science Understanding
ERIC Educational Resources Information Center
Grimberg, Bruna Irene; Hand, Brian
2009-01-01
The purpose of this study was to reconstruct writers' reasoning process as reflected in their written texts. The codes resulting from the text analysis were related to cognitive operations, ranging from simple to more sophisticated ones. The sequence of the cognitive operations as the text unfolded represents the writer's cognitive pathway at the…
ERIC Educational Resources Information Center
Braverman, Marc T.
2016-01-01
Extension program evaluations often present opportunities to analyze data in multiple ways. This article suggests that program evaluations can involve more sophisticated data analysis approaches than are often used. On the basis of a hypothetical program scenario and corresponding data set, two approaches to testing for evidence of program impact…
USDA-ARS?s Scientific Manuscript database
With more sophisticated data compilation and analytical capabilities, the evolution of “big data” analysis has occurred rapidly. We examine the meta-analysis of “big data” representing phosphorus (P) flows and stocks in global agriculture and address the need to consider local nuances of farm operat...
A Sociolinguistic Analysis of English Borrowings in Japanese Advertising Texts.
ERIC Educational Resources Information Center
Takashi, Kyoko
1990-01-01
Sociolinguistic analysis of English borrowings in Japanese television and print advertising supported hypotheses that the primary reason for loanword use was to make the product seem more modern and sophisticated and that there was a relationship between loan functions and such audience characteristics as gender, age, occupation, and background.…
Factor Analysis of Drawings: Application to college student models of the greenhouse effect
NASA Astrophysics Data System (ADS)
Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel
2015-09-01
Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance, suggesting that 4 archetype models of the greenhouse effect dominate thinking within this population. Factor scores, indicating the extent to which each student's drawing aligned with representative models, were compared to performance on conceptual understanding and attitudes measures, demographics, and non-cognitive features of drawings. Student drawings were also compared to drawings made by scientists to ascertain the extent to which models reflect more sophisticated and accurate models. Results indicate that student and scientist drawings share some similarities, most notably the presence of some features of the most sophisticated non-scientific model held among the study population. Prior knowledge, prior attitudes, gender, and non-cognitive components are also predictive of an individual student's model. This work presents a new technique for analyzing drawings, with general implications for the use of drawings in investigating student conceptions.
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Steinmetz, G. G.
1972-01-01
A method of parameter extraction for stability and control derivatives of aircraft from flight test data, implementing maximum likelihood estimation, has been developed and successfully applied to actual lateral flight test data from a modern sophisticated jet fighter. This application demonstrates the important role played by the analyst in combining engineering judgment and estimator statistics to yield meaningful results. During the analysis, the problems of uniqueness of the extracted set of parameters and of longitudinal coupling effects were encountered and resolved. The results for all flight runs are presented in tabular form and as time history comparisons between the estimated states and the actual flight test data.
The steady-state visual evoked potential in vision research: A review
Norcia, Anthony M.; Appelbaum, L. Gregory; Ales, Justin M.; Cottereau, Benoit R.; Rossion, Bruno
2015-01-01
Periodic visual stimulation and analysis of the resulting steady-state visual evoked potentials were first introduced over 80 years ago as a means to study visual sensation and perception. From the first single-channel recording of responses to modulated light to the present use of sophisticated digital displays composed of complex visual stimuli and high-density recording arrays, steady-state methods have been applied in a broad range of scientific and applied settings.The purpose of this article is to describe the fundamental stimulation paradigms for steady-state visual evoked potentials and to illustrate these principles through research findings across a range of applications in vision science. PMID:26024451
Leung, Janet T Y; Shek, Daniel T L
2011-01-01
This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.
Statistical learning and selective inference.
Taylor, Jonathan; Tibshirani, Robert J
2015-06-23
We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.
Overview of Non-Volatile Testing and Screening Methods
NASA Technical Reports Server (NTRS)
Irom, Farokh
2001-01-01
Testing methods for memories and non-volatile memories have become increasingly sophisticated as they become denser and more complex. High frequency and faster rewrite times as well as smaller feature sizes have led to many testing challenges. This paper outlines several testing issues posed by novel memories and approaches to testing for radiation and reliability effects. We discuss methods for measurements of Total Ionizing Dose (TID).
ERIC Educational Resources Information Center
Charconnet, Marie-George
This study describes various patterns of peer tutoring and is based on the use of cultural traditions and endogenous methods, on techniques and equipment acquired from other cultures, on problems presented by the adoption of educational technologies, and on methods needing little sophisticated equipment. A dozen peer tutoring systems are…
Reproducible research in vadose zone sciences
USDA-ARS?s Scientific Manuscript database
A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...
NASA Astrophysics Data System (ADS)
Kang, Mijeong; Yoo, Seung Min; Gwak, Raekeun; Eom, Gayoung; Kim, Jihwan; Lee, Sang Yup; Kim, Bongsoo
2015-12-01
A sophisticated set of an Au nanowire (NW) stimulator-Au NW detector system is developed for electrical cell stimulation and electrochemical analysis of subsequent exocytosis with very high spatial resolution. Dopamine release from a rat pheochromocytoma cell is more stimulated by a more negative voltage pulse. This system could help to improve the therapeutic efficacy of electrotherapies by providing valuable information on their healing mechanism.A sophisticated set of an Au nanowire (NW) stimulator-Au NW detector system is developed for electrical cell stimulation and electrochemical analysis of subsequent exocytosis with very high spatial resolution. Dopamine release from a rat pheochromocytoma cell is more stimulated by a more negative voltage pulse. This system could help to improve the therapeutic efficacy of electrotherapies by providing valuable information on their healing mechanism. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06021d
The use of isotope ratios (13C/12C) for vegetable oils authentication
NASA Astrophysics Data System (ADS)
Cristea, G.; Magdas, D. A.; Mirel, V.
2012-02-01
Stable isotopes are now increasingly used for the control of the geographical origin or authenticity of food products. The falsification may be more or less sophisticated and its sophistication as well as its costs increases with the improvement of analytical methods. In this study 22 vegetable oils (olive, sunflower, palm, maize) commercialized on Romanian market were investigated by mean of δ13C in bulk oil and the obtained results were compared with those reported in literature in order to check the labeling of these natural products. The obtained results were in the range of the mean values found in the literature for these types of oils, thus providing their accurate labeling.
Control technology for future aircraft propulsion systems
NASA Technical Reports Server (NTRS)
Zeller, J. R.; Szuch, J. R.; Merrill, W. C.; Lehtinen, B.; Soeder, J. F.
1984-01-01
The need for a more sophisticated engine control system is discussed. The improvements in better thrust-to-weight ratios demand the manipulation of more control inputs. New technological solutions to the engine control problem are practiced. The digital electronic engine control (DEEC) system is a step in the evolution to digital electronic engine control. Technology issues are addressed to ensure a growth in confidence in sophisticated electronic controls for aircraft turbine engines. The need of a control system architecture which permits propulsion controls to be functionally integrated with other aircraft systems is established. Areas of technology studied include: (1) control design methodology; (2) improved modeling and simulation methods; and (3) implementation technologies. Objectives, results and future thrusts are summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, B. P.; Valdez, C. A.; DeHope, A. J.
Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduledmore » precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.« less
Metsalu, Tauno; Vilo, Jaak
2015-01-01
The Principal Component Analysis (PCA) is a widely used method of reducing the dimensionality of high-dimensional data, often followed by visualizing two of the components on the scatterplot. Although widely used, the method is lacking an easy-to-use web interface that scientists with little programming skills could use to make plots of their own data. The same applies to creating heatmaps: it is possible to add conditional formatting for Excel cells to show colored heatmaps, but for more advanced features such as clustering and experimental annotations, more sophisticated analysis tools have to be used. We present a web tool called ClustVis that aims to have an intuitive user interface. Users can upload data from a simple delimited text file that can be created in a spreadsheet program. It is possible to modify data processing methods and the final appearance of the PCA and heatmap plots by using drop-down menus, text boxes, sliders etc. Appropriate defaults are given to reduce the time needed by the user to specify input parameters. As an output, users can download PCA plot and heatmap in one of the preferred file formats. This web server is freely available at http://biit.cs.ut.ee/clustvis/. PMID:25969447
Pesticide analysis using nanoceria-coated paper-based devices as a detection platform.
Nouanthavong, Souksanh; Nacapricha, Duangjai; Henry, Charles S; Sameenoi, Yupaporn
2016-03-07
We report the first use of a paper-based device coated with nanoceria as a simple, low-cost and rapid detection platform for the analysis of organophosphate (OP) pesticides using an enzyme inhibition assay with acetylcholinesterase (AChE) and choline oxidase (ChOX). In the presence of acetylcholine, AChE and ChOX catalyze the formation of H2O2, which is detected colorimetrically by a nanoceria-coated device resulting in the formation of a yellow color. After incubation with OP pesticides, the AChE activity was inhibited, producing less H2O2, and a reduction in the yellow intensity. The assay is able to analyze OP pesticides without the use of sophisticated instruments and gives detection limits of 18 ng mL(-1) and 5.3 ng mL(-1) for methyl-paraoxon and chlorpyrifos-oxon, respectively. The developed method was successfully applied to detect methyl-paraoxon in spiked vegetables (cabbage) and a dried seafood product (dried green mussel), obtaining ∼95% recovery values for both sample types. The spiked samples were also analyzed using LC-MS/MS as a comparison to the developed method and similar values were obtained, indicating that the developed method gives accurate results and is suitable for OP analysis in real samples.
NASA Technical Reports Server (NTRS)
1986-01-01
The University of Georgia used NASTRAN, a COSMIC program that predicts how a design will stand up under stress, to develop a model for monitoring the transient cooling of vegetables. The winter use of passive solar heating for poultry houses is also under investigation by the Agricultural Engineering Dept. Another study involved thermal analysis of black and green nursery containers. The use of NASTRAN has encouraged student appreciation of sophisticated computer analysis.
Factorization method of quadratic template
NASA Astrophysics Data System (ADS)
Kotyrba, Martin
2017-07-01
Multiplication of two numbers is a one-way function in mathematics. Any attempt to distribute the outcome to its roots is called factorization. There are many methods such as Fermat's factorization, Dixońs method or quadratic sieve and GNFS, which use sophisticated techniques fast factorization. All the above methods use the same basic formula differing only in its use. This article discusses a newly designed factorization method. Effective implementation of this method in programs is not important, it only represents and clearly defines its properties.
Object Markers at Narrow Bridges on Low Volume Rural Roadways
DOT National Transportation Integrated Search
1998-09-01
Bridges are a necessary part of any roadway system. Their construction requires a more sophisticated engineering design analysis and a higher construction cost than that for the roadways connecting them. Until relatively recently, bridge width on low...
Technical Note: Detection of gas bubble leakage via correlation of water column multibeam images
NASA Astrophysics Data System (ADS)
Schneider von Deimling, J.; Papenberg, C.
2011-07-01
Hydroacoustic detection of natural gas release from the seafloor has been conducted in the past by using singlebeam echosounders. In contrast modern multibeam swath mapping systems allow much wider coverage, higher resolution, and offer 3-D spatial correlation. However, up to the present, the extremely high data rate hampers water column backscatter investigations. More sophisticated visualization and processing techniques for water column backscatter analysis are still under development. We here present such water column backscattering data gathered with a 50 kHz prototype multibeam system. Water column backscattering data is presented in videoframes grabbed over 75 s and a "re-sorted" singlebeam presentation. Thus individual gas bubbles rising from the 24 m deep seafloor clearly emerge in the acoustic images and rise velocities can be determined. A sophisticated processing scheme is introduced to identify those rising gas bubbles in the hydroacoustic data. It applies a cross-correlation technique similar to that used in Particle Imaging Velocimetry (PIV) to the acoustic backscatter images. Tempo-spatial drift patterns of the bubbles are assessed and match very well measured and theoretical rise patterns. The application of this processing scheme to our field data gives impressive results with respect to unambiguous bubble detection and remote bubble rise velocimetry. The method can identify and exclude the main driver for misinterpretations, i.e. fish-mediated echoes. Even though image-based cross-correlation techniques are well known in the field of fluid mechanics for high resolution and non-inversive current flow field analysis, this technique was never applied in the proposed sense for an acoustic bubble detector.
Learning cardiopulmonary resuscitation skills: does the type of mannequin make a difference?
Noordergraaf, G J; Van Gelder, J M; Van Kesteren, R G; Diets, R F; Savelkoul, T J
1997-12-01
Resuscitation (CPR) courses stress acquisition of psychomotor skills. The number of mannequins may limit the 'hands-on' time available for each trainee to practise CPR and impede acquisition of skill. This may occur because expensive, sophisticated mannequins are favoured over individual, simple mannequins. In a blind, prospective, controlled study we compared one-rescuer CPR skills of 165 trainees in two cohorts using their own individual light-weight torso mannequins (Actar 911 and Laerdal Little Anne) and a control cohort with four to five trainees sharing a sophisticated mannequin (Laerdal Recording Resusci Anne). No major significant differences (p = 0.18) were found when using the 'Berden scoring system'. Both the Actar 911 and the Little Anne were compatible with the Recording Resusci Anne. Trainees preferred the individual mannequins. We conclude that the results indicate that the use of individual mannequins in conjunction with a sophisticated mannequin neither results in trainees learning incorrect skills nor in significant improvement. Further analysis of the actual training in lay person CPR training courses and evaluation of course didactics to optimize training time appear indicated.
Chotimah, Chusnul; Sudjadi; Riyanto, Sugeng; Rohman, Abdul
2015-01-01
Purpose: Analysis of drugs in multicomponent system officially is carried out using chromatographic technique, however, this technique is too laborious and involving sophisticated instrument. Therefore, UV-VIS spectrophotometry coupled with multivariate calibration of partial least square (PLS) for quantitative analysis of metamizole, thiamin and pyridoxin is developed in the presence of cyanocobalamine without any separation step. Methods: The calibration and validation samples are prepared. The calibration model is prepared by developing a series of sample mixture consisting these drugs in certain proportion. Cross validation of calibration sample using leave one out technique is used to identify the smaller set of components that provide the greatest predictive ability. The evaluation of calibration model was based on the coefficient of determination (R2) and root mean square error of calibration (RMSEC). Results: The results showed that the coefficient of determination (R2) for the relationship between actual values and predicted values for all studied drugs was higher than 0.99 indicating good accuracy. The RMSEC values obtained were relatively low, indicating good precision. The accuracy and presision results of developed method showed no significant difference compared to those obtained by official method of HPLC. Conclusion: The developed method (UV-VIS spectrophotometry in combination with PLS) was succesfully used for analysis of metamizole, thiamin and pyridoxin in tablet dosage form. PMID:26819934
NASA Astrophysics Data System (ADS)
Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon
2016-03-01
The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.
Sampling and sample processing in pesticide residue analysis.
Lehotay, Steven J; Cook, Jo Marie
2015-05-13
Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.
Interrupted time-series analysis: studying trends in neurosurgery.
Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K
2015-12-01
OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.
DARKDROID: Exposing the Dark Side of Android Marketplaces
2016-06-01
Moreover, our approaches can detect apps containing both intentional and unintentional vulnerabilities, such as unsafe code loading mechanisms and...Security, Static Analysis, Dynamic Analysis, Malware Detection , Vulnerability Scanning 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18...applications in a DoD context. ................... 1 1.2.2 Develop sophisticated whole-system static analyses to detect malicious Android applications
The Recoverability of P-Technique Factor Analysis
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; Nesselroade, John R.
2009-01-01
It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…
ERIC Educational Resources Information Center
Schagen, Ian; Schagen, Sandie
2005-01-01
The advent of large-scale matched data sets, linking pupils' attainment across key stages, gives new opportunities to explore the effects of school organisational factors on pupil performance. Combined with currently available sophisticated and efficient software for multilevel analysis, it offers educational researchers the chance to develop…
ERIC Educational Resources Information Center
Ammentorp, William
There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…
EXPOSURE ASSESSMENT IN THE NATIONAL CHILDREN'S STUDY-INTRODUCTION
The science of exposure assessment is relatively new and evolving rapidly with the advancement of sophisticated methods for specific measurements at the picogram per gram level or lower in a variety of environmental and biologic matrices. Without this measurement capability, envi...
Handbook of automated data collection methods for the National Transit Database
DOT National Transportation Integrated Search
2003-10-01
In recent years, with the increasing sophistication and capabilities of information processing technologies, there has been a renewed interest on the part of transit systems to tap the rich information potential of the National Transit Database (NTD)...
ERIC Educational Resources Information Center
Moraes, Edgar P.; da Silva, Nilbert S. A.; de Morais, Camilo de L. M.; das Neves, Luiz S.; de Lima, Kassio M. G.
2014-01-01
The flame test is a classical analytical method that is often used to teach students how to identify specific metals. However, some universities in developing countries have difficulties acquiring the sophisticated instrumentation needed to demonstrate how to identify and quantify metals. In this context, a method was developed based on the flame…
Theory of Mind: Did Evolution Fool Us?
Devaine, Marie; Hollard, Guillaume; Daunizeau, Jean
2014-01-01
Theory of Mind (ToM) is the ability to attribute mental states (e.g., beliefs and desires) to other people in order to understand and predict their behaviour. If others are rewarded to compete or cooperate with you, then what they will do depends upon what they believe about you. This is the reason why social interaction induces recursive ToM, of the sort “I think that you think that I think, etc.”. Critically, recursion is the common notion behind the definition of sophistication of human language, strategic thinking in games, and, arguably, ToM. Although sophisticated ToM is believed to have high adaptive fitness, broad experimental evidence from behavioural economics, experimental psychology and linguistics point towards limited recursivity in representing other’s beliefs. In this work, we test whether such apparent limitation may not in fact be proven to be adaptive, i.e. optimal in an evolutionary sense. First, we propose a meta-Bayesian approach that can predict the behaviour of ToM sophistication phenotypes who engage in social interactions. Second, we measure their adaptive fitness using evolutionary game theory. Our main contribution is to show that one does not have to appeal to biological costs to explain our limited ToM sophistication. In fact, the evolutionary cost/benefit ratio of ToM sophistication is non trivial. This is partly because an informational cost prevents highly sophisticated ToM phenotypes to fully exploit less sophisticated ones (in a competitive context). In addition, cooperation surprisingly favours lower levels of ToM sophistication. Taken together, these quantitative corollaries of the “social Bayesian brain” hypothesis provide an evolutionary account for both the limitation of ToM sophistication in humans as well as the persistence of low ToM sophistication levels. PMID:24505296
Theory of mind: did evolution fool us?
Devaine, Marie; Hollard, Guillaume; Daunizeau, Jean
2014-01-01
Theory of Mind (ToM) is the ability to attribute mental states (e.g., beliefs and desires) to other people in order to understand and predict their behaviour. If others are rewarded to compete or cooperate with you, then what they will do depends upon what they believe about you. This is the reason why social interaction induces recursive ToM, of the sort "I think that you think that I think, etc.". Critically, recursion is the common notion behind the definition of sophistication of human language, strategic thinking in games, and, arguably, ToM. Although sophisticated ToM is believed to have high adaptive fitness, broad experimental evidence from behavioural economics, experimental psychology and linguistics point towards limited recursivity in representing other's beliefs. In this work, we test whether such apparent limitation may not in fact be proven to be adaptive, i.e. optimal in an evolutionary sense. First, we propose a meta-Bayesian approach that can predict the behaviour of ToM sophistication phenotypes who engage in social interactions. Second, we measure their adaptive fitness using evolutionary game theory. Our main contribution is to show that one does not have to appeal to biological costs to explain our limited ToM sophistication. In fact, the evolutionary cost/benefit ratio of ToM sophistication is non trivial. This is partly because an informational cost prevents highly sophisticated ToM phenotypes to fully exploit less sophisticated ones (in a competitive context). In addition, cooperation surprisingly favours lower levels of ToM sophistication. Taken together, these quantitative corollaries of the "social Bayesian brain" hypothesis provide an evolutionary account for both the limitation of ToM sophistication in humans as well as the persistence of low ToM sophistication levels.
Chen, Hongyu; Martin, Bronwen; Daimon, Caitlin M; Maudsley, Stuart
2013-01-01
Text mining is rapidly becoming an essential technique for the annotation and analysis of large biological data sets. Biomedical literature currently increases at a rate of several thousand papers per week, making automated information retrieval methods the only feasible method of managing this expanding corpus. With the increasing prevalence of open-access journals and constant growth of publicly-available repositories of biomedical literature, literature mining has become much more effective with respect to the extraction of biomedically-relevant data. In recent years, text mining of popular databases such as MEDLINE has evolved from basic term-searches to more sophisticated natural language processing techniques, indexing and retrieval methods, structural analysis and integration of literature with associated metadata. In this review, we will focus on Latent Semantic Indexing (LSI), a computational linguistics technique increasingly used for a variety of biological purposes. It is noted for its ability to consistently outperform benchmark Boolean text searches and co-occurrence models at information retrieval and its power to extract indirect relationships within a data set. LSI has been used successfully to formulate new hypotheses, generate novel connections from existing data, and validate empirical data.
A Review of Issues Related to Data Acquisition and Analysis in EEG/MEG Studies
Puce, Aina; Hämäläinen, Matti S.
2017-01-01
Electroencephalography (EEG) and magnetoencephalography (MEG) are non-invasive electrophysiological methods, which record electric potentials and magnetic fields due to electric currents in synchronously-active neurons. With MEG being more sensitive to neural activity from tangential currents and EEG being able to detect both radial and tangential sources, the two methods are complementary. Over the years, neurophysiological studies have changed considerably: high-density recordings are becoming de rigueur; there is interest in both spontaneous and evoked activity; and sophisticated artifact detection and removal methods are available. Improved head models for source estimation have also increased the precision of the current estimates, particularly for EEG and combined EEG/MEG. Because of their complementarity, more investigators are beginning to perform simultaneous EEG/MEG studies to gain more complete information about neural activity. Given the increase in methodological complexity in EEG/MEG, it is important to gather data that are of high quality and that are as artifact free as possible. Here, we discuss some issues in data acquisition and analysis of EEG and MEG data. Practical considerations for different types of EEG and MEG studies are also discussed. PMID:28561761
Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL
NASA Technical Reports Server (NTRS)
Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo
2011-01-01
Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.
Earth Science Informatics Comes of Age
NASA Technical Reports Server (NTRS)
Jodha, Siri; Khalsa, S.; Ramachandran, Rahul
2014-01-01
The volume and complexity of Earth science data have steadily increased, placing ever-greater demands on researchers, software developers and data managers tasked with handling such data. Additional demands arise from requirements being levied by funding agencies and governments to better manage, preserve and provide open access to data. Fortunately, over the past 10-15 years significant advances in information technology, such as increased processing power, advanced programming languages, more sophisticated and practical standards, and near-ubiquitous internet access have made the jobs of those acquiring, processing, distributing and archiving data easier. These advances have also led to an increasing number of individuals entering the field of informatics as it applies to Geoscience and Remote Sensing. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of data, information, and knowledge. Informatics also encompasses the use of computers and computational methods to support decisionmaking and other applications for societal benefits.
Heyll, Uwe
2012-06-01
The method of electro-hyperthermia is based on the production of alternating currents from capacitive coupled electrodes. Because of the associated heating of body tissues, the electro-hyperthermia is promoted as an alternative to the more sophisticated methods of scientific hyperthermia, which find use in oncologic diseases. The analysis of technical data, however, reveals that the electro-hyperthermia is not qualified for a focused, effective and therapeutically useful heating of circumscribed target areas. Data from clinical studies demonstrating efficacy for defined indications are not available. The application of electro-hyperthermia is excluded form the German system of public health insurance. As proof of medical necessity cannot be provided, there is also no claim for reimbursement from private health insurance. According to legal regulations in Germany, an invoice as hyperthermia treatment is usually not possible. Rather, an item from the electrotherapy section of the official provision of medical fees (GOA) has to be chosen.
ERIC Educational Resources Information Center
Gibson, Walker
1993-01-01
Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)
Flow cytometry: basic principles and applications.
Adan, Aysun; Alizada, Günel; Kiraz, Yağmur; Baran, Yusuf; Nalbant, Ayten
2017-03-01
Flow cytometry is a sophisticated instrument measuring multiple physical characteristics of a single cell such as size and granularity simultaneously as the cell flows in suspension through a measuring device. Its working depends on the light scattering features of the cells under investigation, which may be derived from dyes or monoclonal antibodies targeting either extracellular molecules located on the surface or intracellular molecules inside the cell. This approach makes flow cytometry a powerful tool for detailed analysis of complex populations in a short period of time. This review covers the general principles and selected applications of flow cytometry such as immunophenotyping of peripheral blood cells, analysis of apoptosis and detection of cytokines. Additionally, this report provides a basic understanding of flow cytometry technology essential for all users as well as the methods used to analyze and interpret the data. Moreover, recent progresses in flow cytometry have been discussed in order to give an opinion about the future importance of this technology.
Improving Efficiency in Multi-Strange Baryon Reconstruction in d-Au at STAR
NASA Astrophysics Data System (ADS)
Leight, William
2003-10-01
We report preliminary multi-strange baryon measurements for d-Au collisions recorded at RHIC by the STAR experiment. After using classical topological analysis, in which cuts for each discriminating variable are adjusted by hand, we investigate improvements in signal-to-noise optimization using Linear Discriminant Analysis (LDA). LDA is an algorithm for finding, in the n-dimensional space of the n discriminating variables, the axis on which the signal and noise distributions are most separated. LDA is the first step in moving towards more sophisticated techniques for signal-to-noise optimization, such as Artificial Neural Nets. Due to the relatively low background and sufficiently high yields of d-Au collisions, they form an ideal system to study these possibilities for improving reconstruction methods. Such improvements will be extremely important for forthcoming Au-Au runs in which the size of the combinatoric background is a major problem in reconstruction efforts.
Reyhani, Mitra; Kazemi, Ashraf; Keshvari, Mahrokh
2018-02-02
The present study was conducted to determine the perceptions of middle-aged women of reproductive changes. The present study was a qualitative research with a content analysis approach. The participants were 30 middle-aged women whose perceptions of reproductive changes had been collected on in-depth semi-structured interviews. The data were analyzed using the Graneheim and Lundman's inductive content analysis method. The main themes extracted from the data were a sense of "fall" and "the beginning of a new life cycle." A feeling of fall was formed from the subthemes "deterioration of youth," "the dusk of femininity," and "fade-out of the gender roles." The theme "beginning of a new life cycle" was formed from the subthemes of "acceptance," "sophistication," and "maturity." Middle-aged women had a wide range of emotions experienced from the reproductive changes ranging from a feeling of decline to that of excellence and rise.
Of bugs and birds: Markov Chain Monte Carlo for hierarchical modeling in wildlife research
Link, W.A.; Cam, E.; Nichols, J.D.; Cooch, E.G.
2002-01-01
Markov chain Monte Carlo (MCMC) is a statistical innovation that allows researchers to fit far more complex models to data than is feasible using conventional methods. Despite its widespread use in a variety of scientific fields, MCMC appears to be underutilized in wildlife applications. This may be due to a misconception that MCMC requires the adoption of a subjective Bayesian analysis, or perhaps simply to its lack of familiarity among wildlife researchers. We introduce the basic ideas of MCMC and software BUGS (Bayesian inference using Gibbs sampling), stressing that a simple and satisfactory intuition for MCMC does not require extraordinary mathematical sophistication. We illustrate the use of MCMC with an analysis of the association between latent factors governing individual heterogeneity in breeding and survival rates of kittiwakes (Rissa tridactyla). We conclude with a discussion of the importance of individual heterogeneity for understanding population dynamics and designing management plans.
Martins, Marina C M; Caldana, Camila; Wolf, Lucia Daniela; de Abreu, Luis Guilherme Furlan
2018-01-01
The output of metabolomics relies to a great extent upon the methods and instrumentation to identify, quantify, and access spatial information on as many metabolites as possible. However, the most modern machines and sophisticated tools for data analysis cannot compensate for inappropriate harvesting and/or sample preparation procedures that modify metabolic composition and can lead to erroneous interpretation of results. In addition, plant metabolism has a remarkable degree of complexity, and the number of identified compounds easily surpasses the number of samples in metabolomics analyses, increasing false discovery risk. These aspects pose a large challenge when carrying out plant metabolomics experiments. In this chapter, we address the importance of a proper experimental design taking into consideration preventable complications and unavoidable factors to achieve success in metabolomics analysis. We also focus on quality control and standardized procedures during the metabolomics workflow.
Web-accessible cervigram automatic segmentation tool
NASA Astrophysics Data System (ADS)
Xue, Zhiyun; Antani, Sameer; Long, L. Rodney; Thoma, George R.
2010-03-01
Uterine cervix image analysis is of great importance to the study of uterine cervix cancer, which is among the leading cancers affecting women worldwide. In this paper, we describe our proof-of-concept, Web-accessible system for automated segmentation of significant tissue regions in uterine cervix images, which also demonstrates our research efforts toward promoting collaboration between engineers and physicians for medical image analysis projects. Our design and implementation unifies the merits of two commonly used languages, MATLAB and Java. It circumvents the heavy workload of recoding the sophisticated segmentation algorithms originally developed in MATLAB into Java while allowing remote users who are not experienced programmers and algorithms developers to apply those processing methods to their own cervicographic images and evaluate the algorithms. Several other practical issues of the systems are also discussed, such as the compression of images and the format of the segmentation results.
NASA Astrophysics Data System (ADS)
Gloster, Jonathan; Diep, Michael; Dredden, David; Mix, Matthew; Olsen, Mark; Price, Brian; Steil, Betty
2014-06-01
Small-to-medium sized businesses lack resources to deploy and manage high-end advanced solutions to deter sophisticated threats from well-funded adversaries, but evidence shows that these types of businesses are becoming key targets. As malicious code and network attacks become more sophisticated, classic signature-based virus and malware detection methods are less effective. To augment the current malware methods of detection, we developed a proactive approach to detect emerging malware threats using open source tools and intelligence to discover patterns and behaviors of malicious attacks and adversaries. Technical and analytical skills are combined to track adversarial behavior, methods and techniques. We established a controlled (separated domain) network to identify, monitor, and track malware behavior to increase understanding of the methods and techniques used by cyber adversaries. We created a suite of tools that observe the network and system performance looking for anomalies that may be caused by malware. The toolset collects information from open-source tools and provides meaningful indicators that the system was under or has been attacked. When malware is discovered, we analyzed and reverse engineered it to determine how it could be detected and prevented. Results have shown that with minimum resources, cost effective capabilities can be developed to detect abnormal behavior that may indicate malicious software.
High deductible health plans: does cost sharing stimulate increased consumer sophistication?
Gupta, Neal; Polsky, Daniel
2015-06-01
To determine whether increased cost sharing in health insurance plans induces higher levels of consumer sophistication in a non-elderly population. This analysis is based on the collection of survey and demographic data collected from enrollees in the RAND health insurance experiment (HIE). During the RAND HIE, enrollees were randomly assigned to different levels of cost sharing (0, 25, 50 and 95%). The study population compromises about 2000 people enrolled in the RAND HIE, between the years 1974 and 1982. Effects on health-care decision making were measured using the results of a standardized questionnaire, administered at the beginning and end of the experiment. Points of enquiry included whether or not enrollees' (i) recognized the need for second opinions (ii) questioned the effectiveness of certain therapies and (iii) researched the background/skill of their medical providers. Consumer sophistication was also measured for regular health-care consumers, as indicated by the presence of a chronic disease. We found no statically significant changes (P < 0.05) in the health-care decision-making strategies between individuals randomized to high cost sharing plans and low cost sharing plans. Furthermore, we did not find a stronger effect for patients with a chronic disease. The evidence from the RAND HIE does not support the hypothesis that a higher level of cost sharing incentivizes the development of consumer sophistication. As a result, cost sharing alone will not promote individuals to become more selective in their health-care decision-making. © 2012 Blackwell Publishing Ltd.
Science Language Accommodation in Elementary School Read-Alouds
NASA Astrophysics Data System (ADS)
Glass, Rory; Oliveira, Alandeom W.
2014-03-01
This study examines the pedagogical functions of accommodation (i.e. provision of simplified science speech) in science read-aloud sessions facilitated by five elementary teachers. We conceive of read-alouds as communicative events wherein teachers, faced with the task of orally delivering a science text of relatively high linguistic complexity, open up an alternate channel of communication, namely oral discussion. By doing so, teachers grant students access to a simplified linguistic input, a strategy designed to promote student comprehension of the textual contents of children's science books. It was found that nearly half (46%) of the read-aloud time was allotted to discussions with an increased percentage of less sophisticated words and reduced use of more sophisticated vocabulary than found in the books through communicative strategies such as simplified rewording, simplified definition, and simplified questioning. Further, aloud reading of more linguistically complex books required longer periods of discussion and an increased degree of teacher oral input and accommodation. We also found evidence of reversed simplification (i.e. sophistication), leading to student uptake of scientific language. The main significance of this study is that it reveals that teacher talk serves two often competing pedagogical functions (accessible communication of scientific information to students and promotion of student acquisition of the specialized language of science). It also underscores the importance of giving analytical consideration to the simplification-sophistication dimension of science classroom discourse as well as the potential of computer-based analysis of classroom discourse to inform science teaching.
Badiee, Parisa; Nejabat, Mahmood; Alborzi, Abdolvahab; Keshavarz, Fatemeh; Shakiba, Elaheh
2010-01-01
This study seeks to evaluate the efficacy and practicality of the molecular method, compared to the standard microbiological techniques for diagnosing fungal keratitis (FK). Patients with eye findings suspected of FK were enrolled for cornea sampling. Scrapings from the affected areas of the infected corneas were obtained and were divided into two parts: one for smears and cultures, and the other for nested PCR analysis. Of the 38 eyes, 28 were judged to have fungal infections based on clinical and positive findings in the culture, smear and responses to antifungal treatment. Potassium hydroxide, Gram staining, culture and nested PCR results (either positive or negative) matched in 76.3, 42.1, 68.4 and 81.6%, respectively. PCR is a sensitive method but due to the lack of sophisticated facilities in routine laboratory procedures, it can serve only complementarily and cannot replace conventional methods. Copyright © 2010 S. Karger AG, Basel.
Creation of smart composites using an embroidery machine
NASA Astrophysics Data System (ADS)
Torii, Nobuhiro; Oka, Kosuke; Ikeda, Tadashige
2016-04-01
A smart composite with functional fibers and reinforcement fibers optimally placed with an embroidery machine was created. Fiber orientation affects mechanical properties of composite laminates significantly. Accordingly, if the fibers can be placed along a desired curved path, fiber reinforced plastic (FRP) structures can be designed more lightly and more sophisticatedly. To this end a tailored fiber placement method using the embroidery machine have been studied. To add functions to the FRP structures, shape memory alloy (SMA) wires were placed as functional fibers. First, for a certain purpose the paths of the reinforcement fibers and the SMA wires were simultaneously optimized in analysis. Next, the reinforcement fibers and tubes with the SMA wires were placed on fabrics by using the embroidery machine and this fabric was impregnated with resin by using the vacuum assisted resin transfer molding method. This smart composite was activated by applying voltage to the SMA wires. Fundamental properties of the smart composite were examined and the feasibility of the proposed creation method was shown.
Pérez Aparicio, Jesús; Toledano Medina, M Angeles; Lafuente Rosales, Victoria
2007-07-09
Free-choice profile (FCP), developed in the 1980s, is a sensory analysis method that can be carried out by untrained panels. The participants need only to be able to use a scale and be consumers of the product under evaluation. The data are analysed by sophisticated statistical methodologies like Generalized Procrustean Analysis (GPA) or STATIS. To facilitate a wider use of the free-choice profiling procedure, different authors have advocated simpler methods based on principal components analysis (PCA) of merged data sets. The purpose of this work was to apply another easy procedure to this type of data by means of a robust PCA. The most important characteristic of the proposed method is that quality responsible managers could use this methodology without any scale evaluation. Only the free terms generated by the assessors are necessary to apply the script, thus avoiding the error associated with scale utilization by inexpert assessors. Also, it is possible to use the application with missing data and with differences in the assessors' attendance at sessions. An example was performed to generate the descriptors from different orange juice types. The results were compared with the STATIS method and with the PCA on the merged data sets. The samples evaluated were fresh orange juices with differences in storage days and pasteurized, concentrated and orange nectar drinks from different brands. Eighteen assessors with a low-level training program were used in a six-session free-choice profile framework. The results proved that this script could be of use in marketing decisions and product quality program development.
Jarboe, G R; Gates, R H; McDaniel, C D
1990-01-01
Healthcare providers of multiple option plans may be confronted with special market segmentation problems. This study demonstrates how cluster analysis may be used for discovering distinct patterns of preference for multiple option plans. The availability of metric, as opposed to categorical or ordinal, data provides the ability to use sophisticated analysis techniques which may be superior to frequency distributions and cross-tabulations in revealing preference patterns.
Designs and methods used in published Australian health promotion evaluations 1992-2011.
Chambers, Alana Hulme; Murphy, Kylie; Kolbe, Anthony
2015-06-01
To describe the designs and methods used in published Australian health promotion evaluation articles between 1992 and 2011. Using a content analysis approach, we reviewed 157 articles to analyse patterns and trends in designs and methods in Australian health promotion evaluation articles. The purpose was to provide empirical evidence about the types of designs and methods used. The most common type of evaluation conducted was impact evaluation. Quantitative designs were used exclusively in more than half of the articles analysed. Almost half the evaluations utilised only one data collection method. Surveys were the most common data collection method used. Few articles referred explicitly to an intended evaluation outcome or benefit and references to published evaluation models or frameworks were rare. This is the first time Australian-published health promotion evaluation articles have been empirically investigated in relation to designs and methods. There appears to be little change in the purposes, overall designs and methods of published evaluations since 1992. More methodologically transparent and sophisticated published evaluation articles might be instructional, and even motivational, for improving evaluation practice and result in better public health interventions and outcomes. © 2015 Public Health Association of Australia.
Four Educators in Plato's "Theaetetus"
ERIC Educational Resources Information Center
Mintz, Avi I.
2011-01-01
Scholars who have taken interest in "Theaetetus'" educational theme argue that Plato contrasts an inferior, even dangerous, sophistic education to a superior, philosophical, Socratic education. I explore the contrasting exhortations, methods, ideals and epistemological foundations of Socratic and Protagorean education and suggest that Socrates'…
Conceptualizing Effectiveness in Disability Research
ERIC Educational Resources Information Center
de Bruin, Catriona L.
2017-01-01
Policies promoting evidence-based practice in education typically endorse evaluations of the effectiveness of teaching strategies through specific experimental research designs and methods. A number of researchers have critiqued this approach to evaluation as narrow and called for greater methodological sophistication. This paper discusses the…
Lizier, Joseph T; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail
2011-02-01
The human brain undertakes highly sophisticated information processing facilitated by the interaction between its sub-regions. We present a novel method for interregional connectivity analysis, using multivariate extensions to the mutual information and transfer entropy. The method allows us to identify the underlying directed information structure between brain regions, and how that structure changes according to behavioral conditions. This method is distinguished in using asymmetric, multivariate, information-theoretical analysis, which captures not only directional and non-linear relationships, but also collective interactions. Importantly, the method is able to estimate multivariate information measures with only relatively little data. We demonstrate the method to analyze functional magnetic resonance imaging time series to establish the directed information structure between brain regions involved in a visuo-motor tracking task. Importantly, this results in a tiered structure, with known movement planning regions driving visual and motor control regions. Also, we examine the changes in this structure as the difficulty of the tracking task is increased. We find that task difficulty modulates the coupling strength between regions of a cortical network involved in movement planning and between motor cortex and the cerebellum which is involved in the fine-tuning of motor control. It is likely these methods will find utility in identifying interregional structure (and experimentally induced changes in this structure) in other cognitive tasks and data modalities.
NASA Astrophysics Data System (ADS)
White, Joshua S.; Matthews, Jeanna N.; Stacy, John L.
2012-06-01
Phishing website analysis is largely still a time-consuming manual process of discovering potential phishing sites, verifying if suspicious sites truly are malicious spoofs and if so, distributing their URLs to the appropriate blacklisting services. Attackers increasingly use sophisticated systems for bringing phishing sites up and down rapidly at new locations, making automated response essential. In this paper, we present a method for rapid, automated detection and analysis of phishing websites. Our method relies on near real-time gathering and analysis of URLs posted on social media sites. We fetch the pages pointed to by each URL and characterize each page with a set of easily computed values such as number of images and links. We also capture a screen-shot of the rendered page image, compute a hash of the image and use the Hamming distance between these image hashes as a form of visual comparison. We provide initial results demonstrate the feasibility of our techniques by comparing legitimate sites to known fraudulent versions from Phishtank.com, by actively introducing a series of minor changes to a phishing toolkit captured in a local honeypot and by performing some initial analysis on a set of over 2.8 million URLs posted to Twitter over a 4 days in August 2011. We discuss the issues encountered during our testing such as resolvability and legitimacy of URL's posted on Twitter, the data sets used, the characteristics of the phishing sites we discovered, and our plans for future work.
Liu, Cong; Kolarik, Barbara; Gunnarsen, Lars; Zhang, Yinping
2015-10-20
Polychlorinated biphenyls (PCBs) have been found to be persistent in the environment and possibly harmful. Many buildings are characterized with high PCB concentrations. Knowledge about partitioning between primary sources and building materials is critical for exposure assessment and practical remediation of PCB contamination. This study develops a C-depth method to determine diffusion coefficient (D) and partition coefficient (K), two key parameters governing the partitioning process. For concrete, a primary material studied here, relative standard deviations of results among five data sets are 5%-22% for K and 42-66% for D. Compared with existing methods, C-depth method overcomes the inability to obtain unique estimation for nonlinear regression and does not require assumed correlations for D and K among congeners. Comparison with a more sophisticated two-term approach implies significant uncertainty for D, and smaller uncertainty for K. However, considering uncertainties associated with sampling and chemical analysis, and impact of environmental factors, the results are acceptable for engineering applications. This was supported by good agreement between model prediction and measurement. Sensitivity analysis indicated that effective diffusion distance, contacting time of materials with primary sources, and depth of measured concentrations are critical for determining D, and PCB concentration in primary sources is critical for K.
Solid oxide fuel cell simulation and design optimization with numerical adjoint techniques
NASA Astrophysics Data System (ADS)
Elliott, Louie C.
This dissertation reports on the application of numerical optimization techniques as applied to fuel cell simulation and design. Due to the "multi-physics" inherent in a fuel cell, which results in a highly coupled and non-linear behavior, an experimental program to analyze and improve the performance of fuel cells is extremely difficult. This program applies new optimization techniques with computational methods from the field of aerospace engineering to the fuel cell design problem. After an overview of fuel cell history, importance, and classification, a mathematical model of solid oxide fuel cells (SOFC) is presented. The governing equations are discretized and solved with computational fluid dynamics (CFD) techniques including unstructured meshes, non-linear solution methods, numerical derivatives with complex variables, and sensitivity analysis with adjoint methods. Following the validation of the fuel cell model in 2-D and 3-D, the results of the sensitivity analysis are presented. The sensitivity derivative for a cost function with respect to a design variable is found with three increasingly sophisticated techniques: finite difference, direct differentiation, and adjoint. A design cycle is performed using a simple optimization method to improve the value of the implemented cost function. The results from this program could improve fuel cell performance and lessen the world's dependence on fossil fuels.
Mechanics of Composite Materials: Past, Present and Future
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1984-01-01
Composite mechanics disciplines are presented and described at their various levels of sophistication and attendant scales of application. Correlation with experimental data is used as the prime discriminator between alternative methods and level of sophistication. Major emphasis is placed on: (1) where composite mechanics has been; (2) what it has accomplished; (3) where it is headed, based on present research activities; and (4) at the risk of being presumptuous, where it should be headed. The discussion is developed using selected, but typical examples of each composite mechanics discipline identifying degree of success, with respect to correlation with experimental data, and problems remaining. The discussion is centered about fiber/resin composites drawn mainly from the author's research activities/experience spanning two decades at Lewis.
Mechanics of composite materials - Past, present and future
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1989-01-01
Composite mechanics disciplines are presented and described at their various levels of sophistication and attendant scales of application. Correlation with experimental data is used as the prime discriminator between alternative methods and level of sophistication. Major emphasis is placed on: (1) where composite mechanics has been; (2) what it has accomplished; (3) where it is headed, based on present research activities; and (4) at the risk of being presumptuous, where it should be headed. The discussion is developed using selected, but typical examples of each composite mechanics discipline identifying degree of success, with respect to correlation with experimental data, and problems remaining. The discussion is centered about fiber/resin composites drawn mainly from the author's research activities/experience spanning two decades at Lewis.
Social Insects: A Model System for Network Dynamics
NASA Astrophysics Data System (ADS)
Charbonneau, Daniel; Blonder, Benjamin; Dornhaus, Anna
Social insect colonies (ants, bees, wasps, and termites) show sophisticated collective problem-solving in the face of variable constraints. Individuals exchange information and materials such as food. The resulting network structure and dynamics can inform us about the mechanisms by which the insects achieve particular collective behaviors and these can be transposed to man-made and social networks. We discuss how network analysis can answer important questions about social insects, such as how effective task allocation or information flow is realized. We put forward the idea that network analysis methods are under-utilized in social insect research, and that they can provide novel ways to view the complexity of collective behavior, particularly if network dynamics are taken into account. To illustrate this, we present an example of network tasks performed by ant workers, linked by instances of workers switching from one task to another. We show how temporal network analysis can propose and test new hypotheses on mechanisms of task allocation, and how adding temporal elements to static networks can drastically change results. We discuss the benefits of using social insects as models for complex systems in general. There are multiple opportunities emergent technologies and analysis methods in facilitating research on social insect network. The potential for interdisciplinary work could significantly advance diverse fields such as behavioral ecology, computer sciences, and engineering.
Single-Molecule Electronics: Chemical and Analytical Perspectives.
Nichols, Richard J; Higgins, Simon J
2015-01-01
It is now possible to measure the electrical properties of single molecules using a variety of techniques including scanning probe microcopies and mechanically controlled break junctions. Such measurements can be made across a wide range of environments including ambient conditions, organic liquids, ionic liquids, aqueous solutions, electrolytes, and ultra high vacuum. This has given new insights into charge transport across molecule electrical junctions, and these experimental methods have been complemented with increasingly sophisticated theory. This article reviews progress in single-molecule electronics from a chemical perspective and discusses topics such as the molecule-surface coupling in electrical junctions, chemical control, and supramolecular interactions in junctions and gating charge transport. The article concludes with an outlook regarding chemical analysis based on single-molecule conductance.
High-level user interfaces for transfer function design with semantics.
Salama, Christof Rezk; Keller, Maik; Kohlmann, Peter
2006-01-01
Many sophisticated techniques for the visualization of volumetric data such as medical data have been published. While existing techniques are mature from a technical point of view, managing the complexity of visual parameters is still difficult for non-expert users. To this end, this paper presents new ideas to facilitate the specification of optical properties for direct volume rendering. We introduce an additional level of abstraction for parametric models of transfer functions. The proposed framework allows visualization experts to design high-level transfer function models which can intuitively be used by non-expert users. The results are user interfaces which provide semantic information for specialized visualization problems. The proposed method is based on principal component analysis as well as on concepts borrowed from computer animation.
NASA Astrophysics Data System (ADS)
Dunstan, Jocelyn; Fallah-Fini, Saeideh; Nau, Claudia; Glass, Thomas; Global Obesity Prevention Center Team
The applications of sophisticated mathematical and numerical tools in public health has been demonstrated to be useful in predicting the outcome of public intervention as well as to study, for example, the main causes of obesity without doing experiments with the population. In this project we aim to understand which kind of food consumed in different countries over time best defines the rate of obesity in those countries. The use of Machine Learning is particularly useful because we do not need to create a hypothesis and test it with the data, but instead we learn from the data to find the groups of food that best describe the prevalence of obesity.
NASA Technical Reports Server (NTRS)
1988-01-01
Viking landers touched down on Mars equipped with a variety of systems to conduct automated research, each carrying a compact but highly sophisticated instrument for analyzing Martian soil and atmosphere. Instrument called a Gas Chromatography/Mass Spectrometer (GC/MS) had to be small, lightweight, shock resistant, highly automated and extremely sensitive, yet require minimal electrical power. Viking Instruments Corporation commercialized this technology and targeted their primary market as environmental monitoring, especially toxic and hazardous waste site monitoring. Waste sites often contain chemicals in complex mixtures, and the conventional method of site characterization, taking samples on-site and sending them to a laboratory for analysis is time consuming and expensive. Other terrestrial applications are explosive detection in airports, drug detection, industrial air monitoring, medical metabolic monitoring and for military, chemical warfare agents.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Virnstein, R.; Tepera, M.; Beazley, L.
1997-06-01
A pilot study is very briefly summarized in the article. The study tested the potential of multi-spectral digital imagery for discrimination of seagrass densities and species, algae, and bottom types. Imagery was obtained with the Compact Airborne Spectral Imager (casi) and two flight lines flown with hyper-spectral mode. The photogrammetric method used allowed interpretation of the highest quality product, eliminating limitations caused by outdated or poor quality base maps and the errors associated with transfer of polygons. Initial image analysis indicates that the multi-spectral imagery has several advantages, including sophisticated spectral signature recognition and classification, ease of geo-referencing, and rapidmore » mosaicking.« less
Comparison of historical documents for writership
NASA Astrophysics Data System (ADS)
Ball, Gregory R.; Pu, Danjun; Stritmatter, Roger; Srihari, Sargur N.
2010-01-01
Over the last century forensic document science has developed progressively more sophisticated pattern recognition methodologies for ascertaining the authorship of disputed documents. These include advances not only in computer assisted stylometrics, but forensic handwriting analysis. We present a writer verification method and an evaluation of an actual historical document written by an unknown writer. The questioned document is compared against two known handwriting samples of Herman Melville, a 19th century American author who has been hypothesized to be the writer of this document. The comparison led to a high confidence result that the questioned document was written by the same writer as the known documents. Such methodology can be applied to many such questioned documents in historical writing, both in literary and legal fields.
ERIC Educational Resources Information Center
Ling, Chris D.; Bridgeman, Adam J.
2011-01-01
Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…
Development of a Web-Enabled Informatics Platform for Manipulation of Gene Expression Data
2004-12-01
genomic platforms such as metabolomics and proteomics , and to federated databases for knowledge management. A successful SBIR Phase I completed...measurements that require sophisticated bioinformatic platforms for data archival, management, integration, and analysis if researchers are to derive...web-enabled bioinformatic platform consisting of a Laboratory Information Management System (LIMS), an Analysis Information Management System (AIMS
Development of a Searchable Metabolite Database and Simulator of Xenobiotic Metabolism
A computational tool (MetaPath) has been developed for storage and analysis of metabolic pathways and associated metadata. The system is capable of sophisticated text and chemical structure/substructure searching as well as rapid comparison of metabolites formed across chemicals,...
A Magnetic Circuit Demonstration.
ERIC Educational Resources Information Center
Vanderkooy, John; Lowe, June
1995-01-01
Presents a demonstration designed to illustrate Faraday's, Ampere's, and Lenz's laws and to reinforce the concepts through the analysis of a two-loop magnetic circuit. Can be made dramatic and challenging for sophisticated students but is suitable for an introductory course in electricity and magnetism. (JRH)
On the substance of a sophisticated epistemology
NASA Astrophysics Data System (ADS)
Elby, Andrew; Hammer, David
2001-09-01
Among researchers who study students' epistemologies, a consensus has emerged about what constitutes a sophisticated stance toward scientific knowledge. According to this community consensus, students should understand scientific knowledge as tentative and evolving, rather than certain and unchanging; subjectively tied to scientists' perspectives, rather than objectively inherent in nature; and individually or socially constructed, rather than discovered. Surveys, interview protocols, and other methods used to probe students' beliefs about scientific knowledge broadly reflect this outlook. This article questions the community consensus about epistemological sophistication. We do not suggest that scientific knowledge is objective and fixed; if forced to choose whether knowledge is certain or tentative, with no opportunity to elaborate, we would choose tentative. Instead, our critique consists of two lines of argument. First, the literature fails to distinguish between the correctness and productivity of an epistemological belief. For instance, elementary school students who believe that science is about discovering objective truths to questions, such as whether the earth is round or flat, or whether an asteroid led to the extinction of the dinosaurs, may be more likely to succeed in science than students who believe science is about telling stories that vary with one's perspective. Naïve realism, although incorrect (according to a broad consensus of philosophers and social scientists), may nonetheless be productive for helping those students learn. Second, according to the consensus view as reflected in commonly used surveys, epistemological sophistication consists of believing certain blanket generalizations about the nature of knowledge and learning, generalizations that do not attend to context. These generalizations are neither correct nor productive. For example, it would be unsophisticated for students to view as tentative the idea that the earth is round rather than flat. By contrast, they should take a more tentative stance toward theories of mass extinction. Nonetheless, many surveys and interview protocols tally students as sophisticated not for attending to these contextual nuances, but for subscribing broadly to the view that knowledge is tentative.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petiteau, Antoine; Auger, Gerard; Halloin, Hubert
A new LISA simulator (LISACode) is presented. Its ambition is to achieve a new degree of sophistication allowing to map, as closely as possible, the impact of the different subsystems on the measurements. LISACode is not a detailed simulator at the engineering level but rather a tool whose purpose is to bridge the gap between the basic principles of LISA and a future, sophisticated end-to-end simulator. This is achieved by introducing, in a realistic manner, most of the ingredients that will influence LISA's sensitivity as well as the application of TDI combinations. Many user-defined parameters allow the code to studymore » different configurations of LISA thus helping to finalize the definition of the detector. Another important use of LISACode is in generating time-series for data analysis developments.« less
Instrumental Surveillance of Water Quality.
ERIC Educational Resources Information Center
Miller, J. A.; And Others
The role analytical instrumentation performs in the surveillance and control of the quality of water resources is reviewed. Commonly performed analyses may range from simple tests for physical parameters to more highly sophisticated radiological or spectrophotometric methods. This publication explores many of these types of water quality analyses…
Censorship: Tactics for Defense.
ERIC Educational Resources Information Center
Lowery, Skip
1998-01-01
Book banners are generally successful because they have a wide network of support, including national coalitions with sophisticated organizational methods--such as electing certain people to school boards. School officials should get organized and devise defensive strategies, such as inviting critics to class, asking what they would like to…
A sophisticated simulation for the fracture behavior of concrete material using XFEM
NASA Astrophysics Data System (ADS)
Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili
2017-10-01
The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.
Epistemic Beliefs and Conceptual Understanding in Biotechnology: A Case Study
NASA Astrophysics Data System (ADS)
Rebello, Carina M.; Siegel, Marcelle A.; Witzig, Stephen B.; Freyermuth, Sharyn K.; McClure, Bruce A.
2012-04-01
The purpose of this investigation was to explore students' epistemic beliefs and conceptual understanding of biotechnology. Epistemic beliefs can influence reasoning, how individuals evaluate information, and informed decision making abilities. These skills are important for an informed citizenry that will participate in debates regarding areas in science such as biotechnology. We report on an in-depth case study analysis of three undergraduate, non-science majors in a biotechnology course designed for non-biochemistry majors. We selected participants who performed above average and below average on the first in-class exam. Data from multiple sources—interviews, exams, and a concept instrument—were used to construct (a) individual profiles and (b) a cross-case analysis of our participants' conceptual development and epistemic beliefs from two different theoretical perspectives—Women's Ways of Knowing and the Reflective Judgment Model. Two independent trained researchers coded all case records independently for both theoretical perspectives, with resultant initial Cohen's kappa values above .715 (substantial agreement), and then reached consensus on the codes. Results indicate that a student with more sophisticated epistemology demonstrated greater conceptual understandings at the end of the course than a student with less sophisticated epistemology, even though the latter performed higher initially. Also a student with a less sophisticated epistemology and low initial conceptual performance does not demonstrate gains in their overall conceptual understanding. Results suggest the need for instructional interventions fostering epistemological development of learners in order to facilitate their conceptual growth.
Takahashi, Hiro; Nemoto, Takeshi; Yoshida, Teruhiko; Honda, Hiroyuki; Hasegawa, Tadashi
2006-01-01
Background Recent advances in genome technologies have provided an excellent opportunity to determine the complete biological characteristics of neoplastic tissues, resulting in improved diagnosis and selection of treatment. To accomplish this objective, it is important to establish a sophisticated algorithm that can deal with large quantities of data such as gene expression profiles obtained by DNA microarray analysis. Results Previously, we developed the projective adaptive resonance theory (PART) filtering method as a gene filtering method. This is one of the clustering methods that can select specific genes for each subtype. In this study, we applied the PART filtering method to analyze microarray data that were obtained from soft tissue sarcoma (STS) patients for the extraction of subtype-specific genes. The performance of the filtering method was evaluated by comparison with other widely used methods, such as signal-to-noise, significance analysis of microarrays, and nearest shrunken centroids. In addition, various combinations of filtering and modeling methods were used to extract essential subtype-specific genes. The combination of the PART filtering method and boosting – the PART-BFCS method – showed the highest accuracy. Seven genes among the 15 genes that are frequently selected by this method – MIF, CYFIP2, HSPCB, TIMP3, LDHA, ABR, and RGS3 – are known prognostic marker genes for other tumors. These genes are candidate marker genes for the diagnosis of STS. Correlation analysis was performed to extract marker genes that were not selected by PART-BFCS. Sixteen genes among those extracted are also known prognostic marker genes for other tumors, and they could be candidate marker genes for the diagnosis of STS. Conclusion The procedure that consisted of two steps, such as the PART-BFCS and the correlation analysis, was proposed. The results suggest that novel diagnostic and therapeutic targets for STS can be extracted by a procedure that includes the PART filtering method. PMID:16948864
A Novel Field Deployable Point-of-Care Diagnostic Test for Cutaneous Leishmaniasis
2015-10-01
include localized cutaneous leishmaniasis (LCL), and destructive nasal and oropharyngeal lesions of mucosal leishmaniasis (ML). LCL in the New World...the high costs, personnel training and need of sophisticated equipment. Therefore, novel methods to detect leishmaniasis at the POC are urgently needed...To date, there is no field-standardized molecular method based on DNA amplification coupled with Lateral Flow reading to detect leishmaniasis
de Andrade, Jucimara Kulek; de Andrade, Camila Kulek; Komatsu, Emy; Perreault, Hélène; Torres, Yohandra Reyes; da Rosa, Marcos Roberto; Felsner, Maria Lurdes
2017-08-01
Corn syrups, important ingredients used in food and beverage industries, often contain high levels of 5-hydroxymethyl-2-furfural (HMF), a toxic contaminant. In this work, an in house validation of a difference spectrophotometric method for HMF analysis in corn syrups was developed using sophisticated statistical tools by the first time. The methodology showed excellent analytical performance with good selectivity, linearity (R 2 =99.9%, r>0.99), accuracy and low limits (LOD=0.10mgL -1 and LOQ=0.34mgL -1 ). An excellent precision was confirmed by repeatability (RSD (%)=0.30) and intermediate precision (RSD (%)=0.36) estimates and by Horrat value (0.07). A detailed study of method precision using a nested design demonstrated that variation sources such as instruments, operators and time did not interfere in the variability of results within laboratory and consequently in its intermediate precision. The developed method is environmentally friendly, fast, cheap and easy to implement resulting in an attractive alternative for corn syrups quality control in industries and official laboratories. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
2013-01-01
The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...
Innovative eLearning: Technology Shaping Contemporary Problem Based Learning: A Cross-Case Analysis
ERIC Educational Resources Information Center
Blackburn, Greg
2015-01-01
Preparing students to be critical thinkers and effective communicators is essential in today's multinational and technologically sophisticated environment. New electronic technologies provide opportunities for creating learning environments that extend the possibilities of "old" but still essential technologies: books, blackboards, and…
High-Quality Collaboration Benefits Teachers and Students. Lessons from Research
ERIC Educational Resources Information Center
Killion, Joellen
2015-01-01
In this article, Joellen Killion highlights the methodology, analysis, findings, and limitations of Ronfeldt, M., Farmer, S., McQueen, K., & Grissom, J. (2015), "Teacher collaboration in instructional teams and student achievement," "American Educational Research Journal," 52(3), 475-514. Using sophisticated statistical…
NASA Technical Reports Server (NTRS)
Hall, David G.; Bridges, James
1992-01-01
A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
Novel Substrates as Sources of Ancient DNA: Prospects and Hurdles
Green, Eleanor Joan
2017-01-01
Following the discovery in the late 1980s that hard tissues such as bones and teeth preserve genetic information, the field of ancient DNA analysis has typically concentrated upon these substrates. The onset of high-throughput sequencing, combined with optimized DNA recovery methods, has enabled the analysis of a myriad of ancient species and specimens worldwide, dating back to the Middle Pleistocene. Despite the growing sophistication of analytical techniques, the genetic analysis of substrates other than bone and dentine remain comparatively “novel”. Here, we review analyses of other biological substrates which offer great potential for elucidating phylogenetic relationships, paleoenvironments, and microbial ecosystems including (1) archaeological artifacts and ecofacts; (2) calcified and/or mineralized biological deposits; and (3) biological and cultural archives. We conclude that there is a pressing need for more refined models of DNA preservation and bespoke tools for DNA extraction and analysis to authenticate and maximize the utility of the data obtained. With such tools in place the potential for neglected or underexploited substrates to provide a unique insight into phylogenetics, microbial evolution and evolutionary processes will be realized. PMID:28703741
Martinez-Pinna, Roxana; Gonzalez de Peredo, Anne; Monsarrat, Bernard; Burlet-Schiltz, Odile; Martin-Ventura, Jose Luis
2014-08-01
To find potential biomarkers of abdominal aortic aneurysms (AAA), we performed a differential proteomic study based on human plasma-derived microvesicles. Exosomes and microparticles isolated from plasma of AAA patients and control subjects (n = 10 each group) were analyzed by a label-free quantitative MS-based strategy. Homemade and publicly available software packages have been used for MS data analysis. The application of two kinds of bioinformatic tools allowed us to find differential protein profiles from AAA patients. Some of these proteins found by the two analysis methods belong to main pathological mechanisms of AAA such as oxidative stress, immune-inflammation, and thrombosis. Data analysis from label-free MS-based experiments requires the use of sophisticated bioinformatic approaches to perform quantitative studies from complex protein mixtures. The application of two of these bioinformatic tools provided us a preliminary list of differential proteins found in plasma-derived microvesicles not previously associated to AAA, which could help us to understand the pathological mechanisms related to this disease. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Use of Electronic Data Capture Tools in Clinical Trials: Web-Survey of 259 Canadian Trials
Jonker, Elizabeth; Sampson, Margaret; Krleža-Jerić, Karmela; Neisa, Angelica
2009-01-01
Background Electronic data capture (EDC) tools provide automated support for data collection, reporting, query resolution, randomization, and validation, among other features, for clinical trials. There is a trend toward greater adoption of EDC tools in clinical trials, but there is also uncertainty about how many trials are actually using this technology in practice. A systematic review of EDC adoption surveys conducted up to 2007 concluded that only 20% of trials are using EDC systems, but previous surveys had weaknesses. Objectives Our primary objective was to estimate the proportion of phase II/III/IV Canadian clinical trials that used an EDC system in 2006 and 2007. The secondary objectives were to investigate the factors that can have an impact on adoption and to develop a scale to assess the extent of sophistication of EDC systems. Methods We conducted a Web survey to estimate the proportion of trials that were using an EDC system. The survey was sent to the Canadian site coordinators for 331 trials. We also developed and validated a scale using Guttman scaling to assess the extent of sophistication of EDC systems. Trials using EDC were compared by the level of sophistication of their systems. Results We had a 78.2% response rate (259/331) for the survey. It is estimated that 41% (95% CI 37.5%-44%) of clinical trials were using an EDC system. Trials funded by academic institutions, government, and foundations were less likely to use an EDC system compared to those sponsored by industry. Also, larger trials tended to be more likely to adopt EDC. The EDC sophistication scale had six levels and a coefficient of reproducibility of 0.901 (P< .001) and a coefficient of scalability of 0.79. There was no difference in sophistication based on the funding source, but pediatric trials were likely to use a more sophisticated EDC system. Conclusion The adoption of EDC systems in clinical trials in Canada is higher than the literature indicated: a large proportion of clinical trials in Canada use some form of automated data capture system. To inform future adoption, research should gather stronger evidence on the costs and benefits of using different EDC systems. PMID:19275984
Medical subject heading (MeSH) annotations illuminate maize genetics and evolution
USDA-ARS?s Scientific Manuscript database
In the modern era, high-density marker panels and/or whole-genome sequencing,coupled with advanced phenotyping pipelines and sophisticated statistical methods, have dramatically increased our ability to generate lists of candidate genes or regions that are putatively associated with phenotypes or pr...
A Critical Review of Some Qualitative Research Methods Used to Explore Rater Cognition
ERIC Educational Resources Information Center
Suto, Irenka
2012-01-01
Internationally, many assessment systems rely predominantly on human raters to score examinations. Arguably, this facilitates the assessment of multiple sophisticated educational constructs, strengthening assessment validity. It can introduce subjectivity into the scoring process, however, engendering threats to accuracy. The present objectives…
EDUCATIONAL SPECIFICATIONS FOR SECONDARY SCHOOLS.
ERIC Educational Resources Information Center
FLANIGAN, VIRGINIA; AND OTHERS
THE REPORT CAN BE USED AS A GUIDE IN THE PREPARATION OF EDUCATIONAL SPECIFICATIONS FOR SECONDARY SCHOOLS. NEW CURRICULA, METHODS OF INSTRUCTION, AND TEACHING AIDS ADD TO THE SOPHISTICATION OF EDUCATION. PROGRAMS ENCOMPASS MANY AREAS OF EDUCATION, EACH REQUIRING PROFESSIONAL DECISIONS. THESE DECISIONS MUST BE ORGANIZED INTO WRITTEN SPECIFICATIONS…
Environmental Scanning Practices in Junior, Technical, and Community Colleges.
ERIC Educational Resources Information Center
Friedel, Janice N.; Rosenberg, Dana
1993-01-01
Reports results of a 1991 national survey of environmental scanning practices at two-year institutions. Examines sophistication of scanning efforts; personnel involved; and methods of collecting, compiling, interpreting, communicating, and using scan information. Finds scanning practices in use at 41% of the 601 responding institutions. (PAA)
NASA Technical Reports Server (NTRS)
Evers, Ken H.; Bachert, Robert F.
1987-01-01
The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.
New technologies for advanced three-dimensional optimum shape design in aeronautics
NASA Astrophysics Data System (ADS)
Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno
1999-05-01
The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright
Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.
Buske, Christine; Gerlai, Robert
2014-08-30
Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.
Elokely, Khaled M; Eldawy, Mohamed A; Elkersh, Mohamed A; El-Moselhy, Tarek F
2011-01-01
A simple spectrofluorometric method has been developed, adapted, and validated for the quantitative estimation of drugs containing α-methylene sulfone/sulfonamide functional groups using N(1)-methylnicotinamide chloride (NMNCl) as fluorogenic agent. The proposed method has been applied successfully to the determination of methyl sulfonyl methane (MSM) (1), tinidazole (2), rofecoxib (3), and nimesulide (4) in pure forms, laboratory-prepared mixtures, pharmaceutical dosage forms, spiked human plasma samples, and in volunteer's blood. The method showed linearity over concentration ranging from 1 to 150 μg/mL, 10 to 1000 ng/mL, 1 to 1800 ng/mL, and 30 to 2100 ng/mL for standard solutions of 1, 2, 3, and 4, respectively, and over concentration ranging from 5 to 150 μg/mL, 10 to 1000 ng/mL, 10 to 1700 ng/mL, and 30 to 2350 ng/mL in spiked human plasma samples of 1, 2, 3, and 4, respectively. The method showed good accuracy, specificity, and precision in both laboratory-prepared mixtures and in spiked human plasma samples. The proposed method is simple, does not need sophisticated instruments, and is suitable for quality control application, bioavailability, and bioequivalency studies. Besides, its detection limits are comparable to other sophisticated chromatographic methods.
Alexander, Gregory L; Pasupathy, Kalyan S; Steege, Linsey M; Strecker, E Bradley; Carley, Kathleen M
2014-08-01
The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support communication and associated resident outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Liebherr, Magnus; Haas, Christian T.
2014-01-01
Variability indicates motor control disturbances and is suitable to identify gait pathologies. It can be quantified by linear parameters (amplitude estimators) and more sophisticated nonlinear methods (structural information). Detrended Fluctuation Analysis (DFA) is one method to measure structural information, e.g., from stride time series. Recently, an improved method, Adaptive Fractal Analysis (AFA), has been proposed. This method has not been applied to gait data before. Fractal scaling methods (FS) require long stride-to-stride data to obtain valid results. However, in clinical studies, it is not usual to measure a large number of strides (e.g., strides). Amongst others, clinical gait analysis is limited due to short walkways, thus, FS seem to be inapplicable. The purpose of the present study was to evaluate FS under clinical conditions. Stride time data of five self-paced walking trials ( strides each) of subjects with PD and a healthy control group (CG) was measured. To generate longer time series, stride time sequences were stitched together. The coefficient of variation (CV), fractal scaling exponents (DFA) and (AFA) were calculated. Two surrogate tests were performed: A) the whole time series was randomly shuffled; B) the single trials were randomly shuffled separately and afterwards stitched together. CV did not discriminate between PD and CG. However, significant differences between PD and CG were found concerning and . Surrogate version B yielded a higher mean squared error and empirical quantiles than version A. Hence, we conclude that the stitching procedure creates an artificial structure resulting in an overestimation of true . The method of stitching together sections of gait seems to be appropriate in order to distinguish between PD and CG with FS. It provides an approach to integrate FS as standard in clinical gait analysis and to overcome limitations such as short walkways. PMID:24465708
Exploring product supply across age classes and forest types
Robert C. Abt; Karen J. Lee; Gerardo Pacheco
1995-01-01
Timber supply modeling has evolved from examining inventory sustainability based on growth/drain relationships to sophisticated inventory and supply models. These analyses have consistently recognized regional, ownership (public/private), and species group (hardwood/softwood) differences. Recognition of product differences is fundamental to market analysis which...
Was Euclid an Unnecessarily Sophisticated Psychologist?
ERIC Educational Resources Information Center
Arabie, Phipps
1991-01-01
The current state of multidimensional scaling using the city-block metric is reviewed, with attention to (1) substantive and theoretical issues; (2) recent algorithmic developments and their implications for analysis; (3) isometries with other metrics; (4) links to graph-theoretic models; and (5) prospects for future development. (SLD)
Paralogic Hermeneutics and the Possibilities of Rhetoric.
ERIC Educational Resources Information Center
Kent, Thomas
1989-01-01
Explains how the Sophistic tradition, an alternative to the Platonic-Aristotelian rhetorical tradition, provides the historical foundation for a paralogic rhetoric that treats discourse production and analysis as open-ended dialogic activities and not as a codifiable system. Argues that teachers must examine the powerful paralogic/hermeneutic…
Multiple Hypnotizabilities: Differentiating the Building Blocks of Hypnotic Response
ERIC Educational Resources Information Center
Woody, Erik Z.; Barnier, Amanda J.; McConkey, Kevin M.
2005-01-01
Although hypnotizability can be conceptualized as involving component subskills, standard measures do not differentiate them from a more general unitary trait, partly because the measures include limited sets of dichotomous items. To overcome this, the authors applied full-information factor analysis, a sophisticated analytic approach for…
Probing the neurochemical correlates of motivation and decision making.
Wassum, Kate M; Phillips, Paul E M
2015-01-21
Online electrochemical detection techniques are the state-of-the-art for evaluating chemical communication in the brain underlying motivated behavior and decision making. In this Viewpoint, we discuss avenues for future technological development, as well as the requirement for increasingly sophisticated and interdisciplinary behavioral analysis.
An Interpersonal Approach to Writing Negative Messages.
ERIC Educational Resources Information Center
Salerno, Douglas
1988-01-01
Asserts that textbook advice regarding buffers and negative messages is simplistic and frequently wrong, and analyses 22 job-refusal letters and their effectiveness. Claims that recent research on cognitive complexity and social perspective-taking suggests the need for more sophisticated audience analysis protocols for dealing with the negative…
Comparison of Traditional and Trial-Based Methodologies for Conducting Functional Analyses
ERIC Educational Resources Information Center
LaRue, Robert H.; Lenard, Karen; Weiss, Mary Jane; Bamond, Meredith; Palmieri, Mark; Kelley, Michael E.
2010-01-01
Functional analysis represents a sophisticated and empirically supported functional assessment procedure. While these procedures have garnered considerable empirical support, they are often underused in clinical practice. Safety risks resulting from the evocation of maladaptive behavior and the length of time required to conduct functional…
The First Sophists and the Uses of History.
ERIC Educational Resources Information Center
Jarratt, Susan C.
1987-01-01
Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…
DOT National Transportation Integrated Search
1977-02-01
The limitations of currently used estimation procedures in socio-economic modeling have been highlighted in the ongoing work of Senge, in which it is shown where more sophisticated estimation procedures may become necessary. One such advanced method ...
How To Teach "Dirty" Books in High School.
ERIC Educational Resources Information Center
O'Malley, William J.
1967-01-01
Today's self-centered, utopian attitudes toward sexual experience compel teachers to avoid both overcaution and over-indulgence in selecting controversial books for classroom use. One method of selection is to rank books in a gradual progression from those requiring little literary and sexual sophistication in the reader to those requiring much…
ERIC Educational Resources Information Center
Begeny, John C.; Krouse, Hailey E.; Brown, Kristina G.; Mann, Courtney M.
2011-01-01
Teacher judgments about students' academic abilities are important for instructional decision making and potential special education entitlement decisions. However, the small number of studies evaluating teachers' judgments are limited methodologically (e.g., sample size, procedural sophistication) and have yet to answer important questions…
Isolation by ion-exchange methods. In Sarker S.D. (ed) Natural Products Isolation, 3rd edition
USDA-ARS?s Scientific Manuscript database
The primary goal of many natural products chemists is to extract, isolate, and characterize specific analytes from complex plant, animal, microbial, and food matrices. To achieve this goal, they rely considerably on highly sophisticated and highly hyphenated modern instrumentation. Yet, the vast maj...
USDA-ARS?s Scientific Manuscript database
As global trade increases, invasive insects inflict increasing economic damage to agriculture and urban landscapes in the United States yearly, despite a sophisticated array of interception methods and quarantine programs designed to exclude their entry. Insects that are hidden inside soil, wood, or...
Detecting Satisficing in Online Surveys
ERIC Educational Resources Information Center
Salifu, Shani
2012-01-01
The proliferation of computers and high speed internet services are making online activities an integral part of peoples' lives as connect with friends, shop, and exchange data. The increasing ability of the internet to handle sophisticated data exchanges is endearing it to researchers interested in gathering all kinds of data. This method has the…
Teaching Economic Growth Theory with Data
ERIC Educational Resources Information Center
Elmslie, Bruce T.; Tebaldi, Edinaldo
2010-01-01
Many instructors in subjects such as economics are frequently concerned with how to teach technical material to undergraduate students with limited mathematical backgrounds. One method that has proven successful for the authors is to connect theoretically sophisticated material with actual data. This enables students to see how the theory relates…
Socially Responsible Knowledge and Behaviors: Comparing Upper vs. Lower Classmen
ERIC Educational Resources Information Center
Kozar, Joy M.; Connell, Kim Y. Hiller
2010-01-01
Utilizing a sample of undergraduate students and survey research methods, this study examined knowledge on issues of social responsibility within the apparel and textiles industry, comparing the sophistication among upper- versus lower-classmen. The study also investigated the differences between students in their socially responsible apparel…
Seeking Relevance: American Political Science and America
ERIC Educational Resources Information Center
Maranto, Robert; Woessner, Matthew C.
2012-01-01
In this article, the authors talk about the relevance of American political science and America. Political science has enormous strengths in its highly talented practitioners and sophisticated methods. However, its disconnection from its host society, while not so severe as for fields like English and sociology, nonetheless poses an existential…
NASA Technical Reports Server (NTRS)
Buntine, Wray
1994-01-01
IND computer program introduces Bayesian and Markov/maximum-likelihood (MML) methods and more-sophisticated methods of searching in growing trees. Produces more-accurate class-probability estimates important in applications like diagnosis. Provides range of features and styles with convenience for casual user, fine-tuning for advanced user or for those interested in research. Consists of four basic kinds of routines: data-manipulation, tree-generation, tree-testing, and tree-display. Written in C language.
Recent advances in modeling languages for pathway maps and computable biological networks.
Slater, Ted
2014-02-01
As our theories of systems biology grow more sophisticated, the models we use to represent them become larger and more complex. Languages necessarily have the expressivity and flexibility required to represent these models in ways that support high-resolution annotation, and provide for simulation and analysis that are sophisticated enough to allow researchers to master their data in the proper context. These languages also need to facilitate model sharing and collaboration, which is currently best done by using uniform data structures (such as graphs) and language standards. In this brief review, we discuss three of the most recent systems biology modeling languages to appear: BEL, PySB and BCML, and examine how they meet these needs. Copyright © 2014 Elsevier Ltd. All rights reserved.
Progress in Computational Electron-Molecule Collisions
NASA Astrophysics Data System (ADS)
Rescigno, Tn
1997-10-01
The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.
2011-01-01
Background Tetrachloroethylene (PCE) is an important occupational chemical used in metal degreasing and drycleaning and a prevalent drinking water contaminant. Exposure often occurs with other chemicals but it occurred alone in a pattern that reduced the likelihood of confounding in a unique scenario on Cape Cod, Massachusetts. We previously found a small to moderate increased risk of breast cancer among women with the highest exposures using a simple exposure model. We have taken advantage of technical improvements in publically available software to incorporate a more sophisticated determination of water flow and direction to see if previous results were robust to more accurate exposure assessment. Methods The current analysis used PCE exposure estimates generated with the addition of water distribution modeling software (EPANET 2.0) to test model assumptions, compare exposure distributions to prior methods, and re-examine the risk of breast cancer. In addition, we applied data smoothing to examine nonlinear relationships between breast cancer and exposure. We also compared a set of measured PCE concentrations in water samples collected in 1980 to modeled estimates. Results Thirty-nine percent of individuals considered unexposed in prior epidemiological analyses were considered exposed using the current method, but mostly at low exposure levels. As a result, the exposure distribution was shifted downward resulting in a lower value for the 90th percentile, the definition of "high exposure" in prior analyses. The current analyses confirmed a modest increase in the risk of breast cancer for women with high PCE exposure levels defined by either the 90th percentile (adjusted ORs 1.0-1.5 for 0-19 year latency assumptions) or smoothing analysis cut point (adjusted ORs 1.3-2.0 for 0-15 year latency assumptions). Current exposure estimates had a higher correlation with PCE concentrations in water samples (Spearman correlation coefficient = 0.65, p < 0.0001) than estimates generated using the prior method (0.54, p < 0.0001). Conclusions The incorporation of sophisticated flow estimates in the exposure assessment method shifted the PCE exposure distribution downward, but did not meaningfully affect the exposure ranking of subjects or the strength of the association with the risk of breast cancer found in earlier analyses. Thus, the current analyses show a slightly elevated breast cancer risk for highly exposed women, with strengthened exposure assessment and minimization of misclassification by using the latest technology. PMID:21600013
Aristotle and Social-Epistemic Rhetoric: The Systematizing of the Sophistic Legacy.
ERIC Educational Resources Information Center
Allen, James E.
While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…
An Innovative Learning Model for Computation in First Year Mathematics
ERIC Educational Resources Information Center
Tonkes, E. J.; Loch, B. I.; Stace, A. W.
2005-01-01
MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…
Demonstrating Success: Web Analytics and Continuous Improvement
ERIC Educational Resources Information Center
Loftus, Wayne
2012-01-01
As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…
Emerging Uses of Computer Technology in Qualitative Research.
ERIC Educational Resources Information Center
Parker, D. Randall
The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…
A Large-Scale Analysis of Variance in Written Language
ERIC Educational Resources Information Center
Johns, Brendan T.; Jamieson, Randall K.
2018-01-01
The collection of very large text sources has revolutionized the study of natural language, leading to the development of several models of language learning and distributional semantics that extract sophisticated semantic representations of words based on the statistical redundancies contained within natural language (e.g., Griffiths, Steyvers,…
Textbook Pathos: Tracing a Through-Line of Emotion in Composition Textbooks
ERIC Educational Resources Information Center
Jensen, Tim
2016-01-01
Gretchen Flesher Moon's 2003 analysis of emotion's treatment in composition textbooks revealed that pathos "gets very short shrift" or none at all. Since then, however, conversations regarding affect and emotion have advanced in both scope and sophistication. This proliferation of scholarly activity has brought the passions of persuasion…
Analysis of an Anti-Phishing Lab Activity
ERIC Educational Resources Information Center
Werner, Laurie A.; Courte, Jill
2010-01-01
Despite advances in spam detection software, anti-spam laws, and increasingly sophisticated users, the number of successful phishing scams continues to grow. In addition to monetary losses attributable to phishing, there is also a loss of confidence that stifles use of online services. Using in-class activities in an introductory computer course…
Instructional Design Considerations in Converting Non-CBT Materials into CBT Courses.
ERIC Educational Resources Information Center
Ng, Raymond
Instructional designers who are asked to convert existing training materials into computer-based training (CBT) must take special precautions to avoid making the product into a sophisticated page turner. Although conversion may save considerable time on subject research and analysis, courses to be delivered through microcomputers may require…
The microcomputer scientific software series 2: general linear model--regression.
Harold M. Rauscher
1983-01-01
The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...
USDA-ARS?s Scientific Manuscript database
Accurate prediction of pesticide volatilization is important for the protection of human and environmental health. Due to the complexity of the volatilization process, sophisticated predictive models are needed, especially for dry soil conditions. A mathematical model was developed to allow simulati...
ERIC Educational Resources Information Center
Pape, Stephen J.
2004-01-01
Many children read mathematics word problems and directly translate them to arithmetic operations. More sophisticated problem solvers transform word problems into object-based or mental models. Subsequent solutions are often qualitatively different because these models differentially support cognitive processing. Based on a conception of problem…
Modeling conflict : research methods, quantitative modeling, and lessons learned.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.
2004-09-01
This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a resultmore » of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.« less
NASA Technical Reports Server (NTRS)
1994-01-01
Lewis Research Center (LEW) has assisted The Cleveland Museum of Art (CMA) in analyzing the museum's paintings. Because of the many layers of paint that are often involved, this is a complex process. The cross-section of a paint chip must be scanned with a microscope to determine whether a paint layer is original or a restoration. The paint samples, however, are rarely flat enough for high magnification viewing and are frequently scratched. LEW devised an automated method that produces intact, flat, polished paint cross-sections. A sophisticated microprocessor-controlled grinding and polishing machine was manually employed in preparation of exotic samples for aerospace research was a readily adaptable technique. It produced perfectly flat samples with clearly defined layers. The process has been used successfully on a number of paintings, and LEW and CMA are considering additional applications.
Ethics and Epistemology in Big Data Research.
Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A
2017-12-01
Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.
Analytical methods development for supramolecular design in solar hydrogen production
NASA Astrophysics Data System (ADS)
Brown, J. R.; Elvington, M.; Mongelli, M. T.; Zigler, D. F.; Brewer, K. J.
2006-08-01
In the investigation of alternative energy sources, specifically, solar hydrogen production from water, the ability to perform experiments with a consistent and reproducible light source is key to meaningful photochemistry. The design, construction, and evaluation of a series of LED array photolysis systems for high throughput photochemistry have been performed. Three array systems of increasing sophistication are evaluated using calorimetric measurements and potassium tris(oxalato)ferrate(II) chemical actinometry and compared with a traditional 1000 W Xe arc lamp source. The results are analyzed using descriptive statistics and analysis of variance (ANOVA). The third generation array is modular, and controllable in design. Furthermore, the third generation array system is shown to be comparable in both precision and photonic output to a 1000 W Xe arc lamp.
NASA Astrophysics Data System (ADS)
Li, Tianfang; Wang, Jing; Wen, Junhai; Li, Xiang; Lu, Hongbing; Hsieh, Jiang; Liang, Zhengrong
2004-05-01
To treat the noise in low-dose x-ray CT projection data more accurately, analysis of the noise properties of the data and development of a corresponding efficient noise treatment method are two major problems to be addressed. In order to obtain an accurate and realistic model to describe the x-ray CT system, we acquired thousands of repeated measurements on different phantoms at several fixed scan angles by a GE high-speed multi-slice spiral CT scanner. The collected data were calibrated and log-transformed by the sophisticated system software, which converts the detected photon energy into sinogram data that satisfies the Radon transform. From the analysis of these experimental data, a nonlinear relation between mean and variance for each datum of the sinogram was obtained. In this paper, we integrated this nonlinear relation into a penalized likelihood statistical framework for a SNR (signal-to-noise ratio) adaptive smoothing of noise in the sinogram. After the proposed preprocessing, the sinograms were reconstructed with unapodized FBP (filtered backprojection) method. The resulted images were evaluated quantitatively, in terms of noise uniformity and noise-resolution tradeoff, with comparison to other noise smoothing methods such as Hanning filter and Butterworth filter at different cutoff frequencies. Significant improvement on noise and resolution tradeoff and noise property was demonstrated.
Phase transition studies in bismuth ferrite thin films synthesized via spray pyrolysis technique
NASA Astrophysics Data System (ADS)
Goyal, Ankit; Lakhotia, Harish
2013-06-01
Multiferroic are the materials, which combine two or more "ferroic" properties, ferromagnetism, ferroelectricity or ferroelasticity. BiFeO3 is the only single phase multiferroic material which possesses a high Curie temperature (TC ˜ 1103 K), and a high Neel temperature (TN ˜ 643 K) at room temperature. Normally sophisticated methods are being used to deposit thin films but here we have tried a different method Low cost Spray Pyrolysis Method to deposit BiFeO3 thin film of Glass Substrate with rhombohedral crystal structure and R3c space group. Bismuth Ferrite thin films are synthesized using Bismuth Nitrate and Iron Nitrate as precursor solutions. X-Ray Diffraction (XRD) and Scanning Electron Microscopy (SEM) were used to study structural analysis of prepared thin films. XRD pattern shows phase formation of BiFeO3 and SEM analysis shows formation of nanocrystals of 200 nm. High Temperature Resistivity measurements were done by using Keithley Electrometer (Two Probe system). Abrupt behavior in temperature range (313 K - 400K) has been observed in resistance studies which more likely suggests that in this transition the structure is tetragonal rather than rhombohedral. BiFeO3 is the potential active material in the next generation of ferroelectric memory devices.
Lopatka, Martin; Barcaru, Andrei; Sjerps, Marjan J; Vivó-Truyols, Gabriel
2016-01-29
Accurate analysis of chromatographic data often requires the removal of baseline drift. A frequently employed strategy strives to determine asymmetric weights in order to fit a baseline model by regression. Unfortunately, chromatograms characterized by a very high peak saturation pose a significant challenge to such algorithms. In addition, a low signal-to-noise ratio (i.e. s/n<40) also adversely affects accurate baseline correction by asymmetrically weighted regression. We present a baseline estimation method that leverages a probabilistic peak detection algorithm. A posterior probability of being affected by a peak is computed for each point in the chromatogram, leading to a set of weights that allow non-iterative calculation of a baseline estimate. For extremely saturated chromatograms, the peak weighted (PW) method demonstrates notable improvement compared to the other methods examined. However, in chromatograms characterized by low-noise and well-resolved peaks, the asymmetric least squares (ALS) and the more sophisticated Mixture Model (MM) approaches achieve superior results in significantly less time. We evaluate the performance of these three baseline correction methods over a range of chromatographic conditions to demonstrate the cases in which each method is most appropriate. Copyright © 2016 Elsevier B.V. All rights reserved.
Szatkiewicz, Jin P; Wang, WeiBo; Sullivan, Patrick F; Wang, Wei; Sun, Wei
2013-02-01
Structural variation is an important class of genetic variation in mammals. High-throughput sequencing (HTS) technologies promise to revolutionize copy-number variation (CNV) detection but present substantial analytic challenges. Converging evidence suggests that multiple types of CNV-informative data (e.g. read-depth, read-pair, split-read) need be considered, and that sophisticated methods are needed for more accurate CNV detection. We observed that various sources of experimental biases in HTS confound read-depth estimation, and note that bias correction has not been adequately addressed by existing methods. We present a novel read-depth-based method, GENSENG, which uses a hidden Markov model and negative binomial regression framework to identify regions of discrete copy-number changes while simultaneously accounting for the effects of multiple confounders. Based on extensive calibration using multiple HTS data sets, we conclude that our method outperforms existing read-depth-based CNV detection algorithms. The concept of simultaneous bias correction and CNV detection can serve as a basis for combining read-depth with other types of information such as read-pair or split-read in a single analysis. A user-friendly and computationally efficient implementation of our method is freely available.
The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.
Kyle, Kristopher; Crossley, Scott; Berger, Cynthia
2017-07-11
This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.
2016-01-01
We introduce a portable biochemical analysis platform for rapid field deployment of nucleic acid-based diagnostics using consumer-class quadcopter drones. This approach exploits the ability to isothermally perform the polymerase chain reaction (PCR) with a single heater, enabling the system to be operated using standard 5 V USB sources that power mobile devices (via battery, solar, or hand crank action). Time-resolved fluorescence detection and quantification is achieved using a smartphone camera and integrated image analysis app. Standard sample preparation is enabled by leveraging the drone’s motors as centrifuges via 3D printed snap-on attachments. These advancements make it possible to build a complete DNA/RNA analysis system at a cost of ∼$50 ($US). Our instrument is rugged and versatile, enabling pinpoint deployment of sophisticated diagnostics to distributed field sites. This capability is demonstrated by successful in-flight replication of Staphylococcus aureus and λ-phage DNA targets in under 20 min. The ability to perform rapid in-flight assays with smartphone connectivity eliminates delays between sample collection and analysis so that test results can be delivered in minutes, suggesting new possibilities for drone-based systems to function in broader and more sophisticated roles beyond cargo transport and imaging. PMID:26898247
Priye, Aashish; Wong, Season; Bi, Yuanpeng; Carpio, Miguel; Chang, Jamison; Coen, Mauricio; Cope, Danielle; Harris, Jacob; Johnson, James; Keller, Alexandra; Lim, Richard; Lu, Stanley; Millard, Alex; Pangelinan, Adriano; Patel, Neal; Smith, Luke; Chan, Kamfai; Ugaz, Victor M
2016-05-03
We introduce a portable biochemical analysis platform for rapid field deployment of nucleic acid-based diagnostics using consumer-class quadcopter drones. This approach exploits the ability to isothermally perform the polymerase chain reaction (PCR) with a single heater, enabling the system to be operated using standard 5 V USB sources that power mobile devices (via battery, solar, or hand crank action). Time-resolved fluorescence detection and quantification is achieved using a smartphone camera and integrated image analysis app. Standard sample preparation is enabled by leveraging the drone's motors as centrifuges via 3D printed snap-on attachments. These advancements make it possible to build a complete DNA/RNA analysis system at a cost of ∼$50 ($US). Our instrument is rugged and versatile, enabling pinpoint deployment of sophisticated diagnostics to distributed field sites. This capability is demonstrated by successful in-flight replication of Staphylococcus aureus and λ-phage DNA targets in under 20 min. The ability to perform rapid in-flight assays with smartphone connectivity eliminates delays between sample collection and analysis so that test results can be delivered in minutes, suggesting new possibilities for drone-based systems to function in broader and more sophisticated roles beyond cargo transport and imaging.
TOF-SIMS imaging technique with information entropy
NASA Astrophysics Data System (ADS)
Aoyagi, Satoka; Kawashima, Y.; Kudo, Masahiro
2005-05-01
Time-of-flight secondary ion mass spectrometry (TOF-SIMS) is capable of chemical imaging of proteins on insulated samples in principal. However, selection of specific peaks related to a particular protein, which are necessary for chemical imaging, out of numerous candidates had been difficult without an appropriate spectrum analysis technique. Therefore multivariate analysis techniques, such as principal component analysis (PCA), and analysis with mutual information defined by information theory, have been applied to interpret SIMS spectra of protein samples. In this study mutual information was applied to select specific peaks related to proteins in order to obtain chemical images. Proteins on insulated materials were measured with TOF-SIMS and then SIMS spectra were analyzed by means of the analysis method based on the comparison using mutual information. Chemical mapping of each protein was obtained using specific peaks related to each protein selected based on values of mutual information. The results of TOF-SIMS images of proteins on the materials provide some useful information on properties of protein adsorption, optimality of immobilization processes and reaction between proteins. Thus chemical images of proteins by TOF-SIMS contribute to understand interactions between material surfaces and proteins and to develop sophisticated biomaterials.
Isocratean Discourse Theory and Neo-Sophistic Pedagogy: Implications for the Composition Classroom.
ERIC Educational Resources Information Center
Blair, Kristine L.
With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…
From Poetry to Prose: Sophistic Rhetoric and the Epistemic Music of Language.
ERIC Educational Resources Information Center
Katz, Steven B.
Much revisionist scholarship has focused on sophistic epistemology and its relationship to the current revival of epistemic rhetoric in the academy. However, few scholars have recognized the sensuous substance of words as sounds, and the role it played in sophistic philosophy and rhetoric. Before the invention of the Greek alphabet, poetry was…
Kardum-Skelin, Ika
2011-09-01
Clinical cytology is an interdisciplinary medical diagnostic profession that integrates clinical, laboratory and analytical fields along with final cytologist's expert opinion. Cytology involves nonaggressive, minimally invasive and simple for use procedures that are fully acceptable for the patient. Cytology offers rapid orientation, while in combination with additional technologies on cytologic smear analysis (cytochemistry, immunocytochemistry for cell marker analysis, computer image analysis) or sophisticated methods on cytologic samples (flow cytometry, molecular and cytogenetic analysis) it plays a major role in the diagnosis, subtyping and prognosis of malignant tumors. Ten rules for successful performance in cytology are as follows: 1) due knowledge of overall cytology (general cytologist); 2) inclusion in all stages of cytologic sample manipulation from sampling through reporting; 3) due knowledge of additional technologies to provide appropriate interpretation and/or rational advice in dubious cases; 4) to preserve dignity of the profession because every profession has its advantages, shortcomings and limitations; 5) to insist on quality control of the performance, individual cytologists and cytology team; 6) knowledge transfer to young professionals; 7) assisting fellow professionals in dubious cases irrespective of the time needed and fee because it implies helping the patient and the profession itself; 8) experience exchange with other related professionals to upgrade mutual understanding; 9) to prefer the interest of the profession over one's own interest; and 10) to love cytology.
PHELPS, CHARLES; RAPPUOLI, RINO; LEVIN, SCOTT; SHORTLIFFE, EDWARD; COLWELL, RITA
2016-01-01
Policy Points: Scarce resources, especially in population health and public health practice, underlie the importance of strategic planning.Public health agencies’ current planning and priority setting efforts are often narrow, at times opaque, and focused on single metrics such as cost‐effectiveness.As demonstrated by SMART Vaccines, a decision support software system developed by the Institute of Medicine and the National Academy of Engineering, new approaches to strategic planning allow the formal incorporation of multiple stakeholder views and multicriteria decision making that surpass even those sophisticated cost‐effectiveness analyses widely recommended and used for public health planning.Institutions of higher education can and should respond by building on modern strategic planning tools as they teach their students how to improve population health and public health practice. Context Strategic planning in population health and public health practice often uses single indicators of success or, when using multiple indicators, provides no mechanism for coherently combining the assessments. Cost‐effectiveness analysis, the most complex strategic planning tool commonly applied in public health, uses only a single metric to evaluate programmatic choices, even though other factors often influence actual decisions. Methods Our work employed a multicriteria systems analysis approach—specifically, multiattribute utility theory—to assist in strategic planning and priority setting in a particular area of health care (vaccines), thereby moving beyond the traditional cost‐effectiveness analysis approach. Findings (1) Multicriteria systems analysis provides more flexibility, transparency, and clarity in decision support for public health issues compared with cost‐effectiveness analysis. (2) More sophisticated systems‐level analyses will become increasingly important to public health as disease burdens increase and the resources to deal with them become scarcer. Conclusions The teaching of strategic planning in public health must be expanded in order to fill a void in the profession's planning capabilities. Public health training should actively incorporate model building, promote the interactive use of software tools, and explore planning approaches that transcend restrictive assumptions of cost‐effectiveness analysis. The Strategic Multi‐Attribute Ranking Tool for Vaccines (SMART Vaccines), which was recently developed by the Institute of Medicine and the National Academy of Engineering to help prioritize new vaccine development, is a working example of systems analysis as a basis for decision support. PMID:26994711
Otwombe, Kennedy N.; Petzold, Max; Martinson, Neil; Chirwa, Tobias
2014-01-01
Background Research in the predictors of all-cause mortality in HIV-infected people has widely been reported in literature. Making an informed decision requires understanding the methods used. Objectives We present a review on study designs, statistical methods and their appropriateness in original articles reporting on predictors of all-cause mortality in HIV-infected people between January 2002 and December 2011. Statistical methods were compared between 2002–2006 and 2007–2011. Time-to-event analysis techniques were considered appropriate. Data Sources Pubmed/Medline. Study Eligibility Criteria Original English-language articles were abstracted. Letters to the editor, editorials, reviews, systematic reviews, meta-analysis, case reports and any other ineligible articles were excluded. Results A total of 189 studies were identified (n = 91 in 2002–2006 and n = 98 in 2007–2011) out of which 130 (69%) were prospective and 56 (30%) were retrospective. One hundred and eighty-two (96%) studies described their sample using descriptive statistics while 32 (17%) made comparisons using t-tests. Kaplan-Meier methods for time-to-event analysis were commonly used in the earlier period (n = 69, 76% vs. n = 53, 54%, p = 0.002). Predictors of mortality in the two periods were commonly determined using Cox regression analysis (n = 67, 75% vs. n = 63, 64%, p = 0.12). Only 7 (4%) used advanced survival analysis methods of Cox regression analysis with frailty in which 6 (3%) were used in the later period. Thirty-two (17%) used logistic regression while 8 (4%) used other methods. There were significantly more articles from the first period using appropriate methods compared to the second (n = 80, 88% vs. n = 69, 70%, p-value = 0.003). Conclusion Descriptive statistics and survival analysis techniques remain the most common methods of analysis in publications on predictors of all-cause mortality in HIV-infected cohorts while prospective research designs are favoured. Sophisticated techniques of time-dependent Cox regression and Cox regression with frailty are scarce. This motivates for more training in the use of advanced time-to-event methods. PMID:24498313
NASA Astrophysics Data System (ADS)
Kuehnel, C.; Hennemuth, A.; Oeltze, S.; Boskamp, T.; Peitgen, H.-O.
2008-03-01
The diagnosis support in the field of coronary artery disease (CAD) is very complex due to the numerous symptoms and performed studies leading to the final diagnosis. CTA and MRI are on their way to replace invasive catheter angiography. Thus, there is a need for sophisticated software tools that present the different analysis results, and correlate the anatomical and dynamic image information. We introduce a new software assistant for the combined result visualization of CTA and MR images, in which a dedicated concept for the structured presentation of original data, segmentation results, and individual findings is realized. Therefore, we define a comprehensive class hierarchy and assign suitable interaction functions. User guidance is coupled as closely as possible with available data, supporting a straightforward workflow design. The analysis results are extracted from two previously developed software assistants, providing coronary artery analysis and measurements, function analysis as well as late enhancement data investigation. As an extension we introduce a finding concept directly relating suspicious positions to the underlying data. An affine registration of CT and MR data in combination with the AHA 17-segment model enables the coupling of local findings to positions in all data sets. Furthermore, sophisticated visualization in 2D and 3D and interactive bull's eye plots facilitate a correlation of coronary stenoses and physiology. The software has been evaluated on 20 patient data sets.
Gertsch, Jana C; Noblitt, Scott D; Cropek, Donald M; Henry, Charles S
2010-05-01
A microchip capillary electrophoresis (MCE) system has been developed for the determination of perchlorate in drinking water. The United States Environmental Protection Agency (USEPA) recently proposed a health advisory limit for perchlorate in drinking water of 15 parts per billion (ppb), a level requiring large, sophisticated instrumentation, such as ion chromatography coupled with mass spectrometry (IC-MS), for detection. An inexpensive, portable system is desired for routine online monitoring applications of perchlorate in drinking water. Here, we present an MCE method using contact conductivity detection for perchlorate determination. The method has several advantages, including reduced analysis times relative to IC, inherent portability, high selectivity, and minimal sample pretreatment. Resolution of perchlorate from more abundant ions was achieved using zwitterionic, sulfobetaine surfactants, N-hexadecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (HDAPS) and N-tetradecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (TDAPS). The system performance and the optimization of the separation chemistry, including the use of these surfactants to resolve perchlorate from other anions, are discussed in this work. The system is capable of detection limits of 3.4 +/- 1.8 ppb (n = 6) in standards and 5.6 +/- 1.7 ppb (n = 6) in drinking water.
Social media for intelligence: research, concepts, and results
NASA Astrophysics Data System (ADS)
Franke, Ulrik; Rosell, Magnus
2016-05-01
When sampling part of the enormous amounts of social media data it is important to consider whether the sample is representative. Any method of studying the sampled data is also prone to bias. Sampling and bias aside the data may be generated with malicious intent, such as deception. Deception is a complicated (broad, situational, vague) concept. It seems improbable that an automated computer system would be able to find deception as such. Instead, we argue that the role of a system would be to aid the human analyst by detecting indicators, or clues, of (potential) deception. Indicators could take many forms and are typically neither necessary nor sufficient for there to be an actual deception. However, by using one or combining several of them a human may reach conclusions. Indicators are not necessarily dependent and will be added to or removed from the analysis depending on the circumstances. This modularity can help in counteracting/alleviating attacks on the system by an adversary. If we become aware that an indicator is compromised we can remove it from the analysis and/or replace it with a more sophisticated method that give us a similar indication.
Lünse, Christina E.; Corbino, Keith A.; Ames, Tyler D.; Nelson, James W.; Roth, Adam; Perkins, Kevin R.; Sherlock, Madeline E.
2017-01-01
Abstract The discovery of structured non-coding RNAs (ncRNAs) in bacteria can reveal new facets of biology and biochemistry. Comparative genomics analyses executed by powerful computer algorithms have successfully been used to uncover many novel bacterial ncRNA classes in recent years. However, this general search strategy favors the discovery of more common ncRNA classes, whereas progressively rarer classes are correspondingly more difficult to identify. In the current study, we confront this problem by devising several methods to select subsets of intergenic regions that can concentrate these rare RNA classes, thereby increasing the probability that comparative sequence analysis approaches will reveal their existence. By implementing these methods, we discovered 224 novel ncRNA classes, which include ROOL RNA, an RNA class averaging 581 nt and present in multiple phyla, several highly conserved and widespread ncRNA classes with properties that suggest sophisticated biochemical functions and a multitude of putative cis-regulatory RNA classes involved in a variety of biological processes. We expect that further research on these newly found RNA classes will reveal additional aspects of novel biology, and allow for greater insights into the biochemistry performed by ncRNAs. PMID:28977401
Poka Yoke system based on image analysis and object recognition
NASA Astrophysics Data System (ADS)
Belu, N.; Ionescu, L. M.; Misztal, A.; Mazăre, A.
2015-11-01
Poka Yoke is a method of quality management which is related to prevent faults from arising during production processes. It deals with “fail-sating” or “mistake-proofing”. The Poka-yoke concept was generated and developed by Shigeo Shingo for the Toyota Production System. Poka Yoke is used in many fields, especially in monitoring production processes. In many cases, identifying faults in a production process involves a higher cost than necessary cost of disposal. Usually, poke yoke solutions are based on multiple sensors that identify some nonconformities. This means the presence of different equipment (mechanical, electronic) on production line. As a consequence, coupled with the fact that the method itself is an invasive, affecting the production process, would increase its price diagnostics. The bulky machines are the means by which a Poka Yoke system can be implemented become more sophisticated. In this paper we propose a solution for the Poka Yoke system based on image analysis and identification of faults. The solution consists of a module for image acquisition, mid-level processing and an object recognition module using associative memory (Hopfield network type). All are integrated into an embedded system with AD (Analog to Digital) converter and Zync 7000 (22 nm technology).
Transcriptional profiling: a potential anti-doping strategy.
Rupert, J L
2009-12-01
Evolving challenges require evolving responses. The use of illicit performance enhancing drugs by athletes permeates the reality and the perception of elite sports. New drugs with ergogenic or masking potential are quickly adopted, driven by a desire to win and the necessity of avoiding detection. To counter this trend, anti-doping authorities are continually refining existing assays and developing new testing strategies. In the post-genome era, genetic- and molecular-based tests are being evaluated as potential approaches to detect new and sophisticated forms of doping. Transcriptome analysis, in which a tissue's complement of mRNA transcripts is characterized, is one such method. The quantity and composition of a tissue's transcriptome is highly reflective of milieu and metabolic activity. There is much interest in transcriptional profiling in medical diagnostics and, as transcriptional information can be obtained from a variety of easily accessed tissues, similar approaches could be used in doping control. This article briefly reviews current understanding of the transcriptome, common methods of global analysis of gene expression and non-invasive sample sources. While the focus of this article is on anti-doping, the principles and methodology described could be applied to any research in which non-invasive, yet biologically informative sampling is desired.
Improved microarray methods for profiling the yeast knockout strain collection
Yuan, Daniel S.; Pan, Xuewen; Ooi, Siew Loon; Peyser, Brian D.; Spencer, Forrest A.; Irizarry, Rafael A.; Boeke, Jef D.
2005-01-01
A remarkable feature of the Yeast Knockout strain collection is the presence of two unique 20mer TAG sequences in almost every strain. In principle, the relative abundances of strains in a complex mixture can be profiled swiftly and quantitatively by amplifying these sequences and hybridizing them to microarrays, but TAG microarrays have not been widely used. Here, we introduce a TAG microarray design with sophisticated controls and describe a robust method for hybridizing high concentrations of dye-labeled TAGs in single-stranded form. We also highlight the importance of avoiding PCR contamination and provide procedures for detection and eradication. Validation experiments using these methods yielded false positive (FP) and false negative (FN) rates for individual TAG detection of 3–6% and 15–18%, respectively. Analysis demonstrated that cross-hybridization was the chief source of FPs, while TAG amplification defects were the main cause of FNs. The materials, protocols, data and associated software described here comprise a suite of experimental resources that should facilitate the use of TAG microarrays for a wide variety of genetic screens. PMID:15994458
Diaz, Maureen H; Winchell, Jonas M
2016-01-01
Over the past decade there have been significant advancements in the methods used for detecting and characterizing Mycoplasma pneumoniae, a common cause of respiratory illness and community-acquired pneumonia worldwide. The repertoire of available molecular diagnostics has greatly expanded from nucleic acid amplification techniques (NAATs) that encompass a variety of chemistries used for detection, to more sophisticated characterizing methods such as multi-locus variable-number tandem-repeat analysis (MLVA), Multi-locus sequence typing (MLST), matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS), single nucleotide polymorphism typing, and numerous macrolide susceptibility profiling methods, among others. These many molecular-based approaches have been developed and employed to continually increase the level of discrimination and characterization in order to better understand the epidemiology and biology of M. pneumoniae. This review will summarize recent molecular techniques and procedures and lend perspective to how each has enhanced the current understanding of this organism and will emphasize how Next Generation Sequencing may serve as a resource for researchers to gain a more comprehensive understanding of the genomic complexities of this insidious pathogen.
Exploring Remote Rensing Through The Use Of Readily-Available Classroom Technologies
NASA Astrophysics Data System (ADS)
Rogers, M. A.
2013-12-01
Frontier geoscience research using remotely-sensed satellite observation routinely requires sophisticated and novel remote sensing techniques to succeed. Describing these techniques in an educational format presents significant challenges to the science educator, especially with regards to the professional development setting where a small, but competent audience has limited instructor contact time to develop the necessary understanding. In this presentation, we describe the use of simple and cheaply available technologies, including ultrasonic transducers, FLIR detectors, and even simple web cameras to provide a tangible analogue to sophisticated remote sensing platforms. We also describe methods of curriculum development that leverages the use of these simple devices to teach the fundamentals of remote sensing, resulting in a deeper and more intuitive understanding of the techniques used in modern remote sensing research. Sample workshop itineraries using these techniques are provided as well.
Polydiacetylene-Based Liposomes: An "Optical Tongue" for Bacteria Detection and Identification
ERIC Educational Resources Information Center
West, Matthew R.; Hanks, Timothy W.; Watson, Rhett T.
2009-01-01
Food- and water-borne bacteria are a major health concern worldwide. Current detection methods are time-consuming and require sophisticated equipment that is not always readily available. However, new techniques based on nanotechnology are under development that will result in a new generation of sensors. In this experiment, liposomes are…
Cancer Imaging Phenomics Toolkit (CaPTK) | Informatics Technology for Cancer Research (ITCR)
CaPTk is a tool that facilitates translation of highly sophisticated methods that help us gain a comprehensive understanding of the underlying mechanisms of cancer from medical imaging research to the clinic. It replicates basic interactive functionalities of radiological workstations and is distributed under a BSD-style license.
Test Design with Cognition in Mind
ERIC Educational Resources Information Center
Gorin, Joanna S.
2006-01-01
One of the primary themes of the National Research Council's 2001 book "Knowing What Students Know" was the importance of cognition as a component of assessment design and measurement theory (NRC, 2001). One reaction to the book has been an increased use of sophisticated statistical methods to model cognitive information available in test data.…
ERIC Educational Resources Information Center
Longberg, Pauline Oliphant
2012-01-01
As computer assisted instruction (CAI) becomes increasingly sophisticated, its appeal as a viable method of literacy intervention with young children continues despite limited evidence of effectiveness. The present study sought to assess the impact of one such CAI program, "Imagine Learning English" (ILE), on both the receptive…
Have the Focus and Sophistication of Research in Health Education Changed?
ERIC Educational Resources Information Center
Merrill, Ray M.; Lindsay, Christopher A.; Shields, Eric C.; Stoddard, Julianne
2007-01-01
This study assessed the types of research and the statistical methods used in three representative health education journals from 1994 through 2003. Editorials, commentaries, program/practice notes, and perspectives represent 17.6% of the journals' content. The most common types of articles are cross-sectional studies (27.5%), reviews (23.2%), and…
ERIC Educational Resources Information Center
Toussaint, Karen A.; Tiger, Jeffrey H.
2012-01-01
Covert self-injurious behavior (i.e., behavior that occurs in the absence of other people) can be difficult to treat. Traditional treatments typically have involved sophisticated methods of observation and often have employed positive punishment procedures. The current study evaluated the effectiveness of a variable momentary differential…
Physics in one dimension: theoretical concepts for quantum many-body systems.
Schönhammer, K
2013-01-09
Various sophisticated approximation methods exist for the description of quantum many-body systems. It was realized early on that the theoretical description can simplify considerably in one-dimensional systems and various exact solutions exist. The focus in this introductory paper is on fermionic systems and the emergence of the Luttinger liquid concept.
DOT National Transportation Integrated Search
2009-03-21
This study investigates all of the generated soils data in an attempt to use the more 'routine' laboratory tests to determine geotechnical design parameters (such as phiangle, cohesion, wet unit weight, unconfined compression, consolidation character...
Microcomputer Based Computer-Assisted Learning System: CASTLE.
ERIC Educational Resources Information Center
Garraway, R. W. T.
The purpose of this study was to investigate the extent to which a sophisticated computer assisted instruction (CAI) system could be implemented on the type of microcomputer system currently found in the schools. A method was devised for comparing CAI languages and was used to rank five common CAI languages. The highest ranked language, NATAL,…
Practice-based evidence study design for comparative effectiveness research.
Horn, Susan D; Gassaway, Julie
2007-10-01
To describe a new, rigorous, comprehensive practice-based evidence for clinical practice improvement (PBE-CPI) study methodology, and compare its features, advantages, and disadvantages to those of randomized controlled trials and sophisticated statistical methods for comparative effectiveness research. PBE-CPI incorporates natural variation within data from routine clinical practice to determine what works, for whom, when, and at what cost. It uses the knowledge of front-line caregivers, who develop study questions and define variables as part of a transdisciplinary team. Its comprehensive measurement framework provides a basis for analyses of significant bivariate and multivariate associations between treatments and outcomes, controlling for patient differences, such as severity of illness. PBE-CPI studies can uncover better practices more quickly than randomized controlled trials or sophisticated statistical methods, while achieving many of the same advantages. We present examples of actionable findings from PBE-CPI studies in postacute care settings related to comparative effectiveness of medications, nutritional support approaches, incontinence products, physical therapy activities, and other services. Outcomes improved when practices associated with better outcomes in PBE-CPI analyses were adopted in practice.
Big Data Analysis Framework for Healthcare and Social Sectors in Korea
Song, Tae-Min
2015-01-01
Objectives We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. Methods We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Results Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. Conclusions There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached. PMID:25705552
Cheques and challenges: business performance in New Zealand general practice.
Greatbanks, Richard; Doolan-Noble, Fiona; McKenna, Alex
2017-09-01
INTRODUCTION New Zealand general practice mainly functions as small businesses, usually owned by a single or small group of doctors. Consequently, owners often have to balance the provision of patient care with varying funding priorities, changing patient needs and the pressures of running a sustainable business. Such balancing inevitably leads to tensions developing between these factors. AIM To explore and understand these tensions and responses to them, by examining the business performance measurements used by general practice. METHODS For this study, the unit of analysis and focus were individual practices, but qualitative semi-structured interviews with general practitioners (GPs) and practice managers were used to gather the data. RESULTS All participating practices had some form of governance or board review, where high-level aggregated business performance data were presented. More sophisticated business performance measures were evident in the larger, more developed practices and in practices functioning as community trusts. Examples of such measures included doctor utilisation and efficiency, appraisal of risk, patient satisfaction with services and responses to changes in patient demand. DISCUSSION As the number of general practices based on the traditional model decrease, a corresponding increase is likely in the establishment and development of 'super practices' based on a corporatized, multi-service, single-location model. Consequently, service delivery will become increasingly complex and will drive a need for increased sophistication in how general practice measures its business performance, thus ensuring a balance between high-quality, safe patient care and the maintenance of a sustainable business.
Unusual way of suicide by carbon monoxide. Case Report.
Zelený, Michal; Pivnička, Jan; Šindler, Martin; Kukleta, Pavel
2015-01-01
Authors discuss the case of a suicide of a 29-year-old man caused by carbon monoxide (CO) intoxication. What the authors found interesting was the unusual way of committing suicide that required good technical skills and expert knowledge. The level of carboxyhemoglobin (COHb) in the blood of the deceased man was routinely determined by the modified method by Blackmoore (1970), using gas chromatography/thermal conductivity detection. The level of saturation of the hemoglobin by CO in the collected blood sample is determined relatively to the same sample saturated to 100%. In the blood sample of the deceased man the lethal concentration of COHb of 76.5% was determined. Within the following examinations the blood alcohol concentration of 0.05 g.kg(-1) was determined. Further analysis revealed traces of sertraline, its metabolite N-desmethylsertraline, omeprazole and caffeine in the liver tissue, traces of N-desmethylsertraline, ibuprofen and caffeine in urine sample, and only traces of caffeine in the stomach content and blood samples were proved. To commit suicide the man used a sophisticated double container-system equipped with a timer for controlled generation of CO based on the chemical reaction of concentrated sulphuric acid and formic acid. The used timer was set by an electromechanical timer switch that triggered the fatal reaction of the acids while the man was sleeping. The authors discuss an unusual case of suicide by CO intoxication rarely seen in the area of forensic medicine and toxicology that is specific due to its sophisticated way of execution.
Analytical surveillance of emerging drugs of abuse and drug formulations
Thomas, Brian F.; Pollard, Gerald T.; Grabenauer, Megan
2012-01-01
Uncontrolled recreational drugs are proliferating in number and variety. Effects of long-term use are unknown, and regulation is problematic, as efforts to control one chemical often lead to several other structural analogs. Advanced analytical instrumentation and methods are continuing to be developed to identify drugs, chemical constituents of products, and drug substances and metabolites in biological fluids. Several mass spectrometry based approaches appear promising, particularly those that involve high resolution chromatographic and mass spectrometric methods that allow unbiased data acquisition and sophisticated data interrogation. Several of these techniques are shown to facilitate both targeted and broad spectrum analysis, which is often of particular benefit when dealing with misleadingly labeled products or assessing a biological matrix for illicit drugs and metabolites. The development and application of novel analytical approaches such as these will help to assess the nature and degree of exposure and risk and, where necessary, inform forensics and facilitate implementation of specific regulation and control measures. PMID:23154240
David, R.; Stoessel, A.; Berthoz, A.; Spoor, F.; Bennequin, D.
2016-01-01
The semicircular duct system is part of the sensory organ of balance and essential for navigation and spatial awareness in vertebrates. Its function in detecting head rotations has been modelled with increasing sophistication, but the biomechanics of actual semicircular duct systems has rarely been analyzed, foremost because the fragile membranous structures in the inner ear are hard to visualize undistorted and in full. Here we present a new, easy-to-apply and non-invasive method for three-dimensional in-situ visualization and quantification of the semicircular duct system, using X-ray micro tomography and tissue staining with phosphotungstic acid. Moreover, we introduce Ariadne, a software toolbox which provides comprehensive and improved morphological and functional analysis of any visualized duct system. We demonstrate the potential of these methods by presenting results for the duct system of humans, the squirrel monkey and the rhesus macaque, making comparisons with past results from neurophysiological, oculometric and biomechanical studies. Ariadne is freely available at http://www.earbank.org. PMID:27604473
Evaluating an education/training module to foster knowledge of cockpit weather technology.
Cobbett, Erin A; Blickensderfer, Elizabeth L; Lanicci, John
2014-10-01
Previous research has indicated that general aviation (GA) pilots may use the sophisticated meteorological information available to them via a variety of Next-Generation Weather Radar (NEXRAD) based weather products in a manner that actually decreases flight safety. The current study examined an education/training method designed to enable GA pilots to use NEXRAD-based products effectively in convective weather situations. The training method was lecture combined with paper-based scenario exercises. A multivariate analysis of variance revealed that subjects in the training condition performed significantly better than did subjects in the control condition on several knowledge and attitude measures. Subjects in the training condition improved from a mean score of 66% to 80% on the radar-knowledge test and from 62% to 75% on the scenario-knowledge test. Although additional research is needed, these results demonstrated that pilots can benefit from a well-designed education/training program involving specific areas of aviation weather-related knowledge.
The HMMER Web Server for Protein Sequence Similarity Search.
Prakash, Ananth; Jeffryes, Matt; Bateman, Alex; Finn, Robert D
2017-12-08
Protein sequence similarity search is one of the most commonly used bioinformatics methods for identifying evolutionarily related proteins. In general, sequences that are evolutionarily related share some degree of similarity, and sequence-search algorithms use this principle to identify homologs. The requirement for a fast and sensitive sequence search method led to the development of the HMMER software, which in the latest version (v3.1) uses a combination of sophisticated acceleration heuristics and mathematical and computational optimizations to enable the use of profile hidden Markov models (HMMs) for sequence analysis. The HMMER Web server provides a common platform by linking the HMMER algorithms to databases, thereby enabling the search for homologs, as well as providing sequence and functional annotation by linking external databases. This unit describes three basic protocols and two alternate protocols that explain how to use the HMMER Web server using various input formats and user defined parameters. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
Advances in borehole geophysics for hydrology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, P.H.
1982-01-01
Borehole geophysical methods provide vital subsurface information on rock properties, fluid movement, and the condition of engineered borehole structures. Within the first category, salient advances include the continuing improvement of the borehole televiewer, refinement of the electrical conductivity dipmeter for fracture characterization, and the development of a gigahertz-frequency electromagnetic propagation tool for water saturation measurements. The exploration of the rock mass between boreholes remains a challenging problem with high potential; promising methods are now incorporating high-density spatial sampling and sophisticated data processing. Flow-rate measurement methods appear adequate for all but low-flow situations. At low rates the tagging method seems themore » most attractive. The current exploitation of neutron-activation techniques for tagging means that the wellbore fluid itself is tagged, thereby eliminating the mixing of an alien fluid into the wellbore. Another method uses the acoustic noise generated by flow through constrictions and in and behind casing to detect and locate flaws in the production system. With the advent of field-recorded digital data, the interpretation of logs from sedimentary sequences is now reaching a sophisticated level with the aid of computer processing and the application of statistical methods. Lagging behind are interpretive schemes for the low-porosity, fracture-controlled igneous and metamorphic rocks encountered in the geothermal reservoirs and in potential waste-storage sites. Progress is being made on the general problem of fracture detection by use of electrical and acoustical techniques, but the reliable definition of permeability continues to be an elusive goal.« less
Kan, Hirohito; Arai, Nobuyuki; Takizawa, Masahiro; Omori, Kazuyoshi; Kasai, Harumasa; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta
2018-06-11
We developed a non-regularized, variable kernel, sophisticated harmonic artifact reduction for phase data (NR-VSHARP) method to accurately estimate local tissue fields without regularization for quantitative susceptibility mapping (QSM). We then used a digital brain phantom to evaluate the accuracy of the NR-VSHARP method, and compared it with the VSHARP and iterative spherical mean value (iSMV) methods through in vivo human brain experiments. Our proposed NR-VSHARP method, which uses variable spherical mean value (SMV) kernels, minimizes L2 norms only within the volume of interest to reduce phase errors and save cortical information without regularization. In a numerical phantom study, relative local field and susceptibility map errors were determined using NR-VSHARP, VSHARP, and iSMV. Additionally, various background field elimination methods were used to image the human brain. In a numerical phantom study, the use of NR-VSHARP considerably reduced the relative local field and susceptibility map errors throughout a digital whole brain phantom, compared with VSHARP and iSMV. In the in vivo experiment, the NR-VSHARP-estimated local field could sufficiently achieve minimal boundary losses and phase error suppression throughout the brain. Moreover, the susceptibility map generated using NR-VSHARP minimized the occurrence of streaking artifacts caused by insufficient background field removal. Our proposed NR-VSHARP method yields minimal boundary losses and highly precise phase data. Our results suggest that this technique may facilitate high-quality QSM. Copyright © 2017. Published by Elsevier Inc.
Classical and all-floating FETI methods for the simulation of arterial tissues
Augustin, Christoph M.; Holzapfel, Gerhard A.; Steinbach, Olaf
2015-01-01
High-resolution and anatomically realistic computer models of biological soft tissues play a significant role in the understanding of the function of cardiovascular components in health and disease. However, the computational effort to handle fine grids to resolve the geometries as well as sophisticated tissue models is very challenging. One possibility to derive a strongly scalable parallel solution algorithm is to consider finite element tearing and interconnecting (FETI) methods. In this study we propose and investigate the application of FETI methods to simulate the elastic behavior of biological soft tissues. As one particular example we choose the artery which is – as most other biological tissues – characterized by anisotropic and nonlinear material properties. We compare two specific approaches of FETI methods, classical and all-floating, and investigate the numerical behavior of different preconditioning techniques. In comparison to classical FETI, the all-floating approach has not only advantages concerning the implementation but in many cases also concerning the convergence of the global iterative solution method. This behavior is illustrated with numerical examples. We present results of linear elastic simulations to show convergence rates, as expected from the theory, and results from the more sophisticated nonlinear case where we apply a well-known anisotropic model to the realistic geometry of an artery. Although the FETI methods have a great applicability on artery simulations we will also discuss some limitations concerning the dependence on material parameters. PMID:26751957
Machine learning of frustrated classical spin models. I. Principal component analysis
NASA Astrophysics Data System (ADS)
Wang, Ce; Zhai, Hui
2017-10-01
This work aims at determining whether artificial intelligence can recognize a phase transition without prior human knowledge. If this were successful, it could be applied to, for instance, analyzing data from the quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to well-understood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark for this approach. In this work, we feed the computer data generated by the classical Monte Carlo simulation for the X Y model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method.
Blackboard architecture for medical image interpretation
NASA Astrophysics Data System (ADS)
Davis, Darryl N.; Taylor, Christopher J.
1991-06-01
There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.
MEGA7: Molecular Evolutionary Genetics Analysis Version 7.0 for Bigger Datasets.
Kumar, Sudhir; Stecher, Glen; Tamura, Koichiro
2016-07-01
We present the latest version of the Molecular Evolutionary Genetics Analysis (Mega) software, which contains many sophisticated methods and tools for phylogenomics and phylomedicine. In this major upgrade, Mega has been optimized for use on 64-bit computing systems for analyzing larger datasets. Researchers can now explore and analyze tens of thousands of sequences in Mega The new version also provides an advanced wizard for building timetrees and includes a new functionality to automatically predict gene duplication events in gene family trees. The 64-bit Mega is made available in two interfaces: graphical and command line. The graphical user interface (GUI) is a native Microsoft Windows application that can also be used on Mac OS X. The command line Mega is available as native applications for Windows, Linux, and Mac OS X. They are intended for use in high-throughput and scripted analysis. Both versions are available from www.megasoftware.net free of charge. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Havugimana, Pierre C; Hu, Pingzhao; Emili, Andrew
2017-10-01
Elucidation of the networks of physical (functional) interactions present in cells and tissues is fundamental for understanding the molecular organization of biological systems, the mechanistic basis of essential and disease-related processes, and for functional annotation of previously uncharacterized proteins (via guilt-by-association or -correlation). After a decade in the field, we felt it timely to document our own experiences in the systematic analysis of protein interaction networks. Areas covered: Researchers worldwide have contributed innovative experimental and computational approaches that have driven the rapidly evolving field of 'functional proteomics'. These include mass spectrometry-based methods to characterize macromolecular complexes on a global-scale and sophisticated data analysis tools - most notably machine learning - that allow for the generation of high-quality protein association maps. Expert commentary: Here, we recount some key lessons learned, with an emphasis on successful workflows, and challenges, arising from our own and other groups' ongoing efforts to generate, interpret and report proteome-scale interaction networks in increasingly diverse biological contexts.
NASA Astrophysics Data System (ADS)
Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha
2015-01-01
Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.
Alport Syndrome: De Novo Mutation in the COL4A5 Gene Converting Glycine 1205 to Valine.
Antón-Martín, Pilar; Aparicio López, Cristina; Ramiro-León, Soraya; Santillán Garzón, Sonia; Santos-Simarro, Fernando; Gil-Fournier, Belén
2012-01-01
Alport syndrome is a primary basement membrane disorder arising from mutations in genes encoding the type IV collagen protein family. It is a genetically heterogeneous disease with different mutations and forms of inheritance that presents with renal affection, hearing loss and eye defects. Several new mutations related to X-linked forms have been previously determined. We report the case of a 12 years old male and his family diagnosed with Alport syndrome after genetic analysis was performed. A new mutation determining a nucleotide change c.3614G > T (p.Gly1205Val) in hemizygosis in the COL4A5 gene was found. This molecular defect has not been previously described. Molecular biology has helped us to comprehend the mechanisms of pathophysiology in Alport syndrome. Genetic analysis provides the only conclusive diagnosis of the disorder at the moment. Our contribution with a new mutation further supports the need of more sophisticated molecular methods to increase the mutation detection rates with lower costs and less time.
Jabłońska-Czapla, Magdalena
2015-01-01
Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962
Applications of DNA-Based Liquid Biopsy for Central Nervous System Neoplasms.
Wang, Joanna; Bettegowda, Chetan
2017-01-01
The management of central nervous system malignancies remains reliant on histopathological analysis and neuroimaging, despite their complex genetic profile. The intratumoral heterogeneity displayed by these tumors necessitates a more sophisticated method of tumor analysis and monitoring, with the ability to assess tumors over space and time. Circulating biomarkers, including circulating tumor cells, circulating tumor DNA, and extracellular vesicles, hold promise as a type of real-time liquid biopsy able to provide dynamic information not only regarding tumor burden to monitor disease progression and treatment response, but also regarding genetic profile to enable changes in management to match a constantly evolving tumor. In numerous cancer types, including glioma, they have demonstrated their clinical utility as a minimally invasive means for diagnosis, prognostication, and prediction. In addition, they can be used in the laboratory to probe mechanisms of acquired drug resistance and tumor invasion and dissemination. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
Adaptive bill morphology for enhanced tool manipulation in New Caledonian crows
Matsui, Hiroshi; Hunt, Gavin R.; Oberhofer, Katja; Ogihara, Naomichi; McGowan, Kevin J.; Mithraratne, Kumar; Yamasaki, Takeshi; Gray, Russell D.; Izawa, Ei-Ichi
2016-01-01
Early increased sophistication of human tools is thought to be underpinned by adaptive morphology for efficient tool manipulation. Such adaptive specialisation is unknown in nonhuman primates but may have evolved in the New Caledonian crow, which has sophisticated tool manufacture. The straightness of its bill, for example, may be adaptive for enhanced visually-directed use of tools. Here, we examine in detail the shape and internal structure of the New Caledonian crow’s bill using Principal Components Analysis and Computed Tomography within a comparative framework. We found that the bill has a combination of interrelated shape and structural features unique within Corvus, and possibly birds generally. The upper mandible is relatively deep and short with a straight cutting edge, and the lower mandible is strengthened and upturned. These novel combined attributes would be functional for (i) counteracting the unique loading patterns acting on the bill when manipulating tools, (ii) a strong precision grip to hold tools securely, and (iii) enhanced visually-guided tool use. Our findings indicate that the New Caledonian crow’s innovative bill has been adapted for tool manipulation to at least some degree. Early increased sophistication of tools may require the co-evolution of morphology that provides improved manipulatory skills. PMID:26955788
NASA Astrophysics Data System (ADS)
Zielinski, Jerzy S.
The dramatic increase in number and volume of digital images produced in medical diagnostics, and the escalating demand for rapid access to these relevant medical data, along with the need for interpretation and retrieval has become of paramount importance to a modern healthcare system. Therefore, there is an ever growing need for processed, interpreted and saved images of various types. Due to the high cost and unreliability of human-dependent image analysis, it is necessary to develop an automated method for feature extraction, using sophisticated mathematical algorithms and reasoning. This work is focused on digital image signal processing of biological and biomedical data in one- two- and three-dimensional space. Methods and algorithms presented in this work were used to acquire data from genomic sequences, breast cancer, and biofilm images. One-dimensional analysis was applied to DNA sequences which were presented as a non-stationary sequence and modeled by a time-dependent autoregressive moving average (TD-ARMA) model. Two-dimensional analyses used 2D-ARMA model and applied it to detect breast cancer from x-ray mammograms or ultrasound images. Three-dimensional detection and classification techniques were applied to biofilm images acquired using confocal laser scanning microscopy. Modern medical images are geometrically arranged arrays of data. The broadening scope of imaging as a way to organize our observations of the biophysical world has led to a dramatic increase in our ability to apply new processing techniques and to combine multiple channels of data into sophisticated and complex mathematical models of physiological function and dysfunction. With explosion of the amount of data produced in a field of biomedicine, it is crucial to be able to construct accurate mathematical models of the data at hand. Two main purposes of signal modeling are: data size conservation and parameter extraction. Specifically, in biomedical imaging we have four key problems that were addressed in this work: (i) registration, i.e. automated methods of data acquisition and the ability to align multiple data sets with each other; (ii) visualization and reconstruction, i.e. the environment in which registered data sets can be displayed on a plane or in multidimensional space; (iii) segmentation, i.e. automated and semi-automated methods to create models of relevant anatomy from images; (iv) simulation and prediction, i.e. techniques that can be used to simulate growth end evolution of researched phenomenon. Mathematical models can not only be used to verify experimental findings, but also to make qualitative and quantitative predictions, that might serve as guidelines for the future development of technology and/or treatment.
Special environmental control and life support equipment test analyses and hardware
NASA Technical Reports Server (NTRS)
Callahan, David M.
1995-01-01
This final report summarizes NAS8-38250 contract events, 'Special Environmental Control and Life Support Systems Test Analysis and Hardware'. This report is technical and includes programmatic development. Key to the success of this contract was the evaluation of Environmental Control and Life Support Systems (ECLSS) test results via sophisticated laboratory analysis capabilities. The history of the contract, including all subcontracts, is followed by the support and development of each Task.
Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code
1979-06-01
dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was
The Role of Self-Injury in the Organisation of Behaviour
ERIC Educational Resources Information Center
Sandman, C. A.; Kemp, A. S.; Mabini, C.; Pincus, D.; Magnusson, M.
2012-01-01
Background: Self-injuring acts are among the most dramatic behaviours exhibited by human beings. There is no known single cause and there is no universally agreed upon treatment. Sophisticated sequential and temporal analysis of behaviour has provided alternative descriptions of self-injury that provide new insights into its initiation and…
Benchmarking and beyond. Information trends in home care.
Twiss, Amanda; Rooney, Heather; Lang, Christine
2002-11-01
With today's benchmarking concepts and tools, agencies have the unprecedented opportunity to use information as a strategic advantage. Because agencies are demanding more and better information, benchmark functionality has grown increasingly sophisticated. Agencies now require a new type of analysis, focused on high-level executive summaries while reducing the current "data overload."
Cyberbullying Bystanders and Moral Engagement: A Psychosocial Analysis for Pastoral Care
ERIC Educational Resources Information Center
Kyriacou, Chris; Zuin, Antônio
2018-01-01
One of the new challenges facing pastoral care in schools is dealing with the rapid growth of cyberbullying by school-aged children. Within digital cyberspace, cyberbullies are finding more opportunities to express their aggression towards others as social networks become technologically more sophisticated. An important feature of cyberbullying is…
USDA-ARS?s Scientific Manuscript database
The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...
Measuring Lexical Diversity in Narrative Discourse of People with Aphasia
ERIC Educational Resources Information Center
Fergadiotis, Gerasimos; Wright, Heather H.; West, Thomas M.
2013-01-01
Purpose: A microlinguistic content analysis for assessing lexical semantics in people with aphasia (PWA) is lexical diversity (LD). Sophisticated techniques have been developed to measure LD. However, validity evidence for these methodologies when applied to the discourse of PWA is lacking. The purpose of this study was to evaluate four measures…
The Construction of a Second Language Acquisition Index of Development.
ERIC Educational Resources Information Center
Larsen-Freeman, Diane; Strom, Virginia
Compositions written by 48 university students of English as a Second Language (ESL) were examined as a step in the development of an index for proficiency in a second language. A feature analysis of the compositions revealed the following tendencies: (1) syntactic sophistication and tense usage improved as proficiency increased, (2) errors in…
ERIC Educational Resources Information Center
Flannery, Louise P.; Bers, Marina Umaschi
2013-01-01
Young learners today generate, express, and interact with sophisticated ideas using a range of digital tools to explore interactive stories, animations, computer games, and robotics. In recent years, new developmentally appropriate robotics kits have been entering early childhood classrooms. This paper presents a retrospective analysis of one…
Behavioral Determinants of Drug Action: The Contributions of Peter B. Dews
ERIC Educational Resources Information Center
Barrett, James E.
2006-01-01
Peter B. Dews played a significant role in shaping the distinctive characteristics and defining the underlying principles of the discipline of behavioral pharmacology. His early and sophisticated use of schedules of reinforcement in the 1950s, incorporated from research in the experimental analysis of behavior and integrated into the discipline of…
Identifying the Pains and Problems of Shyness: A Content Analysis.
ERIC Educational Resources Information Center
Carducci, Bernardo J.; Ragains, Kristen D.; Kee, Kathy L.; Johnson, Michael R.; Duncan, Heather R.
Contemporary literature on shyness has made steady progression over the last 20 years in theoretical, methodological, and clinical sophistication; however, little research has investigated how shyness is experienced by shy individuals. The purpose of the present study was to gain a more in-depth understanding of shyness as experienced by shy…
Orientation Systems: First Things First, Professional Paper 3-70.
ERIC Educational Resources Information Center
Wright, Robert H.
The geographic orientation requirement for the Army's lighter aircraft, and for Army aviation as a system, is a system-analysis and system-design problem that appears to have defied solution. The factors considered in this paper indicate that the requirement is not filled by a simple "more sophisticated machine" systems approach. Instead, the man…
Early Home Activities and Oral Language Skills in Middle Childhood: A Quantile Analysis
ERIC Educational Resources Information Center
Law, James; Rush, Robert; King, Tom; Westrupp, Elizabeth; Reilly, Sheena
2018-01-01
Oral language development is a key outcome of elementary school, and it is important to identify factors that predict it most effectively. Commonly researchers use ordinary least squares regression with conclusions restricted to average performance conditional on relevant covariates. Quantile regression offers a more sophisticated alternative.…
ERIC Educational Resources Information Center
Mackey, Hollie J.
2015-01-01
Leadership preparation programs are in transition as scholars seek to determine more sophisticated approaches in developing leaders for the increasing demands of accountability policy. This critical conceptual analysis focuses on leadership preparation for the socialization of school leaders. It is intended to reframe current perspectives about…
Using Novel Word Context Measures to Predict Human Ratings of Lexical Proficiency
ERIC Educational Resources Information Center
Berger, Cynthia M.; Crossley, Scott A.; Kyle, Kristopher
2017-01-01
This study introduces a model of lexical proficiency based on novel computational indices related to word context. The indices come from an updated version of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES) and include associative, lexical, and semantic measures of word context. Human ratings of holistic lexical proficiency…
Doing More with Less: A Preliminary Study of the School District Investment.
ERIC Educational Resources Information Center
MacPhail-Wilcox, Bettye
1983-01-01
Changes in cash management practices from 1978 to 1981 were investigated in a random sample of 145 North Carolina school districts, stratified by attendance size. Analysis using chi-square tests indicated the level of investment sophistication (as measured by the proportion of cash invested) has increased, especially for large districts. (RW)
NASA Astrophysics Data System (ADS)
Konik, Arda; Madsen, Mark T.; Sunderland, John J.
2012-10-01
In human emission tomography, combined PET/CT and SPECT/CT cameras provide accurate attenuation maps for sophisticated scatter and attenuation corrections. Having proven their potential, these scanners are being adapted for small animal imaging using similar correction approaches. However, attenuation and scatter effects in small animal imaging are substantially less than in human imaging. Hence, the value of sophisticated corrections is not obvious for small animal imaging considering the additional cost and complexity of these methods. In this study, using GATE Monte Carlo package, we simulated the Inveon small animal SPECT (single pinhole collimator) scanner to find the scatter fractions of various sizes of the NEMA-mouse (diameter: 2-5.5 cm , length: 7 cm), NEMA-rat (diameter: 3-5.5 cm, length: 15 cm) and MOBY (diameter: 2.1-5.5 cm, length: 3.5-9.1 cm) phantoms. The simulations were performed for three radionuclides commonly used in small animal SPECT studies:99mTc (140 keV), 111In (171 keV 90% and 245 keV 94%) and 125I (effective 27.5 keV). For the MOBY phantoms, the total Compton scatter fractions ranged (over the range of phantom sizes) from 4-10% for 99mTc (126-154 keV), 7-16% for 111In (154-188 keV), 3-7% for 111In (220-270 keV) and 17-30% for 125I (15-45 keV) including the scatter contributions from the tungsten collimator, lead shield and air (inside and outside the camera heads). For the NEMA-rat phantoms, the scatter fractions ranged from 10-15% (99mTc), 17-23% 111In: 154-188 keV), 8-12% (111In: 220-270 keV) and 32-40% (125I). Our results suggest that energy window methods based on solely emission data are sufficient for all mouse and most rat studies for 99mTc and 111In. However, more sophisticated methods may be needed for 125I.
Wendel, Jochen; Buttenfield, Barbara P.; Stanislawski, Larry V.
2016-01-01
Knowledge of landscape type can inform cartographic generalization of hydrographic features, because landscape characteristics provide an important geographic context that affects variation in channel geometry, flow pattern, and network configuration. Landscape types are characterized by expansive spatial gradients, lacking abrupt changes between adjacent classes; and as having a limited number of outliers that might confound classification. The US Geological Survey (USGS) is exploring methods to automate generalization of features in the National Hydrography Data set (NHD), to associate specific sequences of processing operations and parameters with specific landscape characteristics, thus obviating manual selection of a unique processing strategy for every NHD watershed unit. A chronology of methods to delineate physiographic regions for the United States is described, including a recent maximum likelihood classification based on seven input variables. This research compares unsupervised and supervised algorithms applied to these seven input variables, to evaluate and possibly refine the recent classification. Evaluation metrics for unsupervised methods include the Davies–Bouldin index, the Silhouette index, and the Dunn index as well as quantization and topographic error metrics. Cross validation and misclassification rate analysis are used to evaluate supervised classification methods. The paper reports the comparative analysis and its impact on the selection of landscape regions. The compared solutions show problems in areas of high landscape diversity. There is some indication that additional input variables, additional classes, or more sophisticated methods can refine the existing classification.
Hu, Yong; Wu, Hai-Long; Yin, Xiao-Li; Gu, Hui-Wen; Xiao, Rong; Xie, Li-Xia; Liu, Zhi; Fang, Huan; Wang, Li; Yu, Ru-Qin
2018-04-01
The aim of the present work was to develop a rapid and interference-free method based on liquid chromatography-mass spectrometry (LC-MS) for the simultaneous determination of nine B-group vitamins in various energy drinks. A smart and green strategy that modeled the three-way data array of LC-MS with second-order calibration methods based on alternating trilinear decomposition (ATLD) and alternating penalty trilinear decomposition (APTLD) algorithms was developed. By virtue of "mathematical separation" and "second-order advantage", the proposed strategy successfully solved the co-eluted peaks and unknown interferents in LC-MS analysis with the elution time less than 4.5min and simple sample preparation. Satisfactory quantitative results were obtained by the ATLD-LC-MS and APTLD-LC-MS methods for the spiked recovery assays, with the average spiked recoveries ranging from 87.2-113.9% to 92.0-111.7%, respectively. These results acquired from the proposed methods were confirmed by the LC-MS/MS method, which shows a quite good consistency with each other. All these results demonstrated that the developed chemometrics-assisted LC-MS strategy had advantages of being rapid, green, accurate and low-cost, and it could be an attractive alternative for the determination of multiple vitamins in complex food matrices, which required no laborious sample preparation, tedious condition optimization or more sophisticated instrumentations. Copyright © 2017 Elsevier B.V. All rights reserved.
Webly-Supervised Fine-Grained Visual Categorization via Deep Domain Adaptation.
Xu, Zhe; Huang, Shaoli; Zhang, Ya; Tao, Dacheng
2018-05-01
Learning visual representations from web data has recently attracted attention for object recognition. Previous studies have mainly focused on overcoming label noise and data bias and have shown promising results by learning directly from web data. However, we argue that it might be better to transfer knowledge from existing human labeling resources to improve performance at nearly no additional cost. In this paper, we propose a new semi-supervised method for learning via web data. Our method has the unique design of exploiting strong supervision, i.e., in addition to standard image-level labels, our method also utilizes detailed annotations including object bounding boxes and part landmarks. By transferring as much knowledge as possible from existing strongly supervised datasets to weakly supervised web images, our method can benefit from sophisticated object recognition algorithms and overcome several typical problems found in webly-supervised learning. We consider the problem of fine-grained visual categorization, in which existing training resources are scarce, as our main research objective. Comprehensive experimentation and extensive analysis demonstrate encouraging performance of the proposed approach, which, at the same time, delivers a new pipeline for fine-grained visual categorization that is likely to be highly effective for real-world applications.
NASA Astrophysics Data System (ADS)
Lohweg, Volker; Schaede, Johannes; Türke, Thomas
2006-02-01
The authenticity checking and inspection of bank notes is a high labour intensive process where traditionally every note on every sheet is inspected manually. However with the advent of more and more sophisticated security features, both visible and invisible, and the requirement of cost reduction in the printing process, it is clear that automation is required. As more and more print techniques and new security features will be established, total quality security, authenticity and bank note printing must be assured. Therefore, this factor necessitates amplification of a sensorial concept in general. We propose a concept for both authenticity checking and inspection methods for pattern recognition and classification for securities and banknotes, which is based on the concept of sensor fusion and fuzzy interpretation of data measures. In the approach different methods of authenticity analysis and print flaw detection are combined, which can be used for vending or sorting machines, as well as for printing machines. Usually only the existence or appearance of colours and their textures are checked by cameras. Our method combines the visible camera images with IR-spectral sensitive sensors, acoustical and other measurements like temperature and pressure of printing machines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denholm, P.; Margolis, R.; Palmintier, B.
This report outlines the methods, data, and tools that could be used at different levels of sophistication and effort to estimate the benefits and costs of DGPV. In so doing, we identify the gaps in current benefit-cost-analysis methods, which we hope will inform the ongoing research agenda in this area. The focus of this report is primarily on benefits and costs from the utility or electricity generation system perspective. It is intended to provide useful background information to utility and regulatory decision makers and their staff, who are often being asked to use or evaluate estimates of the benefits andmore » cost of DGPV in regulatory proceedings. Understanding the technical rigor of the range of methods and how they might need to evolve as DGPV becomes a more significant contributor of energy to the electricity system will help them be better consumers of this type of information. This report is also intended to provide information to utilities, policy makers, PV technology developers, and other stakeholders, which might help them maximize the benefits and minimize the costs of integrating DGPV into a changing electricity system.« less
Survival analysis and classification methods for forest fire size
2018-01-01
Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at “being held” (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at “being held” exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances. PMID:29320497
Survival analysis and classification methods for forest fire size.
Tremblay, Pier-Olivier; Duchesne, Thierry; Cumming, Steven G
2018-01-01
Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at "being held" (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at "being held" exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances.
Krause, David A; Boyd, Michael S; Hager, Allison N; Smoyer, Eric C; Thompson, Anthony T; Hollman, John H
2015-02-01
The squat is a fundamental movement of many athletic and daily activities. Methods to clinically assess the squat maneuver range from simple observation to the use of sophisticated equipment. The purpose of this study was to examine the reliability of Coach's Eye (TechSmith Corp), a 2-dimensional (2D) motion analysis mobile device application (app), for assessing maximal sagittal plane hip, knee, and ankle motion during a functional movement screen deep squat, and to compare range of motion values generated by it to those from a Vicon (Vicon Motion Systems Ltd) 3-dimensional (3D) motion analysis system. Twenty-six healthy subjects performed three functional movement screen deep squats recorded simultaneously by both the app (on an iPad [Apple Inc]) and the 3D motion analysis system. Joint angle data were calculated with Vicon Nexus software (Vicon Motion Systems Ltd). The app video was analyzed frame by frame to determine, and freeze on the screen, the deepest position of the squat. With a capacitive stylus reference lines were then drawn on the iPad screen to determine joint angles. Procedures were repeated with approximately 48 hours between sessions. Test-retest intrarater reliability (ICC3,1) for the app at the hip, knee, and ankle was 0.98, 0.98, and 0.79, respectively. Minimum detectable change was hip 6°, knee 6°, and ankle 7°. Hip joint angles measured with the 2D app exceeded measurements obtained with the 3D motion analysis system by approximately 40°. Differences at the knee and ankle were of lower magnitude, with mean differences of 5° and 3°, respectively. Bland-Altman analysis demonstrated a systematic bias in the hip range-of-motion measurement. No such bias was demonstrated at the knee or ankle. The 2D app demonstrated excellent reliability and appeared to be a responsive means to assess for clinical change, with minimum detectable change values ranging from 6° to 7°. These results also suggest that the 2D app may be used as an alternative to a sophisticated 3D motion analysis system for assessing sagittal plane knee and ankle motion; however, it does not appear to be a comparable alternative for assessing hip motion. 3.
Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe
NASA Astrophysics Data System (ADS)
Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun
2013-04-01
The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.
Roman sophisticated surface modification methods to manufacture silver counterfeited coins
NASA Astrophysics Data System (ADS)
Ingo, G. M.; Riccucci, C.; Faraldi, F.; Pascucci, M.; Messina, E.; Fierro, G.; Di Carlo, G.
2017-11-01
By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.
NASA Astrophysics Data System (ADS)
Bode, Felix; Ferré, Ty; Zigelli, Niklas; Emmert, Martin; Nowak, Wolfgang
2018-03-01
Collaboration between academics and practitioners promotes knowledge transfer between research and industry, with both sides benefiting greatly. However, academic approaches are often not feasible given real-world limits on time, cost and data availability, especially for risk and uncertainty analyses. Although the need for uncertainty quantification and risk assessment are clear, there are few published studies examining how scientific methods can be used in practice. In this work, we introduce possible strategies for transferring and communicating academic approaches to real-world applications, countering the current disconnect between increasingly sophisticated academic methods and methods that work and are accepted in practice. We analyze a collaboration between academics and water suppliers in Germany who wanted to design optimal groundwater monitoring networks for drinking-water well catchments. Our key conclusions are: to prefer multiobjective over single-objective optimization; to replace Monte-Carlo analyses by scenario methods; and to replace data-hungry quantitative risk assessment by easy-to-communicate qualitative methods. For improved communication, it is critical to set up common glossaries of terms to avoid misunderstandings, use striking visualization to communicate key concepts, and jointly and continually revisit the project objectives. Ultimately, these approaches and recommendations are simple and utilitarian enough to be transferred directly to other practical water resource related problems.
Incorporating advanced language models into the P300 speller using particle filtering
NASA Astrophysics Data System (ADS)
Speier, W.; Arnold, C. W.; Deshpande, A.; Knall, J.; Pouratian, N.
2015-08-01
Objective. The P300 speller is a common brain-computer interface (BCI) application designed to communicate language by detecting event related potentials in a subject’s electroencephalogram signal. Information about the structure of natural language can be valuable for BCI communication, but attempts to use this information have thus far been limited to rudimentary n-gram models. While more sophisticated language models are prevalent in natural language processing literature, current BCI analysis methods based on dynamic programming cannot handle their complexity. Approach. Sampling methods can overcome this complexity by estimating the posterior distribution without searching the entire state space of the model. In this study, we implement sequential importance resampling, a commonly used particle filtering (PF) algorithm, to integrate a probabilistic automaton language model. Main result. This method was first evaluated offline on a dataset of 15 healthy subjects, which showed significant increases in speed and accuracy when compared to standard classification methods as well as a recently published approach using a hidden Markov model (HMM). An online pilot study verified these results as the average speed and accuracy achieved using the PF method was significantly higher than that using the HMM method. Significance. These findings strongly support the integration of domain-specific knowledge into BCI classification to improve system performance.
Bridges, Sharon
2014-07-01
Collaboration in the healthcare setting is a multifaceted process that calls for deliberate knowledge sharing and mutual accountability for patient care. The purpose of this analysis is to offer an increased understanding of the concept of collaboration within the context of nurse practitioner (NP)-physician (MD) collaborative practice. The evolutionary method of concept analysis was utilized to explore the concept of collaboration. The process of literature retrieval and data collection was discussed. The search of several nursing and medicine databases resulted in 31 articles, including 17 qualitative and quantitative studies, which met criteria for inclusion in the concept analysis. Collaboration is a complex, sophisticated process that requires commitment of all parties involved. The data analysis identified the surrogate and related terms, antecedents, attributes, and consequences of collaboration within the selected context, which were recognized by major themes presented in the literature and these were discussed. An operational definition was proposed. Increasing collaborative efforts among NPs and MDs may reduce hospital length of stays and healthcare costs, while enhancing professional relationships. Further research is needed to evaluate collaboration and collaborative efforts within the context of NP-MD collaborative practice. ©2013 American Association of Nurse Practitioners.
An insight into morphometric descriptors of cell shape that pertain to regenerative medicine.
Lobo, Joana; See, Eugene Yong-Shun; Biggs, Manus; Pandit, Abhay
2016-07-01
Cellular morphology has recently been indicated as a powerful indicator of cellular function. The analysis of cell shape has evolved from rudimentary forms of microscopic visual inspection to more advanced methodologies that utilize high-resolution microscopy coupled with sophisticated computer hardware and software for data analysis. Despite this progress, there is still a lack of standardization in quantification of morphometric parameters. In addition, uncertainty remains as to which methodologies and parameters of cell morphology will yield meaningful data, which methods should be utilized to categorize cell shape, and the extent of reliability of measurements and the interpretation of the resulting analysis. A large range of descriptors has been employed to objectively assess the cellular morphology in two-dimensional and three-dimensional domains. Intuitively, simple and applicable morphometric descriptors are preferable and standardized protocols for cell shape analysis can be achieved with the help of computerized tools. In this review, cellular morphology is discussed as a descriptor of cellular function and the current morphometric parameters that are used quantitatively in two- and three-dimensional environments are described. Furthermore, the current problems associated with these morphometric measurements are addressed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
A Meta-Analytic Study of the Neural Systems for Auditory Processing of Lexical Tones.
Kwok, Veronica P Y; Dan, Guo; Yakpo, Kofi; Matthews, Stephen; Fox, Peter T; Li, Ping; Tan, Li-Hai
2017-01-01
The neural systems of lexical tone processing have been studied for many years. However, previous findings have been mixed with regard to the hemispheric specialization for the perception of linguistic pitch patterns in native speakers of tonal language. In this study, we performed two activation likelihood estimation (ALE) meta-analyses, one on neuroimaging studies of auditory processing of lexical tones in tonal languages (17 studies), and the other on auditory processing of lexical information in non-tonal languages as a control analysis for comparison (15 studies). The lexical tone ALE analysis showed significant brain activations in bilateral inferior prefrontal regions, bilateral superior temporal regions and the right caudate, while the control ALE analysis showed significant cortical activity in the left inferior frontal gyrus and left temporo-parietal regions. However, we failed to obtain significant differences from the contrast analysis between two auditory conditions, which might be caused by the limited number of studies available for comparison. Although the current study lacks evidence to argue for a lexical tone specific activation pattern, our results provide clues and directions for future investigations on this topic, more sophisticated methods are needed to explore this question in more depth as well.
A Meta-Analytic Study of the Neural Systems for Auditory Processing of Lexical Tones
Kwok, Veronica P. Y.; Dan, Guo; Yakpo, Kofi; Matthews, Stephen; Fox, Peter T.; Li, Ping; Tan, Li-Hai
2017-01-01
The neural systems of lexical tone processing have been studied for many years. However, previous findings have been mixed with regard to the hemispheric specialization for the perception of linguistic pitch patterns in native speakers of tonal language. In this study, we performed two activation likelihood estimation (ALE) meta-analyses, one on neuroimaging studies of auditory processing of lexical tones in tonal languages (17 studies), and the other on auditory processing of lexical information in non-tonal languages as a control analysis for comparison (15 studies). The lexical tone ALE analysis showed significant brain activations in bilateral inferior prefrontal regions, bilateral superior temporal regions and the right caudate, while the control ALE analysis showed significant cortical activity in the left inferior frontal gyrus and left temporo-parietal regions. However, we failed to obtain significant differences from the contrast analysis between two auditory conditions, which might be caused by the limited number of studies available for comparison. Although the current study lacks evidence to argue for a lexical tone specific activation pattern, our results provide clues and directions for future investigations on this topic, more sophisticated methods are needed to explore this question in more depth as well. PMID:28798670
Applications of AN OO Methodology and Case to a Daq System
NASA Astrophysics Data System (ADS)
Bee, C. P.; Eshghi, S.; Jones, R.; Kolos, S.; Magherini, C.; Maidantchik, C.; Mapelli, L.; Mornacchi, G.; Niculescu, M.; Patel, A.; Prigent, D.; Spiwoks, R.; Soloviev, I.; Caprini, M.; Duval, P. Y.; Etienne, F.; Ferrato, D.; Le van Suu, A.; Qian, Z.; Gaponenko, I.; Merzliakov, Y.; Ambrosini, G.; Ferrari, R.; Fumagalli, G.; Polesello, G.
The RD13 project has evaluated the use of the Object Oriented Information Engineering (OOIE) method during the development of several software components connected to the DAQ system. The method is supported by a sophisticated commercial CASE tool (Object Management Workbench) and programming environment (Kappa) which covers the full life-cycle of the software including model simulation, code generation and application deployment. This paper gives an overview of the method, CASE tool, DAQ components which have been developed and we relate our experiences with the method and tool, its integration into our development environment and the spiral lifecycle it supports.
Experimental research of flow servo-valve
NASA Astrophysics Data System (ADS)
Takosoglu, Jakub
Positional control of pneumatic drives is particularly important in pneumatic systems. Some methods of positioning pneumatic cylinders for changeover and tracking control are known. Choking method is the most development-oriented and has the greatest potential. An optimal and effective method, particularly when applied to pneumatic drives, has been searched for a long time. Sophisticated control systems with algorithms utilizing artificial intelligence methods are designed therefor. In order to design the control algorithm, knowledge about real parameters of servo-valves used in control systems of electro-pneumatic servo-drives is required. The paper presents the experimental research of flow servo-valve.
Data mining: sophisticated forms of managed care modeling through artificial intelligence.
Borok, L S
1997-01-01
Data mining is a recent development in computer science that combines artificial intelligence algorithms and relational databases to discover patterns automatically, without the use of traditional statistical methods. Work with data mining tools in health care is in a developmental stage that holds great promise, given the combination of demographic and diagnostic information.
ERIC Educational Resources Information Center
Johnson, Mark M.
2009-01-01
Clay is one of the oldest materials known to humanity and has been used for utilitarian purposes and creative expression since prehistoric times. As civilizations evolved, ceramic materials, techniques, purposes and design all became more sophisticated and expressive. With the addition of different minerals and firing methods, clay was used to…
Perceptions of Biometric Experts on Whether or Not Biometric Modalities Will Combat Identity Fraud
ERIC Educational Resources Information Center
Edo, Galaxy Samson
2012-01-01
Electronic-authentication methods, no matter how sophisticated they are in preventing fraud, must be able to identify people to a reasonable degree of certainty before any credentials are assured (Personix, 2006). User authentication is different from identity verification, and both are separate but vital steps in the process of securing…
ERIC Educational Resources Information Center
Best, Linda M.; Shelley, Daniel J.
2018-01-01
This article examines the effects of the social media applications Facebook, Twitter, Snap Chat/Instagram, Texting and various smartphone applications on academic dishonesty in higher education. The study employed a mixed-methods approach conducted through an emailed question-pro student survey consisting of 20 questions. The results of the study…
[Development of operation patient security detection system].
Geng, Shu-Qin; Tao, Ren-Hai; Zhao, Chao; Wei, Qun
2008-11-01
This paper describes a patient security detection system developed with two dimensional bar codes, wireless communication and removal storage technique. Based on the system, nurses and correlative personnel check code wait operation patient to prevent the defaults. The tests show the system is effective. Its objectivity and currency are more scientific and sophisticated than current traditional method in domestic hospital.
Moral foundations and political attitudes: The moderating role of political sophistication.
Milesi, Patrizia
2016-08-01
Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates. © 2015 International Union of Psychological Science.
Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren
2014-01-01
Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.
The Social Bayesian Brain: Does Mentalizing Make a Difference When We Learn?
Devaine, Marie; Hollard, Guillaume; Daunizeau, Jean
2014-01-01
When it comes to interpreting others' behaviour, we almost irrepressibly engage in the attribution of mental states (beliefs, emotions…). Such "mentalizing" can become very sophisticated, eventually endowing us with highly adaptive skills such as convincing, teaching or deceiving. Here, sophistication can be captured in terms of the depth of our recursive beliefs, as in "I think that you think that I think…" In this work, we test whether such sophisticated recursive beliefs subtend learning in the context of social interaction. We asked participants to play repeated games against artificial (Bayesian) mentalizing agents, which differ in their sophistication. Critically, we made people believe either that they were playing against each other, or that they were gambling like in a casino. Although both framings are similarly deceiving, participants win against the artificial (sophisticated) mentalizing agents in the social framing of the task, and lose in the non-social framing. Moreover, we find that participants' choice sequences are best explained by sophisticated mentalizing Bayesian learning models only in the social framing. This study is the first demonstration of the added-value of mentalizing on learning in the context of repeated social interactions. Importantly, our results show that we would not be able to decipher intentional behaviour without a priori attributing mental states to others. PMID:25474637
User-customized brain computer interfaces using Bayesian optimization
NASA Astrophysics Data System (ADS)
Bashashati, Hossein; Ward, Rabab K.; Bashashati, Ali
2016-04-01
Objective. The brain characteristics of different people are not the same. Brain computer interfaces (BCIs) should thus be customized for each individual person. In motor-imagery based synchronous BCIs, a number of parameters (referred to as hyper-parameters) including the EEG frequency bands, the channels and the time intervals from which the features are extracted should be pre-determined based on each subject’s brain characteristics. Approach. To determine the hyper-parameter values, previous work has relied on manual or semi-automatic methods that are not applicable to high-dimensional search spaces. In this paper, we propose a fully automatic, scalable and computationally inexpensive algorithm that uses Bayesian optimization to tune these hyper-parameters. We then build different classifiers trained on the sets of hyper-parameter values proposed by the Bayesian optimization. A final classifier aggregates the results of the different classifiers. Main Results. We have applied our method to 21 subjects from three BCI competition datasets. We have conducted rigorous statistical tests, and have shown the positive impact of hyper-parameter optimization in improving the accuracy of BCIs. Furthermore, We have compared our results to those reported in the literature. Significance. Unlike the best reported results in the literature, which are based on more sophisticated feature extraction and classification methods, and rely on prestudies to determine the hyper-parameter values, our method has the advantage of being fully automated, uses less sophisticated feature extraction and classification methods, and yields similar or superior results compared to the best performing designs in the literature.
Sousa, Marcelo R; Jones, Jon P; Frind, Emil O; Rudolph, David L
2013-01-01
In contaminant travel from ground surface to groundwater receptors, the time taken in travelling through the unsaturated zone is known as the unsaturated zone time lag. Depending on the situation, this time lag may or may not be significant within the context of the overall problem. A method is presented for assessing the importance of the unsaturated zone in the travel time from source to receptor in terms of estimates of both the absolute and the relative advective times. A choice of different techniques for both unsaturated and saturated travel time estimation is provided. This method may be useful for practitioners to decide whether to incorporate unsaturated processes in conceptual and numerical models and can also be used to roughly estimate the total travel time between points near ground surface and a groundwater receptor. This method was applied to a field site located in a glacial aquifer system in Ontario, Canada. Advective travel times were estimated using techniques with different levels of sophistication. The application of the proposed method indicates that the time lag in the unsaturated zone is significant at this field site and should be taken into account. For this case, sophisticated and simplified techniques lead to similar assessments when the same knowledge of the hydraulic conductivity field is assumed. When there is significant uncertainty regarding the hydraulic conductivity, simplified calculations did not lead to a conclusive decision. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Karpoukhin, Mikhii G.; Kogan, Boris Y.; Karplus, Walter J.
1995-01-01
The simulation of heart arrhythmia and fibrillation are very important and challenging tasks. The solution of these problems using sophisticated mathematical models is beyond the capabilities of modern super computers. To overcome these difficulties it is proposed to break the whole simulation problem into two tightly coupled stages: generation of the action potential using sophisticated models. and propagation of the action potential using simplified models. The well known simplified models are compared and modified to bring the rate of depolarization and action potential duration restitution closer to reality. The modified method of lines is used to parallelize the computational process. The conditions for the appearance of 2D spiral waves after the application of a premature beat and the subsequent traveling of the spiral wave inside the simulated tissue are studied.
Pala, Eva M; Dey, Sudip
2016-02-01
Conventional and highly sophisticated analytical methods (Cyria et al., 1989; Massar et al., 2012a) were used to analyze micro-structural and micro-analytical aspects of the blood of snake head fish, Channa gachua, exposed to municipal wastes and city garbage. Red (RBC) and white blood cell (WBC) counts and hemhemoglobin content were found to be higher in pollution affected fish as compared with control. Scanning electron microscopy revealed the occurrence of abnormal erythrocytes such as crenated cells, echinocytes, lobopodial projections, membrane internalization, spherocytes, ruptured cells, contracted cells, depression, and uneven elongation of erythrocyte membranes in fish inhabiting the polluted sites. Energy-dispersive X-ray spectroscopy (EDS) revealed the presence of silicon and lead in the RBCs of pollution affected fish. Significance of the study includes the highly sophisticated analytical approach, which revealed the aforementioned micro-structural abnormalities.
Influence of cross section variations on the structural behaviour of composite rotor blades
NASA Astrophysics Data System (ADS)
Rapp, Helmut; Woerndle, Rudolf
1991-09-01
A highly sophisticated structural analysis is required for helicopter rotor blades with nonhomogeneous cross sections made from nonisotropic material. Combinations of suitable analytical techniques with FEM-based techniques permit a cost effective and sufficiently accurate analysis of these complicated structures. It is determined that in general the 1D engineering theory of bending combined with 2D theories for determining the cross section properties is sufficient to describe the structural blade behavior.
Dual RNA regulatory control of a Staphylococcus aureus virulence factor.
Chabelskaya, Svetlana; Bordeau, Valérie; Felden, Brice
2014-04-01
In pathogens, the accurate programming of virulence gene expression is essential for infection. It is achieved by sophisticated arrays of regulatory proteins and ribonucleic acids (sRNAs), but in many cases their contributions and connections are not yet known. Based on genetic, biochemical and structural evidence, we report that the expression pattern of a Staphylococcus aureus host immune evasion protein is enabled by the collaborative actions of RNAIII and small pathogenicity island RNA D (SprD). Their combined expression profiles during bacterial growth permit early and transient synthesis of Sbi to avoid host immune responses. Together, these two sRNAs use antisense mechanisms to monitor Sbi expression at the translational level. Deletion analysis combined with structural analysis of RNAIII in complex with its novel messenger RNA (mRNA) target indicate that three distant RNAIII domains interact with distinct sites of the sbi mRNA and that two locations are deep in the sbi coding region. Through distinct domains, RNAIII lowers production of two proteins required for avoiding innate host immunity, staphylococcal protein A and Sbi. Toeprints and in vivo mutational analysis reveal a novel regulatory module within RNAIII essential for attenuation of Sbi translation. The sophisticated translational control of mRNA by two differentially expressed sRNAs ensures supervision of host immune escape by a major pathogen.
Dynamic malware analysis using IntroVirt: a modified hypervisor-based system
NASA Astrophysics Data System (ADS)
White, Joshua S.; Pape, Stephen R.; Meily, Adam T.; Gloo, Richard M.
2013-05-01
In this paper, we present a system for Dynamic Malware Analysis which incorporates the use of IntroVirt™. IntroVirt is an introspective hypervisor architecture and infrastructure that supports advanced analysis techniques for stealth-malwareanalysis. This system allows for complete guest monitoring and interaction, including the manipulation and blocking of system calls. IntroVirt is capable of bypassing virtual machine detection capabilities of even the most sophisticated malware, by spoofing returns to system call responses. Additional fuzzing capabilities can be employed to detect both malware vulnerabilities and polymorphism.
Static analysis of a sonar dome rubber window
NASA Technical Reports Server (NTRS)
Lai, J. L.
1978-01-01
The application of NASTRAN (level 16.0.1) to the static analysis of a sonar dome rubber window (SDRW) was demonstrated. The assessment of the conventional model (neglecting the enclosed fluid) for the stress analysis of the SDRW was made by comparing its results to those based on a sophisticated model (including the enclosed fluid). The fluid was modeled with isoparametric linear hexahedron elements with approximate material properties whose shear modulus was much smaller than its bulk modulus. The effect of the chosen material property for the fluid is discussed.
NECAP: NASA's Energy-Cost Analysis Program. Part 1: User's manual
NASA Technical Reports Server (NTRS)
Henninger, R. H. (Editor)
1975-01-01
The NECAP is a sophisticated building design and energy analysis tool which has embodied within it all of the latest ASHRAE state-of-the-art techniques for performing thermal load calculation and energy usage predictions. It is a set of six individual computer programs which include: response factor program, data verification program, thermal load analysis program, variable temperature program, system and equipment simulation program, and owning and operating cost program. Each segment of NECAP is described, and instructions are set forth for preparing the required input data and for interpreting the resulting reports.
NASA Technical Reports Server (NTRS)
Vlahopoulos, Nickolas; Lyle, Karen H.; Burley, Casey L.
1998-01-01
An algorithm for generating appropriate velocity boundary conditions for an acoustic boundary element analysis from the kinematics of an operating propeller is presented. It constitutes the initial phase of Integrating sophisticated rotorcraft models into a conventional boundary element analysis. Currently, the pressure field is computed by a linear approximation. An initial validation of the developed process was performed by comparing numerical results to test data for the external acoustic pressure on the surface of a tilt-rotor aircraft for one flight condition.
ERIC Educational Resources Information Center
Gottesman, Isaac
2013-01-01
Upon its publication in 1976, Samuel Bowles and Herbert Gintis' "Schooling in Capitalist America" was the most sophisticated and nuanced Marxian social and political analysis of schooling in the United States. Thirty-five years after its publication, "Schooling" continues to have a strong impact on thinking about education. Despite its…
Hidden Communicative Competence: Case Study Evidence Using Eye-Tracking and Video Analysis
ERIC Educational Resources Information Center
Grayson, Andrew; Emerson, Anne; Howard-Jones, Patricia; O'Neil, Lynne
2012-01-01
A facilitated communication (FC) user with an autism spectrum disorder produced sophisticated texts by pointing, with physical support, to letters on a letterboard while their eyes were tracked and while their pointing movements were video recorded. This FC user has virtually no independent means of expression, and is held to have no literacy…
Arsenic speciation in solids using X-ray absorption spectroscopy
Foster, Andrea L.; Kim, Chris S.
2014-01-01
One of the most important aims of this review is to clarify the different types of analysis that are performed on As-XAS spectra, and to describe the benefits, drawbacks, and limitations of each. Arsenic XAS spectra are analyzed to obtain one or more of the following types of information (in increasing order of sophistication):
Enhancing critical thinking with case studies and nursing process.
Neill, K M; Lachat, M F; Taylor-Panek, S
1997-01-01
Challenged to enhance critical thinking concepts in a sophomore nursing process course, faculty expanded the lecture format to include group explorations of patient case studies. The group format facilitated a higher level of analysis of patient cases and more sophisticated applications of nursing process. This teaching strategy was a positive learning experience for students and faculty.
ERIC Educational Resources Information Center
Parker, Edwin B.
SPIRES (Stanford Public Information Retrieval System) is a computerized information storage and retrieval system intended for use by students and faculty members who have little knowledge of computers but who need rapid and sophisticated retrieval and analysis. The functions and capabilities of the system from the user's point of view are…
ERIC Educational Resources Information Center
Ding, Lin; Zhang, Ping
2016-01-01
Previous literature on learners' epistemological beliefs about physics has almost exclusively focused on analysis of university classroom instruction and its effects on students' views. However, little is known about other populations or factors other than classroom instruction on learners' epistemologies. In this study, we used a cross-sequential…
Changes in Pricing Behavior during the 1980s: An Analysis of Selected Case Studies.
ERIC Educational Resources Information Center
St. John, Edward P.
1992-01-01
Reports on changes in pricing decisions at public and private colleges in a low-cost and a high-cost state in the 1980s. Five liberal arts colleges studied used several pricing strategies: "elite" pricing strategy; "prestige" pricing strategy; and price reduction strategy. Study found multiple causes for price increases, more sophisticated pricing…
A Primer for Accounting Certification: Complete Analysis of the Process with Listing of Sources
ERIC Educational Resources Information Center
Boyd, David T.; Boyd, Sanithia C.; Berry, Priscilla
2009-01-01
As a result of globalization and the growth and complexity of both domestic and international bodies requiring accountants, the need for highly sophisticated training and specific certification is mandatory. Students seeking career positions in the field of accounting are amazingly left without the easy access to certification that one might think…
Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis
ERIC Educational Resources Information Center
Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay
2018-01-01
Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…
Economic analysis of crystal growth in space
NASA Technical Reports Server (NTRS)
Ulrich, D. R.; Chung, A. M.; Yan, C. S.; Mccreight, L. R.
1972-01-01
Many advanced electronic technologies and devices for the 1980's are based on sophisticated compound single crystals, i.e. ceramic oxides and compound semiconductors. Space processing of these electronic crystals with maximum perfection, purity, and size is suggested. No ecomonic or technical justification was found for the growth of silicon single crystals for solid state electronic devices in space.
ERIC Educational Resources Information Center
Voutsina, Chronoula
2016-01-01
Empirical research has documented how children's early counting develops into an increasingly abstract process, and initial counting procedures are reified as children develop and use more sophisticated counting. In this development, the learning of different oral counting sequences that allow children to count in steps bigger than one is seen as…
ERIC Educational Resources Information Center
Schmitt, Catherine A.
2012-01-01
This dissertation examines the history of workforce education, corporate university development models in both literature and practice, and the evolution of the next generation of corporate universities. It traces workforce education from indentured servants in Europe during the Middle Ages, to the sophisticated corporate universities that…
Goals and Objectives for Computing in the Associated Colleges of the St. Lawrence Valley.
ERIC Educational Resources Information Center
Grupe, Fritz H.
A forecast of the computing requirements of the Associated Colleges of the St. Lawrence Valley, an analysis of their needs, and specifications for a joint computer system are presented. Problems encountered included the lack of resources and computer sophistication at the member schools and a dearth of experience with long-term computer consortium…
Experience with a sophisticated computer based authoring system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, P.R.
1984-04-01
In the November 1982 issue of ADCIS SIG CBT Newsletter the editor arrives at two conclusions regarding Computer Based Authoring Systems (CBAS): (1) CBAS drastically reduces programming time and the need for expert programmers, and (2) CBAS appears to have minimal impact on initial lesson design. Both of these comments have significant impact on any Cost-Benefit analysis for Computer-Based Training. The first tends to improve cost-effectiveness but only toward the limits imposed by the second. Westinghouse Hanford Company (WHC) recently purchased a sophisticated CBAS, the WISE/SMART system from Wicat (Orem, UT), for use in the Nuclear Power Industry. This reportmore » details our experience with this system relative to Items (1) and (2) above; lesson design time will be compared with lesson input time. Also provided will be the WHC experience in the use of subject matter experts (though computer neophytes) for the design and inputting of CBT materials.« less
Sophisticated Calculation of the 1oo4-architecture for Safety-related Systems Conforming to IEC61508
NASA Astrophysics Data System (ADS)
Hayek, A.; Bokhaiti, M. Al; Schwarz, M. H.; Boercsoek, J.
2012-05-01
With the publication and enforcement of the standard IEC 61508 of safety related systems, recent system architectures have been presented and evaluated. Among a number of techniques and measures to the evaluation of safety integrity level (SIL) for safety-related systems, several measures such as reliability block diagrams and Markov models are used to analyze the probability of failure on demand (PFD) and mean time to failure (MTTF) which conform to IEC 61508. The current paper deals with the quantitative analysis of the novel 1oo4-architecture (one out of four) presented in recent work. Therefore sophisticated calculations for the required parameters are introduced. The provided 1oo4-architecture represents an advanced safety architecture based on on-chip redundancy, which is 3-failure safe. This means that at least one of the four channels have to work correctly in order to trigger the safety function.
Financial Literacy and Financial Sophistication in the Older Population
Lusardi, Annamaria; Mitchell, Olivia S.; Curto, Vilsa
2017-01-01
Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191
Financial Literacy and Financial Sophistication in the Older Population.
Lusardi, Annamaria; Mitchell, Olivia S; Curto, Vilsa
2014-10-01
Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.
Counterfeit drugs: analytical techniques for their identification.
Martino, R; Malet-Martino, M; Gilard, V; Balayssac, S
2010-09-01
In recent years, the number of counterfeit drugs has increased dramatically, including not only "lifestyle" products but also vital medicines. Besides the threat to public health, the financial and reputational damage to pharmaceutical companies is substantial. The lack of robust information on the prevalence of fake drugs is an obstacle in the fight against drug counterfeiting. It is generally accepted that approximately 10% of drugs worldwide could be counterfeit, but it is also well known that this number covers very different situations depending on the country, the places where the drugs are purchased, and the definition of what constitutes a counterfeit drug. The chemical analysis of drugs suspected to be fake is a crucial step as counterfeiters are becoming increasingly sophisticated, rendering visual inspection insufficient to distinguish the genuine products from the counterfeit ones. This article critically reviews the recent analytical methods employed to control the quality of drug formulations, using as an example artemisinin derivatives, medicines particularly targeted by counterfeiters. Indeed, a broad panel of techniques have been reported for their analysis, ranging from simple and cheap in-field ones (colorimetry and thin-layer chromatography) to more advanced laboratory methods (mass spectrometry, nuclear magnetic resonance, and vibrational spectroscopies) through chromatographic methods, which remain the most widely used. The conclusion section of the article highlights the questions to be posed before selecting the most appropriate analytical approach.
Autoreject: Automated artifact rejection for MEG and EEG data.
Jas, Mainak; Engemann, Denis A; Bekhti, Yousra; Raimondo, Federico; Gramfort, Alexandre
2017-10-01
We present an automated algorithm for unified rejection and repair of bad trials in magnetoencephalography (MEG) and electroencephalography (EEG) signals. Our method capitalizes on cross-validation in conjunction with a robust evaluation metric to estimate the optimal peak-to-peak threshold - a quantity commonly used for identifying bad trials in M/EEG. This approach is then extended to a more sophisticated algorithm which estimates this threshold for each sensor yielding trial-wise bad sensors. Depending on the number of bad sensors, the trial is then repaired by interpolation or by excluding it from subsequent analysis. All steps of the algorithm are fully automated thus lending itself to the name Autoreject. In order to assess the practical significance of the algorithm, we conducted extensive validation and comparisons with state-of-the-art methods on four public datasets containing MEG and EEG recordings from more than 200 subjects. The comparisons include purely qualitative efforts as well as quantitatively benchmarking against human supervised and semi-automated preprocessing pipelines. The algorithm allowed us to automate the preprocessing of MEG data from the Human Connectome Project (HCP) going up to the computation of the evoked responses. The automated nature of our method minimizes the burden of human inspection, hence supporting scalability and reliability demanded by data analysis in modern neuroscience. Copyright © 2017 Elsevier Inc. All rights reserved.
Rajasingh, Sheeja; Isai, Dona Greta; Samanta, Saheli; Zhou, Zhi-Gang; Dawn, Buddhadeb; Kinsey, William H; Czirok, Andras; Rajasingh, Johnson
2018-04-05
Induced pluripotent stem cell (iPSC)-based cardiac regenerative medicine requires the efficient generation, structural soundness and proper functioning of mature cardiomyocytes, derived from the patient's somatic cells. The most important functional property of cardiomyocytes is the ability to contract. Currently available methods routinely used to test and quantify cardiomyocyte function involve techniques that are labor-intensive, invasive, require sophisticated instruments or can adversely affect cell vitality. We recently developed optical flow imaging method analyses and quantified cardiomyocyte contractile kinetics from video microscopic recordings without compromising cell quality. Specifically, our automated particle image velocimetry (PIV) analysis of phase-contrast video images captured at a high frame rate yields statistical measures characterizing the beating frequency, amplitude, average waveform and beat-to-beat variations. Thus, it can be a powerful assessment tool to monitor cardiomyocyte quality and maturity. Here we demonstrate the ability of our analysis to characterize the chronotropic responses of human iPSC-derived cardiomyocytes to a panel of ion channel modulators and also to doxorubicin, a chemotherapy agent with known cardiotoxic side effects. We conclude that the PIV-derived beat patterns can identify the elongation or shortening of specific phases in the contractility cycle, and the obtained chronotropic responses are in accord with known clinical outcomes. Hence, this system can serve as a powerful tool to screen the new and currently available pharmacological compounds for cardiotoxic effects.
Automated, on-board terrain analysis for precision landings
NASA Technical Reports Server (NTRS)
Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.; Hines, Glenn D.
2006-01-01
Advances in space robotics technology hinge to a large extent upon the development and deployment of sophisticated new vision-based methods for automated in-space mission operations and scientific survey. To this end, we have developed a new concept for automated terrain analysis that is based upon a generic image enhancement platform|multi-scale retinex (MSR) and visual servo (VS) processing. This pre-conditioning with the MSR and the vs produces a "canonical" visual representation that is largely independent of lighting variations, and exposure errors. Enhanced imagery is then processed with a biologically inspired two-channel edge detection process, followed by a smoothness based criteria for image segmentation. Landing sites can be automatically determined by examining the results of the smoothness-based segmentation which shows those areas in the image that surpass a minimum degree of smoothness. Though the msr has proven to be a very strong enhancement engine, the other elements of the approach|the vs, terrain map generation, and smoothness-based segmentation|are in early stages of development. Experimental results on data from the Mars Global Surveyor show that the imagery can be processed to automatically obtain smooth landing sites. In this paper, we describe the method used to obtain these landing sites, and also examine the smoothness criteria in terms of the imager and scene characteristics. Several examples of applying this method to simulated and real imagery are shown.
Fourment, Mathieu; Holmes, Edward C
2014-07-24
Early methods for estimating divergence times from gene sequence data relied on the assumption of a molecular clock. More sophisticated methods were created to model rate variation and used auto-correlation of rates, local clocks, or the so called "uncorrelated relaxed clock" where substitution rates are assumed to be drawn from a parametric distribution. In the case of Bayesian inference methods the impact of the prior on branching times is not clearly understood, and if the amount of data is limited the posterior could be strongly influenced by the prior. We develop a maximum likelihood method--Physher--that uses local or discrete clocks to estimate evolutionary rates and divergence times from heterochronous sequence data. Using two empirical data sets we show that our discrete clock estimates are similar to those obtained by other methods, and that Physher outperformed some methods in the estimation of the root age of an influenza virus data set. A simulation analysis suggests that Physher can outperform a Bayesian method when the real topology contains two long branches below the root node, even when evolution is strongly clock-like. These results suggest it is advisable to use a variety of methods to estimate evolutionary rates and divergence times from heterochronous sequence data. Physher and the associated data sets used here are available online at http://code.google.com/p/physher/.
2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrington, David Bradley; Waters, Jiajia
Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.
Electromagnetic Imaging Methods for Nondestructive Evaluation Applications
Deng, Yiming; Liu, Xin
2011-01-01
Electromagnetic nondestructive tests are important and widely used within the field of nondestructive evaluation (NDE). The recent advances in sensing technology, hardware and software development dedicated to imaging and image processing, and material sciences have greatly expanded the application fields, sophisticated the systems design and made the potential of electromagnetic NDE imaging seemingly unlimited. This review provides a comprehensive summary of research works on electromagnetic imaging methods for NDE applications, followed by the summary and discussions on future directions. PMID:22247693
NASA Technical Reports Server (NTRS)
Johnson, Paul E.; Smith, Milton O.; Adams, John B.
1992-01-01
Algorithms were developed, based on Hapke's (1981) equations, for remote determinations of mineral abundances and particle sizes from reflectance spectra. In this method, spectra are modeled as a function of end-member abundances and illumination/viewing geometry. The method was tested on a laboratory data set. It is emphasized that, although there exist more sophisticated models, the present algorithms are particularly suited for remotely sensed data, where little opportunity exists to independently measure reflectance versus article size and phase function.
Numerical realization of the variational method for generating self-trapped beams
NASA Astrophysics Data System (ADS)
Duque, Erick I.; Lopez-Aguayo, Servando; Malomed, Boris A.
2018-03-01
We introduce a numerical variational method based on the Rayleigh-Ritz optimization principle for predicting two-dimensional self-trapped beams in nonlinear media. This technique overcomes the limitation of the traditional variational approximation in performing analytical Lagrangian integration and differentiation. Approximate soliton solutions of a generalized nonlinear Schr\\"odinger equation are obtained, demonstrating robustness of the beams of various types (fundamental, vortices, multipoles, azimuthons) in the course of their propagation. The algorithm offers possibilities to produce more sophisticated soliton profiles in general nonlinear models.
Greenfeld, Max; van de Meent, Jan-Willem; Pavlichin, Dmitri S; Mabuchi, Hideo; Wiggins, Chris H; Gonzalez, Ruben L; Herschlag, Daniel
2015-01-16
Single-molecule techniques have emerged as incisive approaches for addressing a wide range of questions arising in contemporary biological research [Trends Biochem Sci 38:30-37, 2013; Nat Rev Genet 14:9-22, 2013; Curr Opin Struct Biol 2014, 28C:112-121; Annu Rev Biophys 43:19-39, 2014]. The analysis and interpretation of raw single-molecule data benefits greatly from the ongoing development of sophisticated statistical analysis tools that enable accurate inference at the low signal-to-noise ratios frequently associated with these measurements. While a number of groups have released analysis toolkits as open source software [J Phys Chem B 114:5386-5403, 2010; Biophys J 79:1915-1927, 2000; Biophys J 91:1941-1951, 2006; Biophys J 79:1928-1944, 2000; Biophys J 86:4015-4029, 2004; Biophys J 97:3196-3205, 2009; PLoS One 7:e30024, 2012; BMC Bioinformatics 288 11(8):S2, 2010; Biophys J 106:1327-1337, 2014; Proc Int Conf Mach Learn 28:361-369, 2013], it remains difficult to compare analysis for experiments performed in different labs due to a lack of standardization. Here we propose a standardized single-molecule dataset (SMD) file format. SMD is designed to accommodate a wide variety of computer programming languages, single-molecule techniques, and analysis strategies. To facilitate adoption of this format we have made two existing data analysis packages that are used for single-molecule analysis compatible with this format. Adoption of a common, standard data file format for sharing raw single-molecule data and analysis outcomes is a critical step for the emerging and powerful single-molecule field, which will benefit both sophisticated users and non-specialists by allowing standardized, transparent, and reproducible analysis practices.
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.
2018-03-01
Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.
iTTVis: Interactive Visualization of Table Tennis Data.
Wu, Yingcai; Lan, Ji; Shu, Xinhuan; Ji, Chenyang; Zhao, Kejian; Wang, Jiachen; Zhang, Hui
2018-01-01
The rapid development of information technology paved the way for the recording of fine-grained data, such as stroke techniques and stroke placements, during a table tennis match. This data recording creates opportunities to analyze and evaluate matches from new perspectives. Nevertheless, the increasingly complex data poses a significant challenge to make sense of and gain insights into. Analysts usually employ tedious and cumbersome methods which are limited to watching videos and reading statistical tables. However, existing sports visualization methods cannot be applied to visualizing table tennis competitions due to different competition rules and particular data attributes. In this work, we collaborate with data analysts to understand and characterize the sophisticated domain problem of analysis of table tennis data. We propose iTTVis, a novel interactive table tennis visualization system, which to our knowledge, is the first visual analysis system for analyzing and exploring table tennis data. iTTVis provides a holistic visualization of an entire match from three main perspectives, namely, time-oriented, statistical, and tactical analyses. The proposed system with several well-coordinated views not only supports correlation identification through statistics and pattern detection of tactics with a score timeline but also allows cross analysis to gain insights. Data analysts have obtained several new insights by using iTTVis. The effectiveness and usability of the proposed system are demonstrated with four case studies.
A rapid method for the sampling of atmospheric water vapour for isotopic analysis.
Peters, Leon I; Yakir, Dan
2010-01-01
Analysis of the stable isotopic composition of atmospheric moisture is widely applied in the environmental sciences. Traditional methods for obtaining isotopic compositional data from ambient moisture have required complicated sampling procedures, expensive and sophisticated distillation lines, hazardous consumables, and lengthy treatments prior to analysis. Newer laser-based techniques are expensive and usually not suitable for large-scale field campaigns, especially in cases where access to mains power is not feasible or high spatial coverage is required. Here we outline the construction and usage of a novel vapour-sampling system based on a battery-operated Stirling cycle cooler, which is simple to operate, does not require any consumables, or post-collection distillation, and is light-weight and highly portable. We demonstrate the ability of this system to reproduce delta(18)O isotopic compositions of ambient water vapour, with samples taken simultaneously by a traditional cryogenic collection technique. Samples were collected over 1 h directly into autosampler vials and were analysed by mass spectrometry after pyrolysis of 1 microL aliquots to CO. This yielded an average error of < +/-0.5 per thousand, approximately equal to the signal-to-noise ratio of traditional approaches. This new system provides a rapid and reliable alternative to conventional cryogenic techniques, particularly in cases requiring high sample throughput or where access to distillation lines, slurry maintenance or mains power is not feasible. Copyright 2009 John Wiley & Sons, Ltd.
Rosen, M. A.; Sampson, J. B.; Jackson, E. V.; Koka, R.; Chima, A. M.; Ogbuagu, O. U.; Marx, M. K.; Koroma, M.; Lee, B. H.
2014-01-01
Background Anaesthesia care in developed countries involves sophisticated technology and experienced providers. However, advanced machines may be inoperable or fail frequently when placed into the austere medical environment of a developing country. Failure mode and effects analysis (FMEA) is a method for engaging local staff in identifying real or potential breakdowns in processes or work systems and to develop strategies to mitigate risks. Methods Nurse anaesthetists from the two tertiary care hospitals in Freetown, Sierra Leone, participated in three sessions moderated by a human factors specialist and an anaesthesiologist. Sessions were audio recorded, and group discussion graphically mapped by the session facilitator for analysis and commentary. These sessions sought to identify potential barriers to implementing an anaesthesia machine designed for austere medical environments—the universal anaesthesia machine (UAM)—and also engaging local nurse anaesthetists in identifying potential solutions to these barriers. Results Participating Sierra Leonean clinicians identified five main categories of failure modes (resource availability, environmental issues, staff knowledge and attitudes, and workload and staffing issues) and four categories of mitigation strategies (resource management plans, engaging and educating stakeholders, peer support for new machine use, and collectively advocating for needed resources). Conclusions We identified factors that may limit the impact of a UAM and devised likely effective strategies for mitigating those risks. PMID:24833727
Cell-fusion method to visualize interphase nuclear pore formation.
Maeshima, Kazuhiro; Funakoshi, Tomoko; Imamoto, Naoko
2014-01-01
In eukaryotic cells, the nucleus is a complex and sophisticated organelle that organizes genomic DNA to support essential cellular functions. The nuclear surface contains many nuclear pore complexes (NPCs), channels for macromolecular transport between the cytoplasm and nucleus. It is well known that the number of NPCs almost doubles during interphase in cycling cells. However, the mechanism of NPC formation is poorly understood, presumably because a practical system for analysis does not exist. The most difficult obstacle in the visualization of interphase NPC formation is that NPCs already exist after nuclear envelope formation, and these existing NPCs interfere with the observation of nascent NPCs. To overcome this obstacle, we developed a novel system using the cell-fusion technique (heterokaryon method), previously also used to analyze the shuttling of macromolecules between the cytoplasm and the nucleus, to visualize the newly synthesized interphase NPCs. In addition, we used a photobleaching approach that validated the cell-fusion method. We recently used these methods to demonstrate the role of cyclin-dependent protein kinases and of Pom121 in interphase NPC formation in cycling human cells. Here, we describe the details of the cell-fusion approach and compare the system with other NPC formation visualization methods. Copyright © 2014 Elsevier Inc. All rights reserved.
Kivlehan, Francine; Mavré, François; Talini, Luc; Limoges, Benoît; Marchal, Damien
2011-09-21
We described an electrochemical method to monitor in real-time the isothermal helicase-dependent amplification of nucleic acids. The principle of detection is simple and well-adapted to the development of portable, easy-to-use and inexpensive nucleic acids detection technologies. It consists of monitoring a decrease in the electrochemical current response of a reporter DNA intercalating redox probe during the isothermal DNA amplification. The method offers the possibility to quantitatively analyze target nucleic acids in less than one hour at a single constant temperature, and to perform at the end of the isothermal amplification a DNA melt curve analysis for differentiating between specific and non-specific amplifications. To illustrate the potentialities of this approach for the development of a simple, robust and low-cost instrument with high throughput capability, the method was validated with an electrochemical system capable of monitoring up to 48 real-time isothermal HDA reactions simultaneously in a disposable microplate consisting of 48-electrochemical microwells. Results obtained with this approach are comparable to that obtained with a well-established but more sophisticated and expensive fluorescence-based method. This makes for a promising alternative detection method not only for real-time isothermal helicase-dependent amplification of nucleic acid, but also for other isothermal DNA amplification strategies.
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2016-03-09
This work represents a first-of-its-kind successful application to employ advanced numerical methods in solving realistic two-phase flow problems with two-fluid six-equation two-phase flow model. These advanced numerical methods include high-resolution spatial discretization scheme with staggered grids (high-order) fully implicit time integration schemes, and Jacobian-free Newton–Krylov (JFNK) method as the nonlinear solver. The computer code developed in this work has been extensively validated with existing experimental flow boiling data in vertical pipes and rod bundles, which cover wide ranges of experimental conditions, such as pressure, inlet mass flux, wall heat flux and exit void fraction. Additional code-to-code benchmark with the RELAP5-3Dmore » code further verifies the correct code implementation. The combined methods employed in this work exhibit strong robustness in solving two-phase flow problems even when phase appearance (boiling) and realistic discrete flow regimes are considered. Transitional flow regimes used in existing system analysis codes, normally introduced to overcome numerical difficulty, were completely removed in this work. As a result, this in turn provides the possibility to utilize more sophisticated flow regime maps in the future to further improve simulation accuracy.« less
Function allocation for humans and automation in the context of team dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey C. Joe; John O'Hara; Jacques Hugo
Within Human Factors Engineering, a decision-making process called function allocation (FA) is used during the design life cycle of complex systems to distribute the system functions, often identified through a functional requirements analysis, to all human and automated machine agents (or teammates) involved in controlling the system. Most FA methods make allocation decisions primarily by comparing the capabilities of humans and automation, but then also by considering secondary factors such as cost, regulations, and the health and safety of workers. The primary analysis of the strengths and weaknesses of humans and machines, however, is almost always considered in terms ofmore » individual human or machine capabilities. Yet, FA is fundamentally about teamwork in that the goal of the FA decision-making process is to determine what are the optimal allocations of functions among agents. Given this framing of FA, and the increasing use of and sophistication of automation, there are two related social psychological issues that current FA methods need to address more thoroughly. First, many principles for effective human teamwork are not considered as central decision points or in the iterative hypothesis and testing phase in most FA methods, when it is clear that social factors have numerous positive and negative effects on individual and team capabilities. Second, social psychological factors affecting team performance and can be difficult to translate to automated agents, and most FA methods currently do not account for this effect. The implications for these issues are discussed.« less
Synthesis in land change science: methodological patterns, challenges, and guidelines.
Magliocca, Nicholas R; Rudel, Thomas K; Verburg, Peter H; McConnell, William J; Mertz, Ole; Gerstner, Katharina; Heinimann, Andreas; Ellis, Erle C
Global and regional economic and environmental changes are increasingly influencing local land-use, livelihoods, and ecosystems. At the same time, cumulative local land changes are driving global and regional changes in biodiversity and the environment. To understand the causes and consequences of these changes, land change science (LCS) draws on a wide array synthetic and meta-study techniques to generate global and regional knowledge from local case studies of land change. Here, we review the characteristics and applications of synthesis methods in LCS and assess the current state of synthetic research based on a meta-analysis of synthesis studies from 1995 to 2012. Publication of synthesis research is accelerating, with a clear trend toward increasingly sophisticated and quantitative methods, including meta-analysis. Detailed trends in synthesis objectives, methods, and land change phenomena and world regions most commonly studied are presented. Significant challenges to successful synthesis research in LCS are also identified, including issues of interpretability and comparability across case-studies and the limits of and biases in the geographic coverage of case studies. Nevertheless, synthesis methods based on local case studies will remain essential for generating systematic global and regional understanding of local land change for the foreseeable future, and multiple opportunities exist to accelerate and enhance the reliability of synthetic LCS research in the future. Demand for global and regional knowledge generation will continue to grow to support adaptation and mitigation policies consistent with both the local realities and regional and global environmental and economic contexts of land change.
The reliability of an instrumented start block analysis system.
Tor, Elaine; Pease, David L; Ball, Kevin A
2015-02-01
The swimming start is highly influential to overall competition performance. Therefore, it is paramount to develop reliable methods to perform accurate biomechanical analysis of start performance for training and research. The Wetplate Analysis System is a custom-made force plate system developed by the Australian Institute of Sport--Aquatic Testing, Training and Research Unit (AIS ATTRU). This sophisticated system combines both force data and 2D digitization to measure a number of kinetic and kinematic parameter values in an attempt to evaluate start performance. Fourteen elite swimmers performed two maximal effort dives (performance was defined as time from start signal to 15 m) over two separate testing sessions. Intraclass correlation coefficients (ICC) were used to determine each parameter's reliability. The kinetic parameters all had ICC greater than 0.9 except the time of peak vertical force (0.742). This may have been due to variations in movement initiation after the starting signal between trials. The kinematic and time parameters also had ICC greater than 0.9 apart from for the time of maximum depth (0.719). This parameter was lower due to the swimmers varying their depth between trials. Based on the high ICC scores for all parameters, the Wetplate Analysis System is suitable for biomechanical analysis of swimming starts.
Shiokawa, Yuka; Date, Yasuhiro; Kikuchi, Jun
2018-02-21
Computer-based technological innovation provides advancements in sophisticated and diverse analytical instruments, enabling massive amounts of data collection with relative ease. This is accompanied by a fast-growing demand for technological progress in data mining methods for analysis of big data derived from chemical and biological systems. From this perspective, use of a general "linear" multivariate analysis alone limits interpretations due to "non-linear" variations in metabolic data from living organisms. Here we describe a kernel principal component analysis (KPCA)-incorporated analytical approach for extracting useful information from metabolic profiling data. To overcome the limitation of important variable (metabolite) determinations, we incorporated a random forest conditional variable importance measure into our KPCA-based analytical approach to demonstrate the relative importance of metabolites. Using a market basket analysis, hippurate, the most important variable detected in the importance measure, was associated with high levels of some vitamins and minerals present in foods eaten the previous day, suggesting a relationship between increased hippurate and intake of a wide variety of vegetables and fruits. Therefore, the KPCA-incorporated analytical approach described herein enabled us to capture input-output responses, and should be useful not only for metabolic profiling but also for profiling in other areas of biological and environmental systems.
de Heer, Brooke
2016-02-01
Prior research on rapes reported to law enforcement has identified criminal sophistication and the use of force against the victim as possible unique identifiers to serial rape versus one-time rape. This study sought to contribute to the current literature on reported serial rape by investigating how the level of criminal sophistication of the rapist and use of force used were associated with two important outcomes of rape: victim injury and overall severity of the assault. In addition, it was evaluated whether rapist and victim ethnicity affected these relationships. A nation-wide sample of serial rape cases reported to law enforcement collected by the Federal Bureau of Investigation (FBI) was analyzed (108 rapists, 543 victims). Results indicated that serial rapists typically used a limited amount of force against the victim and displayed a high degree of criminal sophistication. In addition, the more criminally sophisticated the perpetrator was, the more sexual acts he performed on his victim. Finally, rapes between a White rapist and White victim were found to exhibit higher levels of criminal sophistication and were more severe in terms of number and types of sexual acts committed. These findings provide a more in-depth understanding of serial rape that can inform both academics and practitioners in the field about contributors to victim injury and severity of the assault. © The Author(s) 2014.
Simultaneous master-slave Omega pairs. [navigation system featuring low cost receiver
NASA Technical Reports Server (NTRS)
Burhans, R. W.
1974-01-01
Master-slave sequence ordering of the Omega system is suggested as a method of improving the pair geometry for low-cost receiver user benefit. The sequence change will not affect present sophisticated processor users other than require new labels for some pair combinations, but may require worldwide transmitter operators to slightly alter their long-range synchronizing techniques.
Ralph J. Alig
2004-01-01
Over the past 25 years, renewable resource assessments have addressed demand, supply, and inventory of various renewable resources in increasingly sophisticated fashion, including simulation and optimization analyses of area changes in land uses (e.g., urbanization) and land covers (e.g., plantations vs. naturally regenerated forests). This synthesis reviews related...
ERIC Educational Resources Information Center
Sadd, James; Morello-Frosch, Rachel; Pastor, Manuel; Matsuoka, Martha; Prichard, Michele; Carter, Vanessa
2014-01-01
Environmental justice advocates often argue that environmental hazards and their health effects vary by neighborhood, income, and race. To assess these patterns and advance preventive policy, their colleagues in the research world often use complex and methodologically sophisticated statistical and geospatial techniques. One way to bridge the gap…
Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph
ERIC Educational Resources Information Center
Pizauro, Joao M., Jr.; Ferro, Jesus A.; de Lima, Andrea C. F.; Routman, Karina S.; Portella, Maria Celia
2004-01-01
The present research describes an efficient procedure to obtain high levels of trypsinogen and chymotrypsinogen by using a simple, rapid, and easily reproducible method. The extraction process and the time-course of activation of zymogens can be carried out in a single laboratory period, without sophisticated equipment. The main objective was to…
ERIC Educational Resources Information Center
Erickson, Frederick
The limits and boundaries of anthropology are briefly discussed, along with a general description of lay attitudes towards the field. A research case is given to illustrate the way in which anthropological study methods can contribute to educational research. Noted among these contributions is an informed distrust that anthropologists exhibit…
Structural Uncertainties in Numerical Induction Models
2006-07-01
divide and conquer” modelling approach. Analytical inputs are then assessments, quantitative or qualitative, of the value, performance, or some...said to be naïve because it relies heavily on the inductive method itself. Sophisticated Induction (Logical Positivism ) This form of induction...falters. Popper’s Falsification Karl Popper around 1959 introduced a variant to the above Logical Positivism , known as the inductive-hypothetico
The State of Nursing Home Information Technology Sophistication in Rural and Nonrural US Markets.
Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Wakefield, Douglas S; Wise, Keely K; Alexander, Rachel L
2017-06-01
To test for significant differences in information technology sophistication (ITS) in US nursing homes (NH) based on location. We administered a primary survey January 2014 to July 2015 to NH in each US state. The survey was cross-sectional and examined 3 dimensions (IT capabilities, extent of IT use, degree of IT integration) among 3 domains (resident care, clinical support, administrative activities) of ITS. ITS was broken down by NH location. Mean responses were compared across 4 NH categories (Metropolitan, Micropolitan, Small Town, and Rural) for all 9 ITS dimensions and domains. Least square means and Tukey's method were used for multiple comparisons. Methods yielded 815/1,799 surveys (45% response rate). In every health care domain (resident care, clinical support, and administrative activities) statistical differences in facility ITS occurred in larger (metropolitan or micropolitan) and smaller (small town or rural) populated areas. This study represents the most current national assessment of NH IT since 2004. Historically, NH IT has been used solely for administrative activities and much less for resident care and clinical support. However, results are encouraging as ITS in other domains appears to be greater than previously imagined. © 2016 National Rural Health Association.
Speed genome editing by transient CRISPR/Cas9 targeting and large DNA fragment deletion.
Luo, Jing; Lu, Liaoxun; Gu, Yanrong; Huang, Rong; Gui, Lin; Li, Saichao; Qi, Xinhui; Zheng, Wenping; Chao, Tianzhu; Zheng, Qianqian; Liang, Yinming; Zhang, Lichen
2018-06-07
Genetic engineering of cell lines and model organisms has been facilitated enormously by the CRISPR/Cas9 system. However, in cell lines it remains labor intensive and time consuming to obtain desirable mutant clones due to the difficulties in isolating the mutated clones and sophisticated genotyping. In this study, we have validated fluorescent protein reporter aided cell sorting which enables the isolation of maximal diversity in mutant cells. We further applied two spectrally distinct fluorescent proteins DsRed2 and ECFP as reporters for independent CRISPR/Cas9 mediated targeting, which allows for one-cell-one-well sorting of the mutant cells. Because of ultra-high efficiency of the CRISPR/Cas9 system with dual reporters and large DNA fragment deletion resulting from independent loci cleavage, monoclonal mutant cells could be easily identified by conventional PCR. In the speed genome editing method presented here, sophisticated genotyping methods are not necessary to identify loss of function mutations after CRISPR/Cas9 genome editing, and desirable loss of function mutant clones could be obtained in less than one month following transfection. Copyright © 2018 Elsevier B.V. All rights reserved.
Preliminary design optimization of joined-wing aircraft
NASA Technical Reports Server (NTRS)
Gallman, John W.; Kroo, Ilan M.; Smith, Stephen C.
1990-01-01
The joined wing is an innovative aircraft configuration that has a its tail connected to the wing forming a diamond shape in both top and plan view. This geometric arrangement utilizes the tail for both pitch control and as a structural support for the wing. Several researchers have studied this configuration and predicted significant reductions in trimmed drag or structural weight when compared with a conventional T-tail configuration. Kroo et al. compared the cruise drag of joined wings with conventional designs of the same lifting-surface area and structural weight. This study showed an 11 percent reduction in cruise drag for the lifting system of a joined wing. Although this reduction in cruise drag is significant, a complete design study is needed before any economic savings can be claimed for a joined-wing transport. Mission constraints, such as runway length, could increase the wing area and eliminate potential drag savings. Since other design codes do not accurately represent the interaction between structures and aerodynamics for joined wings, we developed a new design code for this study. The aerodynamic and structural analyses in this study are significantly more sophisticated than those used in most conventional design codes. This sophistication was needed to predict the aerodynamic interference between the wing and tail and the stresses in the truss-like structure. This paper describes these analysis methods, discusses some problems encountered when applying the numerical optimizer NPSOL, and compares optimum joined wings with conventional aircraft on the basis of cruise drag, lifting surface weight, and direct operating cost (DOC).
Changes Observed in Views of Nature of Science During a Historically Based Unit
NASA Astrophysics Data System (ADS)
Rudge, David Wÿss; Cassidy, David Paul; Fulford, Janice Marie; Howe, Eric Michael
2014-09-01
Numerous empirical studies have provided evidence of the effectiveness of an explicit and reflective approach to the learning of issues associated with the nature of science (NOS) (c.f. Abd-El-Khalick and Lederman in J Res Sci Teach 37(10):1057-1095, 2000). This essay reports the results of a mixed-methods association study involving 130 preservice teachers during the course of a three class unit based upon the history of science using such an approach. Within the unit the phenomenon of industrial melanism was presented as a puzzle for students to solve. Students were explicitly asked to reflect upon several NOS issues as they developed and tested their own explanations for the "mystery phenomenon". NOS views of all participants were characterized by means of surveys and follow-up interviews with a subsample of 17 participants, using a modified version of the VNOS protocol (c.f. Lederman et al. in J Res Sci Teach 39(6):497-521, 2002). An analysis of the survey results informed by the interview data suggests NOS views became more sophisticated for some issues, e.g., whether scientific knowledge requires experimentation; but not others, e.g., why scientists experiment. An examination of the interview data informed by our experiences with the unit provides insight into why the unit may have been more effective with regard to some issues than others. This includes evidence that greater sophistication of some NOS issues was fostered by the use of multiple, contextualized examples. The essay concludes with a discussion of limitations, pedagogical implications, and avenues for further research.
Technical Note: Detection of gas bubble leakage via correlation of water column multibeam images
NASA Astrophysics Data System (ADS)
Schneider von Deimling, J.; Papenberg, C.
2012-03-01
Hydroacoustic detection of natural gas release from the seafloor has been conducted in the past by using singlebeam echosounders. In contrast, modern multibeam swath mapping systems allow much wider coverage, higher resolution, and offer 3-D spatial correlation. Up to the present, the extremely high data rate hampers water column backscatter investigations and more sophisticated visualization and processing techniques are needed. Here, we present water column backscatter data acquired with a 50 kHz prototype multibeam system over a period of 75 seconds. Display types are of swath-images as well as of a "re-sorted" singlebeam presentation. Thus, individual and/or groups of gas bubbles rising from the 24 m deep seafloor clearly emerge in the acoustic images, making it possible to estimate rise velocities. A sophisticated processing scheme is introduced to identify those rising gas bubbles in the hydroacoustic data. We apply a cross-correlation technique adapted from particle imaging velocimetry (PIV) to the acoustic backscatter images. Temporal and spatial drift patterns of the bubbles are assessed and are shown to match very well to measured and theoretical rise patterns. The application of this processing to our field data gives clear results with respect to unambiguous bubble detection and remote bubble rise velocimetry. The method can identify and exclude the main source of misinterpretations, i.e. fish-mediated echoes. Although image-based cross-correlation techniques are well known in the field of fluid mechanics for high resolution and non-inversive current flow field analysis, we present the first application of this technique as an acoustic bubble detector.
Attitudes about high school physics in relationship to gender and ethnicity: A mixed method analysis
NASA Astrophysics Data System (ADS)
Hafza, Rabieh Jamal
There is an achievement gap and lack of participation in science, technology, engineering, and math (STEM) by minority females. The number of minority females majoring in STEM related fields and earning advanced degrees in these fields has not significantly increased over the past 40 years. Previous research has evaluated the relationship between self-identity concept and factors that promote the academic achievement as well the motivation of students to study different subject areas. This study examined the interaction between gender and ethnicity in terms of physics attitudes in the context of real world connections, personal interest, sense making/effort, problem solving confidence, and problem solving sophistication. The Colorado Learning Attitudes about Science Survey (CLASS) was given to 131 students enrolled in physics classes. There was a statistically significant Gender*Ethnicity interaction for attitude in the context of Real World Connections, Personal Interest, Sense Making/Effort, Problem Solving Confidence, and Problem Solving Sophistication as a whole. There was also a statistically significant Gender*Ethnicity interaction for attitude in the context of Real World Connections, Personal Interest, and Sense Making/Effort individually. Five Black females were interviewed to triangulate the quantitative results and to describe the experiences of minority females taking physics classes. There were four themes that emerged from the interviews and supported the findings from the quantitative results. The data supported previous research done on attitudes about STEM. The results reported that Real World Connections and Personal Interest could be possible factors that explain the lack of participation and achievement gaps that exists among minority females.
75 FR 63067 - Interpretation of “Children's Product”
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-14
... a level of sophistication required to operate the locomotives. Additionally, the commenters note... railroad hobbyists, the costs involved, and the level of sophistication required to operate them. Model...
The conceptualization and measurement of cognitive health sophistication.
Bodie, Graham D; Collins, William B; Jensen, Jakob D; Davis, Lashara A; Guntzviller, Lisa M; King, Andy J
2013-01-01
This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.
Artificial Intelligence-Assisted Online Social Therapy for Youth Mental Health
D'Alfonso, Simon; Santesteban-Echarri, Olga; Rice, Simon; Wadley, Greg; Lederman, Reeva; Miles, Christopher; Gleeson, John; Alvarez-Jimenez, Mario
2017-01-01
Introduction: Benefits from mental health early interventions may not be sustained over time, and longer-term intervention programs may be required to maintain early clinical gains. However, due to the high intensity of face-to-face early intervention treatments, this may not be feasible. Adjunctive internet-based interventions specifically designed for youth may provide a cost-effective and engaging alternative to prevent loss of intervention benefits. However, until now online interventions have relied on human moderators to deliver therapeutic content. More sophisticated models responsive to user data are critical to inform tailored online therapy. Thus, integration of user experience with a sophisticated and cutting-edge technology to deliver content is necessary to redefine online interventions in youth mental health. This paper discusses the development of the moderated online social therapy (MOST) web application, which provides an interactive social media-based platform for recovery in mental health. We provide an overview of the system's main features and discus our current work regarding the incorporation of advanced computational and artificial intelligence methods to enhance user engagement and improve the discovery and delivery of therapy content. Methods: Our case study is the ongoing Horyzons site (5-year randomized controlled trial for youth recovering from early psychosis), which is powered by MOST. We outline the motivation underlying the project and the web application's foundational features and interface. We discuss system innovations, including the incorporation of pertinent usage patterns as well as identifying certain limitations of the system. This leads to our current motivations and focus on using computational and artificial intelligence methods to enhance user engagement, and to further improve the system with novel mechanisms for the delivery of therapy content to users. In particular, we cover our usage of natural language analysis and chatbot technologies as strategies to tailor interventions and scale up the system. Conclusions: To date, the innovative MOST system has demonstrated viability in a series of clinical research trials. Given the data-driven opportunities afforded by the software system, observed usage patterns, and the aim to deploy it on a greater scale, an important next step in its evolution is the incorporation of advanced and automated content delivery mechanisms. PMID:28626431
Artificial Intelligence-Assisted Online Social Therapy for Youth Mental Health.
D'Alfonso, Simon; Santesteban-Echarri, Olga; Rice, Simon; Wadley, Greg; Lederman, Reeva; Miles, Christopher; Gleeson, John; Alvarez-Jimenez, Mario
2017-01-01
Introduction: Benefits from mental health early interventions may not be sustained over time, and longer-term intervention programs may be required to maintain early clinical gains. However, due to the high intensity of face-to-face early intervention treatments, this may not be feasible. Adjunctive internet-based interventions specifically designed for youth may provide a cost-effective and engaging alternative to prevent loss of intervention benefits. However, until now online interventions have relied on human moderators to deliver therapeutic content. More sophisticated models responsive to user data are critical to inform tailored online therapy. Thus, integration of user experience with a sophisticated and cutting-edge technology to deliver content is necessary to redefine online interventions in youth mental health. This paper discusses the development of the moderated online social therapy (MOST) web application, which provides an interactive social media-based platform for recovery in mental health. We provide an overview of the system's main features and discus our current work regarding the incorporation of advanced computational and artificial intelligence methods to enhance user engagement and improve the discovery and delivery of therapy content. Methods: Our case study is the ongoing Horyzons site (5-year randomized controlled trial for youth recovering from early psychosis), which is powered by MOST. We outline the motivation underlying the project and the web application's foundational features and interface. We discuss system innovations, including the incorporation of pertinent usage patterns as well as identifying certain limitations of the system. This leads to our current motivations and focus on using computational and artificial intelligence methods to enhance user engagement, and to further improve the system with novel mechanisms for the delivery of therapy content to users. In particular, we cover our usage of natural language analysis and chatbot technologies as strategies to tailor interventions and scale up the system. Conclusions: To date, the innovative MOST system has demonstrated viability in a series of clinical research trials. Given the data-driven opportunities afforded by the software system, observed usage patterns, and the aim to deploy it on a greater scale, an important next step in its evolution is the incorporation of advanced and automated content delivery mechanisms.
2013-01-01
Background Molecular imaging using magnetic nanoparticles (MNPs)—magnetic particle imaging (MPI)—has attracted interest for the early diagnosis of cancer and cardiovascular disease. However, because a steep local magnetic field distribution is required to obtain a defined image, sophisticated hardware is required. Therefore, it is desirable to realize excellent image quality even with low-performance hardware. In this study, the spatial resolution of MPI was evaluated using an image reconstruction method based on the correlation information of the magnetization signal in a time domain and by applying MNP samples made from biocompatible ferucarbotran that have adjusted particle diameters. Methods The magnetization characteristics and particle diameters of four types of MNP samples made from ferucarbotran were evaluated. A numerical analysis based on our proposed method that calculates the image intensity from correlation information between the magnetization signal generated from MNPs and the system function was attempted, and the obtained image quality was compared with that using the prototype in terms of image resolution and image artifacts. Results MNP samples obtained by adjusting ferucarbotran showed superior properties to conventional ferucarbotran samples, and numerical analysis showed that the same image quality could be obtained using a gradient magnetic field generator with 0.6 times the performance. However, because image blurring was included theoretically by the proposed method, an algorithm will be required to improve performance. Conclusions MNP samples obtained by adjusting ferucarbotran showed magnetizing properties superior to conventional ferucarbotran samples, and by using such samples, comparable image quality (spatial resolution) could be obtained with a lower gradient magnetic field intensity. PMID:23734917
NASA Astrophysics Data System (ADS)
Offret, J.-P.; Lebedinsky, J.; Navello, L.; Pina, V.; Serio, B.; Bailly, Y.; Hervé, P.
2015-05-01
Temperature data play an important role in the combustion chamber since it determines both the efficiency and the rate of pollutants emission of engines. Air pollution problem concerns the emissions of gases such as CO, CO2, NO, NO2, SO2 and also aerosols, soot and volatile organic compounds. Flame combustion occurs in hostile environments where temperature and concentration profiles are often not easy to measure. In this study, a temperature and CO2 concentration profiles optical measurement method, suitable for combustion analysis, is discussed and presented. The proposed optical metrology method presents numerous advantages when compared to intrusive methods. The experimental setup comprises a passive radiative emission measurement method combined with an active laser-measurement method. The passive method is based on the use of gas emission spectroscopy. The experimental spectrometer device is coupled with an active method. The active method is used to investigate and correct complex flame profiles. This method similar to a LIDAR (Light Detection And Ranging) device is based on the measurement of Rayleigh scattering of a short laser pulse recorded using a high-speed streak camera. The whole experimental system of this new method is presented. Results obtained on a small-scale turbojet are shown and discussed in order to illustrate the potentials deliver by the sophisticated method. Both temperature and concentration profiles of the gas jet are presented and discussed.
ERIC Educational Resources Information Center
Bagley, Katherine G.
2012-01-01
Technological devices are ubiquitous in nearly every facet of society. There are substantial investments made in organizations on a daily basis to improve information technology. From a military perspective, the ultimate goal of these highly sophisticated devices is to assist soldiers in achieving mission success across dynamic and often chaotic…
Performance Analysis of the Mobile IP Protocol (RFC 3344 and Related RFCS)
2006-12-01
Encapsulation HMAC Keyed-Hash Message Authentication Code ICMP Internet Control Message Protocol IEEE Institute of Electrical and Electronics Engineers IETF...Internet Engineering Task Force IOS Internetwork Operating System IP Internet Protocol ITU International Telecommunication Union LAN Local Area...network computing. Most organizations today have sophisticated networks that are connected to the Internet. The major benefit reaped from such a
Box compression analysis of world-wide data spanning 46 years
Thomas J. Urbanik; Benjamin Frank
2006-01-01
The state of the art among most industry citations of box compression estimation is the equation by McKee developed in 1963. Because of limitations in computing tools at the time the McKee equation was developed, the equation is a simplification, with many constraints, of a more general relationship. By applying the results of sophisticated finite element modeling, in...
ERIC Educational Resources Information Center
Cottrell, William B.; And Others
The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…
ERIC Educational Resources Information Center
Enkelaar, Lotte; Smulders, Ellen; Lantman-de Valk, Henny van Schrojenstein; Weerdesteyn, Vivian; Geurts, Alexander C. H.
2013-01-01
Mobility limitations are common in persons with Intellectual Disabilities (ID). Differences in balance and gait capacities between persons with ID and controls have mainly been demonstrated by instrumented assessments (e.g. posturography and gait analysis), which require sophisticated and expensive equipment such as force plates or a 3D motion…
A Modular Simulation Framework for Assessing Swarm Search Models
2014-09-01
SUBTITLE A MODULAR SIMULATION FRAMEWORK FOR ASSESSING SWARM SEARCH MODELS 5. FUNDING NUMBERS 6. AUTHOR(S) Blake M. Wanier 7. PERFORMING ORGANIZATION...Numerical studies demonstrate the ability to leverage the developed simulation and analysis framework to investigate three canonical swarm search models ...as benchmarks for future exploration of more sophisticated swarm search scenarios. 14. SUBJECT TERMS Swarm Search, Search Theory, Modeling Framework
The Case for District-Based Reform: Leading, Building, and Sustaining School Improvement
ERIC Educational Resources Information Center
Supovitz, Jonathan A.
2006-01-01
In 1999, the Duval County (Fla.) school system set out to improve every school in the district. Over the next five years, the district achieved stunning results that have drawn nationwide attention. Jonathan A. Supovitz uses the unfolding story of Duval County to develop a sophisticated and thoughtful analysis of the role of the school district in…
Single image super-resolution reconstruction algorithm based on eage selection
NASA Astrophysics Data System (ADS)
Zhang, Yaolan; Liu, Yijun
2017-05-01
Super-resolution (SR) has become more important, because it can generate high-quality high-resolution (HR) images from low-resolution (LR) input images. At present, there are a lot of work is concentrated on developing sophisticated image priors to improve the image quality, while taking much less attention to estimating and incorporating the blur model that can also impact the reconstruction results. We present a new reconstruction method based on eager selection. This method takes full account of the factors that affect the blur kernel estimation and accurately estimating the blur process. When comparing with the state-of-the-art methods, our method has comparable performance.
[Quality assessment in anesthesia].
Kupperwasser, B
1996-01-01
Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.
SIGKit: Software for Introductory Geophysics Toolkit
NASA Astrophysics Data System (ADS)
Kruse, S.; Bank, C. G.; Esmaeili, S.; Jazayeri, S.; Liu, S.; Stoikopoulos, N.
2017-12-01
The Software for Introductory Geophysics Toolkit (SIGKit) affords students the opportunity to create model data and perform simple processing of field data for various geophysical methods. SIGkit provides a graphical user interface built with the MATLAB programming language, but can run even without a MATLAB installation. At this time SIGkit allows students to pick first arrivals and match a two-layer model to seismic refraction data; grid total-field magnetic data, extract a profile, and compare this to a synthetic profile; and perform simple processing steps (subtraction of a mean trace, hyperbola fit) to ground-penetrating radar data. We also have preliminary tools for gravity, resistivity, and EM data representation and analysis. SIGkit is being built by students for students, and the intent of the toolkit is to provide an intuitive interface for simple data analysis and understanding of the methods, and act as an entrance to more sophisticated software. The toolkit has been used in introductory courses as well as field courses. First reactions from students are positive. Think-aloud observations of students using the toolkit have helped identify problems and helped shape it. We are planning to compare the learning outcomes of students who have used the toolkit in a field course to students in a previous course to test its effectiveness.
NASA Astrophysics Data System (ADS)
Labaria, George R.; Warrick, Abbie L.; Celliers, Peter M.; Kalantar, Daniel H.
2015-02-01
The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high energy density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However, the camera nonlinearities drift over time affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.