Al-Saggaf, Yeslam; Burmeister, Oliver K.
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…
Haughton, Dominique; Phong, Nguyen
This case study covers several exploratory data analysis ideas, the histogram and boxplot, kernel density estimates, the recently introduced bagplot--a two-dimensional extension of the boxplot--as well as the violin plot, which combines a boxplot with a density shape plot. We apply these ideas and demonstrate how to interpret the output from these…
Al-Saggaf, Yeslam; Burmeister, Oliver K.
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517
Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…
Ferner, U; Matejcek, M; Neff, G
Besides hypothesis testing, which should be done as sparingly as possible, the measured or observed data should be described as extensively as possible. The traditional reliance on profiles of the mean responses may neglect useful information, and such profiles may also be misleading. With the aid of exploratory data analysis, different aspects of the structure of a data set can be considered. 'Data snooping' may discover coherences, non-trivial structures and peculiarities, which lead to a new hypothesis or to new mathematical-statistical models. It is, in our opinion, a necessity to consider exploratory and confirmatory data analyses in conjunction. This will be illustrated by examples taken from pharmaco-EEG studies.
Jennrich, Robert I.; Bentler, Peter M.
Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger. The bi-factor model has a general factor and a number of group factors. The purpose of this article is to introduce an exploratory form of bi-factor analysis. An advantage of using exploratory bi-factor analysis is that one need not provide a specific…
Gibson, David; de Freitas, Sara
This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…
Froes, Roberta Eliane Santos; Neto, Waldomiro Borges; Silva, Nilton Oliveira Couto e.; Naveira, Rita Lopes Pereira; Nascentes, Clésia Cristina; da Silva, José Bento Borba
A method for the direct determination (without sample pre-digestion) of microelements in fruit juice by inductively coupled plasma optical emission spectrometry has been developed. The method has been optimized by a 2 3 factorial design, which evaluated the plasma conditions (nebulization gas flow rate, applied power, and sample flow rate). A 1:1 diluted juice sample with 2% HNO 3 (Tetra Packed, peach flavor) and spiked with 0.5 mg L - 1 of Al, Ba, Cd, Co, Cr, Cu, Fe, Mn, Ni, Pb, Sb, Sn, and Zn was employed in the optimization. The results of the factorial design were evaluated by exploratory analysis (Hierarchical Cluster Analysis, HCA, and Principal Component Analysis, PCA) to determine the optimum analytical conditions for all elements. Central point condition differentiation (0.75 L min - 1 , 1.3 kW, and 1.25 mL min - 1 ) was observed for both methods, Principal Component Analysis and Hierarchical Cluster Analysis, with higher analytical signal values, suggesting that these are the optimal analytical conditions. F and t-student tests were used to compare the slopes of the calibration curves for aqueous and matrix-matched standards. No significant differences were observed at 95% confidence level. The correlation coefficient was higher than 0.99 for all the elements evaluated. The limits of quantification were: Al 253, Cu 3.6, Fe 84, Mn 0.4, Zn 71, Ni 67, Cd 69, Pb 129, Sn 206, Cr 79, Co 24, and Ba 2.1 µg L - 1 . The spiking experiments with fruit juice samples resulted in recoveries between 80 and 120%, except for Co and Sn. Al, Cd, Pb, Sn and Cr could not be quantified in any of the samples investigated. The method was applied to the determination of several elements in fruit juice samples commercialized in Brazil.
Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.
The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…
Osborne, Jason W.
Exploratory factor analysis (EFA) is one of the most commonly-reported quantitative methodology in the social sciences, yet much of the detail regarding what happens during an EFA remains unclear. The goal of this brief technical note is to explore what "rotation" is, what exactly is rotating, and why we use rotation when performing…
de Winter, J. C. F.; Dodou, D.; Wieringa, P. A.
Exploratory factor analysis (EFA) is generally regarded as a technique for large sample sizes ("N"), with N = 50 as a reasonable absolute minimum. This study offers a comprehensive overview of the conditions in which EFA can yield good quality results for "N" below 50. Simulations were carried out to estimate the minimum required "N" for different…
Intro to exploratory data analysis. Overview of variable distributions, scatter plots, correlation analysis, GIS datasets. Use of conditional probability to examine stressor levels and impairment. Exploring correlations among multiple stressors.
CEFA 3.02(Browne, Cudeck, Tateneni, & Mels, 2008) is a factor analysis computer program designed to perform exploratory factor analysis. It provides the main properties that are needed for exploratory factor analysis, namely a variety of factoring methods employing eight different discrepancy functions to be minimized to yield initial…
Gorsuch, Richard L.
In exploratory common factor analysis, extension analysis refers to computing the relationship of the common factors to variables that were not included in the factor analysis. A new extension procedure is presented that gives correlations without using estimated factor scores. Advantages of the new method are illustrated. (SLD)
Richard C. Logan
The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.
J. L. Kubicek
The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.
Mead, Tim P.; Legg, David L.
Twenty-one variables believed to be important indicators of health related physical fitness were measured on male and female college students between 1991 and 1993 (n=433). Exploratory and confirmatory factor analytic techniques were used in an attempt to derive important components of physical fitness. The exploratory factor analysis identified…
Hess, Andrew R.
This dissertation involves progress toward phosphazene-based ion conducting materials with a focus on structure-property relationships to improve these materials. This dissertation also includes some more fundamental exploratory syntheses to probe the limits of phosphazene chemistry and discover structure-property relationships that may be useful in designing compounds to fulfill important technical requirements. Chapter 1 provides a brief introduction to polymers and polyphosphazenes as well as ion-conducting materials and the contribution of polyphosphazene chemistry to that field. Chapter 1 also provides a brief introduction to some analytical techniques. Chapter 2 begins with the use of organophosphates as stand-alone non-volatile and fire-retardant liquid electrolyte media for dye sensitized solar cells (DSSCs) as well as their use as plasticizer in polymer gel electrolytes intended for application in lithium batteries. These organophosphates are the smallest phosphorus containing model molecules investigated in this dissertation. A homologous series of oligoalkyleneoxy substituted phosphates was synthesized and the effect of the substituent chain length on viscosity and conductivity was investigated. Small, test-scale DSSCs were constructed and showed promising results with overall cell efficiencies of up to 3.6% under un-optimized conditions. Conductivity measurements were performed on polymer gel-electrolytes based on poly[bis(2-(2-methoxyethoxy)ethoxy)phosphazene] (MEEP) plasticized with the phosphate with the best combination of properties, using a system loaded with lithium trifluoromethanesulfonate as the charge carrier. In chapter 3 the effect of the cation of the charge carrier species on the anionic conductivity of DSSC type electrolytes is evaluated using hexakis(2-(2-methoxyethoxy)ethoxy)cyclotriphosphazene (MEE-trimer) as a small molecule model for MEEP. The iodides of lithium, sodium, and ammonium as well as the ionic liquid, 1-propyl-3
Gorsuch, R L
The special characteristics of items-low reliability, confounds by minor, unwanted covariance, and the likelihood of a general factor-and better understanding of factor analysis means that the default procedure of many statistical packages (Little Jiffy) is no longer adequate for exploratory item factor analysis. It produces too many factors and precludes a general factor even when that means the factors extracted are nonreplicable. More appropriate procedures that reduce these problems are presented, along with how to select the sample, sample size required, and how to select items for scales. Proposed scales can be evaluated by their correlations with the factors; a new procedure for doing so eliminates the biased values produced by correlating them with either total or factor scores. The role of exploratory factor analysis relative to cluster analysis and confirmatory factor analysis is noted.
Pistilli, Matthew D.; Taub, Deborah J.; Bennett, Deborah E.
Created and tested the Senior Concerns Survey. An exploratory factor analysis revealed four areas of concern for college seniors: career related concerns, change and loss related concerns, graduate/professional school related concerns, and support related concerns. (EV)
Sinacore, James M.; And Others
It is argued that there is a benefit to applying techniques of exploratory data analysis (EDA) to program evaluation. The evaluation of a rehabilitation program for people with rheumatoid arthritis (20 subjects and 21 comparisons) through EDA supports the argument, indicating outcomes more precisely than conventional analysis of variance. (SLD)
Mulaik, Stanley A.
Exploratory factor analysis derives its key ideas from many sources, including Aristotle, Francis Bacon, Descartes, Pearson and Yule, and Kant. The conclusions of exploratory factor analysis are never complete without subsequent confirmatory factor analysis. (Author/GDC)
Jennrich, Robert I.; Bentler, Peter M.
Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…
Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…
Techniques of exploratory data analysis (EDA) were used to decompose data tables portraying performance of ethnic groups on the Scholastic Aptitude Test. These analyses indicate the size and structure of differences in performance among groups studied, nature of changes across time, and interactions between group membership and time. (Author/DWH)
Kawahara, Nancy E.; Ethington, Corinna
Median polishing, an exploratory data statistical analysis technique, was used to study achievement patterns for men and women on the Pharmacy College Admission Test over a six-year period. In general, a declining trend in scores was found, and males performed better than females, with the largest differences found in chemistry and biology.…
Craig, Carlton D.; Sprang, Ginny
Objective: The present study provides psychometric data for the Trauma Practices Questionnaire (TPQ). Method: A nationally randomized sample of 2,400 surveys was sent to self-identified trauma treatment specialists, and 711 (29.6%) were returned. Results: An exploratory factor analysis (N = 319) conducted on a randomly split sample (RSS) revealed…
Librarians and archivists can gain insight into the disciplinary culture of historians, and history doctoral students in particular, by examining the acknowledgment sections of these students' doctoral dissertations. This paper is an exploratory analysis of the 219 history dissertations written at the University of Oklahoma between 1930 and 2005.…
Kamakura, Wagner A.; Wedel, Michel
Proposes a class of multivariate Tobit models with a factor structure on the covariance matrix. Such models are useful in the exploratory analysis of multivariate censored data and the identification of latent variables from behavioral data. The factor structure provides a parsimonious representation of the censored data. Models are estimated with…
Oort, Frans J.
In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…
Sass, Daniel A.
Exploratory factor analysis (EFA) is commonly employed to evaluate the factor structure of measures with dichotomously scored items. Generally, only the estimated factor loadings are provided with no reference to significance tests, confidence intervals, and/or estimated factor loading standard errors. This simulation study assessed factor loading…
Describes a method for choosing rejection probabilities for the tests of independence that are used in constraint-based algorithms of exploratory path analysis. The method consists of generating a Markov or semi-Markov model from the equivalence class represented by a partial ancestral graph and then testing the d-separation implications. (SLD)
Hogg, Nanette; Lomicky, Carol S.
This study explores 465 postsecondary students' experiences in online classes through the lens of connectivism. Downes' 4 properties of connectivism (diversity, autonomy, interactivity, and openness) were used as the study design. An exploratory factor analysis was performed. This study found a 4-factor solution. Subjects indicated that autonomy…
Microblogging is a relatively new phenomenon in online social networking that has become increasingly prevalent in the last few years. This study explores the use of Twitter in public and academic libraries to understand microblogging patterns. Analysis of the tweets was conducted in two phases: (1) statistical descriptive analysis and (2) content…
Coupet, Jason; Barnum, Darold
Discussions of efficiency among Historically Black Colleges and Universities (HBCUs) are often missing in academic conversations. This article seeks to assess efficiency of individual HBCUs using Data Envelopment Analysis (DEA), a non-parametric technique that can synthesize multiple inputs and outputs to determine a single efficiency score for…
Strohacker, Kelley; Zakrajsek, Rebecca A
Assessment of "exercise readiness" is a central component to the flexible non-linear periodization (FNLP) method of organizing training workloads, but the underlying factor structure of this construct has not been empirically determined. The purpose of this study was to assess construct dimensionality of exercise readiness using exploratory factor analysis. The result of which serve as initial steps of developing a brief measure of exercise readiness. Participants consisted of students recruited from undergraduate Kinesiology courses at a racially diverse, southern University. Independent, anonymous online survey data were collected across three stages: 1) generation of item pool (n = 290), 2) assessment of face validity and refinement of item pool (n = 168), and 3) exploratory factor analysis (n = 684). A principal axis factor analysis was conducted with 41 items using oblique rotation (promax). Four statistically significant factors, as determined through parallel analysis, explained 61.5% of the variance in exercise readiness. Factor 1 contained items that represented vitality (e.g., lively, revived). Factor 2 items related to physical fatigue (e.g. tired, drained). Factors 3 and 4 were descriptive of, discomfort (e.g. pain, sick) and health (i.e. healthy, fit), respectively. This inductive approach indicates that exercise readiness is comprised of four dimensions: vitality, physical fatigue, discomfort, and health. This finding supports readiness assessment techniques currently recommended for practitioners according to the FNLP model. These results serve as a theoretical foundation upon which to further develop and refine a brief survey instrument to measure exercise readiness. Key pointsAssessment of exercise readiness is a key component in implementing an exercise program based on flexible nonlinear periodization, but the dimensionality of this concept has not been empirically determined.Based on a series of surveys and a robust exploratory factor analysis
Strohacker, Kelley; Zakrajsek, Rebecca A.
Assessment of “exercise readiness” is a central component to the flexible non-linear periodization (FNLP) method of organizing training workloads, but the underlying factor structure of this construct has not been empirically determined. The purpose of this study was to assess construct dimensionality of exercise readiness using exploratory factor analysis. The result of which serve as initial steps of developing a brief measure of exercise readiness. Participants consisted of students recruited from undergraduate Kinesiology courses at a racially diverse, southern University. Independent, anonymous online survey data were collected across three stages: 1) generation of item pool (n = 290), 2) assessment of face validity and refinement of item pool (n = 168), and 3) exploratory factor analysis (n = 684). A principal axis factor analysis was conducted with 41 items using oblique rotation (promax). Four statistically significant factors, as determined through parallel analysis, explained 61.5% of the variance in exercise readiness. Factor 1 contained items that represented vitality (e.g., lively, revived). Factor 2 items related to physical fatigue (e.g. tired, drained). Factors 3 and 4 were descriptive of, discomfort (e.g. pain, sick) and health (i.e. healthy, fit), respectively. This inductive approach indicates that exercise readiness is comprised of four dimensions: vitality, physical fatigue, discomfort, and health. This finding supports readiness assessment techniques currently recommended for practitioners according to the FNLP model. These results serve as a theoretical foundation upon which to further develop and refine a brief survey instrument to measure exercise readiness. Key points Assessment of exercise readiness is a key component in implementing an exercise program based on flexible nonlinear periodization, but the dimensionality of this concept has not been empirically determined. Based on a series of surveys and a robust exploratory factor
analysis because results depend on human imagination and judgment; (5) Design for routine exploratory analysis under deep uncertainty; and (6...Working Paper Causal Models and Exploratory Analysis in Heterogeneous Information Fusion for Detecting Potential Terrorists Paul K. Davis...Security Research Division (NSRD). NSRD conducts research and analysis on defense and national security topics for the U.S. and allied defense
Steed, Chad A.; Ricciuto, Daniel M.; Shipman, Galen M.; Smith, Brian E.; Thornton, Peter E.; Wang, Dali; Shi, Xiaoying; Williams, Dean N.
Rapid increases in high performance computing are feeding the development of larger and more complex data sets in climate research, which sets the stage for so-called big data analysis challenges. However, conventional climate analysis techniques are inadequate in dealing with the complexities of today s data. In this paper, we describe and demonstrate a visual analytics system, called the Exploratory Data analysis ENvironment (EDEN), with specific application to the analysis of complex earth system simulation data sets. EDEN represents the type of interactive visual analysis tools that are necessary to transform data into insight, thereby improving critical comprehension of earth system processes. In addition to providing an overview of EDEN, we describe real-world studies using both point ensembles and global Community Land Model Version 4 (CLM4) simulations.
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data
Osborne, Jason W.; Fitzpatrick, David C.
Exploratory Factor Analysis (EFA) is a powerful and commonly-used tool for investigating the underlying variable structure of a psychometric instrument. However, there is much controversy in the social sciences with regard to the techniques used in EFA (Ford, MacCallum, & Tait, 1986; Henson & Roberts, 2006) and the reliability of the outcome.…
field of surface analysis attended the Workshop. The list of participants follows. 5! A, I Charles Anderson Albert L. Botkin Case Western Reserve...Louis, MO 63166 University of Dayton 300 College Park Richard Chase Dayton, OH 45469 Case Western Reserve University University Circle Brian E. P...Dayton, OH 45469 300 College Park Dayton, OH 45469 Richard W. Hoffman Case Western Reserve University Martin Kordesch Cleveland, OH 44106 Case Western
Çokluk, Ömay; Koçak, Duygu
In this study, the number of factors obtained from parallel analysis, a method used for determining the number of factors in exploratory factor analysis, was compared to that of the factors obtained from eigenvalue and scree plot--two traditional methods for determining the number of factors--in terms of consistency. Parallel analysis is based on…
Criminal justice education is a relatively new program in higher education in many countries, and its curriculum and parameters remain unsettled. An exploratory study investigated whether threshold concepts theory provided a useful lens by which to explore student understandings of this multidisciplinary field. Eight high-performing final-year…
Park, Hee Sun; Dailey, Rene; Lemus, Daisy
Discusses the distinct purposes of principal components analysis (PCA) and exploratory factor analysis (EFA), using two data sets as examples. Reviews the use of each technique in three major communication journals: "Communication Monographs,""Human Communication Research," and "Communication Research." Finds that the…
Chebana, Fateh; Dabo-Niang, Sophie; Ouarda, Taha B. M. J.
The prevention of flood risks and the effective planning and management of water resources require river flows to be continuously measured and analyzed at a number of stations. For a given station, a hydrograph can be obtained as a graphical representation of the temporal variation of flow over a period of time. The information provided by the hydrograph is essential to determine the severity of extreme events and their frequencies. A flood hydrograph is commonly characterized by its peak, volume, and duration. Traditional hydrological frequency analysis (FA) approaches focused separately on each of these features in a univariate context. Recent multivariate approaches considered these features jointly in order to take into account their dependence structure. However, all these approaches are based on the analysis of a number of characteristics and do not make use of the full information content of the hydrograph. The objective of the present work is to propose a new framework for FA using the hydrographs as curves: functional data. In this context, the whole hydrograph is considered as one infinite-dimensional observation. This context allows us to provide more effective and efficient estimates of the risk associated with extreme events. The proposed approach contributes to addressing the problem of lack of data commonly encountered in hydrology by fully employing all the information contained in the hydrographs. A number of functional data analysis tools are introduced and adapted to flood FA with a focus on exploratory analysis as a first stage toward a complete functional flood FA. These methods, including data visualization, location and scale measures, principal component analysis, and outlier detection, are illustrated in a real-world flood analysis case study from the province of Quebec, Canada.
Background: Associations between ozone (O3) and fine particulate matter (PM2.5) concentrations and birth outcomes have been previously demonstrated. We perform an exploratory analysis of O3 and PM2.5 concentrations during early pregnancy and multiple types of birth defects. Methods: Data on births were obtained from the Texas Birth Defects Registry and the National Birth Defects Prevention Study (NBDPS) in Texas. Air pollution concentrations were determined using a Bayesian hierarchical model that combined modeled air pollution concentrations with air monitoring data to create bias-corrected concentrations and matched to residential address at birth. Average air pollution concentrations during the first trimester were calculated. Results: The analysis generated hypotheses for future, confirmatory studies; although many of the observed associations between the air pollutants and birth defects were null. The hypotheses are provided by an observed association between O3 and craniosynostosis [adjusted OR 1.28 (95% CI 1.04, 1.58) per 13.3 ppb increase) and observed inverse associations between PM2.5 concentrations and septal heart defects and obstructive heart defects [adjusted ORs 0.79 (95% CI 0.75, 0.82) and 0.88 (95% CI 0.79, 0.97) per 5.0 µg/m3 increase, respectively] in the Texas Birth Defects Registry study. Septal heart defects and ventricular outflow tract obstructions were also examined using the NBDPS but the associations with PM2.5 were null [adj
John, M T; Reissmann, D R; Feuerstahler, L; Waller, N; Baba, K; Larsson, P; Celebić, A; Szabo, G; Rener-Sitar, K
Although oral health-related quality of life (OHRQoL) as measured by the Oral Health Impact Profile (OHIP) is thought to be multidimensional, the nature of these dimensions is not known. The aim of this report was to explore the dimensionality of the OHIP using the Dimensions of OHRQoL (DOQ) Project, an international study of general population subjects and prosthodontic patients. Using the project's Learning Sample (n = 5173), we conducted an exploratory factor analysis on the 46 OHIP items not specifically referring to dentures for 5146 subjects with sufficiently complete data. The first eigenvalue (27·0) of the polychoric correlation matrix was more than ten times larger than the second eigenvalue (2·6), suggesting the presence of a dominant, higher-order general factor. Follow-up analyses with Horn's parallel analysis revealed a viable second-order, four-factor solution. An oblique rotation of this solution revealed four highly correlated factors that we named Oral Function, Oro-facial Pain, Oro-facial Appearance and Psychosocial Impact. These four dimensions and the strong general factor are two viable hypotheses for the factor structure of the OHIP.
Lev, Elise L; Eller, Lucille Sanzero; Kolassa, John; Gejerman, Glen; Colella, Joan; Lane, Patricia; Scrofine, Suzanne; Esposito, Michael; Lanteri, Vincent; Scheuch, John; Munver, Ravi; Galli, Bernadette; Watson, Richard A; Sawczuk, Ihor
Strategies used by patients to promote health (SUPPH) was used to measure self-care self-efficacy in patients with cancer. The objectives of this study were (1) to determine the extent to which self-efficacy theory explained the factor structure of the SUPPH and (2) to determine the relationship of demographic data with factors of the SUPPH. Subjects were diagnosed with prostate cancer (PCa) and treated with either: (a) radical prostatectomy, (b) intensity modulated radiation therapy (IMRT) + radioactive seed implantation, or (c) IMRT + high dose rate. Subjects completed a demographic questionnaire and the SUPPH. Exploratory factor analysis of the SUPPH was performed using a varimax rotation. Subjects (n = 265) were predominately white and averaged 68 years of age. The model explained 81.3% of the total sum of eigenvalues. Two factors of the SUPPH were identified: physiological efficacy information and performance efficacy information. Younger subjects who were fully employed and earning more money had significantly higher performance self-efficacy than older subjects who were working part time and earning less money. Results are congruent with Bandura's (1997) description of self-efficacy. Use of the SUPPH may facilitate research validating Bandura's (1997) assertion that an individual's self-efficacy is related to quality of life (QOL) during chronic illness. Additional research focusing on self-efficacy and PCa patients' QOL may lead to efficacy enhancing interventions that will improve QOL of patients with PCa.
Burn, Christopher R.; Fox, Michael F.
Exploratory data analysis (EDA) gives students a feel for the data being considered. Four applications of EDA are discussed: the use of displays, resistant statistics, transformations, and smoothing. (RM)
McSweeney, Frances K.; Donahoe, Patricia; Swindell, Samantha
The status of women in applied behavior analysis was examined by comparing the participation of women in the Journal of Applied Behavior Analysis (JABA) to their participation in three similar journals. For all journals, the percentage of articles with at least one female author, the percentage of authors who are female, and the percentage of articles with a female first author increased from 1978 to 1997. Participation by women in JABA was equal to or greater than participation by women in the comparison journals. However, women appeared as authors on papers in special sections of Behavior Modification substantially more often when the editor was female than when the editor was male. In addition, female membership on the editorial boards of JABA, Behavior Modification, and Behaviour Research and Therapy failed to increase from 1978 to 1997. We conclude that a “glass ceiling” reduces the participation of women at the highest levels of applied behavior analysis and related fields. PMID:22478351
Eggerth, Donald E; Flynn, Michael A
Blustein mapped career decision making onto Maslow's model of motivation and personality and concluded that most models of career development assume opportunities and decision-making latitude that do not exist for many individuals from low income or otherwise disadvantaged backgrounds. Consequently, Blustein argued that these models may be of limited utility for such individuals. Blustein challenged researchers to reevaluate current career development approaches, particularly those assuming a static world of work, from a perspective allowing for changing circumstances and recognizing career choice can be limited by access to opportunities, personal obligations, and social barriers. This article represents an exploratory effort to determine if the theory of work adjustment (TWA) might meaningfully be used to describe the work experiences of Latino immigrant workers, a group living with severe constraints and having very limited employment opportunities. It is argued that there is significant conceptual convergence between Maslow's hierarchy of needs and the work reinforcers of TWA. The results of an exploratory, qualitative study with a sample of 10 Latino immigrants are also presented. These immigrants participated in key informant interviews concerning their work experiences both in the United States and in their home countries. The findings support Blustein's contention that such workers will be most focused on basic survival needs and suggest that TWA reinforcers are descriptive of important aspects of how Latino immigrant workers conceptualize their jobs.
Eggerth, Donald E.; Flynn, Michael A.
Blustein mapped career decision making onto Maslow’s model of motivation and personality and concluded that most models of career development assume opportunities and decision-making latitude that do not exist for many individuals from low income or otherwise disadvantaged backgrounds. Consequently, Blustein argued that these models may be of limited utility for such individuals. Blustein challenged researchers to reevaluate current career development approaches, particularly those assuming a static world of work, from a perspective allowing for changing circumstances and recognizing career choice can be limited by access to opportunities, personal obligations, and social barriers. This article represents an exploratory effort to determine if the theory of work adjustment (TWA) might meaningfully be used to describe the work experiences of Latino immigrant workers, a group living with severe constraints and having very limited employment opportunities. It is argued that there is significant conceptual convergence between Maslow’s hierarchy of needs and the work reinforcers of TWA. The results of an exploratory, qualitative study with a sample of 10 Latino immigrants are also presented. These immigrants participated in key informant interviews concerning their work experiences both in the United States and in their home countries. The findings support Blustein’s contention that such workers will be most focused on basic survival needs and suggest that TWA reinforcers are descriptive of important aspects of how Latino immigrant workers conceptualize their jobs. PMID:26345693
Lenzen, Benoit; Theunissen, Catherine; Cloes, Marc
This exploratory study aimed to investigate elements involved in decision making in team handball live situations and to provide coaches and educators with teaching recommendations. The study was positioned within the framework of the situated-action paradigm of which two aspects were of particular interest for this project: (a) the relationship…
Dyckman, Zachary; Harper, Michael; McMenamin, Peter
This article provides comments of three conference panel members on the analyses of the productivity adjustment used in the Medicare Economic Index (MEI), and on exploratory estimates of physician-specific productivity measures. Each has a different background and perspective. PMID:18435222
Lyons, Paul R.; DeCarlo, James F.
An exploratory study examined the job and life satisfaction of a sample of 32 female entrepreneurs residing in the tri-state area of Maryland, Pennsylvania, and West Virginia. To compare the entrepreneurs' concepts of life and job satisfaction to those of women in more traditional occupations, researchers also studied a sample of 32 female nursing…
Craig, Paul; Roa-Seïler, Néna
This paper describes a novel information visualization technique that combines multidimensional scaling and hierarchical clustering to support the exploratory analysis of multidimensional data. The technique displays the results of multidimensional scaling using a scatter plot where the proximity of any two items' representations is approximate to their similarity according to a Euclidean distance metric. The results of hierarchical clustering are overlaid onto this view by drawing smoothed outlines around each nested cluster. The difference in similarity between successive cluster combinations is used to colour code clusters and make stronger natural clusters more prominent in the display. When a cluster or group of items is selected, multidimensional scaling and hierarchical clustering are re-applied to a filtered subset of the data, and animation is used to smooth the transition between successive filtered views. As a case study we demonstrate the technique being used to analyse survey data relating to the appropriateness of different phrases to different emotionally charged situations.
Karlin, S; Williams, P T
A collection of functions that contrast familial trait values between and across generations is proposed for studying transmission effects and other collateral influences in nuclear families. Two classes of structured exploratory data analysis (SEDA) statistics are derived from ratios of these functions. SEDA-functionals are the empirical cumulative distributions of the ratio of the two contrasts computed within each family. SEDA-indices are formed by first averaging the numerator and denominator contrasts separately over the population and then forming their ratio. The significance of SEDA results are determined by a spectrum of permutation techniques that selectively shuffle the trait values across families. The process systematically alters certain family structure relationships while keeping other familial relationships intact. The methodology is applied to five data examples of plasma total cholesterol concentrations, reported height values, dermatoglyphic pattern intensity index scores, measurements of dopamine-beta-hydroxylase activity, and psychometric cognitive test results.
Karlin, S; Williams, P T
A collection of functions that contrast familial trait values between and across generations is proposed for studying transmission effects and other collateral influences in nuclear families. Two classes of structured exploratory data analysis (SEDA) statistics are derived from ratios of these functions. SEDA-functionals are the empirical cumulative distributions of the ratio of the two contrasts computed within each family. SEDA-indices are formed by first averaging the numerator and denominator contrasts separately over the population and then forming their ratio. The significance of SEDA results are determined by a spectrum of permutation techniques that selectively shuffle the trait values across families. The process systematically alters certain family structure relationships while keeping other familial relationships intact. The methodology is applied to five data examples of plasma total cholesterol concentrations, reported height values, dermatoglyphic pattern intensity index scores, measurements of dopamine-beta-hydroxylase activity, and psychometric cognitive test results. PMID:6475959
Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen
Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.
This dissertation presents a new tool for exploratory text analysis that attempts to improve the experience of navigating and exploring text and its metadata. The design of the tool was motivated by the unmet need for text analysis tools in the humanities and social sciences. In these fields, it is common for scholars to have hundreds or thousands…
Reviews the role of exploratory data analysis (EDA) for spatial data mining and presents a case study addressing environmental risk assessments in New York State to illustrate the feasibility and usability of augmenting seriation for spatial data analysis. Describes augmentation with multimedia tools to understand relationships among spatial,…
Ramaswami, Soundaram; Babo, Gerard
This paper presents and discusses the statistical analysis of the responses from an online survey administered to a sample of US superintendents (n=225) in an attempt to explore and authenticate the construct validity of the ISLLC 2008 Standards through exploratory factor analysis. Using a Principal Axis Factor method, 6 factors were extracted…
Kim, Hyunho; Ku, Boncho; Kim, Jong Yeol; Park, Young-Jae
Background. Phlegm pattern questionnaire (PPQ) was developed to evaluate and diagnose phlegm pattern in Korean Medicine and Traditional Chinese Medicine, but it was based on a dataset from patients who visited the hospital to consult with a clinician regarding their health without any strict exclusion or inclusion. In this study, we reinvestigated the construct validity of PPQ with a new dataset and confirmed the feasibility of applying it to a healthy population. Methods. 286 healthy subjects were finally included and their responses to PPQ were acquired. Confirmatory factor analysis (CFA) was conducted and the model fit was discussed. We extracted a new factor structure by exploratory factor analysis (EFA) and compared the two factor structures. Results. In CFA results, the model fit indices are acceptable (RMSEA = 0.074) or slightly less than the good fit values (CFI = 0.839, TLI = 0.860). Many average variances extracted were smaller than the correlation coefficients of the factors, which shows the somewhat insufficient discriminant validity. Conclusions. Through the results from CFA and EFA, this study shows clinically acceptable model fits and suggests the feasibility of applying PPQ to a healthy population with relatively good construct validity and internal consistency. PMID:27051447
Lucht, Martina; Heidig, Steffi
This article describes HOPSCOTCH, a design concept for an "exer-learning game" to engage elementary school children in learning. Exer-learning is a new genre of digital learning games that combines playing and learning with physical activity (exercise). HOPSCOTCH is a first design concept for exer-learning games that can be applied to…
Dombrowski, Stefan C; McGill, Ryan J; Canivez, Gary L
Exploratory and confirmatory factor analytic studies were not reported in the Technical Manual for the Woodcock-Johnson, 4th ed. Cognitive (WJ IV Cognitive; Schrank, McGrew, & Mather, 2014b) Instead, the internal structure of the WJ IV Cognitive was extrapolated from analyses based on the full WJ IV test battery (Schrank, McGrew, & Mather, 2014b). Even if the veracity of extrapolating from the WJ IV full battery were accepted, there were shortcomings in the choices of analyses used and only limited information regarding those analyses was presented in the WJ IV Technical Manual (McGrew, Laforte, & Shrank, 2014). The present study examined the structure of the WJ IV Cognitive using exploratory factor analysis procedures (principal axis factoring with oblique [promax] rotation followed by application of the Schmid-Leiman, 1957, procedure) applied to standardization sample correlation matrices for 2 school age groups (ages 9-13; 14-19). Four factors emerged for both the 9-13 and 14-19 age groups in contrast to the publisher's proposed 7 factors. Results of these analyses indicated a robust manifestation of general intelligence (g) that exceeded the variance attributed to the lower-order factors. Model-based reliability estimates supported interpretation of the higher-order factor (i.e., g). Additional analyses were conducted by forcing extraction of the 7 theoretically posited factors; however, the resulting solution was only partially aligned (i.e., Gs, Gwm) with the theoretical structure promoted in the Technical Manual and suggested the preeminence of the higher-order factor. Results challenge the hypothesized structure of the WJ IV Cognitive and raise concerns about its alignment with Cattell-Horn-Carroll theory. (PsycINFO Database Record
Champendal, Alexandre; Kanevski, Mikhail; Huguenot, Pierre-Emmanuel; Golay, Jean
Air pollution in the city is an important problem influencing environment, well-being of society, economy, management of urban zones, etc. The problem is extremely difficult due to a very complex distribution of the pollution sources, morphology of the city and dispersion processes leading to multivariate nature of the phenomena and high local spatial-temporal variability. The task of understanding, modelling and prediction of spatial-temporal patterns of air pollution in urban zones is an interesting and challenging topic having many research axes from science-based modelling to geostatistics and data mining. The present research mainly deals with a comprehensive exploratory analysis of spatial-temporal air pollution data using statistical, geostatistical and machine learning tools. This analysis helps to 1) understand and model spatial-temporal correlations using variography, 2) explore the temporal evolution of spatial correlation matrix; 3) analyse and visualize an interconnection between measurement stations using network science tools; 4) quantify the availability and predictability of structured patterns. The real data case study deals with spatial-temporal air pollution data of canton Geneva (2002-2011). Carbon dioxide (NO2) have caught our attention. It has effects on health: nitrogen dioxide can irritate the lungs, effects on plants; NO2 contributes to the phenomenon of acid rain. The negative effects of nitrogen dioxides on plants are reducing the growth, production and pesticide resistance. And finally the effects on materials: nitrogen dioxides increase the corrosion. Well-defined patterns of spatial-temporal correlations were detected. The analysis and visualization of spatial correlation matrix for 91 stations were carried out using the network science tools and high levels of clustering were revealed. Moving Window Correlation Matrix and Spatio-temporal variography methods were applied to define and explore the dynamic of our data. More than just
Utley, Cheryl A.
An exploratory factorial analysis of the Multicultural and Special Education Survey (MSES) evaluated the professional development training needs of general and special educators in a midwestern state. Survey items were selected from the culturally and linguistically diverse multicultural, bilingual and special education literature bases (CLD). The…
Exploratory factor analysis (EFA) methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many…
Curtis, Deborah A.; Araki, Cheri J.
The purpose of this research was to analyze recent statistics textbooks in the behavioral sciences in terms of their coverage of exploratory data analysis (EDA) philosophy and techniques. Twenty popular texts were analyzed. EDA philosophy was not addressed in the vast majority of texts. Only three texts had an entire chapter on EDA. None of the…
Wang, Haonan; Iyer, Hari
In this paper we discuss the use of a recent dimension reduction technique called Locally Linear Embedding, introduced by Roweis and Saul, for performing an exploratory latent structure analysis. The coordinate variables from the locally linear embedding describing the manifold on which the data reside serve as the latent variable scores. We…
Lee, Chun-Ting; Zhang, Guangjian; Edwards, Michael C.
Exploratory factor analysis (EFA) is often conducted with ordinal data (e.g., items with 5-point responses) in the social and behavioral sciences. These ordinal variables are often treated as if they were continuous in practice. An alternative strategy is to assume that a normally distributed continuous variable underlies each ordinal variable.…
Carey, John; Brigman, Greg; Webb, Linda; Villares, Elizabeth; Harrington, Karen
This article describes the development of the Student Engagement in School Success Skills instrument including item development and exploratory factor analysis. The instrument was developed to measure student use of the skills and strategies identified as most critical for long-term school success that are typically taught by school counselors.
Young, Anita; Bryan, Julia
This study examined the factor structure of the School Counselor Leadership Survey (SCLS). Survey development was a threefold process that resulted in a 39-item survey of 801 school counselors and school counselor supervisors. The exploratory factor analysis indicated a five-factor structure that revealed five key dimensions of school counselor…
Buley, Jerry L.
States that attacks by communication scholars have cast doubt on the validity of exploratory factor analysis (EFA). Tests EFA's ability to produce results that replicate known dimensions in a data set. Concludes that EFA should be viewed with cautious optimism and be evaluated according to the findings of this and similar studies. (PA)
Norris, Megan; Lecavalier, Luc
Exploratory factor analysis (EFA) is a widely used but poorly understood statistical procedure. This paper described EFA and its methodological variations. Then, key methodological variations were used to evaluate EFA usage over a 10-year period in five leading developmental disabilities journals. Sixty-six studies were located and evaluated on…
Adams, Daniel E.; Crumbly, Christopher M.; Delp, Steve E.; Guidry, Michelle A.; Lisano, Michael E.; Packard, James D.; Striepe, Scott A.
This report presents the unmanned Multiple Exploratory Probe Systems (MEPS), a space vehicle designed to observe the planet Mars in preparation for manned missions. The options considered for each major element are presented as a trade analysis, and the final vehicle design is defined.
Lennon, Patricia A.
This researcher examined the relationship of bureaucratic structure to school climate by means of an exploratory factor analysis of a measure of bureaucracy developed by Hoy and Sweetland (2000) and the four dimensional measure of climate developed by Hoy, Smith, and Sweetland (2002). Since there had been no other empirical studies whose authors…
Clemens, Elysia V.; Carey, John C.; Harrington, Karen M.
This article details the initial development of the School Counseling Program Implementation Survey and psychometric results including reliability and factor structure. An exploratory factor analysis revealed a three-factor model that accounted for 54% of the variance of the intercorrelation matrix and a two-factor model that accounted for 47% of…
A Metropolis-Hastings Robbins-Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The…
Myers, Nicholas D.; Ahn, Soyeon; Jin, Ying
The purpose of this study was to explore the influence of the number of targets specified on the quality of exploratory factor analysis solutions with a complex underlying structure and incomplete substantive measurement theory. Three Monte Carlo studies were performed based on the ratio of the number of observed variables to the number of…
Ang, Rebecca P.; Chong, Wan Har; Huan, Vivien S.; Yeo, Lay See
This article reports the development and initial validation of scores obtained from the Adolescent Concerns Measure (ACM), a scale which assesses concerns of Asian adolescent students. In Study 1, findings from exploratory factor analysis using 619 adolescents suggested a 24-item scale with four correlated factors--Family Concerns (9 items), Peer…
Whitcomb, Sara A.; Woodland, Rebecca H.; Barry, Shannon K.
An exploratory case study is presented in which social network analysis (SNA) was used to explore how school teaming structures influence the implementation of School-Wide Positive Behavioral Interventions and Supports (PBIS). The authors theorized that PBIS leadership teams that include members with connections to all other information-sharing…
Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong
This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…
Dückers, Michel L A; Olff, Miranda
Recent research suggests that greater country vulnerability is associated with a decreased, rather than increased, risk of mental health problems. Because societal parameters may have gender-specific implications, our objective was to explore whether the "vulnerability paradox" equally applies to women and men. Lifetime posttraumatic stress disorder (PTSD) prevalence data for women and men were retrieved from 11 population studies (N = 57,031): conducted in Australia, Brazil, Canada, France, Lebanon, Mexico, Netherlands, Portugal, Sweden, Switzerland, and the United States. We tested statistical models with vulnerability, gender, and their interaction as predictors. The average lifetime PTSD prevalence in women was at least twice as high as it was in men and the vulnerability paradox existed in the prevalence data for women and men (R(2) = .70). We could not confirm the possibility that gender effects are modified by socioeconomic and cultural country characteristics. Issues of methodology, language, and cultural validity complicate international comparisons. Nevertheless, this international sample points at a parallel paradox: The vulnerability paradox was confirmed for both women and men. The absence of a significant interaction between gender and country vulnerability implies that possible explanations for the paradox at the country-level do not necessarily require gender-driven distinction.
Dombrowski, Stefan C.
Two exploratory bifactor methods (e.g., Schmid-Leiman [SL] and exploratory bifactor analysis [EBFA]) were used to investigate the structure of the Woodcock-Johnson III (WJ-III) Cognitive in early school age (age 6-8). The SL procedure is recognized by factor analysts as a preferred method for EBFA. Jennrich and Bentler recently developed an…
Earth system scientists are being inundated by an explosion of data generated by ever-increasing resolution in both global models and remote sensors. Advanced tools for accessing, analyzing, and visualizing very large and complex climate data are required to maintain rapid progress in Earth system research. To meet this need, NASA, in collaboration with the Ultra-scale Visualization Climate Data Analysis Tools (UVCOAT) consortium, is developing exploratory climate data analysis and visualization tools which provide data analysis capabilities for the Earth System Grid (ESG). This paper describes DV3D, a UV-COAT package that enables exploratory analysis of climate simulation and observation datasets. OV3D provides user-friendly interfaces for visualization and analysis of climate data at a level appropriate for scientists. It features workflow inte rfaces, interactive 40 data exploration, hyperwall and stereo visualization, automated provenance generation, and parallel task execution. DV30's integration with CDAT's climate data management system (COMS) and other climate data analysis tools provides a wide range of high performance climate data analysis operations. DV3D expands the scientists' toolbox by incorporating a suite of rich new exploratory visualization and analysis methods for addressing the complexity of climate datasets.
Moskalev, Alexey; Zhikrivetskaya, Svetlana; Shaposhnikov, Mikhail; Dobrovolskaya, Evgenia; Gurinovich, Roman; Kuryan, Oleg; Pashuk, Aleksandr; Jellen, Leslie C.; Aliper, Alex; Peregudov, Alex; Zhavoronkov, Alex
Aging research is a multi-disciplinary field encompassing knowledge from many areas of basic, applied and clinical research. Age-related processes occur on molecular, cellular, tissue, organ, system, organismal and even psychological levels, trigger the onset of multiple debilitating diseases and lead to a loss of function, and there is a need for a unified knowledge repository designed to track, analyze and visualize the cause and effect relationships and interactions between the many elements and processes on all levels. Aging Chart (http://agingchart.org/) is a new, community-curated collection of aging pathways and knowledge that provides a platform for rapid exploratory analysis. Building on an initial content base constructed by a team of experts from peer-reviewed literature, users can integrate new data into biological pathway diagrams for a visible, intuitive, top-down framework of aging processes that fosters knowledge-building and collaboration. As the body of knowledge in aging research is rapidly increasing, an open visual encyclopedia of aging processes will be useful to both the new entrants and experts in the field. PMID:26602690
Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.
Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625
Lobos, Gustavo A; Poblete-Echeverría, Carlos
This article describes public, free software that provides efficient exploratory analysis of high-resolution spectral reflectance data. Spectral reflectance data can suffer from problems such as poor signal to noise ratios in various wavebands or invalid measurements due to changes in incoming solar radiation or operator fatigue leading to poor orientation of sensors. Thus, exploratory data analysis is essential to identify appropriate data for further analyses. This software overcomes the problem that analysis tools such as Excel are cumbersome to use for the high number of wavelengths and samples typically acquired in these studies. The software, Spectral Knowledge (SK-UTALCA), was initially developed for plant breeding, but it is also suitable for other studies such as precision agriculture, crop protection, ecophysiology plant nutrition, and soil fertility. Various spectral reflectance indices (SRIs) are often used to relate crop characteristics to spectral data and the software is loaded with 255 SRIs which can be applied quickly to the data. This article describes the architecture and functions of SK-UTALCA and the features of the data that led to the development of each of its modules.
Lobos, Gustavo A.; Poblete-Echeverría, Carlos
This article describes public, free software that provides efficient exploratory analysis of high-resolution spectral reflectance data. Spectral reflectance data can suffer from problems such as poor signal to noise ratios in various wavebands or invalid measurements due to changes in incoming solar radiation or operator fatigue leading to poor orientation of sensors. Thus, exploratory data analysis is essential to identify appropriate data for further analyses. This software overcomes the problem that analysis tools such as Excel are cumbersome to use for the high number of wavelengths and samples typically acquired in these studies. The software, Spectral Knowledge (SK-UTALCA), was initially developed for plant breeding, but it is also suitable for other studies such as precision agriculture, crop protection, ecophysiology plant nutrition, and soil fertility. Various spectral reflectance indices (SRIs) are often used to relate crop characteristics to spectral data and the software is loaded with 255 SRIs which can be applied quickly to the data. This article describes the architecture and functions of SK-UTALCA and the features of the data that led to the development of each of its modules. PMID:28119705
Esti, Marco; González Airola, Ricardo L; Moneta, Elisabetta; Paperaio, Marina; Sinesio, Fiorella
Grechetto is a traditional white-grape vine, widespread in Umbria and Lazio regions in central Italy. Despite the wine commercial diffusion, little literature on its sensory characteristics is available. The present study is an exploratory research conducted with the aim of identifying the sensory markers of Grechetto wine and of evaluating the effect of clone, geographical area, vintage and producer on sensory attributes. A qualitative sensory study was conducted on 16 wines, differing for vintage, Typical Geographic Indication, and clone, collected from 7 wineries, using a trained panel in isolation who referred to a glossary of 133 white wine descriptors. Sixty-five attributes identified by a minimum of 50% of the respondents were submitted to a correspondence analysis to link wine samples to the sensory attributes. Seventeen terms identified as common to all samples are considered as characteristics of Grechetto wine, 10 of which olfactory: fruity, apple, acacia flower, pineapple, banana, floral, herbaceous, honey, apricot and peach. In order to interpret the relationship between design variables and sensory attributes data on 2005 and 2006 wines, the 28 most discriminating descriptors were projected in a principal component analysis. The first principal component was best described by olfactory terms and the second by gustative attributes. Good reproducibility of results was obtained for the two vintages. For one winery, vintage effect (2002-2006) was described in a new principal component analysis model applied on 39 most discriminating descriptors, which globally explained about 84% of the variance. In the young wines the notes of sulphur, yeast, dried fruit, butter, combined with herbaceous fresh and tropical fruity notes (melon, grapefruit) were dominant. During wine aging, sweeter notes, like honey, caramel, jam, become more dominant as well as some mineral notes, such as tuff and flint.
Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.
The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…
Manukyan, Narine; Eppstein, Margaret J; Horbar, Jeffrey D; Leahy, Kathleen A; Kenny, Michael J; Mukherjee, Shreya; Rizzo, Donna M
We introduce a new method for exploratory analysis of large data sets with time-varying features, where the aim is to automatically discover novel relationships between features (over some time period) that are predictive of any of a number of time-varying outcomes (over some other time period). Using a genetic algorithm, we co-evolve (i) a subset of predictive features, (ii) which attribute will be predicted (iii) the time period over which to assess the predictive features, and (iv) the time period over which to assess the predicted attribute. After validating the method on 15 synthetic test problems, we used the approach for exploratory analysis of a large healthcare network data set. We discovered a strong association, with 100% sensitivity, between hospital participation in multi-institutional quality improvement collaboratives during or before 2002, and changes in the risk-adjusted rates of mortality and morbidity observed after a 1-2 year lag. The proposed approach is a potentially powerful and general tool for exploratory analysis of a wide range of time-series data sets.
Kilian, D; Lemmer, H J R; Gerber, M; du Preez, J L; du Plessis, J
Molecular weight and log P remain the most frequently used physicochemical properties in models that predict skin permeability. However, several reports over the past two decades have suggested that predictions made by these models may not be sufficiently accurate. In this study, exploratory data analysis of the probabilistic dependencies between molecular weight, log P and log Kp was performed on a dataset constructed from the combination of several popular datasets. The results suggest that, in general, molecular weight and log P are poorly correlated to log Kp. However, after employing several exploratory data analysis techniques, regions within the dataset of statistically significant dependence were identified. As an example of the applicability of the information extracted from the exploratory data analyses, a multiple linear regression model was constructed, bounded by the ranges of dependence. This model gave reasonable approximations to log Kp values obtained from skin permeability studies of selected non-steroidal ant-inflammatory drugs (NSAIDs) administered from a buffer solution and a lipid-based drug delivery system. A method of testing whether a given drug falls within the regions of statistical dependence was also presented. Knowing the ranges within which molecular weight and log P are statistically related to log Kp can supplement existing methods of screening, risk analysis or early drug development decision making to add confidence to predictions made regarding skin permeability.
Scott, Terry F.; Schumayer, Daniel; Gray, Andrew R.
We perform a factor analysis on a "Force Concept Inventory" (FCI) data set collected from 2109 respondents. We address two questions: the appearance of conceptual coherence in student responses to the FCI and some consequences of this factor analysis on the teaching of Newtonian mechanics. We will highlight the apparent conflation of Newton's…
Barcikowski, Robert S.; Elliott, Ronald S.
The contribution of individual variables to overall multivariate significance in a multivariate analysis of variance (MANOVA) is investigated using a combination of canonical discriminant analysis and Roy-Bose simultaneous confidence intervals. Difficulties with this procedure are discussed, and its advantages are illustrated using examples based…
Beare, Richard; Chen, Jian; Phan, Thanh G.
The summed Alberta Stroke Program Early CT Score (ASPECTS) is useful for predicting stroke outcome. The anatomical information in the CT template is rarely used for this purpose because traditional regression methods are not adept at handling collinearity (relatedness) among brain regions. While penalized logistic regression (PLR) can handle collinearity, it does not provide an intuitive understanding of the interaction among network structures in a way that eigenvector method such as PageRank can (used in Google search engine). In this exploratory analysis we applied graph theoretical analysis to explore the relationship among ASPECTS regions with respect to disability outcome. The Virtual International Stroke Trials Archive (VISTA) was searched for patients who had infarct in at least one ASPECTS region (ASPECTS ≤9, ASPECTS=10 were excluded), and disability (modified Rankin score/mRS). A directed graph was created from a cross correlation matrix (thresholded at false discovery rate of 0.01) of the ASPECTS regions and demographic variables and disability (mRS>2). We estimated the network-based importance of each ASPECTS region by comparing PageRank and node strength measures. These results were compared with those from PLR. There were 185 subjects, average age 67.5± 12.8 years (55% Males). Model 1: demographic variables having no direct connection with disability, the highest PageRank was M2 (0.225, bootstrap 95% CI 0.215-0.347). Model 2: demographic variables having direct connection with disability, the highest PageRank were M2 (0.205, bootstrap 95% CI 0.194-0.367) and M5 (0.125, bootstrap 95% CI 0.096-0.204). Both models illustrate the importance of M2 region to disability. The PageRank method reveals complex interaction among ASPECTS regions with respects to disability. This approach may help to understand the infarcted brain network involved in stroke disability. PMID:25961856
Krekmanova, Larisa; Hakeberg, Magnus; Robertson, Agneta; Klingberg, Gunilla
The aim of the study was to reduce everyday and dental treatment pain items included in the extended Children's Pain Inventory (CPI), used in a prior study on Swedish children and adolescents. Another aim was to, by means of exploratory factor analysis (EFA), expose hitherto undiscovered dimensions of the CPI pain variables and thus to improve the psychometric properties of CPI. As some pain items are relevant merely to some individuals, a new and more useful questionnaire construction would enhance the internal validity of the instrument in observational surveys. EFA was applied on the extended CPI instrument. 368 children, 8-19 years old, had answered a questionnaire comprising 10 dental and 28 everyday pain variables. These pain items were analysed using a series of sequentially implemented EFA. Interpretations and decisions on the final number of the extracted factors was based on accepted principles; Kaiser's Eigenvalue >1 criterion, inspection of the scree plot and the interpretability of the items loading. The factors were orthogonally rotated using the Varimax method to maximize the amount of variance. Of all tested EFA models in the analysis, a two, three, four, and five factor model surfaced. The interpretability of the factors and their items loading were stepwise examined; the items were modulated and the factors re-evaluated. A four factor pain model emerged as the most interpretable, explaining 79% of the total variance depicting Eigenvalues > 1.014. The factors were named indicating the profile of the content: Factor I cutting trauma to skin/mucosal pain, Factor II head/neck pain, Factor III tenderness/blunt trauma pain, Factor IV oral/dental treatment pain.
Mukherjee, Shashi Bajaj; Sen, Pradip Kumar
Studying periodic pattern is expected as a standard line of attack for recognizing DNA sequence in identification of gene and similar problems. But peculiarly very little significant work is done in this direction. This paper studies statistical properties of DNA sequences of complete genome using a new technique. A DNA sequence is converted to a numeric sequence using various types of mappings and standard Fourier technique is applied to study the periodicity. Distinct statistical behaviour of periodicity parameters is found in coding and non-coding sequences, which can be used to distinguish between these parts. Here DNA sequences of Drosophila melanogaster were analyzed with significant accuracy.
Steed, Chad Allen
EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can select a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.
Yemini, Miri; Sagie, Netta
Research on internationalisation in higher education has dramatically expanded over the last several decades. This study aims to provide an overview of the research developments undertaken between 1980 and 2014, on internationalisation in higher education. Explorative, systematic literature screening and analysis were undertaken, encompassing over…
Young, Arthur P.
"Library Quarterly's" seventy-fifth anniversary invites an analysis of the journal's bibliometric dimension, including contributor attributes, various author rankings, and citation impact. Eugene Garfield's HistCite software, linked to Thomson Scientific's Web of Science, as made available by Garfield, for the period 1956-2004, was used as the…
O'Brien, Rebecca; Pan, Xingyu; Courville, Troy; Bray, Melissa A.; Breaux, Kristina; Avitia, Maria; Choi, Dowon
Norm-referenced error analysis is useful for understanding individual differences in students' academic skill development and for identifying areas of skill strength and weakness. The purpose of the present study was to identify underlying connections between error categories across five language and math subtests of the Kaufman Test of…
Reddy, Lauren M.
In order to meet the increasing demand for manned and unmanned flight, the air transportation system must constantly evolve. As new technologies or operational procedures are conceived, we must determine their effect on humans in the system. In this research, we introduce a strategy to assess how individuals or organizations would respond to a novel aerospace system. We employ the most appropriate and sophisticated exploratory analysis techniques on the survey data to generate insight and identify significant variables. We employ three different methods for eliciting views from individuals or organizations who are affected by a system: an opinion survey, a stated preference survey, and structured interviews. We conduct an opinion survey of both the general public and stakeholders in the unmanned aircraft industry to assess their knowledge, attitude, and practices regarding unmanned aircraft. We complete a statistical analysis of the multiple-choice questions using multinomial logit and multivariate probit models and conduct qualitative analysis on free-text questions. We next present a stated preference survey of the general public on the use of an unmanned aircraft package delivery service. We complete a statistical analysis of the questions using multinomial logit, ordered probit, linear regression, and negative binomial models. Finally, we discuss structured interviews conducted on stakeholders from ANSPs and airlines operating in the North Atlantic. We describe how these groups may choose to adopt a new technology (space-based ADS-B) or operational procedure (in-trail procedures). We discuss similarities and differences between the stakeholders groups, the benefits and costs of in-trail procedures and space-based ADS-B as reported by the stakeholders, and interdependencies between the groups interviewed. To demonstrate the value of the data we generated, we explore how the findings from the surveys can be used to better characterize uncertainty in the cost
Thompson, David R.; Castano, Rebecca; Gilmore, Martha S.
Fast automated analysis of hyperspectral imagery can inform observation planning and tactical decisions during planetary exploration. Products such as mineralogical maps can focus analysts' attention on areas of interest and assist data mining in large hyperspectral catalogs. In this work, sparse spectral unmixing drafts mineral abundance maps with Compact Reconnaissance Imaging Spectrometer (CRISM) images from the Mars Reconnaissance Orbiter. We demonstrate a novel "superpixel" segmentation strategy enabling efficient unmixing in an interactive session. Tests correlate automatic unmixing results based on redundant spectral libraries against hand-tuned summary products currently in use by CRISM researchers.
Trenchard, M. H. (Principal Investigator)
Procedures and techniques for providing analyses of meteorological conditions at segments during the growing season were developed for the U.S./Canada Wheat and Barley Exploratory Experiment. The main product and analysis tool is the segment-level climagraph which depicts temporally meteorological variables for the current year compared with climatological normals. The variable values for the segment are estimates derived through objective analysis of values obtained at first-order station in the region. The procedures and products documented represent a baseline for future Foreign Commodity Production Forecasting experiments.
Karlin, S; Williams, P T; Carmelli, D; Cameron, E
Three structured exploratory data analysis-functionals are applied to plasma total cholesterol concentrations measured for 2,480 young men and women aged 17-18 years and living in Jerusalem, and for their parents. These triad families are divided into five groups according to whether both parents were born in Asia, North Africa, Europe-America, or Israel or whether they were of mixed "origins." The significances of the functionals were determined by a spectrum of permutation techniques that selectively shuffled the trait values across families in order to systematically alter certain family structure relationships while keeping other familial relationships intact. These analyses suggest that generational differences and various distributional effects influence patterns of spouse and parent-offspring interactions within these families and that the nature and forms of these effects and interactions may differ according to the origin of the parents. Results are discussed in relationship to historical and cultural differences among groups.
Vaidyanathan, Seetharaman; Fletcher, John S.; Henderson, Alex; Lockyer, Nicholas P.; Vickerman, John C.
The application of multivariate analytical tools enables simplification of TOF-SIMS datasets so that useful information can be extracted from complex spectra and images, especially those that do not give readily interpretable results. There is however a challenge in understanding the outputs from such analyses. The problem is complicated when analysing images, given the additional dimensions in the dataset. Here we demonstrate how the application of simple pre-processing routines can enable the interpretation of TOF-SIMS spectra and images. For the spectral data, TOF-SIMS spectra used to discriminate bacterial isolates associated with urinary tract infection were studied. Using different criteria for picking peaks before carrying out PC-DFA enabled identification of the discriminatory information with greater certainty. For the image data, an air-dried salt stressed bacterial sample, discussed in another paper by us in this issue, was studied. Exploration of the image datasets with and without normalisation prior to multivariate analysis by PCA or MAF resulted in different regions of the image being highlighted by the techniques.
Opaas, Marianne; Hartmann, Ellen
Fifty-one multitraumatized mental health patients with refugee backgrounds completed the Rorschach (Meyer & Viglione, 2008), Harvard Trauma Questionnaire, and Hopkins Symptom Checklist-25 (Mollica, McDonald, Massagli, & Silove, 2004), and the World Health Organization Quality of Life-BREF questionnaire (WHOQOL Group, 1998) before the start of treatment. The purpose was to gain more in-depth knowledge of an understudied patient group and to provide a prospective basis for later analyses of treatment outcome. Factor analysis of trauma-related Rorschach variables gave 2 components explaining 60% of the variance; the first was interpreted as trauma-related flooding versus constriction and the second as adequate versus impaired reality testing. Component 1 correlated positively with self-reported reexperiencing symptoms of posttraumatic stress (r = .32, p < .05). Component 2 correlated positively with self-reported quality of life in the physical, psychological, and social relationships domains (r = .34, .32, and .35, p < .05), and negatively with anxiety (r = -.33, p < .05). Each component also correlated significantly with resources like work experience, education, and language skills.
Epling, W. Frank; Pierce, W. David
We argue that applied behavior analysis is relevant to basic research. Modification studies, and a broad range of investigations that focus on the precipitating and maintaining conditions of socially significant human behavior, have basic importance. Applied behavior analysis may aid basic researchers in the design of externally valid experiments and thereby enhance the theoretical significance of basic research for understanding human behavior. Applied research with humans, directed at culturally-important problems, will help to propagate the science of human behavior. Such a science will also be furthered by analogue experiments that model socially important behavior. Analytical-applied studies and analogue experiments are forms of applied behavior analysis that could suggest new environment-behavior relationships. These relationships could lead to basic research and principles that further the prediction, control, and understanding of behavior. PMID:22478650
Hopkins, B. L.
Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…
often referred to as behavior modification ) which promotes improvements in human learning through an analysis of the contingencies surrounding a...Company, in press. Bandura, A. Principles of behavior modification . New York: Holt, Rinehart & Winston, 1969. Bostow, D.E., & Bailey, J.S. Modification of...tutors for kindergarten children. Journal of Applied Behavior Analysis, 1974, 7, 223-232. Kazdin, A.E. Behavior modification in applied settings
Smith, Richard J; Lehning, Amanda J; Dunkle, Ruth E
Accurate conceptualization and measurement of age-friendly community characteristics would help to reduce barriers to documenting the effects on elders of interventions to create such communities. This article contributes to the measurement of age-friendly communities through an exploratory factor analysis of items reflecting an existing US Environmental Protection Agency policy framework. From a sample of urban elders (n = 1,376), we identified 6 factors associated with demographic and health characteristics: access to business and leisure, social interaction, access to health care, neighborhood problems, social support, and community engagement. Future research should explore the effects of these factors across contexts and populations.
Hauben, Manfred; Hung, Eric; Hsieh, Wen-Yaw
Background: Severe cutaneous adverse reactions (SCARs) are prominent in pharmacovigilance (PhV). They have some commonalities such as nonimmediate nature and T-cell mediation and rare overlap syndromes have been documented, most commonly involving acute generalized exanthematous pustulosis (AGEP) and drug rash with eosinophilia and systemic symptoms (DRESS), and DRESS and toxic epidermal necrolysis (TEN). However, they display diverse clinical phenotypes and variations in specific T-cell immune response profiles, plus some specific genotype–phenotype associations. A question is whether causation of a given SCAR by a given drug supports causality of the same drug for other SCARs. If so, we might expect significant intercorrelations between SCARs with respect to overall drug-reporting patterns. SCARs with significant intercorrelations may reflect a unified underlying concept. Methods: We used exploratory factor analysis (EFA) on data from the United States Food and Drug Administration Adverse Event Reporting System (FAERS) to assess reporting intercorrelations between six SCARs [AGEP, DRESS, erythema multiforme (EM), Stevens–Johnson syndrome (SJS), TEN, exfoliative dermatitis (ExfolDerm)]. We screened the data using visual inspection of scatterplot matrices for problematic data patterns. We assessed factorability via Bartlett’s test of sphericity, Kaiser-Myer-Olkin (KMO) statistic, initial estimates of communality and the anti-image correlation matrix. We extracted factors via principle axis factoring (PAF). The number of factors was determined by scree plot/Kaiser’s rule. We also examined solutions with an additional factor. We applied various oblique rotations. We assessed the strength of the solution by percentage of variance explained, minimum number of factors loading per major factor, the magnitude of the communalities, loadings and crossloadings, and reproduced- and residual correlations. Results: The data were generally adequate for factor analysis
Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.
Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic
Reeve, Kenneth F; Reeve, Sharon A
Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis.
Navidpour, Fariba; Dolatian, Mahrokh; Shishehgar, Sara; Yaghmaei, Farideh; Majd, Hamid Alavi; Hashemi, Seyed Saeed
Introduction Biological, environmental, inter- and intrapersonal changes during the antenatal period can result in anxiety and stress in pregnant women. It is pivotal to identify potential stressors and prevent their foetal and maternal consequences. The present study was conducted to validate and examine the factor structure of the Farsi version of the Pregnancy Worries and Stress Questionnaire (PWSQ). Methods In 2015, 502 Iranian healthy pregnant women, referred to selected hospitals in Tehran for prenatal care at 8–39 weeks of pregnancy, were recruited through a randomized cluster sampling. The PWSQ was translated into Farsi, and its validity and reliability were examined using exploratory factor analysis by SPSS version 21. Results The content validity of items on the PWSQ was between 0.63–1. The content validity index for relevance, clarity and simplicity were 0.92, 0.98, and 0.98, respectively, with a mean of 0.94. The Kaiser–Meyer–Olkin measure of sampling adequacy was 0.863. Test–retest reliability showed high internal consistency (α=0.89; p<0.0001) Conclusion The psychometric evaluation and exploratory factor analysis showed that the translated questionnaire is a valid and reliable tool to identify stress in Iranian pregnant women. Application of the questionnaire can facilitate the diagnosis of stress in pregnant women and assist health care providers in providing timely support and minimizing negative outcomes of stress and anxiety in pregnant women and their infants. PMID:27957315
Pratama, Ahmad R.
This paper explores the relationship between world ICT and education indicators by using the latest available data from World Bank and UNESCO in range of 2011-2014 with the help of different exploratory methods such as principal component analysis (PCA), factor analysis (FA), cluster analysis, and ordinary least square (OLS) regression. After dealing with all missing values, 119 countries were included in the final dataset. The findings show that most ICT and education indicators are highly associated with income of the respective country and therefore confirm the existence of digital divide in ICT utilization and participation gap in education between rich and poor countries. It also indicates that digital divide and participation gap is highly associated with each other. Finally, the findings also confirm reverse causality in ICT and education; higher participation rate in education increases technology utilization, which in turn helps promote better outcomes of education.
Deitz, Samuel M.
This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557
Ruscio, John; Roche, Brendan
Exploratory factor analysis (EFA) is used routinely in the development and validation of assessment instruments. One of the most significant challenges when one is performing EFA is determining how many factors to retain. Parallel analysis (PA) is an effective stopping rule that compares the eigenvalues of randomly generated data with those for…
Yu, Taeho; Richardson, Jennifer C.
The purpose of this study was to develop an effective instrument to measure student readiness in online learning with reliable predictors of online learning success factors such as learning outcomes and learner satisfaction. The validity and reliability of the Student Online Learning Readiness (SOLR) instrument were tested using exploratory factor…
Haned, H; Slooten, K; Gill, P
The interpretation of DNA mixtures has proven to be a complex problem in forensic genetics. In particular, low template DNA samples, where alleles can be missing (allele drop-out), or where alleles unrelated to the crime-sample are amplified (allele drop-in), cannot be analysed with classical approaches such as random man not excluded or random match probability. Drop-out, drop-in, stutters and other PCR-related stochastic effects, create uncertainty about the composition of the crime-sample, making it difficult to attach a weight of evidence when (a) reference sample(s) is (are) compared to the crime-sample. In this paper, we use a probabilistic model to calculate likelihood ratios when there is uncertainty about the composition of the crime-sample. This model is essentially exploratory in the sense that it allows the exploration of LRs when two key-parameters, drop-out and drop-in are varied within their plausible ranges of variation. We build on the work of Curran et al., and improve their probabilistic model to allow more flexibility in the way the model parameters are applied. Two new main modifications are brought to their model: (i) different drop-out probabilities can be applied to different contributors, and (ii) different parameters can be used under the prosecution and the defence hypotheses. We illustrate how the LRs can be explored when the drop-out and drop-in parameters are varied, and suggest the use of Monte Carlo simulations to derive plausible ranges for the probability of drop-out. Although the model is suited for both high and low template samples, we illustrate the advantages of the exploratory approach through two DNA mixtures (involving two and at least three individuals) with low template components.
Ang, Rebecca P; Chong, Wan Har; Huan, Vivien S; Yeo, Lay See
This article reports the development and initial validation of scores obtained from the Adolescent Concerns Measure (ACM), a scale which assesses concerns of Asian adolescent students. In Study 1, findings from exploratory factor analysis using 619 adolescents suggested a 24-item scale with four correlated factors--Family Concerns (9 items), Peer Concerns (5 items), Personal Concerns (6 items), and School Concerns (4 items). Initial estimates of convergent validity for ACM scores were also reported. The four-factor structure of ACM scores derived from Study 1 was confirmed via confirmatory factor analysis in Study 2 using a two-fold cross-validation procedure with a separate sample of 811 adolescents. Support was found for both the multidimensional and hierarchical models of adolescent concerns using the ACM. Internal consistency and test-retest reliability estimates were adequate for research purposes. ACM scores show promise as a reliable and potentially valid measure of Asian adolescents' concerns.
Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.
Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990
Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold
We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…
Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.
This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…
Iwata, Brian A.
To the extent that applied behavior analysis represents a scientific and practical approach to the study of behavior, its technological character is essential. The most serious problem evident in the field is not that the research being done is too technical but that more good research of all types is needed. (JDD)
Young, Jacy L; Green, Christopher D
In this article, we present the results of an exploratory digital analysis of the contents of the two journals founded in the late 19th century by American psychologist G. Stanley Hall. Using the methods of the increasingly popular digital humanities, some key attributes of the American Journal of Psychology (AJP) and the Pedagogical Seminary (PS) are identified. Our analysis reaffirms some of Hall's explicit aims for the two periodicals, while also revealing a number of other features of the journals, as well as of the people who published within their pages, the methodologies they employed, and the institutions at which they worked. Notably, despite Hall's intent that his psychological journal be strictly an outlet for scientific research, the journal-like its sister pedagogically focused publication-included an array of methodologically diverse research. The multiplicity of research styles that characterize the content of Hall's journals in their initial years is, in part, a consequence of individual researchers at times crossing methodological lines and producing a diverse body of research. Along with such variety within each periodical, it is evident that the line between content appropriate to one periodical rather than the other was fluid rather than absolute. The full results of this digitally informed analysis of Hall's two journals suggest a number of novel avenues for future research and demonstrate the utility of digital methods as applied to the history of psychology. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Lininger, Monica R; Smith, Craig A; Chimera, Nicole J; Hoog, Philipp; Warren, Meghan
Lininger, MR, Smith, CA, Chimera, NJ, Hoog, P, and Warren, M. Tuck Jump Assessment: An exploratory factor analysis in a college age population. J Strength Cond Res 31(3): 653-659, 2017-Due to the high rate of noncontact lower extremity injuries that occur in the collegiate setting, medical personnel are implementing screening mechanisms to identify those athletes that may be at risk for certain injuries before starting a sports season. The tuck jump assessment (TJA) was created as a "clinician friendly" tool to identify lower extremity landing technique flaws during a plyometric activity. There are 10 technique flaws that are assessed as either having the apparent deficit or not during the TJA. Technique flaws are then summed up for an overall score. Through expert consensus, these 10 technique flaws have been grouped into 5 modifiable risk factors: ligament dominance, quadriceps dominance, leg dominance or residual injury deficits, trunk dominance ("core" dysfunction), and technique perfection. Research has not investigated the psychometric properties of the TJA technique flaws or the modifiable risk factors. The present study is a psychometric analysis of the TJA technique flaws to measure the internal structure using an exploratory factor analysis (EFA) using data from collegiate athletes (n = 90) and a general college cohort (n = 99). The EFA suggested a 3 factor model accounting for 46% of the variance. The 3 factors were defined as fatigue, distal landing pattern, and proximal control. The results differ from the 5 modifiable risk categories as previously suggested. These results may question the use of a single score, a unidimensional construct, of the TJA for injury screening.
Muñoz, E; Argüelles, D; Areste, L; Miguel, L San; Prades, M
The medical records of 468 horses that underwent 490 exploratory laparotomies for the correction of gastrointestinal diseases were reviewed to search for differences between Andalusian horses and other breeds. The seasonal distribution of surgical colics and their outcome and complications were also investigated. Bivariant analysis was used to compare the horses' age, gender and breed with the type of surgery, the bowel affected and the type of colic, and all these variables were compared in relation to euthanasia during surgery, complications, short-term survival and seasonal distribution. A total of 405 horses survived the surgery and 329 were discharged from the hospital. Horses less than one year old had better short-term survival than older horses. Andalusian horses suffered more inguinal hernias than the other breeds and were more prone to suffer laminitis as a complication. Colic surgery and inguinal hernias were also more common in the summer.
Dembo, Richard; Wareham, Jennifer; Schmeidler, James; Winters, Ken C.
Research on samples of truant adolescents is limited, with little known about mental health problems among truant youths. This study provided an exploratory, multilevel examination of mental health problems for a sample of 300 truant adolescents. Confirmatory factor analysis indicated a single factor of multiple mental health problems at the…
Rupp, Andre A.
This article presents a novel exploratory multigroup approach that quantifies relative group differences within an item response theory framework using tools from functional data analysis. Specifically, examinee groups are formed using different clustering methodologies based on background and attitudinal variable profiles. Item parameters for the…
Dombrowski, Stefan C.
The Woodcock-Johnson-III cognitive in the adult time period (age 20 to 90 plus) was analyzed using exploratory bifactor analysis via the Schmid-Leiman orthogonalization procedure. The results of this study suggested possible overfactoring, a different factor structure from that posited in the Technical Manual and a lack of invariance across both…
Lorenzo-Seva, Urbano; Ferrando, Pere J.
FACTOR 9.2 was developed for three reasons. First, exploratory factor analysis (FA) is still an active field of research although most recent developments have not been incorporated into available programs. Second, there is now renewed interest in semiconfirmatory (SC) solutions as suitable approaches to the complex structures are commonly found…
Morey, R C; Fine, D J; Loree, S W; Retzlaff-Roberts, D L; Tsubakitani, S
The debate concerning quality of care in hospitals, its "value" and affordability, is increasingly of concern to providers, consumers, and purchasers in the United States and elsewhere. We undertook an exploratory study to estimate the impact on hospital-wide costs if quality-of-care levels were varied. To do so, we obtained costs and service output data regarding 300 U.S. hospitals, representing approximately a 5% cross section of all hospitals operating in 1983; both inpatient and outpatient services were included. The quality-of-care measure used for the exploratory analysis was the ratio of actual deaths in the hospital for the year in question to the forecasted number of deaths for the hospital; the hospital mortality forecaster had earlier (and elsewhere) been built from analyses of 6 million discharge abstracts, and took into account each hospital's actual individual admissions, including key patient descriptors for each admission. Such adjusted death rates have increasingly been used as potential indicators of quality, with recent research lending support for the viability of that linkage. The authors then utilized the economic construct of allocative efficiency relying on "best practices" concepts and peer groupings, built using the "envelopment" philosophy of Data Envelopment Analysis and Pareto efficiency. These analytical techniques estimated the efficiently delivered costs required to meet prespecified levels of quality of care. The marginal additional cost per each death deferred in 1983 was estimated to be approximately $29,000 (in 1990 dollars) for the average efficient hospital. Also, over a feasible range, a 1% increase in the level of quality of care delivered was estimated to increase hospital cost by an average of 1.34%. This estimated elasticity of quality on cost also increased with the number of beds in the hospital.
Wallington, Sherrie Flynt; Blake, Kelly D; Taylor-Clark, Kalahn; Viswanath, K
News coverage of health topics influences knowledge, attitudes, and behaviors at the individual level, and agendas and actions at the institutional and policy levels. Because disparities in health often are the result of social inequalities that require community-level or policy-level solutions, news stories employing a health disparities news frame may contribute to agenda-setting among opinion leaders and policymakers and lead to policy efforts aimed at reducing health disparities. This study objective was to conduct an exploratory analysis to qualitatively describe barriers that health journalists face when covering health disparities in local media. Between June and October 2007, 18 journalists from television, print, and radio in Boston, Lawrence, and Worcester, Massachusetts, were recruited using a purposive sampling technique. In-depth, semi-structured interviews were conducted by telephone, and the crystallization/immersion method was used to conduct a qualitative analysis of interview transcripts. Our results revealed that journalists said that they consider several angles when developing health stories, including public impact and personal behavior change. Challenges to employing a health disparities frame included inability to translate how research findings may impact different socioeconomic groups, and difficulty understanding how findings may translate across racial/ethnic groups. Several journalists reported that disparities-focused stories are "less palatable" for some audiences. This exploratory study offers insights into the challenges that local news media face in using health disparities news frames in their routine coverage of health news. Public health practitioners may use these findings to inform communication efforts with local media in order to advance the public dialogue about health disparities.
Maznah, Zainol; Halimah, Muhamad; Shitan, Mahendran; Kumar Karmokar, Provash; Najwa, Sulaiman
Ganoderma boninense is a fungus that can affect oil palm trees and cause a serious disease called the basal stem root (BSR). This disease causes the death of more than 80% of oil palm trees midway through their economic life and hexaconazole is one of the particular fungicides that can control this fungus. Hexaconazole can be applied by the soil drenching method and it will be of interest to know the concentration of the residue in the soil after treatment with respect to time. Hence, a field study was conducted in order to determine the actual concentration of hexaconazole in soil. In the present paper, a new approach that can be used to predict the concentration of pesticides in the soil is proposed. The statistical analysis revealed that the Exploratory Data Analysis (EDA) techniques would be appropriate in this study. The EDA techniques were used to fit a robust resistant model and predict the concentration of the residue in the topmost layer of the soil. PMID:28060816
This thesis represents a step forward to bring geometry parameterization and control on par with the disciplinary analyses involved in shape optimization, particularly high-fidelity aerodynamic shape optimization. Central to the proposed methodology is the non-uniform rational B-spline, used here to develop a new geometry generator and geometry control system applicable to the aerodynamic design of both conventional and unconventional aircraft. The geometry generator adopts a component-based approach, where any number of predefined but modifiable (parametric) wing, fuselage, junction, etc., components can be arbitrarily assembled to generate the outer mold line of aircraft geometry. A unique Python-based user interface incorporating an interactive OpenGL windowing system is proposed. Together, these tools allow for the generation of high-quality, C2 continuous (or higher), and customized aircraft geometry with fast turnaround. The geometry control system tightly integrates shape parameterization with volume mesh movement using a two-level free-form deformation approach. The framework is augmented with axial curves, which are shown to be flexible and efficient at parameterizing wing systems of arbitrary topology. A key aspect of this methodology is that very large shape deformations can be achieved with only a few, intuitive control parameters. Shape deformation consumes a few tenths of a second on a single processor and surface sensitivities are machine accurate. The geometry control system is implemented within an existing aerodynamic optimizer comprising a flow solver for the Euler equations and a sequential quadratic programming optimizer. Gradients are evaluated exactly with discrete-adjoint variables. The algorithm is first validated by recovering an elliptical lift distribution on a rectangular wing, and then demonstrated through the exploratory shape optimization of a three-pronged feathered winglet leading to a span efficiency of 1.22 under a height
McGill, Ryan J.
The present study examined the structure of the Comprehensive Test of Nonverbal Intelligence-Second Edition (CTONI-2) normative sample using exploratory factor analysis, multiple factor extraction criteria, and higher-order exploratory factor analytic techniques that were not reported in the in the CTONI-2 "Examiner's Manual". Results…
Casarrubea, Maurizio; Sorbera, Filippina; Crescimanno, Giuseppe
The aim of the present paper is to study by means of a multivariate analysis the modifications induced by an environmental acoustic cue on the structure of rat exploratory behavior. Adult male Wistar rats were observed during the exploration of a soundproof observation box. Each rat was acoustically stimulated after 150 s from the beginning of the experimental session, lasting 300 s, and recorded through a digital videocamera. A frame by frame analysis was thereafter carried out using a professional video-recording system. Thirteen behavioral patterns were selected: immobility, immobile-sniffing, walking, rearing, climbing, chewing, paw-licking, face-grooming, body-grooming, head-turning, tuning, oriented-sniffing, focusing. Both descriptive and multivariate analyses (cluster, stochastic, adjusted residuals) were carried out. Through descriptive statistical analysis, latencies and per cent distribution of each pattern were studied. A multivariate cluster analysis revealed the presence of three main behavioral clusters, an additional one being identified following acoustic stimulation. Multivariate stochastic analysis showed that all the patterns converged on immobile-sniffing which could represent a key component in behavioral switching processes related to environmental exploration. Moreover, through adjusted residuals, the degree of relationship among different patterns was shown according to statistic Z-distribution. Our data assign new ethological meanings to different behavioral patterns. Notably, head-turning is suggested to be considered as a generic directional search and tuning as a subtle activity of stimulus localization.
Huang, Ji-Xia; Wang, Jin-Feng; Li, Zhong-Jie; Wang, Yan; Lai, Sheng-Jie; Yang, Wei-Zhong
Objectives In epidemiological research, major studies have focused on theoretical models; however, few methods of visual analysis have been used to display the patterns of disease distribution. Design For this study, a method combining the space-time cube (STC) with space-time scan statistics (STSS) was used to analyze the pattern of incidence of hand-foot-mouth disease (HFMD) in Guangdong Province from May 2008 to March 2009. In this research, STC was used to display the spatiotemporal pattern of incidence of HFMD, and STSS were used to detect the local aggregations of the disease. Setting The hand-foot-mouth disease data were obtained from Guangdong Province from May 2008 to March 2009, with a total of 68,130 cases. Results The STC analysis revealed a differential pattern of HFMD incidence among different months and cities and also showed that the population density and average precipitation are correlated with the incidence of HFMD. The STSS analysis revealed that the most likely aggregation includes the Shenzhen, Foshan and Dongguan populations, which are the most developed regions in Guangdong Province. Conclusion Both STC and STSS are efficient tools for the exploratory data analysis of disease transmission. STC clearly displays the spatiotemporal patterns of disease. Using the maximum likelihood ratio, the STSS model precisely locates the most likely aggregation. PMID:26605919
Nettles, Jenesta Rae
The Electronic Quality of Inquiry Protocol (EQUIP) is an instrument to measure the amount and quality of inquiry instruction of science and mathematics classrooms. The EQUIP rubric includes detailed descriptions for the inquiry practices that it measures. This study analyzed a large dataset of Texas science teacher observations using a discriminant factor analysis and exploratory factor analysis. The discriminant factor analysis found the inquiry instruction practices of primary (1st-5th grade) science teachers and secondary (6th-12th grade) science teachers to differ greatly, with primary science teachers using inquiry practices more often than secondary science teachers. Based on this information the primary and secondary datasets were analyzed separately with the exploratory factor analysis. The resulting factors indicate a variety of differences in the observed practices. The results also, suggest that discourse practices play a divergent but influential role for both datasets. Potential implications for professional development and future research are discussed.
Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A
This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452
Langer, William D.; Wilson, Robert W.; Anderson, Charles H.
The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.
Barrick, Thomas R; Mackay, Clare E; Prima, Sylvain; Maes, Frederik; Vandermeulen, Dirk; Crow, Timothy J; Roberts, Neil
Leftward occipital and rightward frontal lobe asymmetry (brain torque) and leftward planum temporale asymmetry have been consistently reported in postmortem and in vivo neuroimaging studies of the human brain. Here automatic image analysis techniques are applied to quantify global and local asymmetries, and investigate the relationship between brain torque and planum temporale asymmetries on T1-weighted magnetic resonance (MR) images of 30 right-handed young healthy subjects (15 male, 15 female). Previously described automatic cerebral hemisphere extraction and 3D interhemispheric reflection-based methods for studying brain asymmetry are applied with a new technique, LowD (Low Dimension), which enables automatic quantification of brain torque. LowD integrates extracted left and right cerebral hemispheres in columns orthogonal to the midsagittal plane (2D column maps), and subsequently integrates slices along the brain's anterior-posterior axis (1D slice profiles). A torque index defined as the magnitude of occipital and frontal lobe asymmetry is computed allowing exploratory investigation of relationships between this global asymmetry and local asymmetries found in the planum temporale. LowD detected significant torque in the 30 subjects with occipital and frontal components found to be highly correlated (P<0.02). Significant leftward planum temporale asymmetry was detected (P<0.05), and the torque index correlated with planum temporale asymmetry (P<0.001). However, torque and total brain volume were not correlated. Therefore, although components of cerebral asymmetry may be related, their magnitude is not influenced by total hemisphere volume. LowD provides increased sensitivity for detection and quantification of brain torque on an individual subject basis, and future studies will apply these techniques to investigate the relationship between cerebral asymmetry and functional laterality.
Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.
Revicki, Dennis A.; Cook, Karon F.; Amtmann, Dagmar; Harnam, Neesha; Chen, Wen-Hung; Keefe, Francis J.
Purpose The assessment of pain sensation and quality is a key component in understanding the experience of individuals with chronic pain. This study evaluated the factor structure of the Patient-reported Outcome Measurement Information System (PROMIS) pain quality item bank. Methods As part of the PROMIS project, we developed a pool of 37 pain quality items, based on a review of existing pain questionnaires and development of new items. A Web-based survey was designed and completed by 845 members of the general population and 967 individuals with different types of chronic pain. Exploratory factor analysis (EFA) was conducted on a random split-half sample of the data to examine the factor structure of the 37 PROMIS pain quality items in the general population and in a chronic pain sample. A confirmatory factor analysis was conducted in the holdout sample. Results The EFA of the pain quality items resulted in comparable six-factor solutions for the general and chronic pain samples: (1) pulling/tugging pain; (2) tingling/numbness pain; (3) sharp/stabbing pain; (4) dull/aching pain; (5) pounding/pulsing pain; and (6) affective pain. The confirmatory factor analysis in the holdout sample supported this factor structure. Conclusions Further research is needed to evaluate the psychometric characteristics of the derived scales based on their factor scores. PMID:23836435
Schaefer, C.; Young, M.; Mason, S.; Coble, C.; Wear, M. L.; Sargsyan, A.; Garcia, K.; Law. J.; Alexander, D.; Ryder, V. Myers; Van Baalen, M.
Carbon dioxide (CO2) levels on ISS have typically averaged 2.3 to 5.3mm Hg, with large fluctuations occurring over periods of hours and days. CO2 has effects on cerebral vascular tone, resulting in vasodilation and alteration of cerebral blood flow(CBF). Increased CBF leads to elevated intracranial pressure(ICP), which is a factor leading to visual disturbance, headaches, and other central nervous system symptoms. Ultrasound of the optic nerve provides a surrogate measurement of ICP. Inflight ultrasounds were implemented as an enhanced screening tool for the Visual Impairment/Intracranial Pressure (VIIP) Syndrome. This analysis examines the relationships between ambient CO2 levels on ISS and ultrasound measures of the eye in an effort to understand how CO2 may be associated with VIIP and to inform future analysis of inflight VIIP data. Results as shown in Figure2, there was a large timeframe where CO2 readings were removed due to sensor fault errors(see Limitations), from June 2011 to January 2012. After extensive cleaning of the CO2 data, metrics for all of the data were calculated (Table2). Preliminary analyses showed possible associations between variability measures of CO2 and AP diameter (Figure3),and average CO2 exposure and ONSD(Figure4). Adjustments for multiple comparisons were not made due to the exploratory nature of the analysis.
Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene
Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software.
Sen Sarma, Moushumi; Arcoleo, David; Khetani, Radhika S; Chee, Brant; Ling, Xu; He, Xin; Jiang, Jing; Mei, Qiaozhu; Zhai, ChengXiang; Schatz, Bruce
With the rapid decrease in cost of genome sequencing, the classification of gene function is becoming a primary problem. Such classification has been performed by human curators who read biological literature to extract evidence. BeeSpace Navigator is a prototype software for exploratory analysis of gene function using biological literature. The software supports an automatic analogue of the curator process to extract functions, with a simple interface intended for all biologists. Since extraction is done on selected collections that are semantically indexed into conceptual spaces, the curation can be task specific. Biological literature containing references to gene lists from expression experiments can be analyzed to extract concepts that are computational equivalents of a classification such as Gene Ontology, yielding discriminating concepts that differentiate gene mentions from other mentions. The functions of individual genes can be summarized from sentences in biological literature, to produce results resembling a model organism database entry that is automatically computed. Statistical frequency analysis based on literature phrase extraction generates offline semantic indexes to support these gene function services. The website with BeeSpace Navigator is free and open to all; there is no login requirement at www.beespace.illinois.edu for version 4. Materials from the 2010 BeeSpace Software Training Workshop are available at www.beespace.illinois.edu/bstwmaterials.php.
Zandieh, Ali; Kahaki, Zahra Zeynali; Sadeghian, Homa; Pourashraf, Maryam; Parviz, Sara; Ghaffarpour, Majid; Ghabaee, Mojdeh
The underlying structure of National Institutes of Health Stroke Scale (NIHSS) as the most widely used scale in clinical trials has been the focus of little attention. The aim of the current study was to elucidate the clustering pattern of NIHSS items in ischemic stroke patients. A series of 152 consecutive patients with first-ever ischemic strokes admitted to a university affiliated hospital were enrolled. NIHSS score was estimated on admission and correlation coefficients between its items were calculated. Further, exploratory factor analysis was used to study the clustering pattern of NIHSS items. Extinction neglect, visual field, and facial palsy were weakly associated with other NIHSS items. Factor analysis led to a four-factor structure. Factors 1 and 3 were determined by left brain function as items of right arm and leg motor, language and dysarthria loaded on both of them. By contrast, factor 2 reflected right brain involvement. Since visual field and ataxia loaded on factor 4, this factor was primarily associated with posterior strokes. Our study shows that a four-factor structure model is plausible for NIHSS. Further, for the first time, a single distinct factor is identified for posterior strokes.
Farrokhi, Farahman; Mahdavi, Ali; Moradi, Samad
Objective The present study aimed at validating the structure of Career Decision-making Difficulties Questionnaire (CDDQ). Methods Five hundred and eleven undergraduate students took part in this research; from these participants, 63 males and 200 females took part in the first study, and 63 males and 185 females completed the survey for the second study. Results The results of exploratory factor analysis (EFA) indicated strong support for the three-factor structure, consisting of lack of information about the self, inconsistent information, lack of information and lack of readiness factors. A confirmatory factor analysis was run with the second sample using structural equation modeling. As expected, the three-factor solution provided a better fit to the data than the alternative models. Conclusion CDDQ was recommended to be used for college students in this study due to the fact that this instrument measures all three aspects of the model. Future research is needed to learn whether this model would fit other different samples. PMID:22952549
Contextual influence on health outcomes is increasingly becoming an important area of research. Analytical techniques such as spatial analysis help explain the variations and dynamics in health inequalities across different context and among different population groups. This paper explores spatial clustering in body mass index among Ghanaian women by analysing data from the 2008 Ghana Demographic and Health Survey using exploratory spatial data analysis techniques. Overweight was a more common occurrence in urban areas than in rural areas. Close to a quarter of the clusters in Ghana, mostly those in the southern sector contained women who were overweight. Women who lived in clusters where the women were overweight were more likely to live around other clusters where the women were also overweight. The results suggest that the urban environment could be a potential contributing factor to the high levels of obesity in urban areas of Ghana. There is the need for researchers to include a spatial dimension to obesity research in Ghana paying particular attention the urban environment.
Driban, Jeffrey B.; Lo, Grace H.; Eaton, Charles B.; Lapane, Kate L.; Nevitt, Michael; Harvey, William F.; McCulloch, Charles E.; McAlindon, Timothy E.
Background: We conducted an exploratory analysis of osteoarthritis progression among medication users in the Osteoarthritis Initiative to identify interventions or pathways that may be associated with disease modification and therefore of interest for future clinical trials. Methods: We used participants from the Osteoarthritis Initiative with annual medication inventory data between the baseline and 36-month follow-up visit (n = 2938). Consistent medication users were defined for each medication classification as a participant reporting at all four annual visits that they were regularly using an oral prescription medication at the time of the visit. The exploratory analysis focused on medication classes with 40 or more users. The primary outcome measures were medial tibiofemoral joint space width change and the Western Ontario and McMaster Universities Arthritis Index (WOMAC) knee pain score change (12–36-month visits). Within each knee, we explored eight comparisons between users and matched or unmatched nonusers (defined two ways). An effect size of each comparison was calculated. Medication classes had potential signals if (a) both knees had less progression among users compared with nonusers, or (b) there was less progression based on structure and symptoms in one knee. Results: We screened 28 medication classes. Six medication classes had signals for fewer structural changes and better knee pain changes: alpha-adrenergic blockers, antilipemic (excluding statins and fibric acid), anticoagulants, selective serotonin reuptake inhibitors, antihistamines, and antineoplastic agents. Four medication classes had signals for structural changes alone: anti-estrogen (median effect size = 0.28; range = −0.41–0.64), angiotensin-converting enzyme inhibitors (median effect size = 0.13; range = −0.08–0.28), beta-adrenergic blockers (median effect size = 0.09; range = 0.01–0.30), and thyroid agents (median effect size = 0.04; range = −0.05–0.14). Thiazide
Mello, Rafael Barrios; Silva, Maria Regina Regis; Alves, Maria Teresa Seixas; Evison, Martin Paul; Guimarães, Marco Aurelio; Francisco, Rafaella Arrabaca; Astolphi, Rafael Dias; Iwamura, Edna Sadayo Miazato
Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens. Standard hematoxylin and eosin, periodic acid-Schiff and silver methenamine, and picrosirius red staining, and CD31 and CD34 immunohistochemistry were applied to TMA sections. Osteocyte and osteocyte lacuna counts, percent bone matrix loss, and fungal spheroid element counts could be measured and collagen fibre bundles observed in all specimens. Decalcification with 7% nitric acid proceeded more rapidly than with 0.5 M EDTA and may offer better preservation of histological and cellular structure. No endothelial cells could be detected using CD31 and CD34 immunohistochemistry. Correlation between osteocytes per lacuna and age at death may reflect reported age-related responses to microdamage. Methodological limitations and caveats, and results of the TMA analysis of post mortem diagenesis in bone are discussed, and implications for DNA survival and recovery considered. PMID:28051148
Background Given the serious threats posed to terrestrial ecosystems by industrial contamination, environmental monitoring is a standard procedure used for assessing the current status of an environment or trends in environmental parameters. Measurement of metal concentrations at different trophic levels followed by their statistical analysis using exploratory multivariate methods can provide meaningful information on the status of environmental quality. In this context, the present paper proposes a novel chemometric approach to standard statistical methods by combining the Block clustering with Partial least square (PLS) analysis to investigate the accumulation patterns of metals in anthropized terrestrial ecosystems. The present study focused on copper, zinc, manganese, iron, cobalt, cadmium, nickel, and lead transfer along a soil-plant-snai food chain, and the hepatopancreas of the Roman snail (Helix pomatia) was used as a biological end-point of metal accumulation. Results Block clustering deliniates between the areas exposed to industrial and vehicular contamination. The toxic metals have similar distributions in the nettle leaves and snail hepatopancreas. PLS analysis showed that (1) zinc and copper concentrations at the lower trophic levels are the most important latent factors that contribute to metal accumulation in land snails; (2) cadmium and lead are the main determinants of pollution pattern in areas exposed to industrial contamination; (3) at the sites located near roads lead is the most threatfull metal for terrestrial ecosystems. Conclusion There were three major benefits by applying block clustering with PLS for processing the obtained data: firstly, it helped in grouping sites depending on the type of contamination. Secondly, it was valuable for identifying the latent factors that contribute the most to metal accumulation in land snails. Finally, it optimized the number and type of data that are best for monitoring the status of metallic
Fawcett, Stephen B.
Distinct types of activity in the field of applied behavior analysis are noted and discussed. Four metaphorical types of activity are considered: prospecting, farming, building, and guiding. Prospecting consists of time-limited exploration of a variety of beaviors, populations, or settings. Farming consists of producing new behaviors in the same setting using independent variables provided by the researchers or normally available in the setting. Building consists of combining procedural elements to create new programs or systems or to rehabilitate aspects of existing programs. Guiding involves pointing out connections between the principles of human behavior and the problems, populations, settings, and procedures with which researchers are (or could be) working. Advantages of each sphere are noted, and benefits of this division of labor to the field as a whole are discussed. PMID:22478631
Holzinger, M.; Scheeres, D.
Several existing and emerging applications of Space Situational Awareness (SSA) relate directly to spacecraft Rendezvous, Proximity Operations, and Docking (RPOD) and Formation / Cluster Flight (FCF). When multiple Resident Space Ob jects (RSOs) are in vicinity of one another with appreciable periods between observations, correlating new RSO tracks to previously known objects becomes a non-trivial problem. A particularly difficult sub-problem is seen when long breaks in observations are coupled with continuous, low- thrust maneuvers. Reachability theory, directly related to optimal control theory, can compute contiguous reachability sets for known or estimated control authority and can support such RSO search and correlation efforts in both ground and on-board settings. Reachability analysis can also directly estimate the minimum control authority of a given RSO. For RPOD and FCF applications, emerging mission concepts such as fractionation drastically increase system complexity of on-board autonomous fault management systems. Reachability theory, as applied to SSA in RPOD and FCF applications, can involve correlation of nearby RSO observations, control authority estimation, and sensor track re-acquisition. Additional uses of reachability analysis are formation reconfiguration, worst-case passive safety, and propulsion failure modes such as a "stuck" thruster. Existing reachability theory is applied to RPOD and FCF regimes. An optimal control policy is developed to maximize the reachability set and optimal control law discontinuities (switching) are examined. The Clohessy-Wiltshire linearized equations of motion are normalized to accentuate relative control authority for spacecraft propulsion systems at both Low Earth Orbit (LEO) and Geostationary Earth Orbit (GEO). Several examples with traditional and low thrust propulsion systems in LEO and GEO are explored to illustrate the effects of relative control authority on the time-varying reachability set surface. Both
The Ohio Department of Transportation has more than 60 facilities without sewer access generating approximately 19 million gallons of winter maintenance wash water. Off-site disposal is costly, creating the need for sustainable management strategies. The objective of this study was to conduct an exploratory feasibility analysis to assess wash water disposal and potential reuse as brine. Based on a comprehensive literature review and relevant environmental chemistry, a sampling protocol consisting of 31 water quality constituents was utilized for monthly sampling at three geographically distinct Ohio Department of Transportation garages during the winter of 2012. Results were compared to local disposal and reuse guidance limits. Three constituents, including a maximum copper concentration of 858 ppb, exceeded disposal limits, and many constituents also failed to meet reuse limits. Some concentrations were orders of magnitude higher than reuse limits and suggest pre-treatment would be necessary if wash water were reused as brine. These water quality results, in conjunction with copper chemical equilibrium modeling, show pH and dissolved carbon both significantly impact the total dissolved copper concentration and should be measured to assess reuse potential. The sampling protocol and specific obstacles highlighted in this paper aid in the future development of sustainable wash water management strategies. PMID:26908148
McCorkle, Douglas S.; Bryden, Kenneth M.
Several recent reports and workshops have identified integrated computational engineering as an emerging technology with the potential to transform engineering design. The goal is to integrate geometric models, analyses, simulations, optimization and decision-making tools, and all other aspects of the engineering process into a shared, interactive computer-generated environment that facilitates multidisciplinary and collaborative engineering. While integrated computational engineering environments can be constructed from scratch with high-level programming languages, the complexity of these proposed environments makes this type of approach prohibitively slow and expensive. Rather, a high-level software framework is needed to provide the user with the capability to construct an application in an intuitive manner using existing models and engineering tools with minimal programming. In this paper, we present an exploratory open source software framework that can be used to integrate the geometric models, computational fluid dynamics (CFD), and optimization tools needed for shape optimization of complex systems. This framework is demonstrated using the multiphase flow analysis of a complete coal transport system for an 800 MW pulverized coal power station. The framework uses engineering objects and three-dimensional visualization to enable the user to interactively design and optimize the performance of the coal transport system.
Lu, Weidong; Hu, David; Dean-Clower, Elizabeth; Doherty-Gilman, Anne; Legedza, Anna T R; Lee, Hang; Matulonis, Ursula; Rosenthal, David S
Chemotherapy-induced leukopenia and neutropenia are common side effects during cancer treatment. Acupuncture has been reported as an adjunct therapy for this complication. The current study reviewed published randomized controlled trials of acupuncture's effect and explored the acupuncture parameters used in these trials. We searched biomedical databases in English and Chinese from 1979 to 2004. The study populations were cancer patients who were undergoing or had just completed chemotherapy or chemoradiotherapy, randomized to either acupuncture therapy or usual care. The methodologic quality of trials was assessed. From 33 reviewed articles, 682 patients from 11 eligible trials were included in analyses. All trials were published in non-PubMed journals from China. The methodologic quality of these trials was considerably poor. The median sample size of each comparison group was 45, and the median trial duration was 21 days. The frequency of acupuncture treatment was once a day, with a median of 16 sessions in each trial. In the seven trials in which white blood cell (WBC) counts were available, acupuncture use was associated with an increase in leukocytes in patients during chemotherapy or chemoradiotherapy, with a weighted mean difference of 1,221 WBC/muL on average (95% confidence interval 636-1,807; p < .0001). Acupuncture for chemotherapy-induced leukopenia is an intriguing clinical question. However, the inferior quality and publication bias present in these studies may lead to a false-positive estimation. Meta-analysis based on these published trials should be treated in an exploratory nature only.
Zhou, Ajian; Liu, Yue; Zhao, Ying; Zhang, Li; Sun, Leilei; Du, Shiyu; Yang, Qiang; Song, Xin; Liang, Chaoyang
Background. Traditional Chinese medicine (TCM) has long been used to treat chronic atrophic gastritis (CAG). The aim of the present study was to evaluate the TCM syndrome characteristics of CAG and its core pathogenesis so as to promote optimization of treatment strategies. Methods. This study was based on a participant survey conducted in 4 hospitals in China. Patients diagnosed with CAG were recruited by simple random sampling. Exploratory factor analysis (EFA) was conducted on syndrome extraction. Results. Common factors extracted were assigned to six syndrome patterns: qi deficiency, qi stagnation, blood stasis, phlegm turbidity, heat, and yang deficiency. Distribution frequency of all syndrome patterns showed that qi deficiency, qi stagnation, blood stasis, phlegm turbidity, and heat excess were higher (76.7%–84.2%) compared with yang deficiency (42.5%). Distribution of main syndrome patterns showed that frequencies of qi deficiency, qi stagnation, phlegm turbidity, heat, and yang deficiency were higher (15.8%–20.8%) compared with blood stasis (8.3%). Conclusions. The core pathogenesis of CAG is combination of qi deficiency, qi stagnation, blood stasis, phlegm turbidity, heat, and yang deficiency. Therefore, treatment strategy of herbal prescriptions for CAG should include herbs that regulate qi, activate blood, resolve turbidity, clear heat, remove toxin, and warm yang. PMID:28077948
Chagas, Mauro H.; Magalhães, Fabrício A.; Peixoto, Gustavo H. C.; Pereira, Beatriz M.; Andrade, André G. P.; Menzel, Hans-Joachim K.
ABSTRACT Background Stretching exercises are able to promote adaptations in the muscle-tendon unit (MTU), which can be tested through physiological and biomechanical variables. Identifying the key variables in MTU adaptations is crucial to improvements in training. Objective To perform an exploratory factor analysis (EFA) involving the variables often used to evaluate the response of the MTU to stretching exercises. Method Maximum joint range of motion (ROMMAX), ROM at first sensation of stretching (FSTROM), peak torque (torqueMAX), passive stiffness, normalized stiffness, passive energy, and normalized energy were investigated in 36 participants during passive knee extension on an isokinetic dynamometer. Stiffness and energy values were normalized by the muscle cross-sectional area and their passive mode assured by monitoring the EMG activity. Results EFA revealed two major factors that explained 89.68% of the total variance: 53.13% was explained by the variables torqueMAX, passive stiffness, normalized stiffness, passive energy, and normalized energy, whereas the remaining 36.55% was explained by the variables ROMMAX and FSTROM. Conclusion This result supports the literature wherein two main hypotheses (mechanical and sensory theories) have been suggested to describe the adaptations of the MTU to stretching exercises. Contrary to some studies, in the present investigation torqueMAX was significantly correlated with the variables of the mechanical theory rather than those of the sensory theory. Therefore, a new approach was proposed to explain the behavior of the torqueMAX during stretching exercises. PMID:27437715
Chagas, Mauro H; Magalhães, Fabrício A; Peixoto, Gustavo H C; Pereira, Beatriz M; Andrade, André G P; Menzel, Hans-Joachim K
Background Stretching exercises are able to promote adaptations in the muscle-tendon unit (MTU), which can be tested through physiological and biomechanical variables. Identifying the key variables in MTU adaptations is crucial to improvements in training. Objective To perform an exploratory factor analysis (EFA) involving the variables often used to evaluate the response of the MTU to stretching exercises. Method Maximum joint range of motion (ROMMAX), ROM at first sensation of stretching (FSTROM), peak torque (torqueMAX), passive stiffness, normalized stiffness, passive energy, and normalized energy were investigated in 36 participants during passive knee extension on an isokinetic dynamometer. Stiffness and energy values were normalized by the muscle cross-sectional area and their passive mode assured by monitoring the EMG activity. Results EFA revealed two major factors that explained 89.68% of the total variance: 53.13% was explained by the variables torqueMAX, passive stiffness, normalized stiffness, passive energy, and normalized energy, whereas the remaining 36.55% was explained by the variables ROMMAX and FSTROM. Conclusion This result supports the literature wherein two main hypotheses (mechanical and sensory theories) have been suggested to describe the adaptations of the MTU to stretching exercises. Contrary to some studies, in the present investigation torqueMAX was significantly correlated with the variables of the mechanical theory rather than those of the sensory theory. Therefore, a new approach was proposed to explain the behavior of the torqueMAX during stretching exercises.
Gazis, P. R.; Levit, C.; Way, M. J.
Scientific data sets continue to increase in both size and complexity. In the past, dedicated graphics systems at supercomputing centers were required to visualize large data sets, but as the price of commodity graphics hardware has dropped and its capability has increased, it is now possible, in principle, to view large complex data sets on a single workstation. To do this in practice, an investigator will need software that is written to take advantage of the relevant graphics hardware. The Viewpoints visualization package described herein is an example of such software. Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It leverages the capabilities of modern graphics boards (GPUs) to run on a single workstation or laptop. Viewpoints is minimalist: it attempts to do a small set of useful things very well (or at least very quickly) in comparison with similar packages today. Its basic feature set includes linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal. Viewpoints was originally designed for astrophysicists, but it has since been used in a variety of fields that range from astronomy, quantum chemistry, fluid dynamics, machine learning, bioinformatics, and finance to information technology server log mining. In this article, we describe the Viewpoints package and show examples of its usage.
Norris, Emma; Myers, Lynn
Despite evident protective value of motorcycle personal protective equipment (PPE), no research has assessed considerations behind its uptake in UK riders. A cross-sectional online questionnaire design was employed, with riders (n=268) recruited from online motorcycle forums. Principal component analysis found four PPE behavioural outcomes. Theoretical factors of intentions, attitudes, injunctive and descriptive subjective norms, risk perceptions, anticipated regret, benefits and habit were also identified for further analysis. High motorcycle jacket, trousers and boots wear, middling high-visibility wear and low non-Personal Protective Equipment wear were found. Greater intentions, anticipated regret and perceived benefits were significantly associated with increased motorcycle jacket, trousers and boots wear, with habit presence and scooter use significantly associated with increased high-visibility wear. Lower intentions, anticipated regret and risk perceptions, being female, not holding a car licence and urban riding were significantly associated with increased non-PPE wear. A need for freedom of choice and mixed attitudes towards PPE use were evident in additional comments. PPE determinants in this sample provide a preliminary basis for future uptake interventions. Larger scale and qualitative research is needed to further investigate relevant constructs.
Di Giuseppe, M. G.; Troiano, A.; Troise, C.; De Natale, G.
Multi-parameter acquisition is a common geophysical field practice nowadays. Regularly seismic velocity and attenuation, gravity and electromagnetic dataset are acquired in a certain area, to obtain a complete characterization of the some investigate feature of the subsoil. Such a richness of information is often underestimated, although an integration of the analysis could provide a notable improving in the imaging of the investigated structures, mostly because the handling of distinct parameters and their joint inversion still presents several and severe problems. Post-inversion statistical techniques represent a promising approach to these questions, providing a quick, simple and elegant way to obtain this advantageous but complex integration. We present an approach based on the partition of the analyzed multi parameter dataset in a number of different classes, identified as localized regions of high correlation. These classes, or 'Cluster', are structured in such a way that the observations pertaining to a certain group are more similar to each other than the observations belonging to a different one, according to an optimal logical criterion. Regions of the subsoil sharing the same physical characteristic are so identified, without a-priori or empirical relationship linking the distinct measured parameters. The retrieved imaging results highly affordable in a statistical sense, specifically due to this lack of external hypothesis that are, instead, indispensable in a full joint inversion, were works, as matter of fact, just a real constrain for the inversion process, not seldom of relative consistence. We apply our procedure to a certain number of experimental dataset, related to several structures at very different scales presents in the Campanian district (southern Italy). These structures goes from the shallows evidence of the active fault zone originating the M 7.9 Irpinia earthquake to the main feature characterizing the Campi Flegrei Caldera and the Mt
Belzer, David B.
The Department of Energy’s (DOE) Building Technologies Program (BTP) has had an active research program in supporting the development of electrochromic (EC) windows. Electrochromic glazings used in these windows have the capability of varying the transmittance of light and heat in response to an applied voltage. This dynamic property allows these windows to reduce lighting, cooling, and heating energy in buildings where they are employed. The exploratory analysis described in this report examined three different variants of EC glazings, characterized by the amount of visible light and solar heat gain (as measured by the solar heat gain coefficients [SHGC] in their “clear” or transparent states). For these EC glazings, the dynamic range of the SHGC’s between their “dark” (or tinted) state and the clear state were: (0.22 - 0.70, termed “high” SHGC); (0.16 - 0.39, termed “low” SHGC); and (0.13 - 0.19; termed “very low” SHGC). These glazings are compared to conventional (static) glazing that meets the ASHRAE Standard 90.1-2004 energy standard for five different locations in the U.S. All analysis used the EnergyPlus building energy simulation program for modeling EC windows and alternative control strategies. The simulations were conducted for a small and a medium office building, where engineering specifications were taken from the set of Commercial Building Benchmark building models developed by BTP. On the basis of these simulations, total source-level savings in these buildings were estimated to range between 2 to 7%, depending on the amount of window area and building location.
Sakaluk, John K; Short, Stephen D
Sexuality researchers frequently use exploratory factor analysis (EFA) to illuminate the distinguishable theoretical constructs assessed by a set of variables. EFA entails a substantive number of analytic decisions to be made with respect to sample size determination, and how factors are extracted, rotated, and retained. The available analytic options, however, are not all equally empirically rigorous. We discuss the commonly available options for conducting EFA and which options constitute best practices for EFA. We also present the results of a methodological review of the analytic options for EFA used by sexuality researchers in more than 200 EFAs, published in more than 160 articles and chapters from 1974 to 2014, in a sample of sexuality research journals. Our review reveals that best practices for EFA are actually those least frequently used by sexuality researchers. We introduce freely available analytic resources to help make it easier for sexuality researchers to adhere to best practices when conducting EFAs in their own research.
Anderson, D E; Cornwell, D; St-Jean, G; Desrochers, A; Anderson, L S
The effect of right paralumbar fossa exploratory celiotomy and omentopexy on peritoneal fluid constituents was studied in 22 adult dairy cows. Six cows were eliminated on the basis of physical examination findings (n = 2), surgical findings (n = 2), or inability to obtain a sufficient volume of peritoneal fluid (n = 2). Sixteen cattle had normal results of CBC and serum biochemical analysis, and a minimum of 1 ml of peritoneal fluid was obtained by abdominocentesis. Abdominocentesis was repeated on days 1, 2, and 6 after surgery. Statistical analysis for repeated measures was performed, using a significance level of P < 0.05. Stage of gestation was evaluated for interaction with time. Mean total nucleated cell count was 3,200 cells/microliters before surgery, was significantly increased 2 days after surgery (16,336 cells/microliters), and continued to increase through day 6 (20,542 cells/microliters). Mean polymorphonuclear cell count was 1,312 cells/microliters before surgery and was significantly higher at 2 (11,043 cells/microliters) and 6 (10,619 cells/microliters) days after surgery. Mean lymphocyte count was 254 cells/microliters before surgery and was significantly increased 2 days (1,911 cells/microliters) after surgery. By day 6, lymphocyte numbers were similar to preoperative values. Mean mononuclear cell count was 770 cells/microliters before surgery and was significantly increased on days 1 (3,084 cells/microliters), 2 (3,285 cells/microliters), and 6 (2,349 cells/microliters) after surgery. Mean eosinophil numbers were 1,388 cells/microliters before surgery and were significantly increased on day 6 (6,347 cells/microliters) only. Interaction between time and stage of gestation was found only for specific gravity and total protein concentration.(ABSTRACT TRUNCATED AT 250 WORDS)
Habash Krause, L.
As the amount of observational Earth and Space Science data grows, so does the need for learning and employing data analysis techniques that can extract meaningful information from those data. Space-based and ground-based data sources from all over the world are used to inform Earth and Space environment models. However, with such a large amount of data comes a need to organize those data in a way such that trends within the data are easily discernible. This can be tricky due to the interaction between physical processes that lead to partial correlation of variables or multiple interacting sources of causality. With the suite of Exploratory Data Analysis (EDA) data mining codes available at MSFC, we have the capability to analyze large, complex data sets and quantitatively identify fundamentally independent effects from consequential or derived effects. We have used these techniques to examine the accuracy of ionospheric climate models with respect to trends in ionospheric parameters and space weather effects. In particular, these codes have been used to 1) Provide summary "at-a-glance" surveys of large data sets through categorization and/or evolution over time to identify trends, distribution shapes, and outliers, 2) Discern the underlying "latent" variables which share common sources of causality, and 3) Establish a new set of basis vectors by computing Empirical Orthogonal Functions (EOFs) which represent the maximum amount of variance for each principal component. Some of these techniques are easily implemented in the classroom using standard MATLAB functions, some of the more advanced applications require the statistical toolbox, and applications to unique situations require more sophisiticated levels of programming. This paper will present an overview of the range of tools available and how they might be used for a variety of time series Earth and Space Science data sets. Examples of feature recognition from both 1D and 2D (e.g. imagery) time series data
Mazzola, Viridiana; Marano, Giuseppe; Biganzoli, Elia M.; Boracchi, Patrizia; Lanciano, Tiziana; Arciero, Giampiero; Bondolfi, Guido
The issue of individual differences has always been an important area of research in psychology and, more recently, neuroimaging. A major source of interindividual variability stems from differences in basic affective dispositions. In order to make a contribution to this field of research, we have developed a new type of assessment – the In-Out dispositional affective style questionnaire (IN-OUT DASQ) – to measure the proneness between two different ways of feeling situated: a predominantly body-bound one in the case of the inward tendency and an externally anchored one in the case of the outward tendency (Arciero and Bondolfi, 2009). The IN-OUT DASQ contains two scales of seven items each, Self-centric engagement (SCE) and Other-centric engagement (OCE), as a disposition index for inwardness and outwardness respectively. The exploratory factor analysis in sample 1 (n = 292) confirmed a two-factor solution. Confirmatory factor analysis in sample 2 (n = 300) showed the good fit of this two-factor model. Next, we examined construct validity also investigating the correlations between the IN-OUT DASQ, the Big Five Questionnaire and the Positive and Negative Affect Schedule in sample 3 (n = 153). The SCE and OCE scales had robust internal consistency and reliability, though the capacity to discriminate higher inward and outward participants was stronger in SCE. Although further validation research is required, the present study suggests the IN-OUT DASQ has the potential to be a measurement tool for detecting individual differences in social behavior and social affective neuroscience. PMID:25309478
Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.
Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds . Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration . Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone . Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).
Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J
To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis.
Bakchine, E; Pham-Delegue, M H; Kaiser, L; Masson, C
A method of quantification of the exploratory behavior of small animals stimulated by an odorant in a four-choice olfactometer, taking into account the interindividual variability of responses, was developed: individual tracks were time sampled according to the animal's walking speed and its positions were recorded according to the X-Y coordinates of the grid set underneath the device, the mesh of the grid suiting the animal's body size. A software, written in BASIC APPLESOFT on an APPLE IIe computer, allowed us to analyze the coordinates either of a single individual or of an experimental sample, leading to: a) the quantification of the insect distribution all over the experimental chamber, expressed in a table numbered according to the grid, where the percentage of position per square either for a given time fraction or the total observation period were reported, b) a graphic representation of the data according to several levels of greys, expressing the frequentation for each square for a given duration of observation. An analysis per time fraction allowed the chronological setup of events to appreciate. c) The collection of the positions among each flow field of the olfactometer for each individual of the experimental sample, for a given duration, was translated as the percentage of time spent in each flow field. Data files gathered these percentages for further statistical treatments. This computer method, which requires little equipment and appears to be easily adaptable to the study of biological models of various size and speed such as honeybees, trichogrammas and varroas mites, is a powerful tool for behavioral studies of small organisms tested in restricted areas.
Francis, H; Wade, D; Turner-Stokes, L; Kingswell, R; Dott, C; Coxon, E
Background: Spasticity and loss of function in an affected arm are common after stroke. Although botulinum toxin is used to reduce spasticity, its functional benefits are less easily demonstrated. This paper reports an exploratory meta-analysis to investigate the relationship between reduced arm spasticity and improved arm function. Method: Individual data from stroke patients in two randomised controlled trials of intra-muscular botulinum toxin were pooled. The Modified Ashworth Scale (elbow, wrist, fingers) was used to calculate a "Composite Spasticity Index". Data from the arm section of the Barthel Activities of Daily Living Index (dressing, grooming, and feeding) and three subjective measures (putting arm through sleeve, cleaning palm, cutting fingernails) were summed to give a "Composite Functional Index". Change scores and the time of maximum change were also calculated. Results: Maximum changes in both composite measures occurred concurrently in 47 patients. In 26 patients the improvement in spasticity preceded the improvement in function with 18 showing the reverse. There was a definite relationship between the maximum change in spasticity and the maximum change in arm function, independent of treatment (ρ = –0.2822, p = 0.0008, n = 137). There was a clear relationship between the changes in spasticity and in arm function in patients treated with botulinum toxin (Dysport) at 500 or 1000 units (ρ = –0.5679, p = 0.0090, n = 22; ρ = –0.4430, p = 0.0018, n = 47), but not in those treated with placebo or 1500 units. Conclusions: Using a targeted meta-analytic approach, it is possible to demonstrate that reducing spasticity in the arm is associated with a significant improvement in arm function. PMID:15489384
Maire, Eric; Lelievre-Berna, Eddy; Fafeur, Veronique; Vandenbunder, Bernard
We have developed a novel approach to study luminescent light emission during migration of living cells by low-light imaging techniques. The equipment consists in an anti-vibration table with a hole for a direct output under the frame of an inverted microscope. The image is directly captured by an ultra low- light level photon-counting camera equipped with an image intensifier coupled by an optical fiber to a CCD sensor. This installation is dedicated to measure in a dynamic manner the effect of SF/HGF (Scatter Factor/Hepatocyte Growth Factor) both on activation of gene promoter elements and on cell motility. Epithelial cells were stably transfected with promoter elements containing Ets transcription factor-binding sites driving a luciferase reporter gene. Luminescent light emitted by individual cells was measured by image analysis. Images of luminescent spots were acquired with a high aperture objective and time exposure of 10 - 30 min in photon-counting mode. The sensitivity of the camera was adjusted to a high value which required the use of a segmentation algorithm dedicated to eliminate the background noise. Hence, image segmentation and treatments by mathematical morphology were particularly indicated in these experimental conditions. In order to estimate the orientation of cells during their migration, we used a dedicated skeleton algorithm applied to the oblong spots of variable intensities emitted by the cells. Kinetic changes of luminescent sources, distance and speed of migration were recorded and then correlated with cellular morphological changes for each spot. Our results highlight the usefulness of the mathematical morphology to quantify kinetic changes in luminescence microscopy.
Walsh, Kerryann; Rassafiani, Mehdi; Mathews, Ben; Farrell, Ann; Butler, Des
This paper presents an evaluation of an instrument to measure teachers' attitudes toward reporting child sexual abuse and discusses the instrument's merit for research into reporting practice. Based on responses from 444 Australian teachers, the Teachers' Reporting Attitude Scale for Child Sexual Abuse was evaluated using exploratory factor…
Heller, Donald E.
This study examines the racial and ethnic distribution of the costs and benefits of higher education in California. This exploratory work documents the racial and ethnic distribution of these benefits, in the form of enrollments in different sectors and different types of institutions, as well as on the costs, in the form of the share borne by…
Because students learn from each other as well as lecturers, it is important to create opportunities for collaboration in writing classes. Teachers now benefit from access to plagiarism detectors that can also provide feedback. This exploratory study considers the role of four review types, open and anonymous, involving the students themselves,…
A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…
Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer; Diego Mandelli; Michael Pernice; Robert Nourgaliev
and 2) topology-based methodologies to interactively visualize multidimensional data and extract risk-informed insights. Regarding item 1) we employ learning algorithms that aim to infer/predict simulation outcome and decide the coordinate in the input space of the next sample that maximize the amount of information that can be gained from it. Such methodologies can be used to both explore and exploit the input space. The later one is especially used for safety analysis scopes to focus samples along the limit surface, i.e. the boundaries in the input space between system failure and system success. Regarding item 2) we present a software tool that is designed to analyze multi-dimensional data. We model a large-scale nuclear simulation dataset as a high-dimensional scalar function defined over a discrete sample of the domain. First, we provide structural analysis of such a function at multiple scales and provide insight into the relationship between the input parameters and the output. Second, we enable exploratory analysis for users, where we help the users to differentiate features from noise through multi-scale analysis on an interactive platform, based on domain knowledge and data characterization. Our analysis is performed by exploiting the topological and geometric properties of the domain, building statistical models based on its topological segmentations and providing interactive visual interfaces to facilitate such explorations.
Santos, Silvia J.; Hurtado-Ortiz, Maria T.; Sneed, Carl D.
This study examined the validity of the Klonoff and Landrine Illness-Belief Scale when applied to Latino college students (n = 156; 34% male, 66% female) at high risk for future diabetes onset. Principal factor analysis yielded four significant factors--emotional, folk beliefs, punitive, gene/hereditary--which accounted for 64.5% of variance and…
Acar, Serap; Savci, Sema; Keskinoğlu, Pembe; Akdeniz, Bahri; Özpelit, Ebru; Özcan Kahraman, Buse; Karadibak, Didem; Sevinc, Can
Purpose Individuals with cardiac problems avoid physical activity and exercise because they expect to feel shortness of breath, dizziness, or chest pain. Assessing kinesiophobia related to heart problems is important in terms of cardiac rehabilitation. The Tampa Scale of Kinesiophobia Swedish Version for the Heart (TSK-SV Heart) is reliable and has been validated for cardiac diseases in the Swedish population. The aim of this study was to investigate the reliability, parallel-form validity, and exploratory factor analysis of the TSK for the Heart Turkish Version (TSK Heart Turkish Version) for evaluating kinesiophobia in patients with heart failure and pulmonary arterial hypertension. Methods This cross-sectional study involved translation, back translation, and cross-cultural adaptation (localization). Forty-three pulmonary arterial hypertension and 32 heart failure patients were evaluated using the TSK Heart Turkish Version. The 17-item scale, originally composed for the Swedish population, has four factors: perceived danger for heart problem, avoidance of exercise, fear of injury, and dysfunctional self. Cronbach’s alpha (internal consistency) and exploratory factor analysis were used to assess the questionnaire’s reliability. Results of the patients in the 6-minute walk test, International Physical Activity Questionnaire, and Nottingham Health Profile were analyzed by Pearson’s correlation analysis with the TSK Heart Turkish Version to indicate the convergent validity. Results Cronbach’s alpha for the TSK Heart Turkish Version was 0.75, indicating acceptable internal consistency. Although exploratory factor analysis showed a different subgroup distribution than the original questionnaire, the model was acceptable for the four-factor model hypothesis. Therefore, the questionnaire was rated as reliable. Conclusion These results supported the reliability of the TSK Heart Turkish Version. Since the acceptable four-factor model fits the subgroups and
Swimming teachers and coaches can improve their feedback to swimmers, when correcting or refining swim movements, by applying some basic biomechanical concepts relevant to swimming. This article focuses on the biomechanical considerations used in analyzing swimming performance. Techniques for spotting and correcting problems that impede…
Sert, Olcay; Seedhouse, Paul
This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…
12:207-242.  L. S. Breger, G. Inalhan, M. Tillerson, J. P. How, “Cooperative spacecraft Formation Flying: Model Predictive Control With Open And...applying them to the nonlinear relative orbit equations of motion, which are appropriate both for general SSA and spacecraft proximity operations... Nonlinear System (RNS). This assumption does not restrict the scope of these results in the context of SSA, as in orbital scenarios control and
Windeln, Johannes; Bram, Christian; Eckes, Heinz-Ludwig; Hammel, Dirk; Huth, Johanna; Marien, Jan; Röhl, Holger; Schug, Christoph; Wahl, Michael; Wienss, Andreas
This paper gives a synopsis of today's challenges and requirements for a surface analysis and materials science laboratory with a special focus on magnetic recording technology. The critical magnetic recording components, i.e. the protective carbon overcoat (COC), the disk layer structure, the read/write head including the giant-magnetoresistive (GMR) sensor, are described and options for their characterization with specific surface and structure analysis techniques are given. For COC investigations, applications of Raman spectroscopy to the structural analysis and determination of thickness, hydrogen and nitrogen content are discussed. Hardness measurements by atomic force microscopy (AFM) scratching techniques are presented. Surface adsorption phenomena on disk substrates or finished disks are characterized by contact angle analysis or so-called piezo-electric mass adsorption systems (PEMAS), also known as quartz crystal microbalance (QCM). A quickly growing field of applications is listed for various X-ray analysis techniques, such as disk magnetic layer texture analysis for X-ray diffraction, compositional characterization via X-ray fluorescence, compositional analysis with high lateral resolution via electron microprobe analysis. X-ray reflectometry (XRR) has become a standard method for the absolute measurement of individual layer thicknesses contained in multi-layer stacks and thus, is the successor of ellipsometry for this application. Due to the ongoing reduction of critical feature sizes, the analytical challenges in terms of lateral resolution, sensitivity limits and dedicated nano-preparation have been consistently growing and can only be met by state-of-the-art Auger electron spectrometers (AES), transmission electron microscopy (TEM) analysis, time-of-flight-secondary ion mass spectroscopy (ToF-SIMS) characterization, focused ion beam (FIB) sectioning and TEM lamella preparation via FIB. The depth profiling of GMR sensor full stacks was significantly
Friedlander, Alan L.; Harry, David P., III
An exploratory analysis of vehicle guidance during the approach to a target planet is presented. The objective of the guidance maneuver is to guide the vehicle to a specific perigee distance with a high degree of accuracy and minimum corrective velocity expenditure. The guidance maneuver is simulated by considering the random sampling of real measurements with significant error and reducing this information to prescribe appropriate corrective action. The instrumentation system assumed includes optical and/or infrared devices to indicate range and a reference angle in the trajectory plane. Statistical results are obtained by Monte-Carlo techniques and are shown as the expectation of guidance accuracy and velocity-increment requirements. Results are nondimensional and applicable to any planet within limits of two-body assumptions. The problem of determining how many corrections to make and when to make them is a consequence of the conflicting requirement of accurate trajectory determination and propulsion. Optimum values were found for a vehicle approaching a planet along a parabolic trajectory with an initial perigee distance of 5 radii and a target perigee of 1.02 radii. In this example measurement errors were less than i minute of arc. Results indicate that four corrections applied in the vicinity of 50, 16, 15, and 1.5 radii, respectively, yield minimum velocity-increment requirements. Thrust devices capable of producing a large variation of velocity-increment size are required. For a vehicle approaching the earth, miss distances within 32 miles are obtained with 90-percent probability. Total velocity increments used in guidance are less than 3300 feet per second with 90-percent probability. It is noted that the above representative results are valid only for the particular guidance scheme hypothesized in this analysis. A parametric study is presented which indicates the effects of measurement error size, initial perigee, and initial energy on the guidance
This overview examines research synthesis in applied ecology and conservation. Vote counting and pooling unweighted averages are widespread despite the superiority of syntheses based on weighted combination of effects. Such analyses allow exploration of methodological uncertainty in addition to consistency of effects across species, space and time, but exploring heterogeneity remains controversial. Meta-analyses are required to generalize in ecology, and to inform evidence-based decision-making, but the more sophisticated statistical techniques and registers of research used in other disciplines must be employed in ecology to fully realize their benefits.
Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene
Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…
Leone, James E; Mullin, Elizabeth M; Maurer-Starks, Suanne S; Rovito, Michael J
The purpose of this study was to determine whether there is evidence of reliability and validity for the Adolescent Body Image Satisfaction Scale (ABISS), an instrument previously developed to measure adolescent body image. A sample (N = 330) of adolescent males, aged 14-19 years, completed the ABISS to determine current body image satisfaction. Data were analyzed for measures of instrument composite reliability and initial content and construct validity. Exploratory factor analysis supported a 3-factor solution (16 total items), which explained 42.7% of variance in the model. Composite reliability for the subscales, body competence, body inadequacy, and internal conflict ranged from 0.64 to 0.82. Exploratory factor analysis of the ABISS provides initial psychometric support for a valid and reliable measure for assessing adolescent male body image, which also can be used as a needs assessment tool. Strength and conditioning professionals should be aware of their athlete and client psychological attributes, many of whom are adolescents. Understanding how adolescents view their bodies and their body image will assist professionals in designing appropriate, health-promotive strength programs, while at the same time monitoring for signs of body image dissatisfaction. Assessing body image can help heighten awareness and possibly encourage preventative programming to help avert negative health practices (e.g., performance-enhancing drug use, exercise addictions, disordered eating). The ABISS seems to have preliminary psychometric support to be a valid and reliable instrument that helps gauge at-risk populations.
Matsuda, Andrea Harumi; Machado, Luci Brocardo; del Mastro, Nélida Lucia
Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were 60Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600°C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.
Zheng, N; Barrentine, S W
The development of motion analysis and the application of biomechanical analysis techniques to sports has paralleled the exponential growth of computational and videographic technology. Technological developments have provided for advances in the investigation of the human body and the action of the human body during sports believed to be unobtainable a few years ago. Technological advancements have brought biomechanical applications into a wide range of fields from orthopedics to entertainment. An area that has made tremendous gains using biomechanics is sports science. Coaches, therapists, and physicians are using biomechanics to improve performance, rehabilitation, and the prevention of sports related injuries. Functional analyses of athletic movements that were impossible a few years ago are available and used today. With new advancements, the possibilities for investigating the way a human interacts and reacts to environmental conditions are ever expanding.
Gay, Leslie; Karfilis, Kate V; Miller, Michael R; Doe, Chris Q; Stankunas, Kryn
Transcriptional profiling is a powerful approach for studying mouse development, physiology and disease models. Here we describe a protocol for mouse thiouracil tagging (TU tagging), a transcriptome analysis technology that includes in vivo covalent labeling, purification and analysis of cell type-specific RNA. TU tagging enables the isolation of RNA from a given cell population of a complex tissue, avoiding transcriptional changes induced by cell isolation trauma, as well as the identification of actively transcribed RNAs and not preexisting transcripts. Therefore, in contrast to other cell-specific transcriptional profiling methods based on the purification of tagged ribosomes or nuclei, TU tagging provides a direct examination of transcriptional regulation. We describe how to (i) deliver 4-thiouracil to transgenic mice to thio-label cell lineage-specific transcripts, (ii) purify TU-tagged RNA and prepare libraries for Illumina sequencing and (iii) follow a straightforward bioinformatics workflow to identify cell type-enriched or differentially expressed genes. Tissue containing TU-tagged RNA can be obtained in 1 d, RNA-seq libraries can be generated within 2 d and, after sequencing, an initial bioinformatics analysis can be completed in 1 additional day.
Diamond, P.; Payne, A. O.
The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.
Giraldo Osorio, J. D.; García Galiano, S. G.
The traditional techniques to gauge hydrological events often fail with the extreme events. A particular case is the floods spatial detection. In this work, the remote sensing techniques and Geographic Information Systems (GIS) have been merged to develop a key tool for monitoring of floods. The low density of gauge stations networks in the development countries becomes remote sensing techniques the most suitable and economic way to delimitate the flood area and compute the damages cost. The common classification techniques of satellite images use "hard methods" in the sense of a pixel is assigned to an unique land cover class. For coarse resolution, the pixels inevitably will be mixed, so "soft methods" can be used in order to assign several land cover classes according to the surface fractions covered by each one. The main objective of this work is the dynamic monitoring of floods in large areas, based on satellite images -with moderate spatial resolution but with high time resolution- and Digital Elevation Model (DEM). Classified maps with finer spatial resolution can be built through the methodology of Subpixels Analysis developed. The procedure is supported on both the Linear Mixture Model (LMM) and Spatial Coherence Analysis (SCA) hypothesis. The LMM builds the land cover fraction maps through an optimization procedure which uses Lagrange Multipliers, while the SCA defines the most likely place for the land cover fractions within the coarse pixel using linear programming. A subsequent procedure improves the flooded area identification using both the drainage direction and flow accumulation raster maps derived from DEM of the study zone. The Subpixels Analysis technique was validated using historical data of floods which were obtained from satellite images. The procedure improves the spatial resolution of classified maps from satellite images with coarse resolution, while the "hard methods" keep the spatial resolution from the input coarse satellite image.
Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.
Bush, Peter J.; Bush, Mary A.
The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.
Tang, Rong; Sae-Lim, Watinee
In this study, an exploratory content analysis of 30 randomly selected Data Science (DS) programs from eight disciplines revealed significant gaps in current DS education in the United States. The analysis centers on linguistic patterns of program descriptions, curriculum requirements, and DS course focus as pertaining to key skills and domain…
Balasch, S; Nuez, F; Palomares, G; Cuartero, J
Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.
Brady, J V
This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most
Brady, J. V.
This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most
Chiaramello, E; Knaflitz, M; Agostini, V
Static stabilometry is a technique aimed at quantifying postural sway during quiet standing in the upright position. Many different models and many different techniques to analyze the trajectories of the Centre of Pressure (CoP) have been proposed. Most of the parameters calculated according to these different approaches are affected by a relevant intra- and inter-subject variability or do not have a clear physiological interpretation. In this study we hypothesize that CoP trajectories have rotational characteristics, therefore we decompose them in clockwise and counter-clockwise components, using the rotary spectra analysis. Rotary spectra obtained studying a population of healthy subjects are described through the group average of spectral parameters, i.e., 95% spectral bandwidth, mean frequency, median frequency, and skewness. Results are reported for the clockwise and the counter-clockwise components and refer to the upright position maintained with eyes open or closed. This study demonstrates that the approach is feasible and that some of the spectral parameters are statistically different between the open and closed eyes conditions. More research is needed to demonstrate the clinical applicability of this approach, but results so far obtained are promising.
Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.
Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.
Motley, Albert E., III
One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.
Cobo R., J.M.; Bermejo M., F.J.
Agricultural development in the Mexicali Valley and in the high cost of electric power required to operate the irrigation wells in the Valley prompted the Mexican government to investigate the possibility of taking advantage of thermal manifestations in the area located 28 km southeast of the city of Mexicali to generate electric power and thereby partially decrease the flight of foreign exchange. In 1958, a geologic study of the southern and southeastern zone of Mexicali was conducted to identify the possibilities of tapping geothermal resources. The purpose of this study was to gain knowledge of the geologic conditions in this area and, if possible, to establish the location of exploratory and production wells and, on the basis of the results of the former, examine the geologic history in order to gain knowledge and understanding of the structural control of the steam. On the basis of this study, it was recommended that 3 exploratory wells should be drilled in order to locate weak zones that would easily allow for steam flow.
Coutinho, Carolina F S; Souza-Santos, Reinaldo; Lima, Marli M
The use of geo-spatial analysis to anticipate transmission risk for Chagas disease was tested in a rural area of northeast Brazil in an approach that combined geo-referencing and exploratory study of triatomine infestation, including related elements such as the environment and hosts. A total of 617 triatomine specimens, mainly Triatoma brasiliensis, were captured, exhibiting an overall T. cruzi positivity of 44.4%. Layer analysis indicated that the greatest transmission risk to man was associated with woodpiles. The buffer area generated contained uninhabited dwellings teeming with bats and positive bugs. Other locations outside the buffer, near uninhabited dwellings housing cattle, contained colonies of triatomines harboring T. cruzi. The results indicate that local residents' activities themselves favor the development of risk areas for Chagas disease.
Raykos, Bronwyn C; Byrne, Susan M; Watson, Hunna
A confirmatory factor analysis of the factor structure of the Distress Tolerance Scale (DTS) created by Corstorphine et al. [Corstorphine, E., Mountford, V., Tomlinson, S., Waller, G., & Meyer, C. (2007). Distress tolerance in the eating disorders. Eating Behaviors, 8, 91-97.] was conducted to assess whether the scale's purported three factors emerged in a clinical sample of patients with a DSM-IV diagnosed eating disorder. The original three-factor model was generally considered to be a poor fit for the data. Subsequent exploratory factor analysis indicated that a better fit emerged using a four-factor structure. Significant associations were observed between behavioral avoidance of positive affect and eating disorder psychopathology. Implications for use of the DTS with eating disorder patients are discussed.
Gresham, Frank M.; And Others
A review of 158 applied behavior analysis studies with children as subjects, published in the "Journal of Applied Behavior Analysis" between 1980 and 1990, found that (1) 16% measured the accuracy of independent variable implementation, and (2) two-thirds did not operationally define components of the independent variable. Specific recommendations…
Dembo, Richard; Briones-Robinson, Rhissa; Barrett, Kimberly; Winters, Ken C.; Ungaro, Rocio; Karas, Lora; Wareham, Jennifer; Belenko, Steven
Truant youth represent a critical group needing problem-oriented research and involvement in effective services. The limited number of studies on the psychosocial functioning of truant youths have focused on one or a few problem areas, rather than examining co-morbid problem behaviors. The present study addresses the need to examine the interrelationships of multiple domains of psychosocial functioning, including substance involvement, mental health, and delinquency, among truant youth. Exploratory structural equation modeling on baseline data collected on 219 truant youths identified two major factors reflecting psychosocial functioning, and found the factor structure was similar across major sociodemographic subgroups. Further analyses supported the validity of the factor structure. The research and service delivery implications of the findings are discussed. PMID:23243383
Ma, C.W.; Miller, D.D.; Jardine, L.J.
This study assesses which structures, systems, and components of the exploratory shaft facility (ESF) are important to safety when the ESF is converted to become part of the operating waste repository. The assessment follows the methodology required by DOE Procedure AP-6.10Q. Failures of the converted ESF during the preclosure period have been evaluated, along with other underground accidents, to determine the potential offsite radiation doses and associated probabilities. The assessment indicates that failures of the ESF will not result in radiation doses greater than 0.5 rem at the nearest unrestricted area boundary. Furthermore, credible accidents in other underground facilities will not result in radiation doses larger than 0.5 rem, even if any structure, system, or component of the converted ESF fails at the same time. Therefore, no structure, system, or component of the converted ESF is important to safety.
Duran, Erika Christiane Marocco; Toledo, Vanessa Pellegrino
This descriptive exploratory study aims to analyze the production of knowledge on the nursing process, based on Master's theses and doctoral dissertations presented in Brazilian graduate programs in Nursing, using the reports of the Nursing Study and Research Center (CEPEn) from 1972 to 2007, and to identify which were published in indexed databases. We found 122 Master's theses, 42 of which were published, and 26 Doctoral dissertations, with 15 publications. From the year 2000 on more publications were found, with a prevalence of qualitative research. The prevalent thematic trend was nursing assistance, with surveys and validation of nursing diagnosis, as well as the other phases of the process, as the most addressed topics. Publications on the theme show gaps, especially in surveying knowledge production. Researches that study this interface may possibly qualify the practice of nursing.
Bishop-Fitzpatrick, Lauren; Hong, Jinkuk; Smith, Leann E; Makuch, Renee A; Greenberg, Jan S; Mailick, Marsha R
This study aims to extend the definition of quality of life (QoL) for adults with autism spectrum disorder (ASD, n = 180, ages 23-60) by: (1) characterizing the heterogeneity of normative outcomes (employment, independent living, social engagement) and objective QoL (physical health, neighborhood quality, family contact, mental health issues); and (2) identifying predictors of positive normative outcomes and good objective QoL. Findings of an exploratory latent class analysis identified three groups of adults with ASD-Greater Dependence, Good Physical and Mental Health, and Greater Independence. Findings indicate that better daily living skills, better executive function, and more maternal warmth are associated with assignment to better outcome groups. Findings have implications for interventions designed to enhance achievement of normative outcomes and objective QoL.
Background: Associations between ozone (O3) and fine particulate matter (PM2.5) concentrations and birth outcomes have been previously demonstrated. We perform an exploratory analysis of O3 and PM2.5 concentrations during early pregnancy and multiple types of birth defects. Met...
Milman, Natalie B.; Hillarious, Marilyn; Walker, Bryce
This qualitative study is an exploratory, retrospective content analysis (Schwandt, 2007) of 81 debrief statements collected over 3 years and written by graduate students in an educational technology graduate program's educational leadership course taught 100% online. Researchers analyzed students' debrief statements of lessons learned and task…
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
Handley, Thomas H., Jr.; Rubin, Mark R.
The DataHub addresses four areas of significant needs: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactives nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc), in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
Handley, Thomas H., Jr.; Rubin, Mark R.
DataHub addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactive nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc.) in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis is on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
Huang, X.; Yung, Y. L.; Lambrigtsen, B. H.
We apply spatial-spectral EOF analysis to 14 days of AIRS (Atmospheric Infrared Sounder) calibrated radiance data collected over the tropics and subtropics (32S to 32N) from July 1 to July 14, 2003. We limit our analysis to the nadir-view (scan angle less than 5o) spectra only. After the quality control procedure, we have an average of 1400 spectra over this period for each 4o by 5o grid box. We obtain a 14-day averaged spectrum for each grid box and apply EOF analysis to these averaged spectra to obtain principal components in spectrally resolved radiance and associated spatial patterns. The first principal component (PC1) can explain more than 90% of the total variance. With the second principal component (PC2), these two leading principal components can explain more than 99% of the total variance. The PC1 spectral features are consistent with spectral features due to the change of surface (or cloud deck) emission temperature. A couple of features can be clearly seen in the PC1 spatial map: ITCZ due to the low emission temperature of optically-thick high cloud, Sahara due to the high surface emission temperature and the clear sky. The spatial map of PC1 closely resembles that of NCEP/NCAR reanalysis of outgoing longwave radiation over the same period. It is also highly correlated with the map of high cloud amount. Both the spectral features and spatial map indicate that the PC1 is mainly due to the spatial variation of cloud emission temperature (for grid boxes with the optically thick clouds) and surface temperature. The PC2 shows spectral features similar to those due to the change of the optical depth of low clouds. Moreover, the PC2 spatial map shows maxima near the coasts of Peru, Namibia and California, as well as over the southern ocean west of Australia. All these regions are known for high frequency of marine stratus. Unlike the traditional approach of observing low clouds from the visible reflectance, these results indicate that we can actually see
Iwata, B A
Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement.
Iwata, Brian A.
The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…
Edwards, Timothy L.; Poling, Alan
This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…
Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.
Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444
Idaho State Dept. of Education, Boise.
This Idaho Social Science Exploratory course of study applies standards-based content knowledge and skills to an enhanced investigation of geography, history, entrepreneurism, and civic engagement at an eighth-grade level. The exploratory course draws upon the disciplines to emphasize concepts and generalizations from the social sciences, promotes…
Muftić, Lisa R; Baumann, Miranda L
Femicide, the murder of females (most often at the hands of males), is an understudied area in homicide research. Furthermore, femicide perpetrated by females has been all but ignored. One reason this may be is because of the rarity of homicide victimization perpetrated by females. Rather, most homicide incidents consist of a male offender and a male victim. When a homicide does involve a female, either as a victim or as an offender, the other party implicated is generally a male. The primary goal of the proposed study is to provide an in-depth, albeit exploratory, examination of female-perpetrated femicide. Using homicide data taken from the Dallas Homicide Unit, 403 cases of femicide will be analyzed, with special attention devoted to comparing female-perpetrated femicide incidents (n = 39) against male-perpetrated femicide incidents (n = 364). Specifically, the current study will explore the similarities and differences in sociodemographic characteristics of victims and suspects, offense characteristics, and offense circumstances. Contrary to what was expected, results, at first glance, seem to suggest an overwhelming similarity between femicide suspects and victims, irrespective of gender. However, when the relationship between victim and suspect is considered, distinct differences appear. Implications from these findings as well as limitations and suggestions for further research are discussed.
Garcia, Roberto; Griffin, Lisa; Williams, Robert
TD64, the Applied Fluid Dynamics Analysis Group, is one of several groups with high-fidelity fluids design and analysis expertise in the Space Transportation Directorate at Marshall Space Flight Center (MSFC). TD64 assists personnel working on other programs. The group participates in projects in the following areas: turbomachinery activities, nozzle activities, combustion devices, and the Columbia accident investigation.
Sommers, Lawrence M.
Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…
Holt, Margaret E.; And Others
Certain systems analysis techniques can be applied to examinations of program failure in continuing education to locate weaknesses in planning and implementing stages. Questions to guide an analysis and various procedures are recommended. Twelve issues that contribute to failures or discontinuations are identified. (Author/MLW)
Belenko, Steven; Dugosh, Karen L; Lynch, Kevin; Mericle, Amy A; Pich, Michele; Forman, Robert F
Given the uncertain effects of antidrug media campaigns, and the ease of finding online illegal drug information, research is needed on the Internet role in disseminating drug information to youths. This exploratory study analyzes National Survey of Parents and Youth (NSPY) data on drug website viewing among 12-18 year olds (N = 7,145). Approximately 10.4% reported drug-related website exposure: 5.4% viewed only websites that communicated how to avoid drugs or bad things about drugs (antidrug websites); 1.7% only viewed websites that communicated how to use drugs and good things about drugs (prodrug websites); and 3.2% viewed both types of websites. The low rates of viewing antidrug websites occurred despite efforts in the National Youth Antidrug Media Campaign (NYAMC) to encourage youths to visit such websites. Prodrug website viewers had used inhalants and been offered marijuana, perceived little risk in trying marijuana, intended to use marijuana, had close friends who used drugs, reported low parental monitoring, and had been exposed to antidrug media messages. Viewing antidrug websites was related to gender, income, likelihood of using marijuana in the next 12 months, having close friends who use drugs and talking to friends about avoiding drugs, parental monitoring, and drug prevention exposure. Prior prevention exposure increased drug website viewing overall, perhaps by increasing general curiosity about drugs. Because adolescents increasingly seek health information online, research is needed on how they use the Internet as a drug information source, the temporal relationships of prevention exposure and drug website viewing, and the effects of viewing prodrug websites on drug risk.
Bui, Eric; Mauro, Christine; Robinaugh, Donald J.; Skritskaya, Natalia A.; Wang, Yuanjia; Gribbin, Colleen; Ghesquiere, Angela; Horenstein, Arielle; Duan, Naihua; Reynolds, Charles; Zisook, Sidney; Simon, Naomi M.; Shear, M. Katherine
Background Complicated grief (CG) has been recently included in the DSM-5, under the term “Persistent Complex Bereavement Disorder”, as a condition requiring further study. To our knowledge, no psychometric data on any structured clinical interview of CG is available to date. In this manuscript, we introduce the Structured Clinical Interview for CG (SCI-CG) a 31-item “SCID-like” clinician-administered instrument to assess the presence of CG symptoms. Methods Participants were 281 treatment-seeking adults with CG (77.9% (n=219) women, mean age = 52.4, SD = 17.8) who were assessed with the SCI-CG and measures of depression, posttraumatic stress, anxiety, functional impairment. Results The SCI-CG exhibited satisfactory internal consistency (α = .78), good test-retest reliability (Inter-class correlation [ICC] 0.68, 95% CI [0.60, 0.75]), and excellent inter-rater reliability (ICC=0.95, 95% CI [0.89, 0.98]). Exploratory factor analyses revealed that a five-factor structure, explaining 50.3% of the total variance, was the best fit for the data. Conclusions The clinician-rated SCI-CG demonstrates good internal consistency, reliability, and convergent validity in treatment-seeking individuals with CG and therefore can be a useful tool to assess CG. Although diagnostic criteria for CG have yet to be adequately validated, the SCI-CG may facilitate this process. The SCI-CG can now be used as a validated instrument in research and clinical practice. PMID:26061724
Niemann, J; Tietze, E; Ruddat, I; Fruth, A; Prager, R; Rabsch, W; Blaha, T; Münchhausen, C; Merle, R; Kreienbrock, L
An exploratory study in five conventional pig production clusters was carried out to investigate the dynamic and diversity of Salmonella spp. within different production stages and sample site categories (pooled feces, direct and non-direct environment). Observing two production cycles per production cluster, a total of 1276 samples were collected along the pig production chain. Following a microbiological examination via culture, 2246 subcultures were generated out of 285 Salmonella positive samples and analysed by pheno- and genotyping methods. Based on a combination of serotyping, MLVA (multiple-locus variable-number tandem repeat (VNTR) analysis), PFGE (pulse-field gel electrophoresis) and MLST (multilocus sequence typing), an amount of 22.3% Salmonella positive samples were characterized in clonal lineages and its variants. Within each production cluster, one main clonal lineage could be identified and persisted over both production cycles with a large diversity of variants and a wide distribution in sample site categories and production stages. Results underline the importance of biosecurity with emphasis on the environment to prevent persistence and circulation of Salmonella within herds. Furthermore, the combined implementation of MLVA, PFGE and MLST with conventional culture techniques for isolate classification could be successfully applied as an effective and valuable tool for identifying similar pattern of Salmonella occurrence within pig production clusters.
Zhou, Hao; Zhang, Lili; Luo, Xuerong; Wu, Lijie; Zou, Xiaobing; Xia, Kun; Wang, Yimin; Xu, Xiu; Ge, Xiaoling; Jiang, Yong-Hui; Fombonne, Eric; Yan, Weili; Wang, Yi
The purpose of this study was to explore the psychometric properties of the Chinese version of the autism spectrum rating scale (ASRS). We recruited 1,625 community-based children and 211 autism spectrum disorder (ASD) cases from 4 sites, and the parents of all participants completed the Chinese version of the ASRS. A robust weighted least squares means and variance adjusted estimator was used for exploratory factor analysis. The 3-factor structure included 59 items suitable for the current sample. The item reliability for the modified Chinese version of the ASRS (MC-ASRS) was excellent. Moreover, with 60 as the cut-off point, receiver operating characteristic analysis showed that the MC-ASRS had excellent discriminate validity, comparable to that of the unmodified Chinese version (UC-ASRS), with area under the curve values of 0.952 (95% CI: 0.936-0.967) and 0.948 (95% CI: 0.930-0.965), respectively. Meanwhile, the confirm factor analysis revealed that MC-ASRS had a better construct validity than UC-ASRS based on the above factor solution in another children sample. In conclusion, the MC-ASRS shows better efficacy in epidemiological screening for ASD in Chinese children.
Aalboe, Joanna A; Schumacher, Mitzi M
The aim of this study was to explore the internal structure of an instrument assessing dental students' confidence in their ability to communicate with patients in six specific circumstances (anxious, in pain, etc.) using exploratory factor analysis. In a Communication in the Dental Health Care Setting course at a U.S. dental school, second-year dental students in two years (2013 and 2014) responded to the six items on a survey instrument. Of the total 123 students, 122 fully completed the instrument, for a response rate of 99%. Analysis of the results identified a unidimensional scale with regards to patient-specific communication self-efficacy and explained 74% of the total variance. The scale had good internal consistency reflected by high Cronbach's alpha (α=0.929, 95% CI [0.907, 0.947]). These findings suggest the instrument may be a useful tool in assessing the development of patient communication skills in second-year dental students following a course in communication. Further exploration utilizing confirmatory analysis, determining predictive validity, and assessing convergent and discriminant evidence is warranted.
Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan
The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.
The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604
Sánchez-Carracedo, David; Barrada, Juan Ramón; López-Guimerà, Gemma; Fauquet, Jordi; Almenara, Carlos A; Trepat, Esther
The aims of the present study were: (1) to assess the factor structure of the SATAQ-3 in Spanish secondary-school students by means of exploratory factor analysis (EFA), confirmatory factor analysis (CFA) and exploratory structural equation modeling (ESEM) models; and (2) to study its invariance by sex and school grade. ESEM is a technique that has been proposed for the analysis of internal structure that overcomes some of the limitations of EFA and CFA. Participants were 1559 boys and girls in grades seventh to tenth. The results support the four-factor solution of the original version, and reveal that the best fit was obtained with ESEM, excluding Item 20 and with correlated uniqueness between reverse-keyed items. Our version shows invariance by sex and grade. The differences between scores of different groups are in the expected direction, and support the validity of the questionnaire. We recommend a version excluding Item 20 and without reverse-keyed items.
Cawthon, Stephanie W; Wurtz, Keith A
The purpose of this paper is to present findings on alternate assessments for students who are deaf or hard of hearing (SDHH). Drawn from the results of the "Second National Survey of Assessments and Accommodations for Students Who Are Deaf or Hard of Hearing," this study investigated three alternate assessment formats: portfolio, checklists, and out-of-level testing. Analysis includes descriptive data of alternate assessment use across all three formats, qualitative analyses of teacher perspectives, and an exploratory logistic regression analysis on predictors of alternate assessment use. This exploratory analysis looks at predictors such as state policy, educational setting, grades served, language of instruction, and participant perspectives. Results indicate that predictors at the student, teacher, and system level may influence alternate assessment use for SDHH.
Páez, Antonio; Ruiz, Manuel; López, Fernando; Logan, John
The study of population patterns has animated a large body of urban social research over the years. An important part of this literature is concerned with the identification and measurement of segregation patterns. Recently, emphatic calls have been made to develop measures that are better able to capture the geography of population patterns. The objective of this paper is to demonstrate the application of the Q statistic, developed for the analysis of spatial association of qualitative variables, to the detection of ethnic clustering and exposure patterns. The application is to historical data from 1880 Newark in the United States, with individuals classified by ethnicity and geo-coded by place of residence. Three ethnic groups, termed Irish, Germans, and Yankees are considered. Exploratory analysis with the Q statistic identifies significant differences in the tendency of individuals and building occupancy to cluster by ethnicity. In particular, there is evidence of a strong affinity within ethnic clusters, and some intermingling between Yankee and Irish residents. In contrast, the exposure of Germans to individuals of other groups is found to be more limited. PMID:24855322
Páez, Antonio; Ruiz, Manuel; López, Fernando; Logan, John
The study of population patterns has animated a large body of urban social research over the years. An important part of this literature is concerned with the identification and measurement of segregation patterns. Recently, emphatic calls have been made to develop measures that are better able to capture the geography of population patterns. The objective of this paper is to demonstrate the application of the Q statistic, developed for the analysis of spatial association of qualitative variables, to the detection of ethnic clustering and exposure patterns. The application is to historical data from 1880 Newark in the United States, with individuals classified by ethnicity and geo-coded by place of residence. Three ethnic groups, termed Irish, Germans, and Yankees are considered. Exploratory analysis with the Q statistic identifies significant differences in the tendency of individuals and building occupancy to cluster by ethnicity. In particular, there is evidence of a strong affinity within ethnic clusters, and some intermingling between Yankee and Irish residents. In contrast, the exposure of Germans to individuals of other groups is found to be more limited.
Ostlund, Brendan D.; Conradt, Elisabeth; Crowell, Sheila E.; Tyrka, Audrey R.; Marsit, Carmen J.; Lester, Barry M.
Exposure to stress in utero is a risk factor for the development of problem behavior in the offspring, though precise pathways are unknown. We examined whether DNA methylation of the glucocorticoid receptor gene, NR3C1, was associated with experiences of stress by an expectant mother and fearfulness in her infant. Mothers reported on prenatal stress and infant temperament when infants were 5 months old (n = 68). Buccal cells for methylation analysis were collected from each infant. Prenatal stress was not related to infant fearfulness or NR3C1 methylation in the sample as a whole. Exploratory sex-specific analysis revealed a trend-level association between prenatal stress and increased methylation of NR3C1 exon 1F for female, but not male, infants. In addition, increased methylation was significantly associated with greater fearfulness for females. Results suggest an experience-dependent pathway to fearfulness for female infants via epigenetic modification of the glucocorticoid receptor gene. Future studies should examine prenatal stress in a comprehensive fashion while considering sex differences in epigenetic processes underlying infant temperament. PMID:27462209
Delorme, Richard; Bille, Arnaud; Betancur, Catalina; Mathieu, Flavie; Chabane, Nadia; Mouren-Simeoni, Marie Christine; Leboyer, Marion
Background Recent statistical approaches based on factor analysis of obsessive compulsive (OC) symptoms in adult patients have identified dimensions that seem more effective in symptom-based taxonomies and appear to be more stable over time. Although a phenotypic continuum from childhood to adulthood has been hypothesized, no factor analytic studies have been performed in juvenile patients, and the stability of OC dimensions in children and adolescents has not been assessed. Methods This study was designed to perform an exploratory factor analysis of OC symptoms in a sample of children and adolescents with OC disorder (OCD) and to investigate the course of factors over time (mean follow-up period: four years). Results We report for the first time that four symptom dimensions, remarkably similar to those previously described in adults, underlined the heterogeneity of OC symptoms in children and adolescents. Moreover, after follow-up, the symptom dimensions identified remained essentially unmodified. The changes observed concerned the intensity of dimensions rather than shifts from one dimension to another. Conclusion These findings reinforce the hypothesis of a phenotypic continuum of OC symptoms from childhood to adulthood. They also strengthen the interest for investigating the clinical, neurobiological and genetic heterogeneity of OCD using a dimension-based approach. PMID:16396684
Byrd, Gary D.; Shedlock, James
This paper presents an exploratory trend analysis of the statistics published over the past twenty-four editions of the Annual Statistics of Medical School Libraries in the United States and Canada. The analysis focuses on the small subset of nineteen consistently collected data variables (out of 656 variables collected during the history of the survey) to provide a general picture of the growth and changing dimensions of services and resources provided by academic health sciences libraries over those two and one-half decades. The paper also analyzes survey response patterns for U.S. and Canadian medical school libraries, as well as osteopathic medical school libraries surveyed since 1987. The trends show steady, but not dramatic, increases in annual means for total volumes collected, expenditures for staff, collections and other operating costs, personnel numbers and salaries, interlibrary lending and borrowing, reference questions, and service hours. However, when controlled for inflation, most categories of expenditure have just managed to stay level. The exceptions have been expenditures for staff development and travel and for collections, which have both outpaced inflation. The fill rate for interlibrary lending requests has remained steady at about 75%, but the mean ratio of items lent to items borrowed has decreased by nearly 50%. PMID:12883578
The partitioning of nonpolar organic contaminants to marine sediments is considered to be controlled by the amount of organic carbon present. However, several studies propose that other characteristics of sediments may affect the partitioning of contaminants. For this exploratory...
McGill, Ryan J; Spurgin, Angelia R
Higher order factor structure of the Luria interpretive scheme on the Kaufman Assessment Battery for Children-Second Edition (KABC-II) for the 7- to 12-year and the 13- to 18-year age groups in the KABC-II normative sample (N = 2,025) is reported. Using exploratory factor analysis, multiple factor extraction criteria, and hierarchical exploratory factor analysis not included in the KABC-II manual, two-, three-, and four-factor extractions were analyzed to assess the hierarchical factor structure by sequentially partitioning variance appropriately to higher order and lower order dimensions as recommended by Carroll. No evidence for a four-factor solution was found. Results showed that the largest portions of total and common variance were accounted for by the second-order general factor and that interpretation should focus primarily, if not exclusively, at that level of measurement.
Epling, W. Frank; Pierce, W. David
Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574
Veronese, Nicola; Luchini, Claudio; Solmi, Marco; Sergi, Giuseppe; Manzato, Enzo; Stubbs, Brendon
Monoclonal gammopathy of undetermined significance (MGUS) is a common condition in the elderly. A number of studies have investigated the relationship between MGUS and bone health outcomes including bone mineral density (BMD), osteoporosis and fractures, but no meta-analysis exists. We conducted a systematic review and exploratory meta-analysis comparing bone health outcomes in patients with MGUS. Two independent authors searched PubMed and Scopus from inception until 19 October 2016. A meta-analysis of cross-sectional and longitudinal studies investigating fractures and BMD was conducted. Standardised mean differences (SMD) ± 95% confidence intervals (CIs) were calculated for BMD, and risk ratios (RRs) were calculated for prevalent and incident fractures. Of 174 initial hits, 10 studies of moderate methodological quality were eligible, including 8711 individuals with MGUS vs. 52,865 controls. Compared to controls, subjects with MGUS showed significantly lower values for radial cortical volumetric BMD (1 study; SMD = -5.45, 95% CI: -7.24 to -3.66), but not at the lumbar spine, femoral neck or hip. The incidence of fractures was higher in people with MGUS (n = 7466) vs. controls (n = 52,304) (RR = 1.36, 95% CI 1.28-1.44, I (2) = 0%) over a median of 12.5-year follow-up. The incidence of vertebral fractures was particularly elevated (RR = 2.50, 95% CI 1.53-4.06) although limited to two studies. In conclusion, although with limitations, our preliminary meta-analysis suggests that patients with MGUS are at higher risk of fractures despite evidence for differences in BMD being equivocal. Future longitudinal research is required to confirm our findings and determine if fracture prevention interventions are warranted in people with MGUS.
Garcia, Roberto; Griffin, Lisa; Williams, Robert
This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.
Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)
This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.
Olohunfunmi, Ismail Abdul Fatai
The main aim of the present study is to present a clear frame work of how to practically apply the concept "Zuhd" to individual Muslim life. It is an empirical research on the Islamic concept of "Zuhd." The method that is employed in the study is qualitative approach, whereby interviews were staged, recorded and transcribed.…
Pastura, Giuseppe; Doering, Thomas; Gasparetto, Emerson Leandro; Mattos, Paulo; Araújo, Alexandra Prüfer
Abnormalities in the white matter microstructure of the attentional system have been implicated in the aetiology of attention deficit hyperactivity disorder (ADHD). Diffusion tensor imaging (DTI) is a promising magnetic resonance imaging (MRI) technology that has increasingly been used in studies of white matter microstructure in the brain. The main objective of this work was to perform an exploratory analysis of white matter tracts in a sample of children with ADHD versus typically developing children (TDC). For this purpose, 13 drug-naive children with ADHD of both genders underwent MRI using DTI acquisition methodology and tract-based spatial statistics. The results were compared to those of a sample of 14 age- and gender-matched TDC. Lower fractional anisotropy was observed in the splenium of the corpus callosum, right superior longitudinal fasciculus, bilateral retrolenticular part of the internal capsule, bilateral inferior fronto-occipital fasciculus, left external capsule and posterior thalamic radiation (including right optic radiation). We conclude that white matter tracts in attentional and motor control systems exhibited signs of abnormal microstructure in this sample of drug-naive children with ADHD.
Phillips, Selene G; Della, Lindsay J; Sohn, Steve H
In an exploratory analysis of several highly circulated consumer cancer magazines, the authors evaluated congruency between visual images of cancer patients and target audience risk profile. The authors assessed 413 images of cancer patients/potential patients for demographic variables such as age, gender, and ethnicity/race. They compared this profile with actual risk statistics. The images in the magazines are considerably younger, more female, and more White than what is indicated by U.S. cancer risk statistics. The authors also assessed images for visual signs of cancer testing/diagnosis and treatment. Few individuals show obvious signs of cancer treatment (e.g., head scarves, skin/nail abnormalities, thin body types). Most images feature healthier looking people, some actively engaged in construction work, bicycling, and yoga. In contrast, a scan of the editorial content showed that nearly two thirds of the articles focus on treatment issues. To explicate the implications of this imagery-text discontinuity on readers' attention and cognitive processing, the authors used constructs from information processing and social identity theories. On the basis of these models/theories, the authors provide recommendations for consumer cancer magazines, suggesting that the imagery be adjusted to reflect cancer diagnosis realities for enhanced message attention and comprehension.
Edwards, Timothy L; Poling, Alan
This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications.
Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.
Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…
Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene
Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…
Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…
Garcia, Roberto; Griffin, Lisa; Williams, Robert
This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.
Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane
Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…
This paper presents a statistical descriptive analysis and a thorough content analysis of descriptors and journal titles extracted from the Library and Information Science Abstracts (LISA) database, focusing on the subject of Web 2.0 and its main applications: blog, wiki, social network and tags.The primary research questions include: whether the…
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
Lattal, Kennon A.; Neef, Nancy A.
Reinforcement schedules are considered in relation to applied behavior analysis by examining several recent laboratory experiments with humans and other animals. The experiments are drawn from three areas of contemporary schedule research: behavioral history effects on schedule performance, the role of instructions in schedule performance of humans, and dynamic schedules of reinforcement. All of the experiments are discussed in relation to the role of behavioral history in current schedule performance. The paper concludes by extracting from the experiments some more general issues concerning reinforcement schedules in applied research and practice. PMID:16795888
The preliminary results from the innovative subsurface microbiology research program indicate that new data on the nature of the link between the geosphere and biosphere have been generated. The diversity of scientific disciplines represented in the subsurface microbiology program reflects the complexity of the system under study. The research carried out by national laboratory and university research scientists is addressing fundamental questions about the abundance of microorganisms and factors controlling microbial activity in the complex subsurface hydrologic and geochemical environment. Long-term implications of this research for mitigating contamination are clear and researchers share the broader objective of linking the basic science with applied work.
Peres, Maria Fernanda Tourinho; de Almeida, Juliana Feliciano; Vicentin, Diego; Cerda, Magdalena; Cardia, Nancy; Adorno, Sérgio
Throughout the first decade of the 2000s the homicide mortality rate (HMR) showed a significant reduction in the state and the city of São Paulo (MSP). The aim of this study is to describe the trend of HMR, socio-demographic indicators, and the investment in social and public security, and to analyze the correlation between HMR and independent variables in the MSP between 1996 and 2008. An exploratory time series ecological study was conducted. The following variables were included: HMR per 100,000 inhabitants, socio-demographic indicators, and investments in social and public security. The moving-averages for all variables were calculated and trends were analyzed through Simple Linear Regression models. Annual percentage changes, the average annual change and periodic percentage changes were calculated for all variables, and the associations between annual percentage changes were tested by Spearman’s correlation analysis. Correlations were found for the proportion of youth in the population (r = 0.69), unemployment rate (r = 0.60), State budget for education and culture (r = 0.87) and health and sanitation (r = 0.56), municipal (r = 0.68) and State (r = 0.53) budget for Public Security, firearms seized (r = 0.69) and the incarceration rate (r = 0.71). The results allow us to support the hypothesis that demographic changes, acceleration of the economy, in particular the fall in unemployment, investment in social policies and changes in public security policies act synergistically to reduce HMR in São Paulo. Complex models of analysis, incorporating the joint action of different potential explanatory variables, should be developed. PMID:22218669
Miketinas, Derek; Cater, Melissa; Bailey, Ariana; Craft, Brittany; Tuuri, Georgianna
Increasing adolescents' motivation and competence to cook may improve diet quality and reduce the risk for obesity and chronic diseases. The objective of this study was to develop an instrument to measure adolescents' intrinsic motivation to prepare healthy foods and the four psychological needs that facilitate motivation identified by the Self Determination Theory (SDT). Five hundred ninety-three high school students (62.7% female) were recruited to complete the survey. Participants indicated to what extent they agreed or disagreed with 25 statements pertaining to intrinsic motivation and perceived competence to cook, and their perceived autonomy support, autonomy, and relatedness to teachers and classmates. Data were analyzed using exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and internal consistency reliability. EFA returned a five-factor structure explaining 65.3% of the variance; and CFA revealed that the best model fit was a five-factor structure (χ2 = 524.97 (265); Comparative Fit Index = 0.93; RMSEA = 0.056; and SRMR = 0.04). The sub-scales showed good internal consistency (Intrinsic Motivation: α = 0.94; Perceived Competence: α = 0.92; Autonomy Support: α = 0.94; Relatedness: α = 0.90; and Autonomy: α = 0.85). These results support the application of the Adolescent Motivation to Cook Questionnaire to measure adolescents' motivation and perceived competence to cook, autonomy support by their instructor, autonomy in the classroom, and relatedness to peers. Further studies are needed to investigate whether this instrument can measure change in cooking intervention programs.
Peres, Maria Fernanda Tourinho; de Almeida, Juliana Feliciano; Vicentin, Diego; Cerda, Magdalena; Cardia, Nancy; Adorno, Sérgio
Throughout the first decade of the 2000s the homicide mortality rate (HMR) showed a significant reduction in the state and the city of São Paulo (MSP). The aim of this study is to describe the trend of HMR, socio-demographic indicators, and the investment in social and public security, and to analyze the correlation between HMR and independent variables in the MSP between 1996 and 2008. An exploratory time series ecological study was conducted. The following variables were included: HMR per 100,000 inhabitants, socio-demographic indicators, and investments in social and public security. The moving-averages for all variables were calculated and trends were analyzed through Simple Linear Regression models. Annual percentage changes, the average annual change and periodic percentage changes were calculated for all variables, and the associations between annual percentage changes were tested by Spearman's correlation analysis. Correlations were found for the proportion of youth in the population (r = 0.69), unemployment rate (r = 0.60), State budget for education and culture (r = 0.87) and health and sanitation (r = 0.56), municipal (r = 0.68) and State (r = 0.53) budget for Public Security, firearms seized (r = 0.69) and the incarceration rate (r = 0.71). The results allow us to support the hypothesis that demographic changes, acceleration of the economy, in particular the fall in unemployment, investment in social policies and changes in public security policies act synergistically to reduce HMR in São Paulo. Complex models of analysis, incorporating the joint action of different potential explanatory variables, should be developed.
Batson, Sarah; Sutton, Alex; Abrams, Keith
Background Patients with atrial fibrillation are at a greater risk of stroke and therefore the main goal for treatment of patients with atrial fibrillation is to prevent stroke from occurring. There are a number of different stroke prevention treatments available to include warfarin and novel oral anticoagulants. Previous network meta-analyses of novel oral anticoagulants for stroke prevention in atrial fibrillation acknowledge the limitation of heterogeneity across the included trials but have not explored the impact of potentially important treatment modifying covariates. Objectives To explore potentially important treatment modifying covariates using network meta-regression analyses for stroke prevention in atrial fibrillation. Methods We performed a network meta-analysis for the outcome of ischaemic stroke and conducted an exploratory regression analysis considering potentially important treatment modifying covariates. These covariates included the proportion of patients with a previous stroke, proportion of males, mean age, the duration of study follow-up and the patients underlying risk of ischaemic stroke. Results None of the covariates explored impacted relative treatment effects relative to placebo. Notably, the exploration of ‘study follow-up’ as a covariate supported the assumption that difference in trial durations is unimportant in this indication despite the variation across trials in the network. Conclusion This study is limited by the quantity of data available. Further investigation is warranted, and, as justifying further trials may be difficult, it would be desirable to obtain individual patient level data (IPD) to facilitate an effort to relate treatment effects to IPD covariates in order to investigate heterogeneity. Observational data could also be examined to establish if there are potential trends elsewhere. The approach and methods presented have potentially wide applications within any indication as to highlight the potential benefit
Mitra, S. ) Wilson, N.K. )
Principal component analysis (PCA) was used to study polynuclear aromatic hydrocarbon (PAH) profiles in indoor air. Fifteen PAHs were measured in ten different homes in Columbus (Ohio) which had different indoor emission characteristics such as gas utilities, wood-burning fireplaces, and cigarette smokers. Different PAH concentration patterns emerged depending upon the emission sources present in the different homes. Of these, cigarette smoking appeared to have the greatest impact on the indoor PAH concentrations. The PCA allowed convenient displays of the multidimensional data set from which the PAH concentration characteristics could be elucidated. The interrelationship between the different PAHs was also studied by correlation analysis.
Zheng, Robert; Spears, Jeffrey; Luptak, Marilyn; Wilby, Frances
The current study examined factors related to older adults' perceptions of Internet use. Three hundred ninety five older adults participated in the study. The factor analysis revealed four factors perceived by older adults as critical to their Internet use: social connection, self-efficacy, the need to seek financial information, and the need to…
model to a weaker one which does not capture exponent addition or group multiplication. This would seem to be a problematic model: it seems to deny the...protocol analysis for Diffie-Hellman. CoRR, abs/1202.2168, 2012.  Catherine Meadows and Paliath Narendran. A unification algorithm for the group
Foxman, Ellen; Easterling, Debbie
Content analysis of portrayals of organizations and individuals in 32 marketing textbooks showed that in many respects their depiction of the actual U.S. workplace was not accurate. Women and people with disabilities were underrepresented; results for ethnic minorities were unclear because of difficulties of identification in print. (SK)
Byrd, Warren C; Schwartz-Baxter, Sarah; Carlson, Jim; Barros, Silvana; Offenbacher, Steven; Bencharit, Sompop
Denture stomatitis, inflammation and redness beneath a denture, affects nearly half of all denture wearers. Candidal organisms, the presence of a denture, saliva, and host immunity are the key etiological factors for the condition. The role of salivary proteins in denture stomatitis is not clear. In this study 30 edentulous subjects wearing a maxillary complete denture were recruited. Unstimulated whole saliva from each subject was collected and pooled into two groups (n = 15 each), healthy and stomatitis (Newton classification II and III). Label-free multidimensional liquid chromatography/tandem mass spectrometry (2D-LC-MS/MS) proteomics on two mass spectrometry platforms were used to determine peptide mass differences between control and stomatitis groups. Cluster analysis and principal component analysis were used to determine the differential expression among the groups. The two proteomic platforms identified 97 and 176 proteins (ANOVA; p < 0.01) differentially expressed among the healthy, type 2 and 3 stomatitis groups. Three proteins including carbonic anhydrase 6, cystatin C, and cystatin SN were found to be the same as previous study. Salivary proteomic profiles of patients with denture stomatitis were found to be uniquely different from controls. Analysis of protein components suggests that certain salivary proteins may predispose some patients to denture stomatitis while others are believed to be involved in the reaction to fungal infection. Analysis of candidal proteins suggests that multiple species of candidal organisms play a role in denture stomatitis.
Chung, Gregory K. W. K.; Dionne, Gary B.; Kaiser, William J.
Our research question was whether we could develop a feasible technique, using Bayesian networks, to diagnose gaps in student knowledge. Thirty-four college-age participants completed tasks designed to measure conceptual knowledge, procedural knowledge, and problem-solving skills related to circuit analysis. A Bayesian network was used to model…
Sherman, Charles R.
This is one of five studies performed in 1976 to examine the characteristics of U.S. medical schools and the interrelationship among variables that describe them. A principal components analysis was performed and interpreted exploring the interrelationships of 33 selected variables that describe the faculty, student, curriculum, and other…
Saenz, Victor B.; Hatch, Deryl; Bukoski, Beth E.; Kim, Suyun; Lee, Kye-hyoung; Valdez, Patrick
This study employs survey data from the Center for Community College Student Engagement to examine the similarities and differences that exist across student-level domains in terms of student engagement in community colleges. In total, the sample used in the analysis pools data from 663 community colleges and includes more than 320,000 students.…
Smits, Iris A. M.; Timmerman, Marieke E.; Meijer, Rob R.
The assessment of the number of dimensions and the dimensionality structure of questionnaire data is important in scale evaluation. In this study, the authors evaluate two dimensionality assessment procedures in the context of Mokken scale analysis (MSA), using a so-called fixed lowerbound. The comparative simulation study, covering various…
Dembo, Richard; Briones-Robinson, Rhissa; Ungaro, Rocio Aracelis; Gulledge, Laura M.; Karas, Lora M.; Winters, Ken C.; Belenko, Steven; Greenbaum, Paul E.
Intervention Project. Results identified two classes of youths: Class 1(n=9) - youths with low levels of delinquency, mental health and substance abuse issues; and Class 2(n=37) - youths with high levels of these problems. Comparison of these two classes on their urine analysis test results and parent/guardian reports of traumatic events found…
analysis suggests that it could be a manifestation of it or another condition associated with anxiety such as fibromyalgia . Factor 3...military population. However, general muscle pain can accompany many other illnesses, such as infectious diseases, autoimmune disorders, fibromyalgia ...Rosenberg R, Bach FW, Jensen TS. Depression, anxiety, health-related quality of life and pain in patients with chronic fibromyalgia and neuropathic pain
Perry, John L.; Nicholls, Adam R.; Clough, Peter J.; Crust, Lee
Despite the limitations of overgeneralizing cutoff values for confirmatory factor analysis (CFA; e.g., Marsh, Hau, & Wen, 2004), they are still often employed as golden rules for assessing factorial validity in sport and exercise psychology. The purpose of this study was to investigate the appropriateness of using the CFA approach with these…
Byrd, Warren C.; Schwartz-Baxter, Sarah; Carlson, Jim; Barros, Silvana; Offenbacher, Steven; Bencharit, Sompop
Denture stomatitis, inflammation and redness beneath a denture, affects nearly half of all denture wearers. Candida organism, the presence of a denture, saliva, and host immunity are the key etiological factors for the condition. The role of salivary proteins in denture stomatitis is not clear. In this study 30 edentulous subjects wearing a maxillary complete denture were recruited. Unstimulated whole saliva from each subject was collected and pooled into two groups (n=15 each); healthy and stomatitis (Newton classification II and III). Label-free multidimensional liquid chromatography/tandem mass spectrometry (2D-LC-MS/MS) proteomics on two mass spectrometry platforms were used to determine peptide mass differences between control and stomatitis groups. Cluster analysis and principal component analysis were used to determine differential expression among the groups. The two proteomic platforms identified 97 and 176 proteins (ANOVA; p<0.01) differentially expressed among the healthy, type 2 and 3 stomatitis groups. Three proteins including carbonic anhydrase 6, cystatin C, and cystatin SN were found to be the same as previous study. Salivary proteomic profiles of patients with denture stomatitis were found to be uniquely different from controls. Analysis of protein components suggests that certain salivary proteins may predispose some patients to denture stomatitis while others are believed to be involved in the reaction to fungal infection. Analysis of candidal proteins suggest that multiple species of candidal organisms play a role in denture stomatitis. PMID:24947908
This study profiles students in an introductory MIS course according to a variety of variables associated with choice of academic major. The data were collected through a survey administered to 12 sections of the course. A two-step cluster analysis was performed with gender as a categorical variable and students' perceptions of task value…
Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S
Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists.
the model incorporates (in the personnel calculations) econometric effects to Losses by Expiration of Active Obligated Service , Attrition, and Length...APPENDIX D. SUMMARIES OF REMANING META- MODELS Retirement Losses Prior Service Gains Summary of Fit RSquare RSquare Adj Root Mean Square...ANALYSIS OF ECONOMIC FACTORS IN THE NAVY TOTAL FORCE STRENGTH MODEL (NTFSM) by William P. DeSousa December 2015 Thesis Advisor: Thomas W
Yilmaz, Ali; Nyberg, Nils T; Jaroszewski, Jerzy W
A museum collection of Cinchonae cortex samples (n = 117), from the period 1850-1950, was extracted with a mixture of chloroform-d1, methanol-d4, water-d2, and perchloric acid in the ratios 5 : 5 : 1 : 1. The extracts were directly analyzed using 1H NMR spectroscopy (600 MHz) and the spectra evaluated using principal component analysis (PCA) and total statistical correlation spectroscopy (STOCSY). A new method called STOCSY-CA, where CA stands for component analysis, is described, and an analysis using this method is presented. It was found that the samples had a rather homogenous content of the well-known cinchona alkaloids quinine, cinchonine, and cinchonidine without any apparent clustering. Signals from analogues were detected but not in substantial amounts. The main variation was related to the absolute amounts of extracted alkaloids, which was attributed to the evolution of the Cinchona tree cultivation during the period in which the samples were collected.
Lowenkron, Barry; Mitchell, Lynda
At many universities, faculty interested in behavior analysis are spread across disciplines. This makes difficult the development of behavior-analytically oriented programs, and impedes regular contact among colleagues who share common interests. However, this separation by disciplines can be a source of strength if it is used to develop interdisciplinary programs. In this article we describe how a bottom-up strategy was used to develop two complementary interdisciplinary MS programs in applied behavior analysis, and conclude with a description of the benefits—some obvious, some surprising—that can emerge from the development of such programs. PMID:22478230
La Peyre, M.K.; Mendelssohn, I.A.; Reams, M.A.; Templet, P.H.; Grace, J.B.
Integrated management and policy models suggest that solutions to environmental issues may be linked to the socioeconomic and political Characteristics of a nation. In this study, we empirically explore these suggestions by applying them to the wetland management activities of nations. Structural equation modeling was used to evaluate a model of national wetland management effort and one of national wetland protection. Using five predictor variables of social capital, economic capital, environmental and political characteristics, and land-use pressure, the multivariate models were able to explain 60% of the variation in nations' wetland protection efforts based on data from 90 nations, as defined by level of participation, in the international wetland convention. Social capital had the largest direct effect on wetland protection efforts, suggesting that increased social development may eventually lead to better wetland protection. In contrast, increasing economic development had a negative linear relationship with wetland protection efforts, suggesting the need for explicit wetland protection programs as nations continue to focus on economic development. Government, environmental characteristics, and land-use pressure also had a positive direct effect on wetland protection, and mediated the effect of social capital on wetland protection. Explicit wetland protection policies, combined with a focus on social development, would lead to better wetland protection at the national level.
van Manen, S. M.; Chen, S.
Here we present an App designed to visualize and identify patterns in volcanic activity during the last ten years. It visualizes VEI (volcanic explosivity index) levels, population size, frequency of activity, and geographic region, and is designed to address the issue of oversampling of data. Often times, it is difficult to access a large set of data that can be scattered at first glance and hard to digest without visual aid. This App serves as a model that solves this issue and can be applied to other data. To enable users to quickly assess the large data set it breaks down the apparently chaotic abundance of information into categories and graphic indicators: color is used to indicate the VEI level, size for population size within 5 km of a volcano, line thickness for frequency of activity, and a grid to pinpoint a volcano's latitude. The categories and layers within them can be turned on and off by the user, enabling them to scroll through and compare different layers of data. By visualising the data this way, patterns began to emerge. For example, certain geographic regions had more explosive eruptions than others. Another good example was that low frequency larger impact volcanic eruptions occurred more irregularly than smaller impact volcanic eruptions, which had a more stable frequencies. Although these findings are not unexpected, the easy to navigate App does showcase the potential of data visualization for the rapid appraisal of complex and abundant multi-dimensional geoscience data.
Akpa, Onoja Matthew; Bamgboye, Elijah Afolabi; Baiyewu, Olusegun
Most of the existing measures of psychosocial functioning among adolescents are developed outside Lower-middle-income countries (LMIC). Measures relevant to the LMIC setting will provide opportunity to assess the functioning of adolescents in these settings based on their background or context. The Adolescents’ Psychosocial Functioning Inventory (APFI) which addresses relevant challenges and expectations of adolescents in the LMIC settings was developed to bridge this gap in knowledge. A total of 753 adolescents from purposively selected secondary schools participated in this study. Preliminary analyses were performed using descriptive statistics. The underlying factor structure of the APFI was explored using Exploratory and confirmatory Factor Analysis. Chi-square Goodness of Fit (CGF) and other fit indices were used to assess model fit. Cronbachs alpha was used to assess the reliability of the items and subscales of the APFI. The final model derived from the factor analyses yielded a 23-item three-factor model that provided the best fit to the data. Estimate of overall reliability of the APFI scale was α = 0.83 while all three factors/subscales: Optimism and Coping Strategy (OCS), Behaviour and Relationship Problems (BRP), and General Psychosocial Dysfunctions (GPD) had moderate to high reliability (α = 0.59 for OCS, α = 0.57 for BRP and α = 0.90 for GPD). The CGF yielded χ2/df < = 1.58 while all other fit indices were in the acceptable range. The three-factor model APFI is a reliable measure for assessing psychosocial functioning among adolescents in the LMIC. PMID:25893221
Genaidy, Ash; Sequeira, Reynold; Rinder, Magda; A-Rehim, Amal
The emerging US carbon nano-manufacturing sector accounts for 40% of nanotechnology product marketplace, thus, there is a significant potential for increased risks arising from workers' exposure to carbon nanofibers (CNF). This research aims at developing a low-cost/evidence-based tool, thereby, increasing the sustainability of CNF manufacturing firms. The following specific aims achieve the study objective: Aim 1 - To present a technical discussion of the proposed concept for risk analysis and protection measures; Aim 2 - To describe the manufacturing process utilized for the CNF production; Aim 3 - To describe the hazards typically encountered in a CNF manufacturing facility; and, Aim 4 - To document the application of the proposed tool for risk analysis and intervention strategy development. In this study, a four-step methodology was developed to protect worker health in the nano-manufacturing enterprise through the generation of improvement actions (i.e., suggested changes in the hazard/work environment characteristics and individual capabilities without specifying how changes are made) followed by interventions (i.e., workplace solutions which specify how changes are being implemented). The methodology was implemented in a CNF manufacturing enterprise in the Midwest of the US. The data collected were based on detailed observations and interviews with worker and management personnel. A detailed flow process analysis was conducted for the nano-manufacturing operation. Eleven hazards were identified at the facility. Analysis indicated that the computed risk scores ranged from moderate (i.e., requiring one to start with incremental changes, then, explore substantial changes, if needed) to very high (i.e., substantial changes should be planned in the short term, followed by incremental changes). A detailed intervention plan was presented for the identified hazards on the basis of criteria of applicability, cost, benefit and feasibility. Management personnel were in
Ngo, Duy; Sun, Ying; Genton, Marc G; Wu, Jennifer; Srinivasan, Ramesh; Cramer, Steven C; Ombao, Hernando
Many model-based methods have been developed over the last several decades for analysis of electroencephalograms (EEGs) in order to understand electrical neural data. In this work, we propose to use the functional boxplot (FBP) to analyze log periodograms of EEG time series data in the spectral domain. The functional bloxplot approach produces a median curve-which is not equivalent to connecting medians obtained from frequency-specific boxplots. In addition, this approach identifies a functional median, summarizes variability, and detects potential outliers. By extending FBPs analysis from one-dimensional curves to surfaces, surface boxplots are also used to explore the variation of the spectral power for the alpha (8-12 Hz) and beta (16-32 Hz) frequency bands across the brain cortical surface. By using rank-based nonparametric tests, we also investigate the stationarity of EEG traces across an exam acquired during resting-state by comparing the spectrum during the early vs. late phases of a single resting-state EEG exam.
Morita, T; Tsunoda, J; Inoue, S; Chihara, S
To determine an underlying factorial structure of existential distress in Japanese terminally ill cancer patients, a principal components analysis was performed on 162 Japanese hospice inpatients. Existential distress commonly identified was dependency (39%), meaninglessness in present life (37%), hopelessness (37%), burden on others (34%), loss of social role functioning (29%), and feeling emotionally irrelevant (28%). By a factor analysis, three primary components accounted for 66% of the variance. 'Dependency' and 'loss of social role functioning' loaded highly on the first factor, which was interpreted as 'loss of autonomy'. 'Burden on others' and 'feeling emotionally irrelevant' loaded highly on the second component interpreted as 'lowered self-esteem', while 'hopelessness' loaded highly on the third factor. On the other hand, 'meaninglessness in present life' loaded equally on all three components, and was significantly associated with other distress. In conclusion, existential suffering of Japanese terminally ill cancer patients has three principal components: loss of autonomy, lowered self-esteem, and hopelessness. It is also suggested that meaninglessness in present life would be an underlying theme in patients' spirituality.
Nieves, L.A.; Hemphill, R.C.; Clark, D.E.
Recent assessments of socioeconomic impacts resulting from the location of potentially hazardous facilities have concentrated on the issue of negative public perceptions and their resulting economic consequences. This report presents an analysis designed to answer the question: Can economic impacts resulting from negative perceptions of ``noxious facilities`` be identified and measured? To identify the impacts of negative perceptions, data on noxious facilities sited throughout the United States were compiled, and secondary economic and demographic data sufficient to analyze the economic impacts on the surrounding study areas were assembled. This study uses wage rate and property value differentials to measure impacts on social welfare so that the extent to which noxious facilities and their associated activities have affected surrounding areas can be determined.
Nieves, L.A.; Hemphill, R.C.; Clark, D.E.
Recent assessments of socioeconomic impacts resulting from the location of potentially hazardous facilities have concentrated on the issue of negative public perceptions and their resulting economic consequences. This report presents an analysis designed to answer the question: Can economic impacts resulting from negative perceptions of noxious facilities'' be identified and measured To identify the impacts of negative perceptions, data on noxious facilities sited throughout the United States were compiled, and secondary economic and demographic data sufficient to analyze the economic impacts on the surrounding study areas were assembled. This study uses wage rate and property value differentials to measure impacts on social welfare so that the extent to which noxious facilities and their associated activities have affected surrounding areas can be determined.
Ferrari, Joseph R; Stevens, Edward B; Jason, Leonard A
Studies of self-regulation suggested that self-control requires finite resources which, in turn, may present a significant challenge for those trying to recover from or control addictive behaviors. The present study examined the relationships between self-regulation and abstinence maintenance among adults in recovery (n = 606: 407 men, 199 women; M age = 38.5 years) residing in self-governed, communal living, abstinent homes across the United States. Self-regulation scores (controlling for sex and age) were positively related to length of abstinence. In addition, a factor analysis of self-regulation scores resulted in some differentiation between general self-discipline and impulsivity in self-control related to addiction. The relationship between impulsivity and length of abstinence was stronger than the relationship derived between general self-regulation and length of abstinence.
Dembo, Richard; Wareham, Jennifer; Krupa, Julie; Winters, Ken C.
Little is known of sexual risk behaviors among truant youths across gender. This study utilized latent class analysis to examined heterogeneity of sexual risk behaviors across gender among a sample of 300 truant adolescents. Results revealed two latent subgroups within gender: low vs. high sexual risk behaviors. There were gender differences in baseline covariates of sexual risk behaviors, with male truants in higher risk group experiencing ADHD (attention deficit hyperactivity disorder) problems, and female truants in higher risk group experienced marijuana use and depression symptoms. African-American race was a significant covariate for high sexual risk behaviors for both genders. Service and practice implications of sexual risk issues of truant youth are discussed. PMID:27066517
Chan, Christine M S; Kitzmann, Katherine M
The aim of this study was to explore health perceptions of preschool teachers, with a view to inform early childhood practices and teacher education. Pre-service student-teachers and in-service teachers (n = 200) who were voluntarily recruited completed a 24-item health attitude questionnaire. Factor analysis identified four dimensions of health attitudes, reflecting physical, psychosocial, mental and emotional domains. Inter-correlations among the factors suggested that early childhood educators in Hong Kong embrace a holistic view of health, although they consider physical and emotional health as more salient than the psychosocial and mental health dimensions. In comparisons of the perceptions of in-service teachers and student-teachers, students placed less emphasis on psychosocial health, but teachers placed more emphasis on physical health. The findings are discussed in terms of their implications for designing health education programmes for preschool teacher education.
Blay, Sergio Luís; Aguiar, João Vicente Augusto; Passos, Ives Cavalcante
Background The association between depression, anxiety, and polycystic ovary syndrome (PCOS) is still unclear. Therefore, a systematic review and meta-analysis was conducted to assess the rates of comorbid psychiatric disorders among women with PCOS compared to women without it. Methods PubMed/MEDLINE, Embase, PsycINFO, and Web of Science databases were searched from inception to November 27, 2015. Studies were eligible for inclusion if they were original reports in which the rates of mood (bipolar disorder, dysthymia, or major depressive disorder), obsessive–compulsive spectrum disorders, trauma- and stressor-related disorders, anxiety disorders or psychotic disorders, somatic symptom and related disorders, or eating disorders had been investigated among women with an established diagnosis of PCOS and compared with women without PCOS. Psychiatric diagnosis should have been established by means of a structured diagnostic interview or through a validated screening tool. Data were extracted and pooled using random effects models. Results Six studies were included in the meta-analysis; of these, five reported the rates of anxiety and six provided data on the rates of depression. The rate of subjects with anxiety symptoms was higher in patients with PCOS compared to women without PCOS (odds ratio (OR) =2.76; 95% confidence interval (CI) 1.26 to 6.02; Log OR =1.013; P=0.011). The rate of subjects with depressive symptoms was higher in patients with PCOS compared to women without PCOS (OR =3.51; 95% CI 1.97 to 6.24; Log OR =1.255; P<0.001). Conclusion Anxiety and depression symptoms are more prevalent in patients with PCOS. PMID:27877043
Ranalli, M Giovanna; Rocco, Giorgia; Jona Lasinio, Giovanna; Moroni, Beatrice; Castellini, Silvia; Crocchianti, Stefano; Cappelletti, David
In this work we propose the use of functional data analysis (FDA) to deal with a very large dataset of atmospheric aerosol size distribution resolved in both space and time. Data come from a mobile measurement platform in the town of Perugia (Central Italy). An OPC (Optical Particle Counter) is integrated on a cabin of the Minimetrò, an urban transportation system, that moves along a monorail on a line transect of the town. The OPC takes a sample of air every six seconds and counts the number of particles of urban aerosols with a diameter between 0.28 μm and 10 μm and classifies such particles into 21 size bins according to their diameter. Here, we adopt a 2D functional data representation for each of the 21 spatiotemporal series. In fact, space is unidimensional since it is measured as the distance on the monorail from the base station of the Minimetrò. FDA allows for a reduction of the dimensionality of each dataset and accounts for the high space-time resolution of the data. Functional cluster analysis is then performed to search for similarities among the 21 size channels in terms of their spatiotemporal pattern. Results provide a good classification of the 21 size bins into a relatively small number of groups (between three and four) according to the season of the year. Groups including coarser particles have more similar patterns, while those including finer particles show a more different behavior according to the period of the year. Such features are consistent with the physics of atmospheric aerosol and the highlighted patterns provide a very useful ground for prospective model-based studies.
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
Sohlberg, McKay Moore; Todis, Bonnie; Fickas, Stephen; Ehlhardt, Laurie
The goal of this exploratory study was to investigate electronic communication as a potential method to enhance social communication in a range of students with disabilities. This study investigated the usability of an adapted e-mail interface, TeenMail, for 11 middle school students with significant learning and communication impairments who…
Blome, Wendy Whiting; Shields, Joseph; Verdieck, Mary Jeanne
The child welfare and substance abuse systems are integrally linked through the children and families they both serve. There is a dearth of knowledge, however, on how children who have experienced foster care fare when they are treated for substance abuse issues as adults. This article presents an exploratory study using the Alcohol and Drug…
Smith, Andrew; Thomas, Nigel
The XVII 2002 Commonwealth Games held in Manchester, England, was the first major international multi-sport event to include elite athletes with disabilities (EADs) in its main sports programme and medal table. In this exploratory article we seek to examine some of the complex issues surrounding the inclusion of EADs in the Manchester Games by…
Anderson-Hanley, Cay; Arciero, Paul J.; Westen, Sarah C.; Nimon, Joseph; Zimmerman, Earl
Objective This quasi-experimental exploratory study investigated neuropsychological effects of exercise among older adults with diabetes mellitus (DM) compared with adults without diabetes (non-DM), and it examined the feasibility of using a stationary bike exergame as a form of exercise for older adults with and without diabetes. It is a secondary analysis that uses a small dataset from a larger randomized clinical trial (RCT) called the Cybercycle Study, which compared cognitive and physiological effects of traditional stationary cycling versus cybercycling. Methods In the RCT and the secondary analysis, older adults living in eight independent living retirement facilities in the state of New York were enrolled in the study and assigned to exercise five times per week for 45 min per session (two times per week was considered acceptable for retention in the study) by using a stationary bicycle over the course of 3 months. They were randomly assigned to use either a standard stationary bicycle or a “cybercycle” with a video screen that displayed virtual terrains, virtual tours, and racing games with virtual competitors. For this secondary analysis, participants in the RCT who had type 2 DM (n = 10) were compared with age-matched non-DM exercisers (n = 10). The relationship between exercise and executive function (i.e., Color Trials 2, Digit Span Backwards, and Stroop C tests) was examined for DM and non-DM patients. Results Older adults with and without diabetes were able to use cybercycles successfully and complete the study, so the feasibility of this form of exercise for this population was supported. However, in contrast with the larger RCT, this small subset did not demonstrate statistically significant differences in executive function between the participants who used cybercycles and those who used stationary bikes with no games or virtual content on a video screen. Therefore, the study combined the two groups and called them “exercisers” and
Peterson, David; Truong, Pauline T.; Parpia, Sameer; Olivotto, Ivo A.; Berrang, Tanya; Kim, Do-Hoon; Kong, Iwa; Germain, Isabelle; Nichol, Alan; Akra, Mohamed; Roy, Isabelle; Reed, Melanie; Fyles, Anthony; Trotter, Theresa; Perera, Francisco; Balkwill, Susan; Lavertu, Sophie; Elliott, Elizabeth; and others
Purpose: To evaluate factors associated with adverse cosmesis outcome in breast cancer patients randomized to accelerated partial breast irradiation (APBI) using 3-dimensional conformal radiation therapy or whole-breast irradiation in the RAPID (Randomized Trial of Accelerated Partial Breast Irradiation) trial. Methods and Materials: Subjects were trial participants with nurse-assessed global cosmetic scores at baseline and at 3 years. Adverse cosmesis was defined as a score of fair or poor. Cosmetic deterioration was defined as any adverse change in score from baseline to 3 years. The analysis is based on data from the previously reported interim analysis. Logistic regression models were used to assess the association of risk factors for these outcomes among all patients and those treated with APBI only. Results: Clinicopathologic characteristics were similar between subjects randomized to APBI (n=569) or whole-breast irradiation (n=539). For all subjects, factors associated with adverse cosmesis at 3 years were older age, central/inner tumor location, breast infection, smoking, seroma volume, breast volume, and use of APBI; factors associated with cosmetic deterioration were smoking, seroma volume, and use of APBI (P<.05). For APBI subjects, tumor location, smoking, age, and seroma volume were associated with adverse cosmesis (P<.05), and smoking was associated with cosmetic deterioration (P=.02). An independent association between the V95/whole-breast volume ratio and adverse cosmesis (P=.28) or cosmetic deterioration (P=.07) was not detected. On further exploration a V95/whole-breast volume ratio <0.15 was associated with a lower risk of cosmetic deterioration (p=.04), but this accounted for only 11% of patients. Conclusion: In the RAPID trial, a number of patient tumor and treatment-related factors, including the use of APBI, were associated with adverse cosmesis and cosmetic deterioration. For patients treated with APBI alone, the high-dose treatment
Eusebi, Paolo; Kreiner, Svend
Analysis within a Rasch measurement framework aims at development of valid and objective test score. One requirement of both validity and objectivity is that items do not show evidence of differential item functioning (DIF). A number of procedures exist for the assessment of DIF including those based on analysis of contingency tables by Mantel-Haenszel tests and partial gamma coefficients. The aim of this paper is to illustrate Multiple Comparison Procedures (MCP) for analysis of DIF relative to a variable defining a very large number of groups, with an unclear ordering with respect to the DIF effect. We propose a single step procedure controlling the false discovery rate for DIF detection. The procedure applies for both dichotomous and polytomous items. In addition to providing evidence against a hypothesis of no DIF, the procedure also provides information on subset of groups that are homogeneous with respect to the DIF effect. A stepwise MCP procedure for this purpose is also introduced.
Erlyana, Erlyana; Acosta-Deprez, Veronica; O'Lawrence, Henry; Sinay, Tony; Ramirez, Jeremy; Jacot, Emmanuel C; Shim, Kyuyoung
The purpose of this study was to explore characteristics of Internet users who seek health insurance information online, as well as factors affecting their behaviors in seeking health insurance information. Secondary data analysis was conducted using data from the 2012 Pew Internet Health Tracking Survey. Of 2,305 Internet user adults, only 29% were seeking health insurance information online. Bivariate analyses were conducted to test differences in characteristics of those who seek health insurance information online and those who do not. A logistic regression model was used to determine significant predictors of health insurance information-seeking behavior online. Findings suggested that factors such as being a single parent, having a high school education or less, and being uninsured were significant and those individuals were less likely to seek health insurance information online. Being a family caregiver of an adult and those who bought private health insurance or were entitled to Medicare were more likely to seek health insurance information online than non-caregivers and the uninsured. The findings suggested the need to provide quality health insurance information online is critical for both the insured and uninsured population.
Carlesso, Lisa C; Cairney, John; Dolovich, Lisa; Hoogenes, Jennifer
Rare, serious, and common, benign adverse events (AE) are associated with MT techniques. A proposed standard for defining AE in manual therapy (MT) practise has been published but it did not include the patient perspective. Research comparing clinician and patient reporting of AE demonstrates that several differences exist; for example, the reporting of objective versus subjective events. The objective of this study was to describe how patients define AE associated with MT techniques. A descriptive qualitative design was employed. Semi-structured interviews were used with a purposive sample of patients (n = 13) receiving MT, from physiotherapy, chiropractic and osteopathic practises in Ontario, Canada. The interview guide was informed by existing evidence and consultation with content and methodological experts. Interviews were audiotaped and transcribed verbatim. Date were analysed by two independent team members using thematic content analysis. A key finding was that patients defined mild, moderate and major AE by pain/symptom severity, functional impact, duration and by ruling out of alternative causes. An overarching theme identified multiple factors that influence how the AE is perceived. These concepts differ from the previously proposed framework for defining AE that did not include the patient perspective. Future processes to create standard definitions or measures should include the patient viewpoint to provide a broader, client-centred foundation.
Moore, Tyler M.; Reise, Steven P.; Depaoli, Sarah; Haviland, Mark G.
We describe and evaluate a factor rotation algorithm, iterated target rotation (ITR). Whereas target rotation (Browne, 2001) requires a user to specify a target matrix a priori based on theory or prior research, ITR begins with a standard analytic factor rotation (i.e., an empirically-informed target) followed by an iterative search procedure to update the target matrix. Monte Carlo simulations were conducted to evaluate the performance of ITR relative to analytic rotations from the Crawford-Ferguson family with population factor structures varying in complexity. Simulation results: (a) suggested that ITR analyses will be particularly useful when evaluating data with complex structures (i.e., multiple cross-loadings) and (b) showed that the rotation method used to define an initial target matrix did not materially affect the accuracy of the various ITRs. In Study 2, we: (a) demonstrated the application of ITR as a way to determine empirically-informed priors in a Bayesian confirmatory factor analysis (BCFA; Muthén & Asparouhov, 2012) of a rater-report alexithymia measure (Haviland, Warren, & Riggs, 2000) and (b) highlighted some of the challenges when specifying empirically-based priors and assessing item and overall model fit. PMID:26609875
Moore, Tyler M; Reise, Steven P; Depaoli, Sarah; Haviland, Mark G
We describe and evaluate a factor rotation algorithm, iterated target rotation (ITR). Whereas target rotation (Browne, 2001) requires a user to specify a target matrix a priori based on theory or prior research, ITR begins with a standard analytic factor rotation (i.e., an empirically informed target) followed by an iterative search procedure to update the target matrix. In Study 1, Monte Carlo simulations were conducted to evaluate the performance of ITR relative to analytic rotations from the Crawford-Ferguson family with population factor structures varying in complexity. Simulation results: (a) suggested that ITR analyses will be particularly useful when evaluating data with complex structures (i.e., multiple cross-loadings) and (b) showed that the rotation method used to define an initial target matrix did not materially affect the accuracy of the various ITRs. In Study 2, we: (a) demonstrated the application of ITR as a way to determine empirically informed priors in a Bayesian confirmatory factor analysis (BCFA; Muthén & Asparouhov, 2012) of a rater-report alexithymia measure (Haviland, Warren, & Riggs, 2000) and (b) highlighted some of the challenges when specifying empirically based priors and assessing item and overall model fit.
McManus, Kaitlyn; Cummings, Madeline; Visker, Joseph; Cox, Carol
Lead is a strong poison and toxic to many vital organs and body systems especially in the central nervous system of children, who are more vulnerable to lead poisoning than adults. The purpose of the study described in this article was to examine the relationship between elevated blood lead level (BLL) cases of children in the state of Missouri and pre-1980 home construction, lead mine proximity, and median household income and to determine counties and areas for statewide prevention education. Results of the regression analysis indicated that these combined variables were significant predictors (F[3,111] = 19.106, p < .05, R2 = .341), accounting for 34.1% of the explained variance in the number elevated BLL cases. Number of houses built prior to 1980 (β = .606, p < .05) and median household income (β = -0.186, p < .05) were specifically revealed to be significant predictors of elevated blood lead cases. In addition to screening in identified counties, Missouri's statewide plan should expand to include prevention education in all low-income counties.
Falcone, D; Bolda, E; Leak, S C
This article examines the causes of delayed hospital discharge for 3,111 patients waiting for alternative placement in 80 North Carolina acute care general hospitals during May 1989. Almost all of the patients were elderly: their average age was 77. Delay is defined as the period between the day a patient was judged medically ready for discharge by a discharge planner and the day the patient was discharged (or May 31 if unplaced). The average delay was 16.7 days. The policy-relevant patient characteristics associated with delay are requirement for heavy care, race, source of reimbursement, and whether or not there was a financial problem in arranging discharge. The patient's age and whether or not a problem with behavior or family cooperativeness was noted also were predictors. Along with patient characteristics, hospital features such as bed size, occupancy rate, and total revenues were correlated with delay. Local nursing and rest home (domiciliary) bed supply were insignificant predictors, possibly because of their limited variance: the number of nursing home beds in all North Carolina counties is below the national mean; the number of rest home beds exceeds it. The conclusion reached is that the delay problem warrants more intensive analysis, particularly regarding financial problems encountered at discharge, and race. Guidelines for such an endeavor are provided. Further, there is a need to recognize the increasing preponderance of a new type of heavy care patient via more appropriate reimbursement levels and "transitional care" services. PMID:1869444
Background The reconstruction of gene regulatory networks from high-throughput "omics" data has become a major goal in the modelling of living systems. Numerous approaches have been proposed, most of which attempt only "one-shot" reconstruction of the whole network with no intervention from the user, or offer only simple correlation analysis to infer gene dependencies. Results We have developed MINER (Microarray Interactive Network Exploration and Representation), an application that combines multivariate non-linear tree learning of individual gene regulatory dependencies, visualisation of these dependencies as both trees and networks, and representation of known biological relationships based on common Gene Ontology annotations. MINER allows biologists to explore the dependencies influencing the expression of individual genes in a gene expression data set in the form of decision, model or regression trees, using their domain knowledge to guide the exploration and formulate hypotheses. Multiple trees can then be summarised in the form of a gene network diagram. MINER is being adopted by several of our collaborators and has already led to the discovery of a new significant regulatory relationship with subsequent experimental validation. Conclusion Unlike most gene regulatory network inference methods, MINER allows the user to start from genes of interest and build the network gene-by-gene, incorporating domain expertise in the process. This approach has been used successfully with RNA microarray data but is applicable to other quantitative data produced by high-throughput technologies such as proteomics and "next generation" DNA sequencing. PMID:19958480
Aoki, Kotonari; Tomizawa, Shiho; Sone, Masayoshi; Tanaka, Riwa; Kuriki, Hiroshi; Takahashi, Yoichiro
Background Although several reports have suggested that patient-generated data from Internet sources could be used to improve drug safety and pharmacovigilance, few studies have identified such data sources in Japan. We introduce a unique Japanese data source: tōbyōki, which translates literally as “an account of a struggle with disease.” Objective The objective of this study was to evaluate the basic characteristics of the TOBYO database, a collection of tōbyōki blogs on the Internet, and discuss potential applications for pharmacovigilance. Methods We analyzed the overall gender and age distribution of the patient-generated TOBYO database and compared this with other external databases generated by health care professionals. For detailed analysis, we prepared separate datasets for blogs written by patients with depression and blogs written by patients with rheumatoid arthritis (RA), because these conditions were expected to entail subjective patient symptoms such as discomfort, insomnia, and pain. Frequently appearing medical terms were counted, and their variations were compared with those in an external adverse drug reaction (ADR) reporting database. Frequently appearing words regarding patients with depression and patients with RA were visualized using word clouds and word cooccurrence networks. Results As of June 4, 2016, the TOBYO database comprised 54,010 blogs representing 1405 disorders. Overall, more entries were written by female bloggers (68.8%) than by male bloggers (30.8%). The most frequently observed disorders were breast cancer (4983 blogs), depression (3556), infertility (2430), RA (1118), and panic disorder (1090). Comparison of medical terms observed in tōbyōki blogs with those in an external ADR reporting database showed that subjective and symptomatic events and general terms tended to be frequently observed in tōbyōki blogs (eg, anxiety, headache, and pain), whereas events using more technical medical terms (eg, syndrome and
Feldman, Charles; Murray, Douglas; Chavarria, Stephanie; Zhao, Hang
The increase in the weight of American adults and children has been positively associated with the prevalence of the consumption of food-away-from-home. The objective was to assess the accuracy of claimed nutritional information of foods purchased in contracted foodservices located on the campus of an institution of higher education. Fifty popular food items were randomly collected from five main dining outlets located on a selected campus in the northeastern United States. The sampling was repeated three times on separate occasions for an aggregate total of 150 food samples. The samples were then weighed and assessed for nutrient composition (protein, cholesterol, fiber, carbohydrates, total fat, calories, sugar, and sodium) using nutrient analysis software. Results were compared with foodservices' published nutrition information. Two group comparisons, claimed and measured, were performed using the paired-sample t-test. Descriptive statistics were used as well. Among the nine nutritional values, six nutrients (total fat, sodium, protein, fiber, cholesterol, and weight) had more than 10% positive average discrepancies between measured and claimed values. Statistical significance of the variance was obtained in four of the eight categories of nutrient content: total fat, sodium, protein, and cholesterol (P < .05). Significance was also reached in the variance of actual portion weight compared to the published claims (P < .001). Significant differences of portion size (weight), total fat, sodium, protein, and cholesterol were found among the sampled values and the foodservices' published claims. The findings from this study raise the concern that if the actual nutritional information does not accurately reflect the declared values on menus, conclusions, decisions and actions based on posted information may not be valid.
Ruiz, Agustín; Hernández, Isabel; Ronsende-Roca, Maiteé; González-Pérez, Antonio; Rodriguez-Noriega, Emma; Ramírez-Lorca, Reposo; Mauleón, Ana; Moreno-Rey, Concha; Boswell, Lucie; Tune, Larry; Valero, Sergi; Alegret, Montserrat; Gayán, Javier; Becker, James T.; Real, Luis Miguel; Tárraga, Lluís; Ballard, Clive; Terrin, Michael; Sherman, Stephanie; Payami, Haydeh; López, Oscar L.; Mintzer, Jacobo E.; Boada, Mercè
The relationships between GWAS-identified and replicated genetic variants associated to Alzheimer’s disease (AD) risk and disease progression or therapeutic responses in AD patients are almost unexplored. 701 AD patients with at least three different cognitive evaluations and genotypic information for APOE and six GWAS-significant SNPs were selected for this study. Mean differences in GDS and MMSE were evaluated using non-parametric tests, GLM and mixed models for repeated measurements. Each chart was also reviewed for evidence of treatment with any cholinesterase inhibitor (AChEI), memantine or both. Relationships between therapeutic protocols, genetic markers and progression were explored using stratified analysis looking for specific effects on progression in each therapeutic category separately. Neither calculation rendered a Bonferroni-corrected statistically significant difference in any genetic marker. Mixed model results suggested differences in the average point in MMSE test for patients carrying PICALM GA or AA genotype compared to GG carriers at the end of the follow up (MMSE mean difference= −0.57 C.I.95%[−1.145−0.009], p=0.047). This observations remained unaltered after covariate adjustments although did not achieve predefined multiple testing significance threshold. PICALM SNP also displayed a significant effect protecting against rapid progression during pharmacogenetics assays although it observed effect displayed heterogeneity among AD therapeutic protocols (p=0.039). None of studied genetic markers was convincingly linked to AD progression or drug response. However, by using different statistical approaches, PICALM rs3851179 marker displayed consistent but weak effects on disease progression phenotypes. PMID:23036585
Selling, Rebecca E.; Hutchison, Kent E.
Rationale Cannabis dependence is a growing problem among individuals who use marijuana frequently, and genetic differences make some users more liable to progress to dependence. The identification of intermediate phenotypes of cannabis dependence may aid candidate genetic analysis. Promising intermediate phenotypes include craving for marijuana, withdrawal symptoms after abstinence, and sensitivity to its acute effects. A single nucleotide polymorphism (SNP) in the gene encoding for fatty acid amide hydrolase (FAAH) has demonstrated association with substance use disorder diagnoses, but has not been studied with respect to these narrower phenotypes. FAAH is an enzyme that inactivates anandamide, an endogenous agonist for CB1 receptors (to which Δ9-tetrahydrocannabinol binds). CB1 binding modulates mesocorticolimbic dopamine release, which underlies many facets of addiction. Objectives The SNP, FAAH C385A (rs324420), was examined to determine whether its variance was associated with changes in craving and withdrawal after marijuana abstinence, craving after cue exposure, or sensitivity to the acute effects of marijuana. Materials and methods Forty daily marijuana users abstained for 24 h, were presented with a cue-elicited craving paradigm and smoked a marijuana cigarette in the laboratory. Results C385A variance was significantly associated with changes in withdrawal after abstinence, and happiness after smoking marijuana in the predicted directions, was associated with changes in heart rate after smoking in the opposite of the predicted direction, and was not associated with changes in craving or other acute effects. Conclusions These data lend support to some previous association studies of C385A, but suggest that further refinement of these intermediate phenotypes is necessary. PMID:19002671
Menon, Vikas; Kattimani, Shivanand; Sarkar, Siddharth; Mathan, Kaliaperumal
Background: Evidence indicates that repeat suicide attempters, as a group, may differ from 1st time attempters. The identification of repeat attempters is a powerful but underutilized clinical variable. Aims: In this research, we aimed to compare individuals with lifetime histories of multiple attempts with 1st time attempters to identify factors predictive of repeat attempts. Setting and Design: This was a retrospective record based study carried out at a teaching cum Tertiary Care Hospital in South India. Methods: Relevant data was extracted from the clinical records of 1st time attempters (n = 362) and repeat attempters (n = 61) presenting to a single Tertiary Care Center over a 4½ year period. They were compared on various sociodemographic and clinical parameters. The clinical measures included Presumptive Stressful Life Events Scale, Beck Hopelessness Scale, Coping Strategies Inventory – Short Form, and the Global Assessment of Functioning Scale. Statistical Analysis Used: First time attempters and repeaters were compared using appropriate inferential statistics. Logistic regression was used to identify independent predictors of repeat attempts. Results: The two groups did not significantly differ on sociodemographic characteristics. Repeat attempters were more likely to have given prior hints about their act (χ2 = 4.500, P = 0.034). In the final regression model, beck hopelessness score emerged as a significant predictor of repeat suicide attempts (odds ratio = 1.064, P = 0.020). Conclusion: Among suicide attempters presenting to the hospital, the presence of hopelessness is a predictor of repeat suicide attempts, independent of clinical depression. This highlights the importance of considering hopelessness in the assessment of suicidality with a view to minimize the risk of future attempts. PMID:26933353
Sacco, Isabel C. N.; Suda, Eneida Yuri; Vigneron, Vincent; Sartor, Cristina Dallemole
Aims/Hypothesis Early diagnosis of diabetic polyneuropathy (DPN) is critical for a good prognosis. We aimed to identify different groups of patients, based on the various common clinical signs and symptoms of DPN, that represent a progressive worsening of the disease before the onset of plantar ulceration or amputation. We also sought to identify the most important DPN-related variables that can discriminate between groups, thus representing the most informative variables for early detection. Methods In 193 diabetic patients, we assessed 16 DPN-related signs, symptoms, and foot characteristics, based on the literature and the International Consensus on the Diabetic Foot. We used multiple correspondence analysis and the Kohonen algorithm to group the variables into micro and macro-classes and to identify clusters of patients that represent different DPN conditions. Results Four distinct groups were observed. One group showed no indication of DPN. The remaining groups were characterized by a progressive loss of the vibration perception, without a worsening of symptoms or tactile perception. The 2 intermediate groups presented different aspects of DPN: one showed mostly DPN symptoms and the other showed the incipient vibration impairment, callus and crack formation, and foot arch alteration. The fourth group showed more severe foot and DPN conditions, including ulceration and amputation, absence of vibration and tactile perception (irrespective of how many compromised foot areas), and worse foot deformities and callus and crack formation. Conclusion Vibration perception was more informative than tactile sensitivity in discriminating early DPN onset because its impairment was evident in more groups. Symptoms and callus and cracks did not discriminate the severity status and should be interpreted in association with other clinical variables. Reconsideration of the current screening techniques is needed to clinically determine the early onset of neuropathy using tactile
Pierce, W D; Epling, W F; Dews, P B; Estes, W K; Morse, W H; Van Orman, W; Herrnstein, R J
The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance.
Conaway, Cody R.
From 2001-2011, the General Aviation (GA) fatal accident rate remained unchanged (Duquette & Dorr, 2014) with an overall stagnant accident rate between 2004 and 2013. The leading cause, loss of control in flight (NTSB, 2015b & 2015c) due to pilot inability to recognize approach to stall/spin conditions (NTSB, 2015b & 2016b). In 2013, there were 1,224 GA accidents in the U.S., accounting for 94% of all U.S. aviation accidents and 90% of all U.S. aviation fatalities that year (NTSB, 2015c). Aviation entails multiple challenges for pilots related to task management, procedural errors, perceptual distortions, and cognitive discrepancies. While machine errors in airplanes have continued to decrease over the years, human error still has not (NTSB, 2013). A preliminary analysis of a PC-based, Garmin G1000 flight deck was conducted with 3 professional pilots. Analyses revealed increased task load, opportunities for distraction, confusing perceptual ques, and hindered cognitive performance. Complex usage problems were deeply ingrained in the functionality of the system, forcing pilots to use fallible work arounds, add unnecessary steps, and memorize knob turns or button pushes. Modern computing now has the potential to free GA cockpit designs from knobs, soft keys, or limited display options. Dynamic digital displays might include changes in instrumentation or menu structuring depending on the phase of flight. Airspeed indicators could increase in size to become more salient during landing, simultaneously highlighting pitch angle on Attitude Indicators and automatically decluttering unnecessary information for landing. Likewise, Angle-of-Attack indicators demonstrate a great safety and performance advantage for pilots (Duquette & Dorr, 2014; NTSB, 2015b & 2016b), an instrument typically found in military platforms and now the Icon A5, light-sport aircraft (Icon, 2016). How does the design of pilots' environment---the cockpit---further influence their efficiency and
Mendes, S.; Bretherton, C.
upper troposphere (above 6 km), a good linear fit was obtained with L around 1.5 km, but in the mid-troposphere (3-5 km altitude). Binning methods to understand the relative contributions of unsaturated air, saturated updrafts and saturated downdrafts, and principal component analysis of the CMT profiles over the full 120-day record are in progress and will be also reported on.
Gossard, Marcia Hill
Electricity is one of the most serious issues of the 21st century. Modern human societies have become completely dependent upon energy to power modern life---resulting in unwanted environmental effects. Although electricity itself is invisible, many of the most conspicuous household items consume the most electricity. The 2001 energy crisis in California provides a unique opportunity to study how people negotiated their lives during a time of perceived resource scarcity, increased electricity prices, and threats of blackouts. Combining cultural and environmental literatures, I argued that changes in resource availability (perceived or real) led to unsettled lives in which beliefs, rituals and ways of behaving began to be questioned---resulting in new patterns of action organized around lifestyle. As a conceptual framework, lifestyle can be useful for understanding the patterns of people's everyday lives, the objects they consume, and the degrees to which those lifestyles affect the environment. Using data from the California Residential Electricity Conservation Study (CRECS), this research explores the ways households navigated and used different conservation strategies during the summers of 2001 and 2002. Analysis of Behavioral Conservation Strategies (BCS) that require ongoing effort and attention by household residents in order to achieve successful conservation outcomes (e.g., turning off lights or regulating indoor temperature), and Consumer Investment Strategies (CIS) that are one-time purchases improving efficiency (e.g., purchase of an appliance or fixture) reveal different strategies of action over the two years. Wealth indicators and time constraints were less important for predicting conservation, while cultural differences and household composition were better predictors of conservation efforts. In addition, despite assumptions that people are unwilling to change their lifestyle in order to conserve electricity, households employed more strategies that
Plonsky, Luke; Gonulal, Talip
Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…
In response to Osborne (1999), the aims and practices of person-centered planning (PCP) are compared to the basic principles of applied behavior analysis set forth by Baer, Wolf, and Risley (1968, 1987). The principal goal of PCP is social integration of people with disabilities; it qualifies as a socially important behavior, and its problems have been displayed sufficiently. However, social integration is a complex social problem whose solution requires access to system contingencies that influence lifestyles. Nearly all of the component goals of PCP proposed by O'Brien (1987b) have been reliably quantified, although concurrent measurement of outcomes such as friendship, autonomy, and respect presents a formidable challenge. Behavioral principles such as contingency and contextual control are operative within PCP, but problems in achieving reliable implementation appear to impede an experimental analysis. PMID:22478371
Patino, Gustavo; Gioria, Rafael; Meneghini, Julio
The sensitivity of an eigenvalue to base flow modifications induced by an external force is applied to the global unstable modes associated to the onset of vortex shedding in the wake of a stalled airfoil. In this work, the flow regime is close to the first instability of the system and its associated eigenvalue/eigenmode is determined. The sensitivity analysis to a general punctual external force allows establishing the regions where control devices must be in order to stabilize the global modes. Different types of steady control devices, passive and active, are used in the regions predicted by the sensitivity analysis to check the vortex shedding suppression, i.e. the primary instability bifurcation is delayed. The new eigenvalue, modified by the action of the device, is also calculated. Finally the spectral finite element method is employed to determine flow characteristics before and after of the bifurcation in order to cross check the results.
Vasylchenkova, Anastasiia; Mraz, Miha; Zimic, Nikolaj; Moskon, Miha
Biological oscillators present a fundamental part of several regulatory mechanisms that control the response of various biological systems. Several analytical approaches for their analysis have been reported recently. They are, however, limited to only specific oscillator topologies and/or to giving only qualitative answers, i.e., is the dynamics of an oscillator given the parameter space oscillatory or not. Here we present a general analytical approach that can be applied to the analysis of biological oscillators. It relies on the projection of biological systems to classical mechanics systems. The approach is able to provide us with relatively accurate results in the meaning of type of behaviour system reflects (i.e. oscillatory or not) and periods of potential oscillations without the necessity to conduct expensive numerical simulations. We demonstrate and verify the proposed approach on three different implementations of amplified negative feedback oscillator.
Zhang, S.; Huang, M.; Wada, R.; Liu, X.; Lin, W.; Wang, J.
A new method is proposed to perform shape analyses and to evaluate their validity in heavy ion collisions near the Fermi energy. In order to avoid erroneous values of shape parameters in the calculation, a test particle method is utilized in which each nucleon is represented by n test particles, similar to that used in the Boltzmann–Uehling–Uhlenbeck (BUU) calculations. The method is applied to the events simulated by an antisymmetrized molecular dynamics model. The geometrical shape of fragments is reasonably extracted when n = 100 is used. A significant deformation is observed for all fragments created in the multifragmentation process. The method is also applied to the shape of the momentum distribution for event classification. In the momentum case, the errors in the eigenvalue calculation become much smaller than those of the geometrical shape analysis and the results become similar between those with and without the test particle method, indicating that in intermediate heavy ion collisions the shape analysis of momentum distribution can be used for the event classification without the test particle method.
Pinelli, Thomas E.; Glassman, Myron; Barclay, Rebecca O.; Oliu, Walter E.
Data collected from an exploratory study concerned with the technical communications practices of aerospace engineers and scientists were analyzed to test the primary assumption that aerospace managers and nonmanagers have different technical communications practices. Five assumptions were established for the analysis. Aerospace managers and nonmanagers were found to have different technical communications practices for three of the five assumptions tested. Although aerospace managers and nonmanagers were found to have different technical communications practices, the evidence was neither conclusive nor compelling that the presumption of difference in practices could be attributed to the duties performed by aerospace managers and nonmanagers.
[NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 3:] Technical communications in aeronautics: Results of an exploratory study. An analysis of profit managers' and nonprofit managers' responses
Pinelli, Thomas E.; Glassman, Myron; Barclay, Rebecca O.; Oliu, Walter E.
Data collected from an exploratory study concerned with the technical communications practices of aerospace engineers and scientists were analyzed to test the primary assumption that profit and nonprofit managers in the aerospace community have different technical communications practices. Five assumptions were established for the analysis. Profit and nonprofit managers in the aerospace community were found to have different technical communications practices for one of the five assumptions tested. It was, therefore, concluded that profit and nonprofit managers in the aerospace community do not have different technical communications practices.
Bowman, L. E.; Spilde, M. N.; Papike, James J.
Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.
Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane
Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD.
Grey, Ian M; Honan, Rita; McClean, Brian; Daly, Michael
Interventions for children with autism based upon Applied Behaviour Analysis (ABA) has been repeatedly shown to be related both to educational gains and to reductions in challenging behaviours. However, to date, comprehensive training in ABA for teachers and others have been limited. Over 7 months, 11 teachers undertook 90 hours of classroom instruction and supervision in ABA. Each teacher conducted a comprehensive functional assessment and designed a behaviour support plan targeting one behaviour for one child with an autistic disorder. Target behaviours included aggression, non-compliance and specific educational skills. Teachers recorded observational data for the target behaviour for both baseline and intervention sessions. Support plans produced an average 80 percent change in frequency of occurrence of target behaviours. Questionnaires completed by parents and teachers at the end of the course indicated a beneficial effect for the children and the educational environment. The potential benefits of teacher implemented behavioural intervention are discussed.
Ma, Ying; Thakor, Nitish V; Jia, Xiaofeng
Motor evoked potentials (MEPs) convey information regarding the functional integrity of the descending motor pathways. Absence of the MEP has been used as a neurophysiological marker to suggest cortico-spinal abnormalities in the operating room. Due to their high variability and sensitivity, detailed quantitative studies of MEPs are lacking. This paper applies a statistical method to characterize MEPs by estimating the number of motor units and single motor unit potential amplitudes. A clearly increasing trend of single motor unit potential amplitudes in the MEPs after each pulse of the stimulation pulse train is revealed by this method. This statistical method eliminates the effects of anesthesia, and provides an objective assessment of MEPs. Consequently this statistical method has high potential to be useful in future quantitative MEPs analysis.
We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.
Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)
We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.
Portilla, F.; Valencia, J. L.; Tarquis, A. M.; Saa-Requejo, A.
Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn't be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don't have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: • Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R p-value. In this way we consider the implications of reducing the number of points. • Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology. Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no
Pallin, Simon B; Kehrer, Manfred
Aresidential building, constructed with an unvented attic, is acommonroof assembly in the United States.The expected hygrothermal performance and service life of the roof are difficult to estimate due to a number of varying parameters.Typical parameters expected to vary are the climate, direction, and slope of the roof as well as the radiation properties of the surface material. Furthermore, influential parameters are indoor moisture excess, air leakages through the attic floor, and leakages from air-handling unit and ventilation ducts. In addition, the type of building materials such as the insulation material and closed or open cell spray polyurethane foam will influence the future performance of the roof. A development of a simulation model of the roof assembly will enable a risk and sensitivity analysis, in which the most important varying parameters on the hygrothermal performance can be determined. The model is designed to perform probabilistic simulations using mathematical and hygrothermal calculation tools. The varying input parameters can be chosen from existing measurements, simulations, or standards. An analysis is applied to determine the risk of consequences, such as mold growth, rot, or energy demand of the HVAC unit. Furthermore, the future performance of the roof can be simulated in different climates to facilitate the design of an efficient and reliable roof construction with the most suitable technical solution and to determine the most appropriate building materials for a given climate
Honisch, Christoph; Friedrich, Rudolf; Hörner, Florian; Denz, Cornelia
The Kramers-Moyal analysis is a well-established approach to analyze stochastic time series from complex systems. If the sampling interval of a measured time series is too low, systematic errors occur in the analysis results. These errors are labeled as finite time effects in the literature. In the present article, we present some new insights about these effects and discuss the limitations of a previously published method to estimate Kramers-Moyal coefficients at the presence of finite time effects. To increase the reliability of this method and to avoid misinterpretations, we extend it by the computation of error estimates for estimated parameters using a Monte Carlo error propagation technique. Finally, the extended method is applied to a data set of an optical trapping experiment yielding estimations of the forces acting on a Brownian particle trapped by optical tweezers. We find an increased Markov-Einstein time scale of the order of the relaxation time of the process, which can be traced back to memory effects caused by the interaction of the particle and the fluid. Above the Markov-Einstein time scale, the process can be very well described by the classical overdamped Markov model for Brownian motion.
Walsh, J; Desbonnet, L; Clarke, N; Waddington, J L; O'Tuathaigh, C M P
Disrupted-in-schizophrenia-1 (DISC1) is a gene that has been functionally linked with neurodevelopmental processes and structural plasticity in the brain. Clinical genetic investigations have implicated DISC1 as a genetic risk factor for schizophrenia and related psychoses. Studies using mutant mouse models of DISC1 gene function have demonstrated schizophrenia-related anatomical and behavioral endophenotypes. In the present study, ethologically based assessment of exploratory and habituation behavior in the open field was conducted in DISC1 (L100P), wild-type (WT), heterozygous (HET), and homozygous (HOM) mutant mice of both sexes. Ethological assessment was conducted in an open-field environment to explore specific topographies of murine exploratory behavior across the extended course of interaction from initial exploration through subsequent habituation (the ethogram). During initial exploration, HET and HOM DISC1 mutants evidenced increased levels of locomotion and rearing to wall compared with WT. A HOM-specific increase in total rearing and a HET-specific increase in sifting behavior and reduction in rearing seated were also observed. Over subsequent habituation, locomotion, sniffing, total rearing, rearing to wall, rearing free, and rearing seated were increased in HET and HOM mutants vs. WT. Overall, grooming was increased in HOM relative to other genotypes. HET mice displayed a selective decrease in habituation of sifting behavior. These data demonstrate impairment in both initial exploratory and habituation of exploration in a novel environment in mice with mutation of DISC1. This is discussed in the context of the functional role of the gene vis à vis a schizophrenia phenotype as well as the value of ethologically based approaches to behavioral phenotyping.
Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.
Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.
Slocum, Timothy A; Detrich, Ronnie; Wilczynski, Susan M; Spencer, Trina D; Lewis, Teri; Wolfe, Katie
Evidence-based practice (EBP) is a model of professional decision-making in which practitioners integrate the best available evidence with client values/context and clinical expertise in order to provide services for their clients. This framework provides behavior analysts with a structure for pervasive use of the best available evidence in the complex settings in which they work. This structure recognizes the need for clear and explicit understanding of the strength of evidence supporting intervention options, the important contextual factors including client values that contribute to decision making, and the key role of clinical expertise in the conceptualization, intervention, and evaluation of cases. Opening the discussion of EBP in this journal, Smith (The Behavior Analyst, 36, 7-33, 2013) raised several key issues related to EBP and applied behavior analysis (ABA). The purpose of this paper is to respond to Smith's arguments and extend the discussion of the relevant issues. Although we support many of Smith's (The Behavior Analyst, 36, 7-33, 2013) points, we contend that Smith's definition of EBP is significantly narrower than definitions that are used in professions with long histories of EBP and that this narrowness conflicts with the principles that drive applied behavior analytic practice. We offer a definition and framework for EBP that aligns with the foundations of ABA and is consistent with well-established definitions of EBP in medicine, psychology, and other professions. In addition to supporting the systematic use of research evidence in behavior analytic decision making, this definition can promote clear communication about treatment decisions across disciplines and with important outside institutions such as insurance companies and granting agencies.
Li, X.; Roman, D. R.
For gravimetric observation systems on mobile platforms (land/sea/airborne), the Low Signal to Noise Ratio (SNR) issue is the main barrier to achieving an accurate, high resolution gravity signal. Normally, low-pass filters (Childers et al 1999, Forsberg et al 2000, Kwon and Jekeli 2000, Hwang et al 2006) are applied to smooth or remove the high frequency "noise" - even though some of the high frequency component is not necessarily noise. This is especially true for aerogravity surveys such as those from the Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project. These gravity survey flights have a spatial resolution of 10 km between tracks but higher resolution along track. The along track resolution is improved due to the lower flight height (6.1 km), equipment sensitivity, and improved modeling of potential errors. Additionally, these surveys suffer from a loss of signal power due to the increased flight elevation. Hence, application of a low-pass filter removes possible signal sensed in the along-track direction that might otherwise prove useful for various geophysical and geodetic applications. Some cutting-edge developments in Wavelets and Artificial Neural Networks had been successfully applied for obtaining improved results (Li 2008 and 2011, Liang and Liu 2013). However, a clearer and fundamental understanding of the error characteristics will further improve the quality of the gravity estimates out of these gravimetric systems. Here, instead of using any predefined basis function or any a priori model, the idea of Factor Analysis is first employed to try to extract the underlying factors of the noises in the systems. Real data sets collected by both land vehicle and aircraft will be processed as the examples.
Ng, Vincent; Cao, Mengyang; Marsh, Herbert W; Tay, Louis; Seligman, Martin E P
The factor structure of the Values in Action Inventory of Strengths (VIA-IS; Peterson & Seligman, 2004) has not been well established as a result of methodological challenges primarily attributable to a global positivity factor, item cross-loading across character strengths, and questions concerning the unidimensionality of the scales assessing character strengths. We sought to overcome these methodological challenges by applying exploratory structural equation modeling (ESEM) at the item level using a bifactor analytic approach to a large sample of 447,573 participants who completed the VIA-IS with all 240 character strengths items and a reduced set of 107 unidimensional character strength items. It was found that a 6-factor bifactor structure generally held for the reduced set of unidimensional character strength items; these dimensions were justice, temperance, courage, wisdom, transcendence, humanity, and an overarching general factor that is best described as dispositional positivity. (PsycINFO Database Record
von Hippel, Ted; von Hippel, Courtney
We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.
von Hippel, Ted; von Hippel, Courtney
We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742
Chen, Rick C S; Yang, Stephen J H
From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA.
Iborra, Francisco; Cook, Peter R; Jackson, Dean A
One of the ultimate goals of biological research is to understand mechanisms of cell function within living organisms. With this in mind, many sophisticated technologies that allow us to inspect macromolecular structure in exquisite detail have been developed. Although knowledge of structure derived from techniques such as X-ray crystallography and nuclear magnetic resonance is of vital importance, these approaches cannot reveal the remarkable complexity of molecular interactions that exists in vivo. With this in mind, this review focuses on the use of microscopy techniques to analyze cell structure and function. We describe the different basic microscopic methodologies and how the routine techniques are best applied to particular biological problems. We also emphasize the specific capabilities and uses of light and electron microscopy and highlight their individual advantages and disadvantages. For completion, we also comment on the alternative possibilities provided by a variety of advanced imaging technologies. We hope that this brief analysis of the undoubted power of microscopy techniques will be enough to stimulate a wider participation in this rapidly developing area of biological discovery.
Bowleg, Lisa; Brooks, Kelly; Ritz, Susan Faye
Although the workplace stress that Black women and lesbian, gay, bisexual and transgender (LGBT) people experience due to prejudice and discrimination has been well-documented in the social science literature, much of this literature focuses on Black women or LGBTs as if these groups were distinct and mutually exclusive. Consequently, there is a void of theory and research on the workplace stress that Black lesbians experience. This qualitative study involved exploratory analyses of workplace stress due to race, sex/gender, and sexual orientation, and coping strategies among a predominantly middle-class, highly educated sample of 19 Black lesbians between the ages of 26 and 68. Four workplace stressors emerged, those relevant to: heterosexism/ sexual identity; racism/race; sexism/sex/gender; and intersections of race, sex/gender, and sexual orientation. Three primary coping strategies emerged: being out and managing being out, covering their sexual orientation, and confronting or educating coworkers about prejudice and discrimination.
Falk, Daniel E.; Castle, I-Jen P.; Ryan, Megan; Fertig, Joanne; Litten, Raye Z.
Objectives To explore if varenicline (Chantix®) showed more efficacy in treating certain subgroups of patients. In a recent multi-site trial, varenicline was shown to be effective in reducing drinking in alcohol dependent patients, both smokers and nonsmokers. Given the heterogeneity among alcohol dependent patients, secondary analyses were conducted to determine if certain subgroups responded more favorably than others to treatment with varenicline. Methods Data were drawn from a Phase 2 randomized, double-blind, placebo-controlled multi-site 13-week trial of varenicline in alcohol dependent patients (Litten et al., 2013). Seventeen moderator variables were selected for exploratory testing on the basis of theoretical and scientific interest. Results Of the 17 moderator variables assessed, four were statistically significant, including cigarettes per day reduction, treatment drinking goal, years drinking regularly, and age of patient. Two other variables—the type of adverse events experienced by patients and the severity of alcohol-related consequences—appeared to moderate the varenicline treatment effect at borderline statistical significance. Individuals who reduced the number of cigarettes per day experienced a significant effect from varenicline in reducing drinking, whereas those who did not change or who increased their number of cigarettes observed no beneficial effect. Reviewing the moderators related to severity, varenicline appeared to have greater efficacy than placebo among less severely-dependent patients. Conclusions Varenicline appears to be more efficacious in certain subgroups, particularly in those who reduced their smoking and in the “less severe” patient. Additional studies are warranted to confirm the results of these exploratory analyses. PMID:26083958
Cincotti, S.; Sornette, D.; Treleaven, P.; Battiston, S.; Caldarelli, G.; Hommes, C.; Kirman, A.
This paper describes the vision of a European Exploratory for economics and finance using an interdisciplinary consortium of economists, natural scientists, computer scientists and engineers, who will combine their expertise to address the enormous challenges of the 21st century. This Academic Public facility is intended for economic modelling, investigating all aspects of risk and stability, improving financial technology, and evaluating proposed regulatory and taxation changes. The European Exploratory for economics and finance will be constituted as a network of infrastructure, observatories, data repositories, services and facilities and will foster the creation of a new cross-disciplinary research community of social scientists, complexity scientists and computing (ICT) scientists to collaborate in investigating major issues in economics and finance. It is also considered a cradle for training and collaboration with the private sector to spur spin-offs and job creations in Europe in the finance and economic sectors. The Exploratory will allow Social Scientists and Regulators as well as Policy Makers and the private sector to conduct realistic investigations with real economic, financial and social data. The Exploratory will (i) continuously monitor and evaluate the status of the economies of countries in their various components, (ii) use, extend and develop a large variety of methods including data mining, process mining, computational and artificial intelligence and every other computer and complex science techniques coupled with economic theory and econometric, and (iii) provide the framework and infrastructure to perform what-if analysis, scenario evaluations and computational, laboratory, field and web experiments to inform decision makers and help develop innovative policy, market and regulation designs.
Woo, Alex; Chancellor, Marisa K. (Technical Monitor)
Present statistical factor analysis of computer simulations and measurement data for the VFY-218 configuration. Factor analysis try to quantify the statistical grouping of measurements and simulations.
Karlin, S; Chakraborty, R; Williams, P T; Mathew, S
Fourteen dermatoglyphic traits measured on 125 Velanadu Brahmin families were analyzed for mode of inheritance using three Structured Exploratory Data Analysis (SEDA) statistics: the major gene index, the offspring between parents function, and the traditional midparental correlation coefficient. Since the traits are integer valued with restricted ranges of variation, we simulated various transmission models with discrete expression to better understand the nature of the SEDA statistics for such variables. In addition, permutation procedures were employed to aid the interpretation of the SEDA results. These analyses suggest that corresponding homologous fingers on the left and right hands exhibit similar transmission characteristics. The relationship of the parent and child total ridge-counts of the two hands separately, as well as their combined total, virtually simulate complete Galtonian blending inheritance. Results for the individual digital ridge-counts as well as the pattern-intensity-index variable also suggest a multifactorial mode of transmission or possibly one involving several genes.
Priesmeyer, H R; Sharp, L F
This article applies the new science of nonlinearity to administrative issues and accounts receivable management in health care, and it provides a new perspective on common operating and quality control measures.
Morimoto, Hidetsugu; Hoshino, Satoshi; Kuki, Yasuaki
This paper aimed to consider about material-recycling, preventing global warming, and economic efficiency on preset and planed 195 Biomass Towns applying DEA (Data Envelopment Analysis), which can evaluate operational efficiency entities such as private companies or projects. In the results, although the Biomass Town can recycle material efficiently, it was clarified that preventing global warming and business profitability was brushed off like it in Biomass Town Design. Moreover, from the point of view of operational efficiency, we suggested an improvement of the Biomass Town scale for more efficiency-enhancing applying DEA. We found that applying DEA was able to catch more improvements or indicator as compared with cost-benefit analysis and cost-effectiveness analysis.
The Lawrence Berkeley Laboratory Exploratory R D Fund FY 1990 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of an Exploratory R D Fund (ERF) planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The research areas covered in this report are: Accelerator and fusion research; applied science; cell and molecular biology; chemical biodynamics; chemical sciences; earth sciences; engineering; information and computing sciences; materials sciences; nuclear science; physics and research medicine and radiation biophysics.
D'Archivio, Angelo Antonio; Maggi, Maria Anna
We attempted geographical classification of saffron using UV-visible spectroscopy, conventionally adopted for quality grading according to the ISO Normative 3632. We investigated 81 saffron samples produced in L'Aquila, Città della Pieve, Cascia, and Sardinia (Italy) and commercial products purchased in various supermarkets. Exploratory principal component analysis applied to the UV-vis spectra of saffron aqueous extracts revealed a clear differentiation of the samples belonging to different quality categories, but a poor separation according to the geographical origin of the spices. On the other hand, linear discriminant analysis based on 8 selected absorbance values, concentrated near 279, 305 and 328nm, allowed a good distinction of the spices coming from different sites. Under severe validation conditions (30% and 50% of saffron samples in the evaluation set), correct predictions were 85 and 83%, respectively.
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
Enders, Craig K.
Walking readers step by step through complex concepts, this book translates missing data techniques into something that applied researchers and graduate students can understand and utilize in their own research. Enders explains the rationale and procedural details for maximum likelihood estimation, Bayesian estimation, multiple imputation, and…
Stemp, W James; Lerner, Harry J; Kristant, Elaine H
Although previous use-wear studies involving quartz and quartzite have been undertaken by archaeologists, these are comparatively few in number. Moreover, there has been relatively little effort to quantify use-wear on stone tools made from quartzite. The purpose of this article is to determine the effectiveness of a measurement system, laser scanning confocal microscopy (LSCM), to document the surface roughness or texture of experimental Mistassini quartzite scrapers used on two different contact materials (fresh and dry deer hide). As in previous studies using LSCM on chert, flint, and obsidian, this exploratory study incorporates a mathematical algorithm that permits the discrimination of surface roughness based on comparisons at multiple scales. Specifically, we employ measures of relative area (RelA) coupled with the F-test to discriminate used from unused stone tool surfaces, as well as surfaces of quartzite scrapers used on dry and fresh deer hide. Our results further demonstrate the effect of raw material variation on use-wear formation and its documentation using LSCM and RelA.
O'Donohue, William; Fryling, Mitch
Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…
Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom
Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
Shortle, John F.; Allocco, Michael
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Linder, N. F.
Neutron activation analysis is used for quantitative identification of trace metals in copper. Establishing a unique fingerprint of impurities in Michigan copper would enable identification of artifacts made from this copper.
Tanaka, M.; Matsugi, E.; Miyasaki, K.; Yamagata, T.; Inoue, M.; Ogata, H.; Shimoura, S.
PIXE measurement was applied for trace elemental analyses of 40 autoptic human kidneys. To investigate the reproducibility of the PIXE data, 9 targets obtained from one human liver were examined. The targets were prepared by wet-digestion using nitric and sulfuric acid. Yttrium was used as an internal standard. The extracted elemental concentrations for K, Fe, Cu, Zn, and Cd were in reasonable agreement with those obtained by atomic absorption spectrometry (AAS) and flame photometry (FP). Various correlations among the elements K, Ca, Cr, Mn, Fe, Ni, Cu, Zn, Rb, and Cd were examined individually for the renal cortex and renal medulla.
Okun, Michele L.; Luther, James F.; Wisniewski, Stephen R.; Wisner, Katherine L.
Objective Disturbed sleep and depression are potential risk factors for pregnancy complications. Both conditions are noted to dysregulate biological pathways responsible for maintaining homeostatic balance and pregnancy health. Depression during pregnancy is associated with poor sleep. Thus, we explored whether disturbed sleep was associated with inflammatory cytokines and risk for adverse pregnancy outcomes, as well as whether depression augmented the sleep-cytokine relationship thereby additively contributing to risk for adverse outcomes. Methods Interview-assessed sleep and plasma cytokine concentrations were evaluated in a cohort of depressed and non-depressed pregnant women (N= 168) at 20 and 30 weeks gestation. Outcomes evaluated included preterm birth, birth weight, and peripartum events. Results Among depressed women, short sleep duration (< 7 hours) was associated with higher IL-8 across time (β=.506, p = .001), poor sleep efficiency (< 85%) was associated with higher IL-6 (β=.205, p = .006), and daytime naps were associated with higher TNF-α (β=.105, p =.024). Aspects of poor sleep were associated with having a lower weight baby (ps < 053). Among depressed women, IFN-γ increased risk for preterm birth (OR = 1.175, p = .032). Trends for IL-6 and higher birth weight (β = 105.2, p = .085); IFN-γ and lower birth weight (β = −19.92, p < .069); and increased IL-8 and babies weighing < 4000g, (OR =.72, p < .083) were observed. Conclusions Although speculative, disturbed sleep may disrupt normal immune processes and contribute to adverse pregnancy outcomes. Exploratory analyses indicate depression modifies these relationships. PMID:23864582
Chiprés, J.A.; Castro-Larragoitia, J.; Monroy, M.G.
The threshold between geochemical background and anomalies can be influenced by the methodology selected for its estimation. Environmental evaluations, particularly those conducted in mineralized areas, must consider this when trying to determinate the natural geochemical status of a study area, quantifying human impacts, or establishing soil restoration values for contaminated sites. Some methods in environmental geochemistry incorporate the premise that anomalies (natural or anthropogenic) and background data are characterized by their own probabilistic distributions. One of these methods uses exploratory data analysis (EDA) on regional geochemical data sets coupled with a geographic information system (GIS) to spatially understand the processes that influence the geochemical landscape in a technique that can be called a spatial data analysis (SDA). This EDA-SDA methodology was used to establish the regional background range from the area of Catorce-Matehuala in north-central Mexico. Probability plots of the data, particularly for those areas affected by human activities, show that the regional geochemical background population is composed of smaller subpopulations associated with factors such as soil type and parent material. This paper demonstrates that the EDA-SDA method offers more certainty in defining thresholds between geochemical background and anomaly than a numeric technique, making it a useful tool for regional geochemical landscape analysis and environmental geochemistry studies.
Petrie, E. J.; Clarke, P. J.; King, M. A.; Williams, S. D. P.
GPS data is used to provide estimates of vertical land motion caused by e.g. glacial isostatic adjustment (GIA) and hydrologic loading. The vertical velocities estimated from the GPS data are often assimilated into GIA models or used for comparison purposes. GIA models are very important as they provide time-variable gravity corrections needed to estimate ice mass change over Greenland and Antarctica. While state-of-the art global GPS analysis has previously been performed for many Antarctic sites, formal errors in the resulting site velocities are typically obtained from noise analysis of each individual time series without consideration of processing or metadata issues. Here we present analysis of the results from two full global runs including a variety of parameter and reference frame alignment choices, and compare the results to previous work with a view to assessing if the size of the formal errors from the standard method is truly representative.
Ryan, R.; Verderaime, V.
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
Stevenson, Frederick W.
This book attempts to introduce students to the creative aspects of mathematics through exploratory problems. The introduction presents the criteria for the selection of the problems in the book. Criteria indicate that problems should: be immediately attractive, require data to be generated or gathered, appeal to students from junior high school…
Wahler, Robert G.; Fox, James J.
The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646
Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…
Choi, Hyungshin; Kang, Myunghee
This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…
Baskas, Richard S.
The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…
Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann
Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…
Byo, James L.
The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…
Krajnak, Mike; Jesse, Lisa; Mucks, John
The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines
Wishner, Amy; Aronson, Jerold; Kohrt, Alan; Norton, Gary
The authors applied business management tools to analyze and promote excellence and to evaluate differences between average and above-average immunization peformers in private practices. The authors conducted a pilot study of 10 private practices in Pennsylvania using tools common in management to assess practices' organizational climate and managerial style. Authoritative and coaching styles of physician leaders were common to both groups. Managerial styles that emphasized higher levels of clarity and responsibility managerial styles were evident in the large practices; and rewards and flexibility styles were higher in the small above-average practices. The findings of this pilot study match results seen in high performers in other industries. It concludes that the authoritative style appears to have the most impact on performance. It has interesting implications for training/behavior change to improve immunization rates, along with traditional medical interventions.
Beauchamp, G R; Bane, M C; Stager, D R; Berry, P M; Wright, W W
PURPOSE: To assess the value of amblyopia-related services by utilizing a health value model (HVM). Cost and quality criteria are evaluated in accordance with the interests of patients, physicians, and purchasers. METHODS: We applied an HVM to a hypothetical statistical ("median") child with amblyopia whose visual acuity is 20/80 and to a group of children with amblyopia who are managed by our practice. We applied the model to calculate the value of these services by evaluating the responses of patients and physicians and relating these responses to clinical outcomes. RESULTS: The consensus value of care for the hypothetical median child was calculated to be 0.406 (of 1.000). For those children managed in our practice, the calculated value is 0.682. Clinically, 79% achieved 20/40 or better visual acuity, and the mean final visual acuity was 0.2 logMAR (20/32). Value appraisals revealed significant concerns about the financial aspects of amblyopia-related services, particularly among physicians. Patients rated services more positively than did physicians. CONCLUSIONS: Amblyopia care is difficult, sustained, and important work that requires substantial sensitivity to and support of children and families. Compliance and early detection are essential to success. The value of amblyopia services is rated significantly higher by patients than by physicians. Relative to the measured value, amblyopia care is undercompensated. The HVM is useful to appraise clinical service delivery and its variation. The costs of failure and the benefits of success are high; high-value amblyopia care yields substantial dividends and should be commensurately compensated in the marketplace. PMID:10703133
Ronald L. Boring
Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.
AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS Anum Barki AFIT-ENP-13-M-02 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENP-13-M-02 AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS...APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS Anum Barki, BS Approved: Dr. Ronald F. Tuttle (Chairman) Date Dr. Kimberly Kendricks
Target List/Dynamic Target Queue (DTL/ DTQ ) in the same place. Figure 4-27 shows the task steps involved in achieving Goal 7. 4- 30 Figure 4-27...GUI WG to brainstorm the order of columns in the DTL/ DTQ Table, a critical component of the TCTF CUI, with successful results, which were...Cognitive Work Analysis DTD Display Task Description DTL/ DTQ Dynamic Target List/Dynamic Target Queue FDO Fighter Duty Officer FEBA Forward Edge
Gay, Leslie; Karfilis, Kate V.; Miller, Michael R.; Doe, Chris Q.; Stankunas, Kryn
Transcriptional profiling is a powerful approach to study mouse development, physiology, and disease models. Here, we describe a protocol for mouse thiouracil-tagging (TU-tagging), a transcriptome analysis technology that includes in vivo covalent labeling, purification, and analysis of cell type-specific RNA. TU-tagging enables 1) the isolation of RNA from a given cell population of a complex tissue, avoiding transcriptional changes induced by cell isolation trauma, and 2) the identification of actively transcribed RNAs and not pre-existing transcripts. Therefore, in contrast to other cell-specific transcriptional profiling methods based on purification of tagged ribosomes or nuclei, TU-tagging provides a direct examination of transcriptional regulation. We describe how to: 1) deliver 4-thiouracil to transgenic mice to thio-label cell lineage-specific transcripts, 2) purify TU-tagged RNA and prepare libraries for Illumina sequencing, and 3) follow a straight-forward bioinformatics workflow to identify cell type-enriched or differentially expressed genes. Tissue containing TU-tagged RNA can be obtained in one day, RNA-Seq libraries generated within two days, and, following sequencing, an initial bioinformatics analysis completed in one additional day. PMID:24457332
Williams, Andrew James; Wyatt, Katrina M.; Williams, Craig A.; Logan, Stuart; Henley, William E.
Schools are common sites for obesity prevention interventions. Although many theories suggest that the school context influences weight status, there has been little empirical research. The objective of this study was to explore whether features of the school context were consistently and meaningfully associated with pupil weight status (overweight or obese). Exploratory factor analysis of routinely collected data on 319 primary schools in Devon, England, was used to identify possible school-based contextual factors. Repeated cross-sectional multilevel analysis of five years (2006/07-2010/11) of data from the National Child Measurement Programme was then used to test for consistent and meaningful associations. Four school-based contextual factors were derived which ranked schools according to deprivation, location, resource and prioritisation of physical activity. None of which were meaningfully and consistently associated with pupil weight status, across the five years. The lack of consistent associations between the factors and pupil weight status suggests that the school context is not inherently obesogenic. In contrast, incorporating findings from education research indicates that schools may be equalising weight status, and obesity prevention research, policy and practice might need to address what is happening outside schools and particularly during the school holidays. PMID:26700027
Khan, Adil Mehmood; Siddiqi, Muhammad Hameed; Lee, Seok-Won
Smartphone-based activity recognition (SP-AR) recognizes users' activities using the embedded accelerometer sensor. Only a small number of previous works can be classified as online systems, i.e., the whole process (pre-processing, feature extraction, and classification) is performed on the device. Most of these online systems use either a high sampling rate (SR) or long data-window (DW) to achieve high accuracy, resulting in short battery life or delayed system response, respectively. This paper introduces a real-time/online SP-AR system that solves this problem. Exploratory data analysis was performed on acceleration signals of 6 activities, collected from 30 subjects, to show that these signals are generated by an autoregressive (AR) process, and an accurate AR-model in this case can be built using a low SR (20 Hz) and a small DW (3 s). The high within class variance resulting from placing the phone at different positions was reduced using kernel discriminant analysis to achieve position-independent recognition. Neural networks were used as classifiers. Unlike previous works, true subject-independent evaluation was performed, where 10 new subjects evaluated the system at their homes for 1 week. The results show that our features outperformed three commonly used features by 40% in terms of accuracy for the given SR and DW. PMID:24084108
Phan, Loan T.; Rivera, Edil Torres; Volker, Martin A.; Garrett, Michael T.
This article reports on the development of a scale used to assess and measure group dynamics during group supervision counselling courses (practicum and internship). A 20-item Likert-type scale was administered to 200 counsellors-in-training master's students. Reliability and validity data are described. An exploratory factor analysis yielded…
The following work outlines an analysis of education initiatives aimed at the elderly. It examines the characteristics of the old aged learner, his/her "educability" and the foundations for an educational approach for this age group. These theoretical assumptions form the basis of this research: an exploratory study into various…
Konarski, Edward A.; Johnson, Moses R.; Crowell, Charles R.; Whitman, Thomas L.
First-grade children engaged in seatwork behaviors under reinforcement schedules established according to the Premack Principle and the Response Deprivation Hypothesis. Across two experiments, schedules were presented to the children in a counter-balanced fashion which fulfilled the conditions of one, both, or neither of the hypotheses. Duration of on-task math and coloring in Experiment 1 and on-task math and reading in Experiment 2 were the dependent variables. A modified ABA-type withdrawal design, including a condition to control for the noncontingent effects of a schedule, indicated an increase of on-task instrumental responding only in those schedules where the condition of response deprivation was present but not where it was absent, regardless of the probability differential between the instrumental and contingent responses. These results were consistent with laboratory findings supporting the necessity of response deprivation for producing the reinforcement effect in single response, instrumental schedules. However, the results of the control procedure were equivocal so the contribution of the contingent relationship between the responses to the increases in instrumental behavior could not be determined. Nevertheless, these results provided tentative support for the Response Deprivation Hypothesis as a new approach to establishing reinforcement schedules while indicating the need for further research in this area. The possible advantages of this technique for applied use were identified and discussed. PMID:16795635
Bu, Lingguo; Li, Jianfeng; Li, Fangyi; Liu, Heshan; Li, Zengyong
Objective The objective of this study was to assess the effects of long-term offshore work on cerebral oxygenation oscillations in sailors based on the wavelet phase coherence (WPCO) of near-infrared spectroscopy (NIRS) signals. Methods The fatigue severity scale (FSS) was first applied to assess the fatigue level of sailors and age-matched controls. Continuous recordings of NIRS signals were then obtained from the prefrontal lobes in 30 healthy sailors and 30 age-matched controls during the resting state. WPCO between the left and right prefrontal oscillations was analysed and Pearson correlation analysis was used to study the relationship between the FSS and the wavelet amplitude (WA), and between the FSS and the WPCO level. Results The periodic oscillations of Delta (HbO2) signals were identified at six frequency intervals: I (0.6–2 Hz); II (0.145–0.6 Hz); III (0.052–0.145 Hz); IV (0.021–0.052 Hz); V (0.0095–0.021 Hz); and VI (0.005–0.0095 Hz). The WA in intervals I (F=8.823, p=0.004) and III (F=4.729, p=0.034) was significantly lower in sailors than that in the controls. The WPCO values of sailor group were significantly lower in intervals III (F=4.686, p=0.039), IV (F=4.864, p=0.036) and V (F=5.195, p=0.03) than those of the control group. In the sailor group, the WA in interval I (r=−0.799, p<0.01) and in interval III (r=−0.721, p<0.01) exhibited a negative correlation with the FSS. Also, the WPCO exhibited a negative correlation with the FSS in intervals III (r=−0.839, p<0.01), IV (r=−0.765, p<0.01) and V (r=−0.775, p<0.01) in the sailor group. Conclusions The negative correlation between WA and FSS indicates that the lower oscillatory activities might contribute to the development of fatigue. The low WPCO in intervals III, IV and V represents a reduced phase synchronisation of myogenic, neurogenic and endothelial metabolic activities respectively and this may suggest a decline of cognitive function. PMID:27810980
Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene
Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.
Lax, P.; Berger, M.
This is the final report for the Courant Mathematics and Computing Laboratory (CMCL) research program for the years 1991--1993. Our research efforts encompass the formulation of physical problems in terms of mathematical models (both old and new), the mathematical analysis of such models, and their numerical resolution. This last step involves the development and implementation of efficient methods for large scale computation. Our analytic and numerical work often go hand in hand; new theoretical approaches often have numerical counterparts, while numerical experimentation often suggests avenues for analytical investigation.
Pascual-Granado, J.; Garrido, R.; Suárez, J. C.
Harmonic analysis is the fundamental mathematical method used for the identification of pulsation frequencies in asteroseismology and other fields of physics. Here we introduce a test to evaluate the validity of the hypothesis in which Fourier theorem is based: the convergence of the expansion series. The huge number of difficulties found in the interpretation of the periodograms of pulsating stars observed by CoRoT and Kepler satellites lead us to test whether the function underlying these time series is analytic or not. Surprisingly, the main result is that these are originated from non-analytic functions, therefore, the condition for Parseval's theorem is not guaranteed.
Chomette, B.; Le Carrou, J.-L.
Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.
Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem
In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions.
Hancu, Gabriel; Simon, Brigitta; Rusu, Aura; Mircia, Eleonora; Gyéresi, Arpád
Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.
Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.
Wessel, Niels; Riedl, Maik; Kramer, Jan; Muller, Andreas; Penzel, Thomas; Kurths, Jurgen
Sleep is a physiological process with an internal program of a number of well defined sleep stages and intermediate wakefulness periods. The sleep stages modulate the autonomous nervous system and thereby the sleep stages are accompanied by different regulation regimes for the cardiovascular and respiratory system. The differences in regulation can be distinguished by new techniques of cardiovascular physics. The number of patients suffering from sleep disorders increases unproportionally with the increase of the human population and aging, leading to very high expenses in the public health system. Therefore, the challenge of cardiovascular physics is to develop highly-sophisticated methods which are able to, on the one hand, supplement and replace expensive medical devices and, on the other hand, improve the medical diagnostics with decreasing the patient's risk. Methods of cardiovascular physics are used to analyze heart rate, blood pressure and respiration to detect changes of the autonomous nervous system in different diseases. Data driven modeling analysis, synchronization and coupling analysis and their applications to biosignals in healthy subjects and patients with different sleep disorders are presented. Newly derived methods of cardiovascular physics can help to find indicators for these health risks.
Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G
The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes.
Warsow, Gregor; Struckmann, Stephan; Kerkhoff, Claus; Reimer, Toralf; Engel, Nadja; Fuellen, Georg
In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7. PMID:24349128
Taplidou, Styliani A; Hadjileontiadis, Leontios J; Kitsas, Ilias K; Panoulas, Konstantinos I; Penzel, Thomas; Gross, Volker; Panas, Stavros M
The identification of continuous abnormal lung sounds, like wheezes, in the total breathing cycle is of great importance in the diagnosis of obstructive airways pathologies. To this vein, the current work introduces an efficient method for the detection of wheezes, based on the time-scale representation of breath sound recordings. The employed Continuous Wavelet Transform is proven to be a valuable tool at this direction, when combined with scale-dependent thresholding. Analysis of lung sound recordings from 'wheezing' patients shows promising performance in the detection and extraction of wheezes from the background noise and reveals its potentiality for data-volume reduction in long-term wheezing screening, such as in sleep-laboratories.
GlueX is a high energy physics experiment with the goal of collecting data necessary for understanding confinement in quantum chromodynamics. Beginning in 2015, GlueX will collect huge amounts of data describing billions of particle collisions. In preparation for data collection, efforts are underway to develop a methodology for analyzing these large data sets. One of the primary challenges in GlueX data analysis is isolating events of interest from a proportionally large background. GlueX has recently begun approaching this selection problem using machine learning algorithms, specifically boosted decision trees. Preliminary studies indicate that these algorithms have the potential to offer vast improvements in both signal selection efficiency and purity over more traditional techniques.
Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.
Dixon, Mark; Baker, Jonathan C; Sadowski, Katherine Ann
Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may facilitate not only acquisition of language but also the ability to recall items or objects that may have appeared to be "forgotten." The present study examined the utility of having a series of adults in long-term care emit tacts, echoics, or intraverbals upon presentation of various visual stimuli. Compared to a no-verbal response condition, it appears that the incorporation of Skinner's verbal operants can in fact improve recall for this population. Implications for the retraining of lost language are presented.
Dowdall, Mark; Lind, Bjørn; Gerland, Sebastian; Rudjord, Anne Liv
This article details the results of an investigation into the application of geostatistical data analysis to two environmental radiometric time series. The data series employed consist of 99Tc values for seaweed (Fucus vesiculosus) and seawater samples taken as part of a marine monitoring program conducted on the coast of northern Norway by the Norwegian Radiation Protection Authority. Geostatistical methods were selected in order to provide information on values of the variables at unsampled times and to investigate the temporal correlation exhibited by the data sets. This information is of use in the optimisation of future sampling schemes and for providing information on the temporal behaviour of the variables in question that may not be obtained during a cursory analysis. The results indicate a high degree of temporal correlation within the data sets, the correlation for the seawater and seaweed data being modelled with an exponential and linear function, respectively. The semi-variogram for the seawater data indicates a temporal range of correlation of approximately 395 days with no apparent random component to the overall variance structure and was described best by an exponential function. The temporal structure of the seaweed data was best modelled by a linear function with a small nugget component. Evidence of drift was present in both semi-variograms. Interpolation of the data sets using the fitted models and a simple kriging procedure were compared, using a cross-validation procedure, with simple linear interpolation. Results of this exercise indicate that, for the seawater data, the kriging procedure outperformed the simple interpolation with respect to error distribution and correlation of estimates with actual values. Using the unbounded linear model with the seaweed data produced estimates that were only marginally better than those produced by the simple interpolation.
Liou, Der-Ming; Chang, Wei-Pin
Data mining, also known as Knowledge-Discovery in Databases (KDD), is the process of automatically searching large volumes of data for patterns. For instance, a clinical pattern might indicate a female who have diabetes or hypertension are easier suffered from stroke for 5 years in a future. Then, a physician can learn valuable knowledge from the data mining processes. Here, we present a study focused on the investigation of the application of artificial intelligence and data mining techniques to the prediction models of breast cancer. The artificial neural network, decision tree, logistic regression, and genetic algorithm were used for the comparative studies and the accuracy and positive predictive value of each algorithm were used as the evaluation indicators. 699 records acquired from the breast cancer patients at the University of Wisconsin, nine predictor variables, and one outcome variable were incorporated for the data analysis followed by the tenfold cross-validation. The results revealed that the accuracies of logistic regression model were 0.9434 (sensitivity 0.9716 and specificity 0.9482), the decision tree model 0.9434 (sensitivity 0.9615, specificity 0.9105), the neural network model 0.9502 (sensitivity 0.9628, specificity 0.9273), and the genetic algorithm model 0.9878 (sensitivity 1, specificity 0.9802). The accuracy of the genetic algorithm was significantly higher than the average predicted accuracy of 0.9612. The predicted outcome of the logistic regression model was higher than that of the neural network model but no significant difference was observed. The average predicted accuracy of the decision tree model was 0.9435 which was the lowest of all four predictive models. The standard deviation of the tenfold cross-validation was rather unreliable. This study indicated that the genetic algorithm model yielded better results than other data mining models for the analysis of the data of breast cancer patients in terms of the overall accuracy of
Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.
Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate
This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…
Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.
This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.
Irion, Jeff; Saito, Naoki
In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.
Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.
Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait
Wiesner, Margit; Schanding, G Thomas
Several psychological assessment instruments are based on the assumption of a general construct that is composed of multiple interrelated domains. Standard confirmatory factor analysis is often not well suited for examining the factor structure of such scales. This study used data from 1885 elementary school students (mean age=8.77 years, SD=1.47 years) to examine the factor structure of the Behavioral Assessment System for Children, Second Edition (BASC-2) Behavioral and Emotional Screening System (BESS) Teacher Form that was designed to assess general risk for emotional/behavioral difficulty among children. The modeling sequence included the relatively new exploratory structural equation modeling (ESEM) approach and bifactor models in addition to more standard techniques. Findings revealed that the factor structure of the BASC-2 BESS Teacher Form is multidimensional. Both ESEM and bifactor models showed good fit to the data. Bifactor models were preferred on conceptual grounds. Findings illuminate the hypothesis-generating power of ESEM and suggest that it might not be optimal for instruments designed to assess a predominant general factor underlying the data.
Copenhaver, Michael; Shrestha, Roman; Wickersham, Jeffrey A; Weikum, Damian; Altice, Frederick L
The present study examines the factor structure of the existing Neuropsychological Impairment Scale (NIS) through the use of exploratory factor analysis (EFA). The NIS is a brief, self-report measure originally designed to assess neurocognitive impairment (NCI) by having patients rate a range of items that may influence cognitive functioning. Stabilized patients on methadone maintenance therapy (MMT; N=339) in New Haven, CT who reported drug- or sex-related HIV risk behaviors in the past 6 months were administered the full 95-item NIS. An EFA was then conducted using principal axis factoring and orthogonal varimax rotation. The EFA resulted in retaining 57 items, with a 9-factor solution that explained 54.8% of the overall variance. The revised 9-factor measure--now referred to as the Brief Inventory of Neuro-cognitive Impairment (BINI)--showed a diverse set of factors with excellent to good reliability (i.e., F1 α=0.97 to F9 α=0.73). This EFA suggests the potential utility of using the BINI in the context of addiction treatment. Further research should examine the utility of this tool within other clinical care settings.
Shelby, Rebecca A.; Golden-Kreutz, Deanna M.; Andersen, Barbara L.
The Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV; American Psychiatric Association, 1994a) conceptualization of posttraumatic stress disorder (PTSD) includes three symptom clusters: reexperiencing, avoidance/numbing, and arousal. The PTSD Checklist-Civilian Version (PCL-C) corresponds to the DSM-IV PTSD symptoms. In the current study, we conducted exploratory factor analysis (EFA) of the PCL-C with two aims: (a) to examine whether the PCL-C evidenced the three-factor solution implied by the DSM-IV symptom clusters, and (b) to identify a factor solution for the PCL-C in a cancer sample. Women (N = 148) with Stage II or III breast cancer completed the PCL-C after completion of cancer treatment. We extracted two-, three-, four-, and five-factor solutions using EFA. Our data did not support the DSM-IV PTSD symptom clusters. Instead, EFA identified a four-factor solution including reexperiencing, avoidance, numbing, and arousal factors. Four symptom items, which may be confounded with illness and cancer treatment-related symptoms, exhibited poor factor loadings. Using these symptom items in cancer samples may lead to overdiagnosis of PTSD and inflated rates of PTSD symptoms. PMID:16281232
Kostakoglu, Lale; Goy, Andre; Martinelli, Giovanni; Caballero, Dolores; Crump, Michael; Gaidano, Gianluca; Baetz, Tara; Buckstein, Rena; Fine, Gregg; Fingerle-Rowson, Guenter; Berge, Claude; Sahin, Deniz; Press, Oliver; Sehn, Laurie
An exploratory analysis of 75 follicular lymphoma patients treated with obinutuzumab or rituximab induction therapy (IT) for 4 weeks in the phase II GAUSS study aimed to determine whether positron emission tomography (PET) results could predict progression-free survival (PFS) and tumor response. The proportion of patients with a PFS event (progression or death) was higher in those who were PET-positive after IT (assessed using Deauville five-point scale criteria; 35/52, 67%) than PET-negative (5/20, 25%); the hazard ratio for progression or death was 0.25 (95%CI: 0.01-0.64; p = 0.0018). A significant association was also found when PET results were assessed using International Harmonization Project and European Organisation for Research and Treatment of Cancer criteria. Change between baseline and end of IT in values of standardized uptake value and other PET parameters were associated with PFS and response. Validation of these results in prospective studies of larger cohorts is warranted.
Lam, Pui Mei; Lai, Claudia Kam Yuk
The aim of this work was to study the psychometric properties of the Chinese version of the Swallow Quality-of-Life Questionnaire (CSWAL-QOL) validated in the Hong Kong Chinese-speaking population. With convenience sampling, a cross-sectional survey was launched to evaluate the validity and reliability of the CSWAL-QOL. One hundred subjects with swallowing problems were recruited to evaluate the construct validity and internal consistency, and 20 subjects were recruited for the test-retest reliability. Construct validity was validated through factor analysis (both exploratory and confirmatory) and a correlation study between the CSWAL-QOL and the World Health Organization Quality-of-Life Questionnaire-abbreviated version [WHOQOL-BREF (HK)]. Reliability was estimated using tests of internal consistency and test-retest reliability. The psychometric properties of the CSWAL-QOL were found to be largely similar to those of the SWAL-QOL, except the three-item eating desire scale of the CSWAL-QOL, which showed insignificant results in the validity and reliability tests. The CSWAL-QOL is the first validated Chinese version of the SWAL-QOL in Hong Kong. It is a clinically valid and reliable tool for assessing the quality of life in dysphagic Chinese patients in Hong Kong, regardless of the causes of dysphagia.
Kottorp, Anders; Boettcher, Johanna; Andersson, Gerhard; Carlbring, Per
Research conducted during the last decades has provided increasing evidence for the use of psychological treatments for a number of psychiatric disorders and somatic complaints. However, by focusing only on the positive outcomes, less attention has been given to the potential of negative effects. Despite indications of deterioration and other adverse and unwanted events during treatment, little is known about their occurrence and characteristics. Hence, in order to facilitate research of negative effects, a new instrument for monitoring and reporting their incidence and impact was developed using a consensus among researchers, self-reports by patients, and a literature review: the Negative Effects Questionnaire. Participants were recruited via a smartphone-delivered self-help treatment for social anxiety disorder and through the media (N = 653). An exploratory factor analysis was performed, resulting in a six-factor solution with 32 items, accounting for 57.64% of the variance. The derived factors were: symptoms, quality, dependency, stigma, hopelessness, and failure. Items related to unpleasant memories, stress, and anxiety were experienced by more than one-third of the participants. Further, increased or novel symptoms, as well as lack of quality in the treatment and therapeutic relationship rendered the highest self-reported negative impact. In addition, the findings were discussed in relation to prior research and other similar instruments of adverse and unwanted events, giving credence to the items that are included. The instrument is presently available in eleven different languages and can be freely downloaded and used from www.neqscale.com. PMID:27331907
Rozental, Alexander; Kottorp, Anders; Boettcher, Johanna; Andersson, Gerhard; Carlbring, Per
Research conducted during the last decades has provided increasing evidence for the use of psychological treatments for a number of psychiatric disorders and somatic complaints. However, by focusing only on the positive outcomes, less attention has been given to the potential of negative effects. Despite indications of deterioration and other adverse and unwanted events during treatment, little is known about their occurrence and characteristics. Hence, in order to facilitate research of negative effects, a new instrument for monitoring and reporting their incidence and impact was developed using a consensus among researchers, self-reports by patients, and a literature review: the Negative Effects Questionnaire. Participants were recruited via a smartphone-delivered self-help treatment for social anxiety disorder and through the media (N = 653). An exploratory factor analysis was performed, resulting in a six-factor solution with 32 items, accounting for 57.64% of the variance. The derived factors were: symptoms, quality, dependency, stigma, hopelessness, and failure. Items related to unpleasant memories, stress, and anxiety were experienced by more than one-third of the participants. Further, increased or novel symptoms, as well as lack of quality in the treatment and therapeutic relationship rendered the highest self-reported negative impact. In addition, the findings were discussed in relation to prior research and other similar instruments of adverse and unwanted events, giving credence to the items that are included. The instrument is presently available in eleven different languages and can be freely downloaded and used from www.neqscale.com.
Tosevska, Anela; Franzke, Bernhard; Hofmann, Marlene; Vierheilig, Immina; Schober-Halper, Barbara; Oesen, Stefan; Neubauer, Oliver; Wessner, Barbara; Wagner, Karl-Heinz
Telomere length (TL) in blood cells is widely used in human studies as a molecular marker of ageing. Circulating cell-free DNA (cfDNA) as well as unconjugated bilirubin (UCB) are dynamic blood constituents whose involvement in age-associated diseases is largely unexplored. To our knowledge, there are no published studies integrating all three parameters, especially in individuals of advanced age. Here we present a secondary analysis from the Vienna Active Aging Study (VAAS), a randomized controlled intervention trial in institutionalized elderly individuals (n = 101). Using an exploratory approach we combine three blood-based molecular markers (TL, UCB and cfDNA) with a range of primary and secondary outcomes from the intervention. We further look at the changes occurring in these parameters after 6-month resistance exercise training with or without supplementation. A correlation between UCB and TL was evident at baseline (p < 0.05), and both were associated with increased chromosomal anomalies such as nucleoplasmatic bridges and nuclear buds (p < 0.05). Of the three main markers explored in this paper, only cfDNA decreased significantly (p < 0.05) after 6-month training and dietary intervention. No clear relationship could be established between cfDNA and either UCB or TL. The trial was registered at ClinicalTrials.gov (NCT01775111). PMID:27905522
Boutot, E. Amanda; Hume, Kara
Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…
Boutot, E. Amanda; Hume, Kara
Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…
experiences using a non-traditional Hadoop distributed computing setup on top of a HPC computing cluster. Categories and Subject Descriptors J.2 [Computer...Analysis, Hadoop , R 1. INTRODUCTION In application areas involving large-scale distributed sen- sor networks, prior to deploying algorithms over high...executed on a Hadoop cluster. This provides the flexible rapid development and iterative analysis capabilities required for our analysis as well as the
Korneva, N. N.; Mogilevskii, M. M.; Nazarov, V. N.
Traditional methods of time series analysis of satellite ionospheric measurements have some limitations and disadvantages that are mainly associated with the complex nonstationary signal structure. In this paper, the possibility of identifying and studying the temporal characteristics of signals via visual analysis is considered. The proposed approach is illustrated by the example of the visual analysis of wave measurements on the DEMETER microsatellite during its passage over the HAARP facility.
Westendorp, Willeke F.; Vermeij, Jan-Dirk; Brouwer, Matthijs C.; Roos, Y.B.W.E.M.; Nederkoorn, Paul J.; van de Beek, Diederik
Background Stroke-associated infections occur frequently and are associated with unfavorable outcome. Previous cohort studies suggest a protective effect of beta-blockers (BBs) against infections. A sympathetic drive may increase immune suppression and infections. Aim This study is aimed at investigating the association between BB treatment at baseline and post-stroke infection in the Preventive Antibiotics in Stroke Study (PASS), a prospective clinical trial. Methods We performed an exploratory analysis in PASS, 2,538 patients with acute phase of stroke (24 h after onset) were randomized to ceftriaxone (intravenous, 2 g per day for 4 days) in addition to stroke unit care, or standard stroke unit care without preventive antibiotic treatment. All clinical data, including use of BBs, was prospectively collected. Infection was diagnosed by the treating physician, and independently by an expert panel blinded for all other data. Multivariable analysis was performed to investigate the relation between BB treatment and infection rate. Results Infection, as defined by the physician, occurred in 348 of 2,538 patients (14%). Multivariable analysis showed that the use of BBs at baseline was associated with the development of infection during clinical course (adjusted OR (aOR) 1.61, 95% CI 1.19-2.18; p < 0.01). BB use at baseline was also associated with the development of pneumonia (aOR 1.56, 95% CI 1.05-2.30; p = 0.03). Baseline BB use was not associated with mortality (aOR 1.14, 95% CI 0.84-1.53; p = 0.41) or unfavorable outcome at 3 months (aOR 1.10, 95% CI 0.89-1.35; p = 0.39). Conclusions Patients treated with BBs prior to stroke have a higher rate of infection and pneumonia. PMID:27701170
Mosimann, Garrett; Wagner, Rachel; Schirber, Tom
The key objective of this exploratory study was to investigate the feasibility of the development or adoption of technologies that would enable a large percentage of existing homes in cold climates to apply a combination 'excavationless' soil removal process with appropriate insulation and water management on the exterior of existing foundations at a low cost. Our approach was to explore existing excavation and material technologies and systems to discover whether potential successful combinations existed.
This paper examines ecosystem restoration practices that focus on water temperature reductions in the upper mainstem Willamette River, Oregon, for the benefit of endangered salmonids and other native cold-water species. The analysis integrates hydrologic, natural science and eco...
Fernandes, T. L.; Donatelli, G. D.; Baldo, C. R.
In recent years, computed tomography (CT) has been applied as an industrial metrology tool for the dimensional evaluation of visible and even hidden features of production parts in a non-destructive manner. Considering the experimental findings of a recent work of the authors, this paper deals with the effect of voxel size on measuring distances between feature-of-size centers, which would be less sensitive to edge offset errors. Particular attention is given to the design of experiment and to the measurement uncertainty sources. The most significant experimental findings are outlined and discussed in this paper.
Palace-Berl, Fanny; Jorge, Salomão Dória; Pasqualoto, Kerly Fernanda Mesquita; Ferreira, Adilson Kleber; Maria, Durvanei Augusto; Zorzi, Rodrigo Rocha; de Sá Bortolozzo, Leandro; Lindoso, José Ângelo Lauletta; Tavares, Leoberto Costa
The anti-Trypanosoma cruzi activity of 5-nitro-2-furfuriliden derivatives as well as the cytotoxicity of these compounds on J774 macrophages cell line and FN1 human fibroblast cells were investigated in this study. The most active compounds of series I and II were 4-butyl-[N'-(5-nitrofuran-2-yl) methylene] benzidrazide (3g; IC50=1.05μM±0.07) and 3-acetyl-5-(4-butylphenyl)-2-(5-nitrofuran-2-yl)-2,3-dihydro,1,3,4-oxadiazole (4g; IC50=8.27μM±0.42), respectively. Also, compound 3g was more active than the standard drugs, benznidazole (IC50=22.69μM±1.96) and nifurtimox (IC50=3.78μM±0.10). Regarding the cytotoxicity assay, the 3g compound presented IC50 value of 28.05μM (SI=26.71) against J774 cells. For the FN1 fibroblast assay, 3g showed IC50 value of 98μM (SI=93.33). On the other hand, compound 4g presented a cytotoxicity value on J774 cells higher than 400μM (SI >48), and for the FN1 cells its IC50 value was 186μM (SI=22.49). Moreover, an exploratory data analysis, which comprises hierarchical cluster (HCA) and principal component analysis (PCA), was carried out and the findings were complementary. The molecular properties that most influenced the compounds' grouping were ClogP and total dipole moment, pointing out the need of a lipophilic/hydrophilic balance in the designing of novel potential anti-T. cruzi molecules.
Ramezankhani, Ali; Soori, Hamid; Alhani, Fatemeh; Goudarzi, Ali Moazemi
Introduction Despite the importance of perceived barriers against self-care in diabetic patients, the role of this factor is rarely addressed in the improvement of self-care behaviors of Iranian patients. The lack of appropriate instruments that fit demographic properties of Iranian society is one reason. The aim of this study was to develop and validate the scale of perceived barriers to self-care in patients with type 2 diabetes mellitus. Methods This cross-sectional study conducted on 400 patients with type 2 diabetes who were covered by the health centers in Isfahan (Iran) in 2015. A 22-item, researcher-made instrument was designed; the face and content validities of the instrument were examined through obtaining the opinions of an expert panel before administering the instrument in the study. Also, the exploratory factor analysis was used to investigate the instrument’s validity. Cronbach’s alpha was employed to measure its internal consistency (reliability). To examine the validity of the final scale, the mean scores of perceived barriers in patients with appropriate and inappropriate self-care behaviors were compared. Results The research sample was comprised of 240 women (60%) and 160 men (40%). The mean value of the content validity index was 0.84. The results of factor analysis confirmed the validity of the 11 items and 3 factors of the developed scale. The factor loading ranged from 0.46 to 0.78. These three factors together explained 40.28% of the total variance. The overall reliability coefficient of the instrument was 0.79, ranging from 0.82 to 0.93 for three factors. Conclusion According to the results, the developed scale was a valid and reliable instrument for examining the barriers perceived by the patients. The findings of this research can help health policy makers in planning to facilitate the self-care behaviors as the most vital factor in diabetes control. PMID:26767102
Tuuri, Georgianna; Cater, Melissa; Craft, Brittany; Bailey, Ariana; Miketinas, Derek
Whole grains are recommended by dietary guidelines because of their health-promoting properties, yet attitudes toward consuming these foods have not been examined. This study developed and validated a questionnaire to estimate willingness to consume whole grain foods. Focus group interviews with high school students and input from nutrition educators produced a list of 10 whole grain items that were included in the "Willingness to Eat Whole Grains Questionnaire". Young adult university students 18-29 years of age indicated their willingness to consume each of the whole grain foods using a 4-point, Likert-type scale with responses ranging from "always unwilling" to "always willing" and a fifth option of "never eaten". Participants' age, race/ethnicity, and gender were collected. Data were examined using exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and test-retest reliability. The EFA test (n = 266; 65% female; 69% white) using principal axis factoring returned a single factor that included all survey items and explained 58.3% of the variance. The CFA (n = 252; 62% female, 74% white) supported a single-factor solution: χ(2) = 80.57 (35); RMSEA = 0.07; Comparative Fit Index = 0.92; Tucker-Lewis Index = 0.90; and SRMR = 0.05. The questionnaire, administered on two occasions separated by two weeks to 36 university students, demonstrated good testretest reliability (r = 0.87, p < 0.0001). The "Willingness to Eat Whole Grains Questionnaire" had good face validity when used with a young adult population and will be a useful tool to help nutrition educators examine attitudes toward consuming nutrient-rich whole grain foods.
Henry, Teague; Campbell, Ashley
Objective. To examine factors that determine the interindividual variability of learning within a team-based learning environment. Methods. Students in a pharmacokinetics course were given 4 interim, low-stakes cumulative assessments throughout the semester and a cumulative final examination. Students’ Myers-Briggs personality type was assessed, as well as their study skills, motivations, and attitudes towards team-learning. A latent curve model (LCM) was applied and various covariates were assessed to improve the regression model. Results. A quadratic LCM was applied for the first 4 assessments to predict final examination performance. None of the covariates examined significantly impacted the regression model fit except metacognitive self-regulation, which explained some of the variability in the rate of learning. There were some correlations between personality type and attitudes towards team learning, with introverts having a lower opinion of team-learning than extroverts. Conclusion. The LCM could readily describe the learning curve. Extroverted and introverted personality types had the same learning performance even though preference for team-learning was lower in introverts. Other personality traits, study skills, or practice did not significantly contribute to the learning variability in this course. PMID:25861101
McCormick, Jennifer A.
The purpose of this study was to examine inclusive elementary teacher knowledge and attitude toward Autism Spectrum Disorder (ASD) and applied behavior analysis (ABA) and their use of ABA. Furthermore, this study examined if knowledge and attitude predicted use of ABA. A survey was developed and administered through a web-based program. Of the…
Schad, Mareike; John, Jürgen
Over the last decades, methods for the economic evaluation of health care technologies were increasingly used to inform reimbursement decisions. For a short time, the German Statutory Health Insurance makes use of these methods to support reimbursement decisions on patented drugs. In this context, the discounting procedure emerges as a critical component of these methods, as discount rates can strongly affect the resulting incremental cost-effectiveness ratios. The aim of this paper is to identify the appropriate value of a social discount rate to be used by the German Statutory Health Insurance for the economic evaluation of health technologies. On theoretical grounds, we build on the widespread view of contemporary economists that the social rate of time preference (SRTP) is the adequate social discount rate. For quantifying the SRTP, we first apply the market behaviour approach, which assumes that the SRTP is reflected in observable market interest rates. As a second approach, we derive the SRTP from optimal growth theory by using the Ramsey equation. A major part of the paper is devoted to specify the parameters of this equation. Depending on various assumptions, our empirical findings result in the range of 1.75-4.2% for the SRTP. A reasonable base case discount rate for Germany, thus, would be about 3%. Furthermore, we deal with the much debated question whether a common discount rate for costs and health benefits or a lower rate for health should be applied in health economic evaluations. In the German social health insurance system, no exogenously fixed budget constraint does exist. When evaluating a new health technology, the health care decision maker is obliged to conduct an economic evaluation in order to examine whether there is an economically appropriate relation between the value of the health gains and the additional costs which are given by the value of the consumption losses due to the additional health care expenditures. Therefore, a discount
Mathis, Janelle B.
International children's literature has the potential to create global experiences and cultural insights for young people confronted with limited and biased images of the world offered by media. The current inquiry was designed to explore, through a critical content analysis approach, international children's literature in which characters…
Dembo, Richard; Briones, Rhissa; Gulledge, Laura; Karas, Lora; Winters, Ken C.; Belenko, Steven; Greenbaum, Paul E.
Reflective of interest in mental health and substance abuse issues among youths involved with the justice system, we performed a latent class analysis on baseline information collected on 100 youths involved in two diversion programs. Results identified two groups of youths: Group 1: a majority of the youths, who had high levels of delinquency,…
McClintock, Edwin; O'Brien, George; Jiang, Zhonghong
Analyses of the impact of reform-based teaching practices in Florida International University's program have been previously reported. However, the impact of the field experiences "per se" has not been assessed. Using a cross-case analysis approach, the authors assess the impact of voluntary field experiences with teachers who practice…
Schaefer, C.; Coble, C.; Mason, S.; Young, M.; Wear, M. L.; Sargsyan, A.; Garcia, K.; Patel, N.; Gibson, C.; Alexander, D.; Van Baalen, M.
Carbon dioxide (CO2) levels on board the International Space Station (ISS) have typically averaged 2.3 to 5.3 mmHg, with large fluctuations occurring over periods of hours and days. CO2 has effects on cerebral vascular tone, resulting in vasodilation and alteration of cerebral blood flow (CBF). Increased CBF leads to elevated intracranial pressure (ICP), a factor leading to visual disturbances, headaches, and other central nervous system symptoms. Ultrasound of the optic nerve and optical coherence tomography (OCT) provide surrogate measurements of ICP; in-flight measurements of both were implemented as enhanced screening tools for the Visual Impairment/Intracranial Pressure (VIIP) syndrome. This analysis examines the relationships between ambient CO2 levels on ISS, ultrasound and OCT measures of the eye in an effort to understand how CO2 may possibly be associated with VIIP and to inform future analysis of in-flight VIIP data.
Loyek, Christian; Bunkowski, Alexander; Vautz, Wolfgang; Nattkemper, Tim W
In nowadays life science projects, sharing data and data interpretation is becoming increasingly important. This considerably calls for novel information technology approaches, which enable the integration of expert knowledge from different disciplines in combination with advanced data analysis facilities in a collaborative manner. Since the recent development of web technologies offers scientific communities new ways for cooperation and communication, we propose a fully web-based software approach for the collaborative analysis of bioimage data and demonstrate the applicability of Web2.0 techniques to ion mobility spectrometry image data. Our approach allows collaborating experts to easily share, explore and discuss complex image data without any installation of software packages. Scientists only need a username and a password to get access to our system and can directly start exploring and analyzing their data.
Plock, Nele; Bax, Leon; Lee, Douglas; DeManno, Deborah; Lahu, Gezim; Pfister, Marc
The presented analysis was performed to characterize the relationship between treatment-related early (week 4) and longer term (3-6 months) weight loss to understand the potential utility of 4-week proof-of-mechanism studies in the early decision-making process during clinical development of new antiobesity compounds. A regression-based meta-analysis was performed leveraging publically available clinical outcomes data to (1) characterize the within-trial relationship between treatment-related early and longer term body weight loss and (2) identify and quantify key covariate effects on this relationship. Data from 89 randomized clinical trials with 209 treatment arms, representing observations from 54 461 patients and 9 treatments, were available for the meta-analysis. Results indicated that (1) there is a correlation between treatment-related early and longer term body weight loss (r > 0.9), (2) baseline body weight influences the relationship between early and longer term weight loss, whereas comorbidity such as type 2 diabetes mellitus, class of drugs including GLP-1 analogues and the antiobesity compounds lorcaserin or phentermine/topiramate showed no significant effects on this relationship. The model was externally evaluated with data from the investigational compound beloranib, for which longer term weight loss could be successfully predicted based on early response data. Based on these results, the identified strong relationship between treatment-related early and longer term weight loss appears to be independent of mechanism of action. Thus, findings from this analysis can optimize design of clinical studies and facilitate development of new anti-obesity compounds.
Objective To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. Design A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. Assessment After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Conclusion Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients. PMID:17786269
Namba, Alexa; Leonberg, Beth L.; Wootan, Margo G.
Introduction Since 2008, several states and municipalities have implemented regulations requiring provision of nutrition information at chain restaurants to address obesity. Although early research into the effect of such labels on consumer decisions has shown mixed results, little information exists on the restaurant industry’s response to labeling. The objective of this exploratory study was to evaluate the effect of menu labeling on fast-food menu offerings over 7 years, from 2005 through 2011. Methods Menus from 5 fast-food chains that had outlets in jurisdictions subject to menu-labeling laws (cases) were compared with menus from 4 fast-food chains operating in jurisdictions not requiring labeling (controls). A trend analysis assessed whether case restaurants improved the healthfulness of their menus relative to the control restaurants. Results Although the overall prevalence of “healthier” food options remained low, a noteworthy increase was seen after 2008 in locations with menu-labeling laws relative to those without such laws. Healthier food options increased from 13% to 20% at case locations while remaining static at 8% at control locations (test for difference in the trend, P = .02). Since 2005, the average calories for an à la carte entrée remained moderately high (approximately 450 kilocalories), with less than 25% of all entrées and sides qualifying as healthier and no clear systematic differences in the trend between chain restaurants in case versus control areas (P ≥ .50). Conclusion These findings suggest that menu labeling has thus far not affected the average nutritional content of fast-food menu items, but it may motivate restaurants to increase the availability of healthier options. PMID:23786908
Aina, Yusuf A.; van der Merwe, Johannes H.; Alshuwaikhat, Habib M.
The effects of concentrations of fine particulate matter on urban populations have been gaining attention because fine particulate matter exposes the urban populace to health risks such as respiratory and cardiovascular diseases. Satellite-derived data, using aerosol optical depth (AOD), have been adopted to improve the monitoring of fine particulate matter. One of such data sources is the global multi-year PM2.5 data (2001–2010) released by the Center for International Earth Science Information Network (CIESIN). This paper explores the satellite-derived PM2.5 data of Saudi Arabia to highlight the trend of PM2.5 concentrations. It also examines the changes in PM2.5 concentrations in some urbanized areas of Saudi Arabia. Concentrations in major cities like Riyadh, Dammam, Jeddah, Makkah, Madinah and the industrial cities of Yanbu and Jubail are analyzed using cluster analysis. The health risks due to exposure of the populace are highlighted by using the World Health Organization (WHO) standard and targets. The results show a trend of increasing concentrations of PM2.5 in urban areas. Significant clusters of high values are found in the eastern and south-western part of the country. There is a need to explore this topic using images with higher spatial resolution and validate the data with ground observations to improve the analysis. PMID:25350009
Chen, Jinyao; Song, Yang; Zhang, Lishi
Lycopene is a potentially useful compound for preventing and treating cardiovascular diseases and cancers. Studies on the effects of lycopene on oxidative stress offer insights into its mechanism of action and provide evidence-based rationale for its supplementation. In this analysis, randomized controlled trials of the effects of oral lycopene supplementation on any valid outcomes of oxidative stress were identified and pooled through a search of international journal databases and reference lists of relevant publications. Two reviewers extracted data from each of the identified studies. Only studies of sufficient quality were included. Twelve parallel trials and one crossover trial were included in the systematic review, and six trials provided data for quantitative meta-analysis. Our results indicate that lycopene supplementation significantly decreases the DNA tail length, as determined using comet assays, with a mean difference (MD) of -6.27 [95% confidence interval (CI) -10.74, -1.90] (P=.006) between the lycopene intervention groups and the control groups. Lycopene supplementation does not significantly prolong the lag time of low-density lipoprotein (MD 3.76 [95% CI -2.48, 10.01]; P=.24). Lycopene possibly alleviates oxidative stress; however, biomarker research for oxidative stress needs be more consistent with the outcomes in lycopene intervention trials for disease prevention.
Hafen, Ryan P.; Vishwanathan, Vilanyur V.; Subbarao, Krishnappa; Kintner-Meyer, Michael CW
Battery testing procedures are important for understanding battery performance, including degradation over the life of the battery. Standards are important to provide clear rules and uniformity to an industry. The work described in this report addresses the need for standard battery testing procedures that reflect real-world applications of energy storage systems to provide regulation services to grid operators. This work was motivated by the need to develop Vehicle-to-Grid (V2G) testing procedures, or V2G drive cycles. Likewise, the stationary energy storage community is equally interested in standardized testing protocols that reflect real-world grid applications for providing regulation services. As the first of several steps toward standardizing battery testing cycles, this work focused on a statistical analysis of frequency regulation signals from the Pennsylvania-New Jersey-Maryland Interconnect with the goal to identify patterns in the regulation signal that would be representative of the entire signal as a typical regulation data set. Results from an extensive time-series analysis are discussed, and the results are explained from both the statistical and the battery-testing perspectives. The results then are interpreted in the context of defining a small set of V2G drive cycles for standardization, offering some recommendations for the next steps toward standardizing testing protocols.
Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.
Luyben, Paul D
Applied behavior analysis, a psychological discipline, has been characterized as the science of behavior change (Chance, 2006). Research in applied behavior analysis has been published for approximately 40 years since the initial publication of the Journal of Applied Behavior Analysis in 1968. The field now encompasses a wide range of human behavior. Although much of the published research centers on problem behaviors that occur in schools and among people with disabilities, a substantial body of knowledge has emerged in community settings. This article provides a review of the behavioral community research published in the Journal of Applied Behavior Analysis as representative of this work, including research in the areas of home and family, health, safety, community involvement and the environment, recreation and sports, crime and delinquency, and organizations. In the interest of space, research in schools and with people with disabilities has been excluded from this review.
Artzrouni, M A; Easterlin, R A
A post World War 2 swing in fertility occurred in many industrialized countries. Research focusing chiefly on the US has suggested that a country's prior birth history has, through its effects on age structure, been an important cause of this fertility swing. The reasoning is that the pre-World War 2 depression in fertility and post World War 2 baby boom produced after 1945 1st a scarcity and then an abundance of those in family-forming ages relative to older adults. The relative scarcity of young adults, in turn, created favorable economic and psychological conditions among those in child bearing ages and promoted marriage and child bearing; the relative abundance had the opposite effect. This paper examines the relation between birth history and fertility from 1951-76 in England, Wales, France, Netherlands, Sweden, Finland, Denmark, Switzerland, Spain, Italy, and the US and explores the implications of the analysis for experience in the remainder of this century. The analysis builds on the well-known proposition that age structure is primarily determined by a country's birth history. Birth data can be thought of as yielding an imputed age ratio, that which would prevail in the absence of mortality and migration. Analysis of data indicates that the pattern of change in the imputed ratio usually approximates fairly closely that in the actual ratio. A ratio of old to young can be thought of as consisting of an upper age limit, lower age limit, and an intermediate age that divides the population into young and old. With all 3 of these ages free to vary, a computer program then determines within certain constraints which of all possible imputed ratios of old to young has the highest (positive or negative) correlation with the total fertility rate from 1951-76. In all countries except Italy the results support the hypothesis that a scarcity of adults in the younger adult ages relative to those in older ages leads to a relatively high total fertility rate; a relative
Kalagher, Hilary; Jones, Susan S.
Adults vary their haptic exploratory behavior reliably with variation both in the sensory input and in the task goals. Little is known about the development of these connections between perceptual goals and exploratory behaviors. A total of 36 children ages 3, 4, and 5 years and 20 adults completed a haptic intramodal match-to-sample task.…
Dembo, Richard; Briones-Robinson, Rhissa; Ungaro, Rocio; Karas, Lora; Gulledge, Laura; Greenbaum, Paul E.; Schmeidler, James; Winters, Ken C.; Belenko, Steven
Baseline data collected in two brief intervention projects (BI-Court and Truancy Project) were used to assess similarities and differences in subgroups of at-risk youth. Classifications of these subgroups were based on their psychosocial characteristics (e.g., substance use). Multi-group latent class analysis (LCA) identified two BI-Court subgroups of youth, and three Truant subgroups. These classes can be viewed as differing along two dimensions, substance use involvement and emotional/behavioral issues. Equality tests of means across the latent classes for BI-Court and Truancy Project youths found significant differences that were consistent with their problem group classification. These findings highlight the importance of quality assessments and allocating appropriate services based on problem profiles of at-risk youth. PMID:21966055
Alvi, Shahid; Zaidi, Arshia; Ammar, Nawal; Culbert, Lisa
The purpose of this study was to explore the influence of macro-level factors on immigrant and non-immigrant women's mental health status in a Canadian context. This study was part of a larger study examining women's quality of life in south eastern Ontario. Using survey research methods, data were collected through face-to-face interviews with 91 women of whom 66 identified their country of origin as "other" than Canada. Descriptive, bivariate and regression analysis of this data revealed that immigrant and non-immigrant women's macro-level predictors of mental health status vary. Overall, for immigrant women's perceptions of neighbourhood social cohesion was a stronger predictor influencing mental health status, while for non-immigrant women social support was more influential. Research with larger, representative samples should explore the findings to ascertain generalizability.
Newall, Anthony T; Dehollain, Juan Pablo
It is important to consider the value for money offered by existing elderly influenza vaccination programs, particularly as doubts persist about the magnitude of the effectiveness of such programs. An informative approach to explore the value of vaccination is to consider what vaccine efficacy would be required for a program to be considered cost-effective. To estimate the cost-effectiveness of the current elderly (65+ years) influenza vaccination program in Australia, we modelled how the hypothetical removal of vaccination would increase current disease burden estimates depending on alternative vaccine efficacy assumptions. The base-case results of the analysis found that the existing elderly vaccination program is likely to be cost-effective (under A$50,000 per quality-adjusted life year gained) if the vaccine efficacy is above ∼30%. This study offers reassurance that the influenza vaccination of elderly Australians is likely to offer value for money.
MacCluer, J W; Kammerer, C M
Several SEDA statistics are described and three of them (the Midparent-Child Correlation Coefficient, the Major Gene Index, and the Offspring Between Parents Function) are evaluated with respect to their ability to distinguish monogenic, polygenic and sporadic effects on quantitative traits. The MPCC, MGI, and OBP, used in combination, are sensitive but not specific in identifying monogenic inheritance. In our tests, traits determined by a simple type of polygenic inheritance were almost invariably classified as monogenic. Sporadic traits, on the other hand, were correctly identified in 84 percent of cases. The concurrent use of these SEDA statistics and two sibship variance tests yielded some improvement whenever the two procedures agreed (which happened about 50 percent of the time). These results suggest that under some circumstances, SEDA can be a valuable adjunct to other methods of genetic analysis. SEDA methodology also appears promising as an aid in understanding the contribution of various nongenetic factors to quantitative traits.
Rechten, Frances; Tweed, Alison E.
Every day nearly 900 children will be excluded from UK schools for disruptive behaviour and almost one-third of this population has a diagnosed mental health disorder. Exclusion from school is the endpoint of most schools' sanction-based behaviour management policies. This exploratory study investigated staff opinions for using a communication and…
Kolotilina, L.; Nikishin, A.; Yeremin, A.
The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.
Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte
Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis.
Modlinska, Klaudia; Pisula, Wojciech
Prenatal sex steroids play a vital role in the development of the whole organism, and therefore also the brain. Exposure of the fetus to testosterone seems to be of special importance both for typical development and pathology. The key factor impacting offspring development (including prenatal androgen levels) appears to be diet, both in terms of shortage and excessive intake of certain food products. Prenatal steroid levels are measured using the ratio of the lengths of the second and fourth fingers (2D:4D). So far, the digit ratio (2D:4D) has been shown to correlate negatively with prenatal testosterone and positively with prenatal estrogen. Numerous correlational studies found relationships between the 2D:4D phenotype and differences in magnitude of many psychological traits. Certain social and demographic variables also correlate with the digit ratio. The present paper offers a preliminary analysis of correlations between diet, prenatal hormones’ levels (established based on the digit ratio), and selected social variables. One of the findings is that countries with high meat consumption present the so-called masculine digit ratio, while countries with plant-based diets – a feminine digit ratio. PMID:27833908
Schaefer, C.; Young, M.; Mason, S.; Coble, C.; Wear, M. L.; Sargsyan, A.; Garcia, K.; Patel, N.; Gibson, C.; Alexander, D.; Van Baalen, M.
Enhanced screening for the Visual Impairment/Intracranial Pressure (VIIP) syndrome has been implemented to better characterize the ocular and vision changes observed in some long-duration crewmembers. This includes implementation of in-flight ultrasound in 2010 and optical coherence tomography (OCT) in 2013. Potential risk factors for VIIP include cardiovascular health, diet, anatomical and genetic factors, and environmental conditions. Carbon dioxide (CO2), a potent vasodilator, is chronically elevated on the International Space Station (ISS) relative to ambient levels on Earth, and is a plausible risk factor for VIIP. In an effort to understand the possible associations between CO2 and VIIP, this study explores the relationship of ambient CO2 levels on ISS compared to inflight ultrasound and OCT measures of the eye obtained from ISS crewmembers. CO2 measurements were aggregated from Operational Data Reduction Complex and Node 3 major constituent analyzers (MCAs) on ISS or from sensors located in the European Columbus module, as available. CO2 levels in the periods between each ultrasound and OCT session are summarized using timeseries metrics, including time-weighted means and variances. Partial least squares regression analyses are used to quantify the complex relationship between specific ultrasound and OCT measures and the CO2 metrics simulataneously. These analyses will enhance our understanding of the possible associations between CO2 levels and structural changes to the eye which will in turn inform future analysis of inflight VIIP data.
Giannoccaro, Nicola Ivan; Spedicato, Luigi
In this paper, the authors have developed a new method for reconstructing the boundary walls of a room environment by using a mechatronic device consisting of four ultrasonic sensors rotated by a servo modular actuator. This scanning system allows to measure the times of flight in each motor position so as to explore the surrounding space detecting reflections from the boundary walls and from other static obstacles. In addition to undesired reflections, due to non-target obstacles interposed between the sensors and the target surfaces, several spurious times are observed at the corners because of multiple reflections. The Fuzzy C-Means (FCM) algorithm is used for partitioning the obtained dataset in five clusters and some considerations on the output signal energy permit to select the two subsets concerned with multipath echoes. Each remaining cluster is associated to a set of three-dimensional points by considering the directivity of the wide beam propagated. In order to discard the observations that are numerically distant from the confidence data, the three sets are filtered by means of an ellipsoid defined by the Principal Component Analysis (PCA). The best-fit planes are obtained by testing the eigenvalues and relating eigenvectors of the covariance matrix of each filtered set. Several tests are shown and discussed for appreciating the effectiveness of the described approach and they are aimed at making a robot aware of its environment.
Burkhardt, Käthe; Loxton, Helene; Kagee, Ashraf; Ollendick, Thomas H
The Fear Survey Schedule for Children-Revised (Ollendick, 1983) is an 80-item self-report instrument that has been used internationally to asses the number of fears and general level of fearfulness among children. Despite its widespread use, this instrument has not been adapted to the South African context. The present study addressed this gap by means of a 2-phase investigation aimed at developing a South African version of the instrument. In Phase 1, semistructured interviews were conducted with 40 children (7 to 13 years of age). Qualitative data obtained from these interviews were used to construct additional items for inclusion in the South African Fear Survey Schedule for Children-Revised. The modified scale, consisting of 97 items, was then administered to a sample of 646 children between the ages of 7 and 13 years. Further psychometric considerations resulted in the final version of the scale consisting of 74 items with high internal consistency (α=.97). The factor structure was explored by means of principal component analysis with varimax rotation and a 5-factor solution was found to provide the best conceptual fit. The factors identified were as follows: Fear of Death and Danger; Fear of the Unknown; Fear of Small Animals and Minor Threats to Self; Large Animal Fears; and Situational Fears. Differences between the South African version and the original Fear Survey Schedule for Children-Revised are noted and implications for the study of fear in South Africa and other countries are discussed.
Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.
Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.
D TFhG-]NOLOGY k 4b 00 \\" ) ’b Technical Note BN-962 AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONVECTION DIFFUSION PROBLEM by I...Babu~ka and W. G. Szym’czak March 1981 V.. UNVI I Of- ’i -S AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD P. - 0 w APPLIED TO CONVECTION DIFFUSION ...AOAO98 895 MARYLAND UNIVYCOLLEGE PARK INST FOR PHYSICAL SCIENCE--ETC F/G 12/I AN ERROR ANALYIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONV..ETC (U
Žibert, Janez; Cedilnik, Jure; Pražnikar, Jure
In last decade space-density of monitoring stations increased, in to addition also air pollution modeling made big progress. Using diversity of big data can lead to better knowledge about air pollution at continental scale. The focus of presented study is the data-driven approach using non-negative matrix factorization to provide new insights and to study the characteristic space-time particulate-matter patterns across Europe. We analyzed the PM10 concentrations obtained from 1097 monitoring stations (AirBase data) and the Monitoring Atmospheric Composition and Climate (MACC) modeled fields for a period of 3 years. We distinguished five characteristic patterns obtained from the AirBase data and five patterns from the MACC data. A comparison between the AirBase and MACC data shows a good spatial overlap for the east Europe, central Europe and the Mediterranean patterns. However, it should be noted that an analysis of the MACC data revealed two additional marine patterns: the Celtic and the North Seas. The Po Valley and Balkan patterns were very clearly identified when analyzing the AirBase data. In order to better understand the influence of the synoptic situation on the particulate-matter concentrations the synoptic meteorological situations were additionally analyzed. The cold season, low wind and very stable conditions, which can last for several days, is the most common situation linked to high concentrations of anthropogenic air pollution with particulate matter. In contrast, for the Mediterranean pattern the most common situation (high factor loadings) is observed during the summer period. This pattern also exhibits a clearer annual cycle. A closer look at the sea-salt patterns (Celtic and North Seas) shows low time-series correlations between these two factors. Nevertheless, the physical mechanism is the same: a steep gradient between the cyclone and the anti-cyclone that causes high winds and, consequently, higher sea-salt production.
King, Doug; Hume, Patria; Gissane, Conor; Clark, Trevor
OBJECTIVE The aim of this study was to investigate the frequency, magnitude, and distribution of head impacts sustained by players in a junior rugby league over a season of matches. METHODS The authors performed a prospective cohort analysis of impact magnitude, frequency, and distribution on data collected with instrumented XPatches worn behind the ear of players in an "under-11" junior rugby league team (players under 11 years old). RESULTS A total of 1977 impacts were recorded. Over the course of the study, players sustained an average of 116 impacts (average of 13 impacts per player per match). The measured linear acceleration ranged from 10g to 123g (mean 22g, median 16g, and 95th percentile 57g). The rotational acceleration ranged from 89 rad/sec(2) to 22,928 rad/sec(2) (mean 4041 rad/sec(2), median 2773 rad/sec(2), and 95th percentile 11,384 rad/sec(2)). CONCLUSIONS The level of impact severity based on the magnitude of impacts for linear and rotational accelerations recorded was similar to the impacts reported in studies of American junior and high school football, collegiate football, and youth ice hockey players, but the players in the rugby league cohort were younger, had less body mass, and played at a slower speed than the American players. Junior rugby league players are required to tackle the player to the ground and use a different tackle technique than that used in American football, likely increasing the rotational accelerations recorded at the head.
Arbuckle, T E; Lin, Z; Mery, L S
The toxicity of pesticides on human reproduction is largely unknown--particularly how mixtures of pesticide products might affect fetal toxicity. The Ontario Farm Family Health Study collected data by questionnaire on the identity and timing of pesticide use on the farm, lifestyle factors, and a complete reproductive history from the farm operator and eligible couples living on the farm. A total of 2,110 women provided information on 3,936 pregnancies, including 395 spontaneous abortions. To explore critical windows of exposure and target sites for toxicity, we examined exposures separately for preconception (3 months before and up to month of conception) and postconception (first trimester) windows and for early (< 12 weeks) and late (12-19 weeks) spontaneous abortions. We observed moderate increases in risk of early abortions for preconception exposures to phenoxy acetic acid herbicides [odds ratio (OR) = 1.5; 95% confidence interval (CI), 1.1-2.1], triazines (OR = 1.4; 95% CI, 1.0-2.0), and any herbicide (OR = 1.4; 95% CI, 1.1-1.9). For late abortions, preconception exposure to glyphosate (OR = 1.7; 95% CI, 1.0-2.9), thiocarbamates (OR = 1.8; 95% CI, 1.1-3.0), and the miscellaneous class of pesticides (OR = 1.5; 95% CI, 1.0-2.4) was associated with elevated risks. Postconception exposures were generally associated with late spontaneous abortions. Older maternal age (> 34 years of age) was the strongest risk factor for spontaneous abortions, and we observed several interactions between pesticides in the older age group using Classification and Regression Tree analysis. This study shows that timing of exposure and restricting analyses to more homogeneous endpoints are important in characterizing the reproductive toxicity of pesticides. PMID:11564623
This paper explores the degree of variability in the structure of research article introductions within a single discipline. It is an exploratory study based on the analysis of 20 research articles. The study investigates the differences between two subdisciplines of applied linguistics, namely second language acquisition and second language…
Matson, Johnny L.; Coe, David A.
This article reviews applications of the applied behavior analysis ideas of B. F. Skinner and others to persons with both mental retardation and emotional disturbance. The review examines implications of behavior analysis for operant conditioning and radical behaviorism, schedules of reinforcement, and emotion and mental illness. (DB)
Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily
Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…
This evaluative review describes the history of applied behavior analysis in the area of developmental disability and its strengths and weaknesses. Emphasis is placed on the fact that behavior analysis can continue to provide valuable insights into the education and treatment of people with mental retardation. (Author/CR)
Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John
This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…
ARI Research Note 2008-01 A Cost-Benefit Analysis Applied to Example Proposals for Army Training and Education Research John E. Morrison, J. Dexter...2008 4. TITLE AND SUBTITLE 5a, CONTRACT OR GRANT NUMBER A Cost-Benefit Analysis Applied to Example Proposalsfor Army DASWO1 -04-C-0003 and W74V8H-05-C...elements of the current analysis were 21 proposed R&D efforts derived from concepts discussed in the workshop. Total costs were calculated in two ways: (1
Wingeier, B. M.; Nunez, P. L.; Silberstein, R. B.
We demonstrate an application of spherical harmonic decomposition to the analysis of the human electroencephalogram (EEG). We implement two methods and discuss issues specific to the analysis of hemispherical, irregularly sampled data. Spatial sampling requirements and performance of the methods are quantified using simulated data. The analysis is applied to experimental EEG data, confirming earlier reports of an approximate frequency-wave-number relationship in some bands.
Marsh, Herbert W.; Muthen, Bengt; Asparouhov, Tihomir; Ludtke, Oliver; Robitzsch, Alexander; Morin, Alexandre J. S.; Trautwein, Ulrich
This study is a methodological-substantive synergy, demonstrating the power and flexibility of exploratory structural equation modeling (ESEM) methods that integrate confirmatory and exploratory factor analyses (CFA and EFA), as applied to substantively important questions based on multidimentional students' evaluations of university teaching…
Sokolowsky, Martina; Fischer, Ulrich
Bitterness in wine, especially in white wine, is a complex and sensitive topic as it is a persistent sensation with negative connotation by consumers. However, the molecular base for bitter taste in white wines is still widely unknown yet. At the same time studies dealing with bitterness have to cope with the temporal dynamics of bitter perception. The most common method to describe bitter taste is the static measurement amongst other attributes during a descriptive analysis. A less frequently applied method, the time-intensity analysis, evaluates the temporal gustatory changes focusing on bitterness alone. The most recently developed multidimensional approach of the temporal dominance of sensations method reveals the temporal dominance of bitter taste in relation to other attributes. In order to compare the results comprised with these different sensory methodologies, 13 commercial white wines were evaluated by the same panel. To facilitate a statistical comparison, parameters were extracted from bitterness curves obtained from time-intensity and temporal dominance of sensations analysis and were compared to bitter intensity as well as bitter persistency based on descriptive analysis. Analysis of variance differentiated significantly the wines regarding all measured bitterness parameters obtained from the three sensory techniques. Comparing the information of all sensory parameters by multiple factor analysis and correlation, each technique provided additional valuable information regarding the complex bitter perception in white wine.
Kim, Wankyung; Soh, Wooyoung
Network event analysis gives useful information on the network status that helps protect from attacks. It involves finding sets of frequently used packet information such as IP addresses and requires real-time processing by its nature. This paper applies association rules to network event analysis. Originally association rules used for data mining can be applied to find frequent item sets. So, if frequent items occur on networks, information system can guess that there is a threat. But existed association rules such as Apriori algorithm are not suitable for analyzing network events on real-time due to the high usage of CPU and memory and thus low processing speed. This paper develops a network event audit module by applying association rules to network events using a new algorithm instead of Apriori algorithm. Test results show that the application of the new algorithm gives drastically low usage of both CPU and memory for network event analysis compared with existing Apriori algorithm.
Tennyson, George P. Jr.; Finger, John T.; Eichelberger, John C.; Hickox, Charles E.
This session at the Geothermal Energy Program Review X: Geothermal Energy and the Utility Market consisted of four presentations: ''Long Valley Exploratory Well - Summary'' by George P. Tennyson, Jr.; ''The Long Valley Well - Phase II Operations'' by John T. Finger; ''Geologic results from the Long Valley Exploratory Well'' by John C. Eichelberger; and ''A Model for Large-Scale Thermal Convection in the Long Valley Geothermal Region'' by Charles E. Hickox.
Martin, Neil T; Nosik, Melissa R; Carr, James E
Dymond, Clarke, Dunlap, and Steiner's (2000) analysis of international publication trends in the Journal of Applied Behavior Analysis (JABA) from 1970 to 1999 revealed low numbers of publications from outside North America, leading the authors to express concern about the lack of international involvement in applied behavior analysis. They suggested that a future review would be necessary to evaluate any changes in international authorship in the journal. As a follow-up, we analyzed non-U.S. publication trends in the most recent 15 years of JABA and found similar results. We discuss potential reasons for the relative paucity of international authors and suggest potential strategies for increasing non-U.S. contributions to the advancement of behavior analysis.
Wolery, Mark; Barton, Erin E.; Hine, Jeffrey F.
Two issues of each volume of the Journal of Applied Behavior Analysis were reviewed to identify research reports focusing on individuals with autism. The identified articles were analyzed to describe the ages of individuals with autism, the settings in which the research occurred, the nature of the behaviors targeted for intervention, and the…
Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan
This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…
Scotti, Joseph R.; And Others
Clinical research in the area of severe psychiatric disorders constituted the major focus for the discipline of applied behavior analysis during the early 1960s. Recently, however, there appears to be a notable lack of a behavioral focus within many inpatient psychiatric settings and a relative dearth of published behavioral treatment studies with…
Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.
Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…
Cebula, Katie R.
Psychosocial adjustment in siblings of children with autism whose families were using a home-based, applied behavior analysis (ABA) program was compared to that of siblings in families who were not using any intensive autism intervention. Data gathered from parents, siblings and teachers indicated that siblings in ABA families experienced neither…
Maguire, Heather M.
Parents often respond to challenging behavior exhibited by their children in such a way that unintentionally strengthens it. Applied behavior analysis (ABA) is a research-based science that has been proven effective in remediating challenging behavior in children. Although many parents could benefit from using strategies from the field of ABA with…
Griffith, G. M.; Fletcher, R.; Hastings, R. P.
Over more than a decade, specialist Applied Behavior Analysis (ABA) schools or classes for children with autism have developed in the UK and Ireland. However, very little is known internationally about how ABA is defined in practice in school settings, the characteristics of children supported in ABA school settings, and the staffing structures…
Schwichtenberg, A.; Poehlmann, J.
Background: Interventions based on applied behaviour analysis (ABA) are commonly recommended for children with an autism spectrum disorder (ASD); however, few studies address how this intervention model impacts families. The intense requirements that ABA programmes place on children and families are often cited as a critique of the programme,…
Barnes, Scott; Armstrong, Elizabeth
Despite the well documented pragmatic deficits that can arise subsequent to Right Hemisphere Brain Damage (RHBD), few researchers have directly studied everyday conversations involving people with RHBD. In recent years, researchers have begun applying Conversation Analysis (CA) to the everyday talk of people with aphasia. This research programme…
Morris, Edward K
I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life. (Tolstoy, 1894) This article presents a case study in the misrepresentation of applied behavior analysis for autism based on Morton Ann Gernsbacher's presentation of a lecture titled “The Science of Autism: Beyond the Myths and Misconceptions.” Her misrepresentations involve the characterization of applied behavior analysis, descriptions of practice guidelines, reviews of the treatment literature, presentations of the clinical trials research, and conclusions about those trials (e.g., children's improvements are due to development, not applied behavior analysis). The article also reviews applied behavior analysis' professional endorsements and research support, and addresses issues in professional conduct. It ends by noting the deleterious effects that misrepresenting any research on autism (e.g., biological, developmental, behavioral) have on our understanding and treating it in a transdisciplinary context. PMID:22478522
What Works Clearinghouse, 2010
The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours…
For some time the author looked for a tool to let students apply what they are learning about critical analysis in the science classroom to a relevant life experience. The opportunity occurred when a proposal to use environmentally friendly cleaning products in town buildings appeared on the local town meeting agenda. Using a copy of the proposal…
Mandersloot, Wim G. B.
Argues that technical communication editing is most effective if it deals with structure first, and that structure deficiencies can be detected by applying a range of logical analysis criteria to each text part. Concludes that lists, headings, classifications, and organograms must comply with the laws of categorization and relevant logical…
The Exploratory Studies Group is dedicated to advanced investigation of accelerators and radiation, primarily in the area of charged-particle beams and photon beams. Its primary mission is to explore the next steps in the development of particle accelerators and storage rings, which are important both for high-energy physics and for the wide range of disciplines now turning to synchrotron-radiation sources and free-electron lasers. Our research is therefore deeply committed to LBL`s institutional goal of becoming a center for the generation and use of coherent and incoherent electromagnetic radiation of exceptional brightness, as well as for generic research on the future development of accelerators. A significant fraction of our effort is dedicated to general accelerator-physics research for facilities on the immediate horizon, but a vital part of our activities comprises research into exotic possibilities for charged-particle production, accumulation, acceleration, and storage. During this report period, we were proncipally involved in four general areas of study: Accelerator-physics research for the Advanced Light Source, the 1-2 GeV synchrotron radiation source now under construction at LBL. In collaboration with the Stanford Linear Accelerator Center, both the conceptual and the detailed design of PEP-II, an energy-asymmetric electron-positron collider, based on the PEP ring at SLAC and designed to serve as a B-meson factory. Studies of ultraviolet and infrared free-electron lasers based on linear accelerators and storage rings, in particular the conceptual design of an infrared free-electron laser for the proposed Chemical Dynamics Research Laboratory at LBL. Generic high-energy accelerator-physics and photon-beam research directed far into the future to envision facilities that would employ new techniques of particle-beam acceleration and storage and photon-beam generation.
The Exploratory Studies Group is dedicated to advanced investigation of accelerators and radiation, primarily in the area of charged-particle beams and photon beams. Its primary mission is to explore the next steps in the development of particle accelerators and storage rings, which are important both for high-energy physics and for the wide range of disciplines now turning to synchrotron-radiation sources and free-electron lasers. Our research is therefore deeply committed to LBL's institutional goal of becoming a center for the generation and use of coherent and incoherent electromagnetic radiation of exceptional brightness, as well as for generic research on the future development of accelerators. A significant fraction of our effort is dedicated to general accelerator-physics research for facilities on the immediate horizon, but a vital part of our activities comprises research into exotic possibilities for charged-particle production, accumulation, acceleration, and storage. During this report period, we were proncipally involved in four general areas of study: Accelerator-physics research for the Advanced Light Source, the 1-2 GeV synchrotron radiation source now under construction at LBL. In collaboration with the Stanford Linear Accelerator Center, both the conceptual and the detailed design of PEP-II, an energy-asymmetric electron-positron collider, based on the PEP ring at SLAC and designed to serve as a B-meson factory. Studies of ultraviolet and infrared free-electron lasers based on linear accelerators and storage rings, in particular the conceptual design of an infrared free-electron laser for the proposed Chemical Dynamics Research Laboratory at LBL. Generic high-energy accelerator-physics and photon-beam research directed far into the future to envision facilities that would employ new techniques of particle-beam acceleration and storage and photon-beam generation.
Kolski, Jeffrey S.; Macek, Robert J.; McCrady, Rodney C.; Pang, Xiaoying
Independent component analysis (ICA) is a powerful blind source separation (BSS) method. Compared to the typical BSS method, principal component analysis (PCA), which is the BSS foundation of the well known model independent analysis (MIA), ICA is more robust to noise, coupling, and nonlinearity. ICA of turn-by-turn beam position data has been used to measure the transverse betatron phase and amplitude functions, dispersion function, linear coupling, sextupole strength, and nonlinear beam dynamics. We apply ICA in a new way to slices along the bunch and discuss the source signals identified as betatron motion and longitudinal beam structure.
Zhang, Y.; Oladyshkin, S.; Liu, Y.; Pau, G. S. H.
This study focuses on the comparison of applying four reduced order models (ROMs) to global sensitivity analysis (GSA). ROM is one way to improve computational efficiency in many-query applications such as optimization, uncertainty quantification, sensitivity analysis, inverse modeling where the computational demand can become large. The four ROM methods are: arbitrary Polynomial Chaos (aPC), Gaussian process regression (GPR), cut high dimensional model representation (HDMR), and random sample HDMR. The discussion is mainly based on a global sensitivity analysis performed for a hypothetical large-scale CO2 storage project. Pros and cons of each method will be discussed and suggestions on how each method should be applied individually or combined will be made.
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
Fulmer, Gavin W.; Liang, Ling L.; Liu, Xiufeng
This exploratory study applied a proposed force and motion learning progression (LP) to high-school and university students and to content involving both one- and two-dimensional force and motion situations. The Force Concept Inventory (FCI) was adapted, based on a previous content analysis and coding of the questions in the FCI in terms of the…
Lespinats, Sylvain; Pinker-Domenig, Katja; Meyer-Bäse, Uwe; Meyer-Bäse, Anke
Chromosome 19 is known to be linked to neurodegeneration and many cancers. Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the presentation of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the expression patterns for chromosome 19 proteins.
Morris, Edward K; Altus, Deborah E; Smith, Nathaniel G
This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research.
Morris, Edward K.; Altus, Deborah E.; Smith, Nathaniel G.
This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research. PMID:25729133
Miltenberger, R G; Fuqua, R W; Woods, D W
This article provides a review and analysis of habit reversal, a multicomponent procedure developed by Azrin and Nunn (1973, 1974) for the treatment of nervous habits, tics, and stuttering. The article starts with a discussion of the behaviors treated with habit reversal, behavioral covariation among habits, and functional analysis and assessment of habits. Research on habit reversal and simplified versions of the procedure is then described. Next the article discusses the limitations of habit reversal and the evidence for its generality. The article concludes with an analysis of the behavioral processes involved in habit reversal and suggestions for future research. PMID:9757583