A ‘reader’ unit of the chemical computer
Smelov, Pavel S.
2018-01-01
We suggest the main principals and functional units of the parallel chemical computer, namely, (i) a generator (which is a network of coupled oscillators) of oscillatory dynamic modes, (ii) a unit which is able to recognize these modes (a ‘reader’) and (iii) a decision-making unit, which analyses the current mode, compares it with the external signal and sends a command to the mode generator to switch it to the other dynamical regime. Three main methods of the functioning of the reader unit are suggested and tested computationally: (a) the polychronization method, which explores the differences between the phases of the generator oscillators; (b) the amplitude method which detects clusters of the generator and (c) the resonance method which is based on the resonances between the frequencies of the generator modes and the internal frequencies of the damped oscillations of the reader cells. Pro and contra of these methods have been analysed. PMID:29410852
ERIC Educational Resources Information Center
Davis, Charles H.
Intended for teaching applications programing for libraries and information centers, this volume is a graded workbook or text supplement containing typical practice problems, suggested solutions, and brief analyses which emphasize programing efficiency. The computer language used is Programing Language/One (PL/1) because it adapts readily to…
NASA Astrophysics Data System (ADS)
Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee
2017-06-01
We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening.
Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee
2017-01-01
We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening. PMID:28621308
Perri, Romina; Huta, Veronika; Pinchuk, Leonard; Pinchuk, Cindy; Ostry, David J; Lund, James P
2008-09-01
To determine if temporomandibular joint disorders (TMDs) are associated with extended computer use. People with chronic pain and extensive computer use were recruited by means of a newspaper advertisement. Those who responded to the ad were asked to complete an online survey, which included questions on computer use, medical history, pain symptoms, lifestyle and mood. Ninety-two people completed the online survey, but none of them responded to all questions in the survey. Of the 88 respondents who reported their sex, 49 (56%) were female. Most of the respondents had used computers for more than 5 hours per day for more than 5 years, and most believed that their pain was linked to computer use. The great majority had pain in the neck (73/89 [82%]) or shoulder (67/89 [75%]), but many (40/91 [44%]) also had symptoms of TMD. About half of the participants reported poor sleep and fatigue, and many linked their pain to negative effects on lifestyle and poor quality of life. Two multiple regressions, with duration of pain as the dependent variable, were carried out, one using the entire sample of respondents who had completed the necessary sections of the survey (n = 91) and the other using the subset of people with symptoms suggestive of TMD (n = 40). Duration of computer use was associated with duration of pain in both analyses, but 6 other independent variables (injury or arthritis, hours of daily computer use, stress, position of computer screen relative to the eyes, sex, and age) were without effect. In these regression analyses, the intercept was close to 0 years, which suggests that the pain began at about the same time as computer use. This web-based survey provides the first evidence that chronic pain in jaw muscles and other symptoms of TMD are associated with long-term, heavy use of computers. However, the great majority of people with these symptoms probably also suffer from pain in the shoulder and neck.
NASA Astrophysics Data System (ADS)
Bergey, Bradley W.; Ketelhut, Diane Jass; Liang, Senfeng; Natarajan, Uma; Karakus, Melissa
2015-10-01
The primary aim of the study was to examine whether performance on a science assessment in an immersive virtual environment was associated with changes in scientific inquiry self-efficacy. A secondary aim of the study was to examine whether performance on the science assessment was equitable for students with different levels of computer game self-efficacy, including whether gender differences were observed. We examined 407 middle school students' scientific inquiry self-efficacy and computer game self-efficacy before and after completing a computer game-like assessment about a science mystery. Results from path analyses indicated that prior scientific inquiry self-efficacy predicted achievement on end-of-module questions, which in turn predicted change in scientific inquiry self-efficacy. By contrast, computer game self-efficacy was neither predictive of nor predicted by performance on the science assessment. While boys had higher computer game self-efficacy compared to girls, multi-group analyses suggested only minor gender differences in how efficacy beliefs related to performance. Implications for assessments with virtual environments and future design and research are discussed.
Talarczyk-Desole, Joanna; Berger, Anna; Taszarek-Hauke, Grażyna; Hauke, Jan; Pawelczyk, Leszek; Jedrzejczak, Piotr
2017-01-01
The aim of the study was to check the quality of computer-assisted sperm analysis (CASA) system in comparison to the reference manual method as well as standardization of the computer-assisted semen assessment. The study was conducted between January and June 2015 at the Andrology Laboratory of the Division of Infertility and Reproductive Endocrinology, Poznań University of Medical Sciences, Poland. The study group consisted of 230 men who gave sperm samples for the first time in our center as part of an infertility investigation. The samples underwent manual and computer-assisted assessment of concentration, motility and morphology. A total of 184 samples were examined twice: manually, according to the 2010 WHO recommendations, and with CASA, using the program set-tings provided by the manufacturer. Additionally, 46 samples underwent two manual analyses and two computer-assisted analyses. The p-value of p < 0.05 was considered as statistically significant. Statistically significant differences were found between all of the investigated sperm parameters, except for non-progressive motility, measured with CASA and manually. In the group of patients where all analyses with each method were performed twice on the same sample we found no significant differences between both assessments of the same probe, neither in the samples analyzed manually nor with CASA, although standard deviation was higher in the CASA group. Our results suggest that computer-assisted sperm analysis requires further improvement for a wider application in clinical practice.
Computational study of Drucker-Prager plasticity of rock using microtomography
NASA Astrophysics Data System (ADS)
Liu, J.; Sarout, J.; Zhang, M.; Dautriat, J.; Veveakis, M.; Regenauer-Lieb, K.
2016-12-01
Understanding the physics of rocks is essential for the industry of mining and petroleum. Microtomography provides a new way to quantify the relationship between the microstructure and their mechanical and transport properties. Transport and elastic properties have been studied widely while plastic properties are still poorly understood. In this study, we analyse a synthetic sandstone sample for its up-scaled plastic properties from the micro-scale. The computations are based on the representative volume element (RVE). The mechanical RVE was determined by the upper and lower bound finite element computations of elasticity. By comparing with experimental curves, the parameters of the matrix (solid part), which consists of calcite-cemented quartz grains, were investigated and quite accurate values obtained. Analyses deduced the bulk properties of yield stress, cohesion and the angle of friction of the rock with pores. Computations of a series of models of volume-sizes from 240-cube to 400-cube showed almost overlapped stress-strain curves, suggesting that the mechanical RVE determined by elastic computations is valid for plastic yielding. Furthermore, a series of derivative models were created which have similar structure but different porosity values. The analyses of these models showed that yield stress, cohesion and the angle of friction linearly decrease with the porosity increasing in the range of porosity from 8% to 28%. The angle of friction decreases the fastest and cohesion shows the most stable along with porosity.
Comparison of bias analysis strategies applied to a large data set.
Lash, Timothy L; Abrams, Barbara; Bodnar, Lisa M
2014-07-01
Epidemiologic data sets continue to grow larger. Probabilistic-bias analyses, which simulate hundreds of thousands of replications of the original data set, may challenge desktop computational resources. We implemented a probabilistic-bias analysis to evaluate the direction, magnitude, and uncertainty of the bias arising from misclassification of prepregnancy body mass index when studying its association with early preterm birth in a cohort of 773,625 singleton births. We compared 3 bias analysis strategies: (1) using the full cohort, (2) using a case-cohort design, and (3) weighting records by their frequency in the full cohort. Underweight and overweight mothers were more likely to deliver early preterm. A validation substudy demonstrated misclassification of prepregnancy body mass index derived from birth certificates. Probabilistic-bias analyses suggested that the association between underweight and early preterm birth was overestimated by the conventional approach, whereas the associations between overweight categories and early preterm birth were underestimated. The 3 bias analyses yielded equivalent results and challenged our typical desktop computing environment. Analyses applied to the full cohort, case cohort, and weighted full cohort required 7.75 days and 4 terabytes, 15.8 hours and 287 gigabytes, and 8.5 hours and 202 gigabytes, respectively. Large epidemiologic data sets often include variables that are imperfectly measured, often because data were collected for other purposes. Probabilistic-bias analysis allows quantification of errors but may be difficult in a desktop computing environment. Solutions that allow these analyses in this environment can be achieved without new hardware and within reasonable computational time frames.
Regenbogen, Christina; Herrmann, Manfred; Fehr, Thorsten
2010-01-01
Studies investigating the effects of violent computer and video game playing have resulted in heterogeneous outcomes. It has been assumed that there is a decreased ability to differentiate between virtuality and reality in people that play these games intensively. FMRI data of a group of young males with (gamers) and without (controls) a history of long-term violent computer game playing experience were obtained during the presentation of computer game and realistic video sequences. In gamers the processing of real violence in contrast to nonviolence produced activation clusters in right inferior frontal, left lingual and superior temporal brain regions. Virtual violence activated a network comprising bilateral inferior frontal, occipital, postcentral, right middle temporal, and left fusiform regions. Control participants showed extended left frontal, insula and superior frontal activations during the processing of real, and posterior activations during the processing of virtual violent scenarios. The data suggest that the ability to differentiate automatically between real and virtual violence has not been diminished by a long-term history of violent video game play, nor have gamers' neural responses to real violence in particular been subject to desensitization processes. However, analyses of individual data indicated that group-related analyses reflect only a small part of actual individual different neural network involvement, suggesting that the consideration of individual learning history is sufficient for the present discussion.
A VLBI variance-covariance analysis interactive computer program. M.S. Thesis
NASA Technical Reports Server (NTRS)
Bock, Y.
1980-01-01
An interactive computer program (in FORTRAN) for the variance covariance analysis of VLBI experiments is presented for use in experiment planning, simulation studies and optimal design problems. The interactive mode is especially suited to these types of analyses providing ease of operation as well as savings in time and cost. The geodetic parameters include baseline vector parameters and variations in polar motion and Earth rotation. A discussion of the theroy on which the program is based provides an overview of the VLBI process emphasizing the areas of interest to geodesy. Special emphasis is placed on the problem of determining correlations between simultaneous observations from a network of stations. A model suitable for covariance analyses is presented. Suggestions towards developing optimal observation schedules are included.
Control and Stabilization: Making Millikan's Oil Drop Experiment Work
ERIC Educational Resources Information Center
Muller-Hill, Christoph; Heering, Peter
2011-01-01
Educational versions of Millikan's oil-drop experiment have frequently been criticized; suggestions for improvement either focus on technical innovations of the setup or on replacing the experiment by other approaches of familiarization, such as computer simulations. In our approach, we have analysed experimental procedures. In doing so, we were…
An empirical generative framework for computational modeling of language acquisition.
Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon
2010-06-01
This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.
Multiple independent introductions of Plasmodium falciparum in South America
Yalcindag, Erhan; Elguero, Eric; Arnathau, Céline; Durand, Patrick; Akiana, Jean; Anderson, Timothy J.; Aubouy, Agnes; Balloux, François; Besnard, Patrick; Bogreau, Hervé; Carnevale, Pierre; D'Alessandro, Umberto; Fontenille, Didier; Gamboa, Dionicia; Jombart, Thibaut; Le Mire, Jacques; Leroy, Eric; Maestre, Amanda; Mayxay, Mayfong; Ménard, Didier; Musset, Lise; Newton, Paul N.; Nkoghé, Dieudonné; Noya, Oscar; Ollomo, Benjamin; Rogier, Christophe; Veron, Vincent; Wide, Albina; Zakeri, Sedigheh; Carme, Bernard; Legrand, Eric; Chevillon, Christine; Ayala, Francisco J.; Renaud, François; Prugnolle, Franck
2012-01-01
The origin of Plasmodium falciparum in South America is controversial. Some studies suggest a recent introduction during the European colonizations and the transatlantic slave trade. Other evidence—archeological and genetic—suggests a much older origin. We collected and analyzed P. falciparum isolates from different regions of the world, encompassing the distribution range of the parasite, including populations from sub-Saharan Africa, the Middle East, Southeast Asia, and South America. Analyses of microsatellite and SNP polymorphisms show that the populations of P. falciparum in South America are subdivided in two main genetic clusters (northern and southern). Phylogenetic analyses, as well as Approximate Bayesian Computation methods suggest independent introductions of the two clusters from African sources. Our estimates of divergence time between the South American populations and their likely sources favor a likely introduction from Africa during the transatlantic slave trade. PMID:22203975
ERIC Educational Resources Information Center
Burger, H. Robert
1983-01-01
Part 1 (SE 533 635) presented programs for use in mineralogy, petrology, and geochemistry. This part presents an annotated list of 64 additional programs, focusing on introductory geology, mapping, and statistical packages for geological analyses. A brief description, source, suggested use(s), programing language, and other information are…
NASA Technical Reports Server (NTRS)
Hawke, Veronica; Gage, Peter; Manning, Ted
2007-01-01
ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.
Computational analyses in cognitive neuroscience: in defense of biological implausibility.
Dror, I E; Gallogly, D P
1999-06-01
Because cognitive neuroscience researchers attempt to understand the human mind by bridging behavior and brain, they expect computational analyses to be biologically plausible. In this paper, biologically implausible computational analyses are shown to have critical and essential roles in the various stages and domains of cognitive neuroscience research. Specifically, biologically implausible computational analyses can contribute to (1) understanding and characterizing the problem that is being studied, (2) examining the availability of information and its representation, and (3) evaluating and understanding the neuronal solution. In the context of the distinct types of contributions made by certain computational analyses, the biological plausibility of those analyses is altogether irrelevant. These biologically implausible models are nevertheless relevant and important for biologically driven research.
The interactome of CCT complex - A computational analysis.
Narayanan, Aswathy; Pullepu, Dileep; Kabir, M Anaul
2016-10-01
The eukaryotic chaperonin, CCT (Chaperonin Containing TCP1 or TriC-TCP-1 Ring Complex) has been subjected to physical and genetic analyses in S. cerevisiae which can be extrapolated to human CCT (hCCT), owing to its structural and functional similarities with yeast CCT (yCCT). Studies on hCCT and its interactome acquire an additional dimension, as it has been implicated in several disease conditions like neurodegeneration and cancer. We attempt to study its stress response role in general, which will be reflected in the aspects of human diseases and yeast physiology, through computational analysis of the interactome. Towards consolidating and analysing the interactome data, we prepared and compared the unique CCT-interacting protein lists for S. cerevisiae and H. sapiens, performed GO term classification and enrichment studies which provide information on the diversity in CCT interactome, in terms of protein classes in the data set. Enrichment with disease-associated proteins and pathways highlight the medical importance of CCT. Different analyses converge, suggesting the significance of WD-repeat proteins, protein kinases and cytoskeletal proteins in the interactome. The prevalence of proteasomal subunits and ribosomal proteins suggest a possible cross-talk between protein-synthesis, folding and degradation machinery. A network of chaperones and chaperonins that function in combination can also be envisaged from the CCT interactome-Hsp70 interactome analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae
2014-01-01
Background We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. Methods We evaluated 176 patients with small lung adenocarcinomas (diameter, 1–3 cm) who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution computed tomography) with tumour dimensions evaluated under two settings: lung window and mediastinal window. We also determined the patient age, gender, preoperative nodal status, tumour size, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and pathological status (lymphatic vessel, vascular vessel or pleural invasion). Recurrence-free survival was used for prognosis. Results Lung window, mediastinal window, tumour disappearance ratio and preoperative nodal status were significant predictive factors for recurrence-free survival in univariate analyses. Areas under the receiver operator curves for recurrence were 0.76, 0.73 and 0.65 for mediastinal window, tumour disappearance ratio and lung window, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant predictive factors for lymph node metastasis in univariate analyses; areas under the receiver operator curves were 0.61, 0.76, 0.72 and 0.66, for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant factors for lymphatic vessel, vascular vessel or pleural invasion in univariate analyses; areas under the receiver operator curves were 0.60, 0.81, 0.81 and 0.65 for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Conclusions According to the univariate analyses including a logistic regression and ROCs performed for variables with p-values of <0.05 on univariate analyses, our results suggest that measuring tumour size using mediastinal window on high-resolution computed tomography is a simple and useful preoperative prognosis modality in small adenocarcinoma. PMID:25365326
Bringing computational science to the public.
McDonagh, James L; Barker, Daniel; Alderson, Rosanna G
2016-01-01
The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.
ERIC Educational Resources Information Center
Nakata, Tatsuya
2011-01-01
The present study aims to conduct a comprehensive investigation of flashcard software for learning vocabulary in a second language. Nine flashcard programs were analysed using 17 criteria derived from previous studies on flashcard learning as well as paired-associate learning. Results suggest that in general, most programs have been developed in a…
Choosing a Transformation in Analyses of Insect Counts from Contagious Distributions with Low Means
W.D. Pepper; S.J. Zarnoch; G.L. DeBarr; P. de Groot; C.D. Tangren
1997-01-01
Guidelines based on computer simulation are suggested for choosing a transformation of insect counts from negative binomial distributions with low mean counts and high levels of contagion. Typical values and ranges of negative binomial model parameters were determined by fitting the model to data from 19 entomological field studies. Random sampling of negative binomial...
Environmental Studies: Mathematical, Computational and Statistical Analyses
1993-03-03
mathematical analysis addresses the seasonally and longitudinally averaged circulation which is under the influence of a steady forcing located asymmetrically...employed, as has been suggested for some situations. A general discussion of how interfacial phenomena influence both the original contamination process...describing the large-scale advective and dispersive behaviour of contaminants transported by groundwater and the uncertainty associated with field-scale
ERIC Educational Resources Information Center
Klein, Davina C. D.; O'Neil, Harold F., Jr.; Dennis, Robert A.; Baker, Eva L.
A cognitive demands analysis of a learning technology, a term that includes the hardware and the computer software products that form learning environments, attempts to describe the types of cognitive learning expected of the individual by the technology. This paper explores the context of cognitive learning, suggesting five families of cognitive…
Quantitative, steady-state properties of Catania's computational model of the operant reserve.
Berg, John P; McDowell, J J
2011-05-01
Catania (2005) found that a computational model of the operant reserve (Skinner, 1938) produced realistic behavior in initial, exploratory analyses. Although Catania's operant reserve computational model demonstrated potential to simulate varied behavioral phenomena, the model was not systematically tested. The current project replicated and extended the Catania model, clarified its capabilities through systematic testing, and determined the extent to which it produces behavior corresponding to matching theory. Significant departures from both classic and modern matching theory were found in behavior generated by the model across all conditions. The results suggest that a simple, dynamic operant model of the reflex reserve does not simulate realistic steady state behavior. Copyright © 2011 Elsevier B.V. All rights reserved.
Coupled Thermo-Mechanical Analyses of Dynamically Loaded Rubber Cylinders
NASA Technical Reports Server (NTRS)
Johnson, Arthur R.; Chen, Tzi-Kang
2000-01-01
A procedure that models coupled thermo-mechanical deformations of viscoelastic rubber cylinders by employing the ABAQUS finite element code is described. Computational simulations of hysteretic heating are presented for several tall and short rubber cylinders both with and without a steel disk at their centers. The cylinders are compressed axially and are then cyclically loaded about the compressed state. The non-uniform hysteretic heating of the rubber cylinders containing a steel disk is presented. The analyses performed suggest that the coupling procedure should be considered for further development as a design tool for rubber degradation studies.
Ekins, Sean; Olechno, Joe; Williams, Antony J.
2013-01-01
Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research. PMID:23658723
Air slab-correction for Γ-ray attenuation measurements
NASA Astrophysics Data System (ADS)
Mann, Kulwinder Singh
2017-12-01
Gamma (γ)-ray shielding behaviour (GSB) of a material can be ascertained from its linear attenuation coefficient (μ, cm-1). Narrow-beam transmission geometry is required for μ-measurement. In such measurements, a thin slab of the material has to insert between point-isotropic γ-ray source and detector assembly. The accuracy in measurements requires that sample's optical thickness (OT) remain below 0.5 mean free path (mfp). Sometimes it is very difficult to produce thin slab of sample (absorber), on the other hand for thick absorber, i.e. OT >0.5 mfp, the influence of the air displaced by it cannot be ignored during μ-measurements. Thus, for a thick sample, correction factor has been suggested which compensates the air present in the transmission geometry. The correction factor has been named as an air slab-correction (ASC). Six samples of low-Z engineering materials (cement-black, clay, red-mud, lime-stone, cement-white and plaster-of-paris) have been selected for investigating the effect of ASC on μ-measurements at three γ-ray energies (661.66, 1173.24, 1332.50 keV). The measurements have been made using point-isotropic γ-ray sources (Cs-137 and Co-60), NaI(Tl) detector and multi-channel-analyser coupled with a personal computer. Theoretical values of μ have been computed using a GRIC2-toolkit (standardized computer programme). Elemental compositions of the samples were measured with Wavelength Dispersive X-ray Fluorescence (WDXRF) analyser. Inter-comparison of measured and computed μ-values, suggested that the application of ASC helps in precise μ-measurement for thick samples of low-Z materials. Thus, this hitherto widely ignored ASC factor is recommended to use in similar γ-ray measurements.
Mouritsen, H; Larsen, O N
2001-11-01
This paper investigates how young pied flycatchers, Ficedula hypoleuca, and blackcaps, Sylvia atricapilla, interpret and use celestial cues. In order to record these data, we developed a computer-controlled version of the Emlen funnel, which enabled us to make detailed temporal analyses. First, we showed that the birds use a star compass. Then, we tested the birds under a stationary planetarium sky, which simulated the star pattern of the local sky at 02:35 h for 11 consecutive hours of the night, and compared the birds' directional choices as a function of time with the predictions from five alternative stellar orientation hypotheses. The results supported the hypothesis suggesting that birds use a time-independent star compass based on learned geometrical star configurations to pinpoint the rotational point of the starry sky (north). In contrast, neither hypotheses suggesting that birds use the stars for establishing their global position and then perform true star navigation nor those suggesting the use of a time-compensated star compass were supported.
Ferraro, Mauro; Auricchio, Ferdinando; Boatti, Elisa; Scalet, Giulia; Conti, Michele; Morganti, Simone; Reali, Alessandro
2015-01-01
Computer-based simulations are nowadays widely exploited for the prediction of the mechanical behavior of different biomedical devices. In this aspect, structural finite element analyses (FEA) are currently the preferred computational tool to evaluate the stent response under bending. This work aims at developing a computational framework based on linear and higher order FEA to evaluate the flexibility of self-expandable carotid artery stents. In particular, numerical simulations involving large deformations and inelastic shape memory alloy constitutive modeling are performed, and the results suggest that the employment of higher order FEA allows accurately representing the computational domain and getting a better approximation of the solution with a widely-reduced number of degrees of freedom with respect to linear FEA. Moreover, when buckling phenomena occur, higher order FEA presents a superior capability of reproducing the nonlinear local effects related to buckling phenomena. PMID:26184329
Some Observations on the Current Status of Performing Finite Element Analyses
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.
2015-01-01
Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.
Data association approaches in bearings-only multi-target tracking
NASA Astrophysics Data System (ADS)
Xu, Benlian; Wang, Zhiquan
2008-03-01
According to requirements of time computation complexity and correctness of data association of the multi-target tracking, two algorithms are suggested in this paper. The proposed Algorithm 1 is developed from the modified version of dual Simplex method, and it has the advantage of direct and explicit form of the optimal solution. The Algorithm 2 is based on the idea of Algorithm 1 and rotational sort method, it combines not only advantages of Algorithm 1, but also reduces the computational burden, whose complexity is only 1/ N times that of Algorithm 1. Finally, numerical analyses are carried out to evaluate the performance of the two data association algorithms.
Bhatt, Ishan S; Guthrie, O'neil
2017-06-01
Bilateral audiometric notch (BN) at 4000-6000 Hz was identified as a noise-induced hearing loss (NIHL) phenotype for genetic association analysis in college-aged musicians. This study analysed BN in a sample of US youth. Prevalence of the BN within the study sample was determined and logistic-regression analyses were performed to identify audiologic and other demographic factors associated with BN. Computer-simulated "flat" audiograms were used to estimate potential influence of false-positive rates in estimating the prevalence of the BN. 2348 participants (12-19 years) following the inclusion criteria were selected from the National Health and Nutrition Examination Survey data (2005-2010). The prevalence of BN was 16.6%. Almost 55.6% of the participants showed notch in at least one ear. Noise exposure, gender, ethnicity and age showed significant relationship with the BN. Computer simulation revealed that 5.5% of simulated participants with "flat" audiograms showed BN. Association of noise exposure with BN suggests that it is a useful NIHL phenotype for genetic association analyses. However, further research is necessary to reduce false-positive rates in notch identification.
Multivoxel neurofeedback selectively modulates confidence without changing perceptual performance
Cortese, Aurelio; Amano, Kaoru; Koizumi, Ai; Kawato, Mitsuo; Lau, Hakwan
2016-01-01
A central controversy in metacognition studies concerns whether subjective confidence directly reflects the reliability of perceptual or cognitive processes, as suggested by normative models based on the assumption that neural computations are generally optimal. This view enjoys popularity in the computational and animal literatures, but it has also been suggested that confidence may depend on a late-stage estimation dissociable from perceptual processes. Yet, at least in humans, experimental tools have lacked the power to resolve these issues convincingly. Here, we overcome this difficulty by using the recently developed method of decoded neurofeedback (DecNef) to systematically manipulate multivoxel correlates of confidence in a frontoparietal network. Here we report that bi-directional changes in confidence do not affect perceptual accuracy. Further psychophysical analyses rule out accounts based on simple shifts in reporting strategy. Our results provide clear neuroscientific evidence for the systematic dissociation between confidence and perceptual performance, and thereby challenge current theoretical thinking. PMID:27976739
ter Haar, E; Day, B W; Rosenkranz, H S
1996-03-09
The computational analysis data presented indicate a significant mechanistic association between the ability of a chemical to cause tubulin polymerization perturbation (TPP), via direct interaction with the protein, and the in vivo induction of micronuclei (MN). Since it is known that TPP is not a genotoxic event, the analyses suggest that the induction of MN by a non-genotoxic mechanism is a significant alternate pathway.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reaugh, J. E.
HE ignition caused by shear localization is the principal concern for safety analyses of postulated mechanical insults to explosive assemblies. Although prompt detonation from shock is certainly a concern, insults that lead to prompt detonation are associated with high velocity, and correspondingly rare. For high-density HMX assemblies, an impact speed (by a steel object) of 400 m/s is needed to develop a detonation in a run distance less than 30 mm. To achieve a steady plane shock, which results in the shortest run distance to detonation for a given peak pressure, the impactor diameter must exceed 60 mm, and thicknessmore » approach 20 mm. Thinner plates and/or smaller diameter ones require even higher impact velocity. Ignitions from shear localization, however, have been observed from impacts less than 50 m/s in Steven tests, less than 30 m/s from spigot impact tests, and less than 10 m/s from various drop tests. This lower velocity range is much frequent in postulated mechanical insults. Preliminary computer simulations and analyses of a variety of such tests have suggested that although each is accompanied by shear localization, there are differing detailed mechanisms at work that cause the ignitions. We identify those mechanisms that may be at work in a variety of such tests, and suggest how models of shear ignition, such as HERMES, may be revised and calibrated to conform to experiment. We suggest combining additional experiments with computer simulations and model development to begin confirm or uncover mechanisms that may be at work in a specific postulated event.« less
Developing an item bank and short forms that assess the impact of asthma on quality of life.
Stucky, Brian D; Edelen, Maria Orlando; Sherbourne, Cathy D; Eberhart, Nicole K; Lara, Marielena
2014-02-01
The present work describes the process of developing an item bank and short forms that measure the impact of asthma on quality of life (QoL) that avoids confounding QoL with asthma symptomatology and functional impairment. Using a diverse national sample of adults with asthma (N = 2032) we conducted exploratory and confirmatory factor analyses, and item response theory and differential item functioning analyses to develop a 65-item unidimensional item bank and separate short form assessments. A psychometric evaluation of the RAND Impact of Asthma on QoL item bank (RAND-IAQL) suggests that though the concept of asthma impact on QoL is multi-faceted, it may be measured as a single underlying construct. The performance of the bank was then evaluated with a real-data simulated computer adaptive test. From the RAND-IAQL item bank we then developed two short forms consisting of 4 and 12 items (reliability = 0.86 and 0.93, respectively). A real-data simulated computer adaptive test suggests that as few as 4-5 items from the bank are needed to obtain highly precise scores. Preliminary validity results indicate that the RAND-IAQL measures distinguish between levels of asthma control. To measure the impact of asthma on QoL, users of these items may choose from two highly reliable short forms, computer adaptive test administration, or content-specific subsets of items from the bank tailored to their specific needs. Copyright © 2013 Elsevier Ltd. All rights reserved.
Tiwari, Sameeksha; Awasthi, Manika; Singh, Swati; Pandey, Veda P; Dwivedi, Upendra N
2017-10-23
Protein-protein interactions (PPI) are a new emerging class of novel therapeutic targets. In order to probe these interactions, computational tools provide a convenient and quick method towards the development of therapeutics. Keeping this in view the present study was initiated to analyse interaction of tumour suppressor protein p53 (TP53) and breast cancer associated protein (BRCA1) as promising target against breast cancer. Using computational approaches such as protein-protein docking, hot spot analyses, molecular docking and molecular dynamics simulation (MDS), stepwise analyses of the interactions of the wild type and mutant TP53 with that of wild type BRCA1 and their modulation by alkaloids were done. Protein-protein docking method was used to generate both wild type and mutant complexes of TP53-BRCA1. Subsequently, the complexes were docked using sixteen different alkaloids, fulfilling ADMET and Lipinski's rule of five criteria, and were compared with that of a well-known inhibitor of PPI, namely nutlin. The alkaloid dicentrine was found to be the best docked alkaloid among all the docked alklaloids as well as that of nutlin. Furthermore, MDS analyses of both wild type and mutant complexes with the best docked alkaloid i.e. dicentrine, revealed higher stability of mutant complex than that of the wild one, in terms of average RMSD, RMSF and binding free energy, corroborating the results of docking. Results suggested more pronounced interaction of BRCA1 with mutant TP53 leading to increased expression of mutated TP53 thus showing a dominant negative gain of function and hampering wild type TP53 function leading to tumour progression.
Numerical Taxonomy of Some Bacteria Isolated from Antarctic and Tropical Seawaters1
Pfister, Robert M.; Burkholder, Paul R.
1965-01-01
Pfister, Robert M. (Lamont Geological Observatory, Palisades, N.Y.), and Paul R. Burkholder. Numerical taxonomy of some bacteria isolated from Antarctic and tropical seawaters. J. Bacteriol. 90:863–872. 1965.—Microorganisms from Antarctic seas and from tropical waters near Puerto Rico were examined with a series of morphological, physiological, and biochemical tests. The results of these analyses were coded on punch cards, and similarity matrices were computed with a program for an IBM 1620 computer. When the matrix was reordered by use of the single-linkage technique, and the results were plotted with four symbols for different per cent similarity ranges, nine groups of microorganisms were revealed. The data suggest that organisms occurring in different areas of the open ocean may be profitably studied with standardized computer techniques. PMID:5847807
Discrete-Roughness-Element-Enhanced Swept-Wing Natural Laminar Flow at High Reynolds Numbers
NASA Technical Reports Server (NTRS)
Malik, Mujeeb; Liao, Wei; Li, Fei; Choudhari, Meelan
2015-01-01
Nonlinear parabolized stability equations and secondary-instability analyses are used to provide a computational assessment of the potential use of the discrete-roughness-element technology for extending swept-wing natural laminar flow at chord Reynolds numbers relevant to transport aircraft. Computations performed for the boundary layer on a natural-laminar-flow airfoil with a leading-edge sweep angle of 34.6 deg, freestream Mach number of 0.75, and chord Reynolds numbers of 17 × 10(exp 6), 24 × 10(exp 6), and 30 × 10(exp 6) suggest that discrete roughness elements could delay laminar-turbulent transition by about 20% when transition is caused by stationary crossflow disturbances. Computations show that the introduction of small-wavelength stationary crossflow disturbances (i.e., discrete roughness element) also suppresses the growth of most amplified traveling crossflow disturbances.
DRE-Enhanced Swept-Wing Natural Laminar Flow at High Reynolds Numbers
NASA Technical Reports Server (NTRS)
Malik, Mujeeb; Liao, Wei; Li, Fe; Choudhari, Meelan
2013-01-01
Nonlinear parabolized stability equations and secondary instability analyses are used to provide a computational assessment of the potential use of the discrete roughness elements (DRE) technology for extending swept-wing natural laminar flow at chord Reynolds numbers relevant to transport aircraft. Computations performed for the boundary layer on a natural laminar flow airfoil with a leading-edge sweep angle of 34.6deg, free-stream Mach number of 0.75 and chord Reynolds numbers of 17 x 10(exp 6), 24 x 10(exp 6) and 30 x 10(exp 6) suggest that DRE could delay laminar-turbulent transition by about 20% when transition is caused by stationary crossflow disturbances. Computations show that the introduction of small wavelength stationary crossflow disturbances (i.e., DRE) also suppresses the growth of most amplified traveling crossflow disturbances.
Hofstadter-Duke, Kristi L; Daly, Edward J
2015-03-01
This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.
From systems biology to dynamical neuropharmacology: proposal for a new methodology.
Erdi, P; Kiss, T; Tóth, J; Ujfalussy, B; Zalányi, L
2006-07-01
The concepts and methods of systems biology are extended to neuropharmacology in order to test and design drugs for the treatment of neurological and psychiatric disorders. Computational modelling by integrating compartmental neural modelling techniques and detailed kinetic descriptions of pharmacological modulation of transmitter-receptor interaction is offered as a method to test the electrophysiological and behavioural effects of putative drugs. Even more, an inverse method is suggested as a method for controlling a neural system to realise a prescribed temporal pattern. In particular, as an application of the proposed new methodology, a computational platform is offered to analyse the generation and pharmacological modulation of theta rhythm related to anxiety.
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update
Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy
2016-01-01
High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889
Tracking brain states under general anesthesia by using global coherence analysis.
Cimenser, Aylin; Purdon, Patrick L; Pierce, Eric T; Walsh, John L; Salazar-Gomez, Andres F; Harrell, Priscilla G; Tavares-Stoeckel, Casie; Habeeb, Kathleen; Brown, Emery N
2011-05-24
Time and frequency domain analyses of scalp EEG recordings are widely used to track changes in brain states under general anesthesia. Although these analyses have suggested that different spatial patterns are associated with changes in the state of general anesthesia, the extent to which these patterns are spatially coordinated has not been systematically characterized. Global coherence, the ratio of the largest eigenvalue to the sum of the eigenvalues of the cross-spectral matrix at a given frequency and time, has been used to analyze the spatiotemporal dynamics of multivariate time-series. Using 64-lead EEG recorded from human subjects receiving computer-controlled infusions of the anesthetic propofol, we used surface Laplacian referencing combined with spectral and global coherence analyses to track the spatiotemporal dynamics of the brain's anesthetic state. During unconsciousness the spectrograms in the frontal leads showed increasing α (8-12 Hz) and δ power (0-4 Hz) and in the occipital leads δ power greater than α power. The global coherence detected strong coordinated α activity in the occipital leads in the awake state that shifted to the frontal leads during unconsciousness. It revealed a lack of coordinated δ activity during both the awake and unconscious states. Although strong frontal power during general anesthesia-induced unconsciousness--termed anteriorization--is well known, its possible association with strong α range global coherence suggests highly coordinated spatial activity. Our findings suggest that combined spectral and global coherence analyses may offer a new approach to tracking brain states under general anesthesia.
NASA Astrophysics Data System (ADS)
Saighi, Ouafa; Salah Zerouala, Mohamed
2017-12-01
This The paper particularly deals with the way in which computer tools are used by students in their design studio’s projects. Four institutions of architecture education in Algeria are considered as a case study to evaluate the impact of such tools on student design process. This aims to inspect in depth such use, to sort out its advantages and shortcomings in order to suggest some solutions. A field survey was undertaken on a sample of students and their teachers at the same institutions. The analysed results mainly show that computer tools are highly focusing on improving the quality of drawings representation and images seeking observers’ satisfaction hence influencing their decision. Some teachers are not very keen to overuse the computer during the design phase; they prefer the “traditional” approach. This is the present situation that Algerian university is facing which leads to conflict and disagreement between students and teachers. Meanwhile, there was no doubt that computer tools have effectively contributed to improve the competitive level among students.
Chronic Exposure to Methamphetamine Disrupts Reinforcement-Based Decision Making in Rats.
Groman, Stephanie M; Rich, Katherine M; Smith, Nathaniel J; Lee, Daeyeol; Taylor, Jane R
2018-03-01
The persistent use of psychostimulant drugs, despite the detrimental outcomes associated with continued drug use, may be because of disruptions in reinforcement-learning processes that enable behavior to remain flexible and goal directed in dynamic environments. To identify the reinforcement-learning processes that are affected by chronic exposure to the psychostimulant methamphetamine (MA), the current study sought to use computational and biochemical analyses to characterize decision-making processes, assessed by probabilistic reversal learning, in rats before and after they were exposed to an escalating dose regimen of MA (or saline control). The ability of rats to use flexible and adaptive decision-making strategies following changes in stimulus-reward contingencies was significantly impaired following exposure to MA. Computational analyses of parameters that track choice and outcome behavior indicated that exposure to MA significantly impaired the ability of rats to use negative outcomes effectively. These MA-induced changes in decision making were similar to those observed in rats following administration of a dopamine D2/3 receptor antagonist. These data use computational models to provide insight into drug-induced maladaptive decision making that may ultimately identify novel targets for the treatment of psychostimulant addiction. We suggest that the disruption in utilization of negative outcomes to adaptively guide dynamic decision making is a new behavioral mechanism by which MA rigidly biases choice behavior.
Combaz, Adrien; Van Hulle, Marc M
2015-01-01
We study the feasibility of a hybrid Brain-Computer Interface (BCI) combining simultaneous visual oddball and Steady-State Visually Evoked Potential (SSVEP) paradigms, where both types of stimuli are superimposed on a computer screen. Potentially, such a combination could result in a system being able to operate faster than a purely P300-based BCI and encode more targets than a purely SSVEP-based BCI. We analyse the interactions between the brain responses of the two paradigms, and assess the possibility to detect simultaneously the brain activity evoked by both paradigms, in a series of 3 experiments where EEG data are analysed offline. Despite differences in the shape of the P300 response between pure oddball and hybrid condition, we observe that the classification accuracy of this P300 response is not affected by the SSVEP stimulation. We do not observe either any effect of the oddball stimulation on the power of the SSVEP response in the frequency of stimulation. Finally results from the last experiment show the possibility of detecting both types of brain responses simultaneously and suggest not only the feasibility of such hybrid BCI but also a gain over pure oddball- and pure SSVEP-based BCIs in terms of communication rate.
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph; Kopasakis, George
2016-01-01
An overview of recent applications of the FUN3D CFD code to computational aeroelastic, sonic boom, and aeropropulsoservoelasticity (APSE) analyses of a low-boom supersonic configuration is presented. The overview includes details of the computational models developed including multiple unstructured CFD grids suitable for aeroelastic and sonic boom analyses. In addition, aeroelastic Reduced-Order Models (ROMs) are generated and used to rapidly compute the aeroelastic response and utter boundaries at multiple flight conditions.
Nair, Pradeep S; John, Eugene B
2007-01-01
Aligning specific sequences against a very large number of other sequences is a central aspect of bioinformatics. With the widespread availability of personal computers in biology laboratories, sequence alignment is now often performed locally. This makes it necessary to analyse the performance of personal computers for sequence aligning bioinformatics benchmarks. In this paper, we analyse the performance of a personal computer for the popular BLAST and FASTA sequence alignment suites. Results indicate that these benchmarks have a large number of recurring operations and use memory operations extensively. It seems that the performance can be improved with a bigger L1-cache.
Effect of Stitching on Debonding in Composite Structural Elements
NASA Technical Reports Server (NTRS)
Raju, I. S.; Glaessgen, E. H.
2001-01-01
Stitched multiaxial warp knit materials have been suggested as viable alternatives to laminated prepreg materials for large aircraft structures such as wing skins. Analyses have been developed to quantify the effectiveness of stitching for reducing strain energy release rates in skin-stiffener debond, lap joint and sandwich debond configurations. Strain energy release rates were computed using the virtual crack closure technique. In all configurations, the stitches were shown to significantly reduce the strain energy release rate.
The Structure of Medical Informatics Journal Literature
Morris, Theodore A.; McCain, Katherine W.
1998-01-01
Abstract Objective: Medical informatics is an emergent interdisciplinary field described as drawing upon and contributing to both the health sciences and information sciences. The authors elucidate the disciplinary nature and internal structure of the field. Design: To better understand the field's disciplinary nature, the authors examine the intercitation relationships of its journal literature. To determine its internal structure, they examined its journal cocitation patterns. Measurements: The authors used data from the Science Citation Index (SCI) and Social Science Citation Index (SSCI) to perform intercitation studies among productive journal titles, and software routines from SPSS to perform multivariate data analyses on cocitation data for proposed core journals. Results: Intercitation network analysis suggests that a core literature exists, one mark of a separate discipline. Multivariate analyses of cocitation data suggest that major focus areas within the field include biomedical engineering, biomedical computing, decision support, and education. The interpretable dimensions of multidimensional scaling maps differed for the SCI and SSCI data sets. Strong links to information science literature were not found. Conclusion: The authors saw indications of a core literature and of several major research fronts. The field appears to be viewed differently by authors writing in journals indexed by SCI from those writing in journals indexed by SSCI, with more emphasis placed on computers and engineering versus decision making by the former and more emphasis on theory versus application (clinical practice) by the latter. PMID:9760393
Zhao, Min; Wang, Qingguo; Wang, Quan; Jia, Peilin; Zhao, Zhongming
2013-01-01
Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development.
2013-01-01
Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169
Multi-Scale Modeling to Improve Single-Molecule, Single-Cell Experiments
NASA Astrophysics Data System (ADS)
Munsky, Brian; Shepherd, Douglas
2014-03-01
Single-cell, single-molecule experiments are producing an unprecedented amount of data to capture the dynamics of biological systems. When integrated with computational models, observations of spatial, temporal and stochastic fluctuations can yield powerful quantitative insight. We concentrate on experiments that localize and count individual molecules of mRNA. These high precision experiments have large imaging and computational processing costs, and we explore how improved computational analyses can dramatically reduce overall data requirements. In particular, we show how analyses of spatial, temporal and stochastic fluctuations can significantly enhance parameter estimation results for small, noisy data sets. We also show how full probability distribution analyses can constrain parameters with far less data than bulk analyses or statistical moment closures. Finally, we discuss how a systematic modeling progression from simple to more complex analyses can reduce total computational costs by orders of magnitude. We illustrate our approach using single-molecule, spatial mRNA measurements of Interleukin 1-alpha mRNA induction in human THP1 cells following stimulation. Our approach could improve the effectiveness of single-molecule gene regulation analyses for many other process.
Liu, Danping; Yeung, Edwina H; McLain, Alexander C; Xie, Yunlong; Buck Louis, Germaine M; Sundaram, Rajeshwari
2017-09-01
Imperfect follow-up in longitudinal studies commonly leads to missing outcome data that can potentially bias the inference when the missingness is nonignorable; that is, the propensity of missingness depends on missing values in the data. In the Upstate KIDS Study, we seek to determine if the missingness of child development outcomes is nonignorable, and how a simple model assuming ignorable missingness would compare with more complicated models for a nonignorable mechanism. To correct for nonignorable missingness, the shared random effects model (SREM) jointly models the outcome and the missing mechanism. However, the computational complexity and lack of software packages has limited its practical applications. This paper proposes a novel two-step approach to handle nonignorable missing outcomes in generalized linear mixed models. We first analyse the missing mechanism with a generalized linear mixed model and predict values of the random effects; then, the outcome model is fitted adjusting for the predicted random effects to account for heterogeneity in the missingness propensity. Extensive simulation studies suggest that the proposed method is a reliable approximation to SREM, with a much faster computation. The nonignorability of missing data in the Upstate KIDS Study is estimated to be mild to moderate, and the analyses using the two-step approach or SREM are similar to the model assuming ignorable missingness. The two-step approach is a computationally straightforward method that can be conducted as sensitivity analyses in longitudinal studies to examine violations to the ignorable missingness assumption and the implications relative to health outcomes. © 2017 John Wiley & Sons Ltd.
Méjean, Caroline; Andreeva, Valentina A; Kesse-Guyot, Emmanuelle; Fassier, Philippine; Galan, Pilar; Hercberg, Serge; Touvier, Mathilde
2015-01-01
Background In spite of the growing literature in the field of e-epidemiology, clear evidence about computer literacy or attitudes toward respondent burden among e-cohort participants is largely lacking. Objective We assessed the computer and Internet skills of participants in the NutriNet-Santé Web-based cohort. We then explored attitudes toward the study demands/respondent burden according to levels of computer literacy and sociodemographic status. Methods Self-reported data from 43,028 e-cohort participants were collected in 2013 via a Web-based questionnaire. We employed unconditional logistic and linear regression analyses. Results Approximately one-quarter of participants (23.79%, 10,235/43,028) reported being inexperienced in terms of computer use. Regarding attitudes toward participant burden, women tended to be more favorable (eg, “The overall website use is easy”) than were men (OR 0.65, 95% CI 0.59-0.71, P<.001), whereas better educated participants (>12 years of schooling) were less likely to accept the demands associated with participation (eg, “I receive questionnaires too often”) compared to their less educated counterparts (OR 1.62, 95% CI 1.48-1.76, P<.001). Conclusions A substantial proportion of participants had low computer/Internet skills, suggesting that this does not represent a barrier to participation in Web-based cohorts. Our study also suggests that several subgroups of participants with lower computer skills (eg, women or those with lower educational level) might more readily accept the demands associated with participation in the Web cohort. These findings can help guide future Web-based research strategies. PMID:25648178
NASA Astrophysics Data System (ADS)
Jonker, C. M.; Snoep, J. L.; Treur, J.; Westerhoff, H. V.; Wijngaards, W. C. A.
Within the areas of Computational Organisation Theory and Artificial Intelligence, techniques have been developed to simulate and analyse dynamics within organisations in society. Usually these modelling techniques are applied to factories and to the internal organisation of their process flows, thus obtaining models of complex organisations at various levels of aggregation. The dynamics in living cells are often interpreted in terms of well-organised processes, a bacterium being considered a (micro)factory. This suggests that organisation modelling techniques may also benefit their analysis. Using the example of Escherichia coli it is shown how indeed agent-based organisational modelling techniques can be used to simulate and analyse E.coli's intracellular dynamics. Exploiting the abstraction levels entailed by this perspective, a concise model is obtained that is readily simulated and analysed at the various levels of aggregation, yet shows the cell's essential dynamic patterns.
ResidPlots-2: Computer Software for IRT Graphical Residual Analyses
ERIC Educational Resources Information Center
Liang, Tie; Han, Kyung T.; Hambleton, Ronald K.
2009-01-01
This article discusses the ResidPlots-2, a computer software that provides a powerful tool for IRT graphical residual analyses. ResidPlots-2 consists of two components: a component for computing residual statistics and another component for communicating with users and for plotting the residual graphs. The features of the ResidPlots-2 software are…
Blomberg, S
2000-11-01
Currently available programs for the comparative analysis of phylogenetic data do not perform optimally when the phylogeny is not completely specified (i.e. the phylogeny contains polytomies). Recent literature suggests that a better way to analyse the data would be to create random trees from the known phylogeny that are fully-resolved but consistent with the known tree. A computer program is presented, Fels-Rand, that performs such analyses. A randomisation procedure is used to generate trees that are fully resolved but whose structure is consistent with the original tree. Statistics are then calculated on a large number of these randomly-generated trees. Fels-Rand uses the object-oriented features of Xlisp-Stat to manipulate internal tree representations. Xlisp-Stat's dynamic graphing features are used to provide heuristic tools to aid in analysis, particularly outlier analysis. The usefulness of Xlisp-Stat as a system for phylogenetic computation is discussed. Available from the author or at http://www.uq.edu.au/~ansblomb/Fels-Rand.sit.hqx. Xlisp-Stat is available from http://stat.umn.edu/~luke/xls/xlsinfo/xlsinfo.html. s.blomberg@abdn.ac.uk
Isoflurane increases cardiorespiratory coordination in rats
NASA Astrophysics Data System (ADS)
Kabir, Muammar M.; Beig, Mirza I.; Nalivaiko, Eugene; Abbott, Derek; Baumert, Mathias
2008-12-01
Anesthetics such as isoflurane adversely affect heart rate. In this study we analysed the interaction between heart rhythm and respiration at different concentrations of isoflurane and ventilation rates. In two rats, the electrocardiogram (ECG) and respiratory signals were recorded under the influence of isoflurane. For the assessment of cardiorespiratory coordination, we analysed the phase locking between heart rate, computed from the R-R intervals of body surface ECG, and respiratory rate, computed from impedance changes, using Hilbert transform. The changes in heart rate, percentage of synchronization and duration of synchronized epochs at different isoflurane concentrations and ventilation rates were assessed using linear regression model. From this study it appears that the amount of phase locking between cardiac and respiratory rates increases with the increase in concentration of isoflurane. Heart rate and duration of synchronized epochs increased significantly with the increase in the level of isoflurane concentration while respiratory rate was not significantly affected. Cardiorespiratory coordination also showed a considerable increase at the ventilation rates of 50- 55 cpm in both the rats, suggesting that the phase-locking between the cardiac and respiratory oscillators can be increased by breathing at a particular respiratory frequency.
Representational constraints on children's suggestibility.
Ceci, Stephen J; Papierno, Paul B; Kulkofsky, Sarah
2007-06-01
In a multistage experiment, twelve 4- and 9-year-old children participated in a triad rating task. Their ratings were mapped with multidimensional scaling, from which euclidean distances were computed to operationalize semantic distance between items in target pairs. These children and age-mates then participated in an experiment that employed these target pairs in a story, which was followed by a misinformation manipulation. Analyses linked individual and developmental differences in suggestibility to children's representations of the target items. Semantic proximity was a strong predictor of differences in suggestibility: The closer a suggested distractor was to the original item's representation, the greater was the distractor's suggestive influence. The triad participants' semantic proximity subsequently served as the basis for correctly predicting memory performance in the larger group. Semantic proximity enabled a priori counterintuitive predictions of reverse age-related trends to be confirmed whenever the distance between representations of items in a target pair was greater for younger than for older children.
A computer graphics program for general finite element analyses
NASA Technical Reports Server (NTRS)
Thornton, E. A.; Sawyer, L. M.
1978-01-01
Documentation for a computer graphics program for displays from general finite element analyses is presented. A general description of display options and detailed user instructions are given. Several plots made in structural, thermal and fluid finite element analyses are included to illustrate program options. Sample data files are given to illustrate use of the program.
A Review of Recent Aeroelastic Analysis Methods for Propulsion at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Bakhle, Milind A.; Srivastava, R.; Mehmed, Oral; Stefko, George L.
1993-01-01
This report reviews aeroelastic analyses for propulsion components (propfans, compressors and turbines) being developed and used at NASA LeRC. These aeroelastic analyses include both structural and aerodynamic models. The structural models include a typical section, a beam (with and without disk flexibility), and a finite-element blade model (with plate bending elements). The aerodynamic models are based on the solution of equations ranging from the two-dimensional linear potential equation to the three-dimensional Euler equations for multibladed configurations. Typical calculated results are presented for each aeroelastic model. Suggestions for further research are made. Many of the currently available aeroelastic models and analysis methods are being incorporated in a unified computer program, APPLE (Aeroelasticity Program for Propulsion at LEwis).
Wei, Ran
2007-06-01
This study examines the effects of exposure to online videogame violence on Chinese adolescents' attitudes toward violence, empathy, and aggressive behavior. Results of bivariate analyses show that playing violent videogames on the Internet was associated with greater tolerance of violence, a lower emphatic attitude, and more aggressive behavior. Results of hierarchical regression analyses showed sustained relationships between exposure and pro-violent attitudes and empathy when exposure was examined simultaneously with gender, computer use, and Internet use. However, the linkage between exposure and aggression became non-significant, suggesting that the effects of playing violent videogames were greater for attitudinal outcomes than on overt behavior. Gender differences in playing videogames and in effects were also found.
Thermal Performance Analysis of a Geologic Borehole Repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reagin, Lauren
2016-08-16
The Brazilian Nuclear Research Institute (IPEN) proposed a design for the disposal of Disused Sealed Radioactive Sources (DSRS) based on the IAEA Borehole Disposal of Sealed Radioactive Sources (BOSS) design that would allow the entirety of Brazil’s inventory of DSRS to be disposed in a single borehole. The proposed IPEN design allows for 170 waste packages (WPs) containing DSRS (such as Co-60 and Cs-137) to be stacked on top of each other inside the borehole. The primary objective of this work was to evaluate the thermal performance of a conservative approach to the IPEN proposal with the equivalent of twomore » WPs and two different inside configurations using Co-60 as the radioactive heat source. The current WP configuration (heterogeneous) for the IPEN proposal has 60% of the WP volume being occupied by a nuclear radioactive heat source and the remaining 40% as vacant space. The second configuration (homogeneous) considered for this project was a homogeneous case where 100% of the WP volume was occupied by a nuclear radioactive heat source. The computational models for the thermal analyses of the WP configurations with the Co-60 heat source considered three different cooling mechanisms (conduction, radiation, and convection) and the effect of mesh size on the results from the thermal analysis. The results of the analyses yielded maximum temperatures inside the WPs for both of the WP configurations and various mesh sizes. The heterogeneous WP considered the cooling mechanisms of conduction, convection, and radiation. The temperature results from the heterogeneous WP analysis suggest that the model is cooled predominantly by conduction with effect of radiation and natural convection on cooling being negligible. From the thermal analysis comparing the two WP configurations, the results suggest that either WP configuration could be used for the design. The mesh sensitivity results verify the meshes used, and results obtained from the thermal analyses were close to being independent of mesh size. The results from the computational case and analytically-calculated case for the homogeneous WP in benchmarking were almost identical, which indicates that the computational approach used here was successfully verified by the analytical solution.« less
Tracking brain states under general anesthesia by using global coherence analysis
Cimenser, Aylin; Purdon, Patrick L.; Pierce, Eric T.; Walsh, John L.; Salazar-Gomez, Andres F.; Harrell, Priscilla G.; Tavares-Stoeckel, Casie; Habeeb, Kathleen; Brown, Emery N.
2011-01-01
Time and frequency domain analyses of scalp EEG recordings are widely used to track changes in brain states under general anesthesia. Although these analyses have suggested that different spatial patterns are associated with changes in the state of general anesthesia, the extent to which these patterns are spatially coordinated has not been systematically characterized. Global coherence, the ratio of the largest eigenvalue to the sum of the eigenvalues of the cross-spectral matrix at a given frequency and time, has been used to analyze the spatiotemporal dynamics of multivariate time-series. Using 64-lead EEG recorded from human subjects receiving computer-controlled infusions of the anesthetic propofol, we used surface Laplacian referencing combined with spectral and global coherence analyses to track the spatiotemporal dynamics of the brain's anesthetic state. During unconsciousness the spectrograms in the frontal leads showed increasing α (8–12 Hz) and δ power (0–4 Hz) and in the occipital leads δ power greater than α power. The global coherence detected strong coordinated α activity in the occipital leads in the awake state that shifted to the frontal leads during unconsciousness. It revealed a lack of coordinated δ activity during both the awake and unconscious states. Although strong frontal power during general anesthesia-induced unconsciousness—termed anteriorization—is well known, its possible association with strong α range global coherence suggests highly coordinated spatial activity. Our findings suggest that combined spectral and global coherence analyses may offer a new approach to tracking brain states under general anesthesia. PMID:21555565
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Perry, Boyd, III; Florance, James R.; Sanetrik, Mark D.; Wieseman, Carol D.; Stevens, William L.; Funk, Christie J.; Hur, Jiyoung; Christhilf, David M.; Coulson, David A.
2011-01-01
A summary of computational and experimental aeroelastic and aeroservoelastic (ASE) results for the Semi-Span Super-Sonic Transport (S4T) wind-tunnel model is presented. A broad range of analyses and multiple ASE wind-tunnel tests of the S4T have been performed in support of the ASE element in the Supersonics Program, part of NASA's Fundamental Aeronautics Program. The computational results to be presented include linear aeroelastic and ASE analyses, nonlinear aeroelastic analyses using an aeroelastic CFD code, and rapid aeroelastic analyses using CFD-based reduced-order models (ROMs). Experimental results from two closed-loop wind-tunnel tests performed at NASA Langley's Transonic Dynamics Tunnel (TDT) will be presented as well.
ERIC Educational Resources Information Center
Vangsnes, Vigdis; Gram Okland, Nils Tore; Krumsvik, Rune
2012-01-01
This article focuses on the didactical implications when commercial educational computer games are used in Norwegian kindergartens by analysing the dramaturgy and the didactics of one particular game and the game in use in a pedagogical context. Our justification for analysing the game by using dramaturgic theory is that we consider the game to be…
Computational Aeroelastic Analyses of a Low-Boom Supersonic Configuration
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph
2015-01-01
An overview of NASA's Commercial Supersonic Technology (CST) Aeroservoelasticity (ASE) element is provided with a focus on recent computational aeroelastic analyses of a low-boom supersonic configuration developed by Lockheed-Martin and referred to as the N+2 configuration. The overview includes details of the computational models developed to date including a linear finite element model (FEM), linear unsteady aerodynamic models, unstructured CFD grids, and CFD-based aeroelastic analyses. In addition, a summary of the work involving the development of aeroelastic reduced-order models (ROMs) and the development of an aero-propulso-servo-elastic (APSE) model is provided.
Analyses of ACPL thermal/fluid conditioning system
NASA Technical Reports Server (NTRS)
Stephen, L. A.; Usher, L. H.
1976-01-01
Results of engineering analyses are reported. Initial computations were made using a modified control transfer function where the systems performance was characterized parametrically using an analytical model. The analytical model was revised to represent the latest expansion chamber fluid manifold design, and systems performance predictions were made. Parameters which were independently varied in these computations are listed. Systems predictions which were used to characterize performance are primarily transient computer plots comparing the deviation between average chamber temperature and the chamber temperature requirement. Additional computer plots were prepared. Results of parametric computations with the latest fluid manifold design are included.
[Results of the marketing research study "Acceptance of physician's office computer systems"].
Steinhausen, D; Brinkmann, F; Engelhard, A
1998-01-01
We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.
NASA Astrophysics Data System (ADS)
Titus, Benjamin M.; Daly, Marymegan
2017-03-01
Specialist and generalist life histories are expected to result in contrasting levels of genetic diversity at the population level, and symbioses are expected to lead to patterns that reflect a shared biogeographic history and co-diversification. We test these assumptions using mtDNA sequencing and a comparative phylogeographic approach for six co-occurring crustacean species that are symbiotic with sea anemones on western Atlantic coral reefs, yet vary in their host specificities: four are host specialists and two are host generalists. We first conducted species discovery analyses to delimit cryptic lineages, followed by classic population genetic diversity analyses for each delimited taxon, and then reconstructed the demographic history for each taxon using traditional summary statistics, Bayesian skyline plots, and approximate Bayesian computation to test for signatures of recent and concerted population expansion. The genetic diversity values recovered here contravene the expectations of the specialist-generalist variation hypothesis and classic population genetics theory; all specialist lineages had greater genetic diversity than generalists. Demography suggests recent population expansions in all taxa, although Bayesian skyline plots and approximate Bayesian computation suggest the timing and magnitude of these events were idiosyncratic. These results do not meet the a priori expectation of concordance among symbiotic taxa and suggest that intrinsic aspects of species biology may contribute more to phylogeographic history than extrinsic forces that shape whole communities. The recovery of two cryptic specialist lineages adds an additional layer of biodiversity to this symbiosis and contributes to an emerging pattern of cryptic speciation in the specialist taxa. Our results underscore the differences in the evolutionary processes acting on marine systems from the terrestrial processes that often drive theory. Finally, we continue to highlight the Florida Reef Tract as an important biodiversity hotspot.
Lloyd, Jan; Moni, Karen B; Jobling, Anne
2006-06-01
There has been huge growth in the use of information technology (IT) in classrooms for learners of all ages. It has been suggested that computers in the classroom encourage independent and self-paced learning, provide immediate feedback and improve self-motivation and self-confidence. Concurrently there is increasing interest related to the role of technology in educational programs for individuals with intellectual disabilities. However, although many claims are made about the benefits of computers and software packages there is limited evidence based information to support these claims. Researchers are now starting to look at the specific instructional design features that are hypothesised to facilitate education outcomes rather than the over-emphasis on graphics and sounds. Research undertaken as part of a post-school program (Latch-On: Literacy and Technology - Hands On) at the University of Queensland investigated the use of computers by young adults with intellectual disabilities. The aims of the research reported in this paper were to address the challenges identified in the 'hype' surrounding different pieces of educational software and to develop a means of systematically analysing software for use in teaching programs.
Speech perception at the interface of neurobiology and linguistics.
Poeppel, David; Idsardi, William J; van Wassenhove, Virginie
2008-03-12
Speech perception consists of a set of computations that take continuously varying acoustic waveforms as input and generate discrete representations that make contact with the lexical representations stored in long-term memory as output. Because the perceptual objects that are recognized by the speech perception enter into subsequent linguistic computation, the format that is used for lexical representation and processing fundamentally constrains the speech perceptual processes. Consequently, theories of speech perception must, at some level, be tightly linked to theories of lexical representation. Minimally, speech perception must yield representations that smoothly and rapidly interface with stored lexical items. Adopting the perspective of Marr, we argue and provide neurobiological and psychophysical evidence for the following research programme. First, at the implementational level, speech perception is a multi-time resolution process, with perceptual analyses occurring concurrently on at least two time scales (approx. 20-80 ms, approx. 150-300 ms), commensurate with (sub)segmental and syllabic analyses, respectively. Second, at the algorithmic level, we suggest that perception proceeds on the basis of internal forward models, or uses an 'analysis-by-synthesis' approach. Third, at the computational level (in the sense of Marr), the theory of lexical representation that we adopt is principally informed by phonological research and assumes that words are represented in the mental lexicon in terms of sequences of discrete segments composed of distinctive features. One important goal of the research programme is to develop linking hypotheses between putative neurobiological primitives (e.g. temporal primitives) and those primitives derived from linguistic inquiry, to arrive ultimately at a biologically sensible and theoretically satisfying model of representation and computation in speech.
Price, Rebecca B.; Wallace, Meredith; Kuckertz, Jennie M.; Amir, Nader; Graur, Simona; Cummings, Logan; Popa, Paul; Carlbring, Per; Bar-Haim, Yair
2016-01-01
Computer-based approaches, such as Attention Bias Modification (ABM), could help improve access to care for anxiety. Study-level meta-analyses of ABM have produced conflicting findings and leave critical questions unresolved regarding ABM’s mechanisms of action and clinical potential. We pooled patient-level datasets from randomized controlled trials of children and adults with high-anxiety. Attentional bias (AB) towards threat, the target mechanism of ABM, was tested as an outcome and a mechanistic mediator and moderator of anxiety reduction. Diagnostic remission and Liebowitz Social Anxiety Scale (LSAS) were clinical outcomes available in enough studies to enable pooling. Per-patient data were obtained on at least one outcome from 13/16 eligible studies [86% of eligible participants; n=778]. Significant main effects of ABM on diagnostic remission (ABM—22.6%, control—10.8%; OR=2.57; p=.006) and AB (β*(95%CI)=−.63(−.83, −.42); p<.00005) were observed. There was no main effect of ABM on LSAS. However, moderator analyses suggested ABM was effective for patients who were younger (≤37y), trained in the lab, and/or assessed by clinicians. Under the same conditions where ABM was effective, mechanistic links between AB and anxiety reduction were supported. Under these specific circumstances, ABM reduces anxiety and acts through its target mechanism, supporting ABM’s theoretical basis while simultaneously suggesting clinical indications and refinements to improve its currently limited clinical potential. PMID:27693664
Validation of a motivation-based typology of angry aggression among antisocial youths in Norway.
Bjørnebekk, Gunnar; Howard, Rick
2012-01-01
This article describes the validation of the Angry Aggression Scales (AAS), the Behavior Inhibition System and the Behavior Activation System (BIS/BAS) scales, the reactive aggression and proactive power scales in relation to a Norwegian sample of 101 antisocial youths with conduct problems (64 boys, 37 girls, mean age 15 ± 1.3 years) and 101 prosocial controls matched on age, gender, education, ethnicity, and school district. Maximum likelihood exploratory factor analyses with oblique rotation were performed on AAS, BIS/BAS, reactive aggression and proactive power scales as well as computation of Cronbach's alpha and McDonald's omega. Tests for normality and homogeneity of variance were acceptable. Factor analyses of AAS and the proactive/reactive aggression scales suggested a hierarchical structure comprising a single higher-order angry aggression (AA) factor and four and two lower-order factors, respectively. Moreover, results suggested one BIS factor and a single higher-order BAS factor with three lower-order factors related to drive, fun-seeking and reward responsiveness. To compare scores of antisocial youths with controls, t-tests on the mean scale scores were computed. Results confirmed that antisocial youths were different from controls on the above-mentioned scales. Consistent with the idea that anger is associated with approach motivation, AAS scores correlated with behavioral activation, but only explosive/reactive and vengeful/ruminative AA correlated with behavioral inhibition. Results generally validated the quadruple typology of aggression and violence proposed by Howard (2009). Copyright © 2012 John Wiley & Sons, Ltd.
Cadeddu, Andrea; Wylie, Elizabeth K; Jurczak, Janusz; Wampler-Doty, Matthew; Grzybowski, Bartosz A
2014-07-28
Methods of computational linguistics are used to demonstrate that a natural language such as English and organic chemistry have the same structure in terms of the frequency of, respectively, text fragments and molecular fragments. This quantitative correspondence suggests that it is possible to extend the methods of computational corpus linguistics to the analysis of organic molecules. It is shown that within organic molecules bonds that have highest information content are the ones that 1) define repeat/symmetry subunits and 2) in asymmetric molecules, define the loci of potential retrosynthetic disconnections. Linguistics-based analysis appears well-suited to the analysis of complex structural and reactivity patterns within organic molecules. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis
NASA Technical Reports Server (NTRS)
Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.
2012-01-01
MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.
Bayet, Laurie; Pascalis, Olivier; Quinn, Paul C.; Lee, Kang; Gentaz, Édouard; Tanaka, James W.
2015-01-01
Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward “male” responding in children as young as 5–6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1–2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling. PMID:25859238
Martin, David M; Murphy, Eoin A; Boyle, Fergal J
2014-08-01
In many computational fluid dynamics (CFD) studies of stented vessel haemodynamics, the geometry of the stented vessel is described using non-deformed (NDF) geometrical models. These NDF models neglect complex physical features, such as stent and vessel deformation, which may have a major impact on the haemodynamic environment in stented coronary arteries. In this study, CFD analyses were carried out to simulate pulsatile flow conditions in both NDF and realistically-deformed (RDF) models of three stented coronary arteries. While the NDF models were completely idealised, the RDF models were obtained from nonlinear structural analyses and accounted for both stent and vessel deformation. Following the completion of the CFD analyses, major differences were observed in the time-averaged wall shear stress (TAWSS), time-averaged wall shear stress gradient (TAWSSG) and oscillatory shear index (OSI) distributions predicted on the luminal surface of the artery for the NDF and RDF models. Specifically, the inclusion of stent and vessel deformation in the CFD analyses resulted in a 32%, 30% and 31% increase in the area-weighted mean TAWSS, a 3%, 7% and 16% increase in the area-weighted mean TAWSSG and a 21%, 13% and 21% decrease in the area-weighted mean OSI for Stents A, B and C, respectively. These results suggest that stent and vessel deformation are likely to have a major impact on the haemodynamic environment in stented coronary arteries. In light of this observation, it is recommended that these features are considered in future CFD studies of stented vessel haemodynamics. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.
Materials constitutive models for nonlinear analysis of thermally cycled structures
NASA Technical Reports Server (NTRS)
Kaufman, A.; Hunt, L. E.
1982-01-01
Effects of inelastic materials models on computed stress-strain solutions for thermally loaded structures were studied by performing nonlinear (elastoplastic creep) and elastic structural analyses on a prismatic, double edge wedge specimen of IN 100 alloy that was subjected to thermal cycling in fluidized beds. Four incremental plasticity creep models (isotropic, kinematic, combined isotropic kinematic, and combined plus transient creep) were exercised for the problem by using the MARC nonlinear, finite element computer program. Maximum total strain ranges computed from the elastic and nonlinear analyses agreed within 5 percent. Mean cyclic stresses, inelastic strain ranges, and inelastic work were significantly affected by the choice of inelastic constitutive model. The computing time per cycle for the nonlinear analyses was more than five times that required for the elastic analysis.
A combined computational-experimental analyses of selected metabolic enzymes in Pseudomonas species.
Perumal, Deepak; Lim, Chu Sing; Chow, Vincent T K; Sakharkar, Kishore R; Sakharkar, Meena K
2008-09-10
Comparative genomic analysis has revolutionized our ability to predict the metabolic subsystems that occur in newly sequenced genomes, and to explore the functional roles of the set of genes within each subsystem. These computational predictions can considerably reduce the volume of experimental studies required to assess basic metabolic properties of multiple bacterial species. However, experimental validations are still required to resolve the apparent inconsistencies in the predictions by multiple resources. Here, we present combined computational-experimental analyses on eight completely sequenced Pseudomonas species. Comparative pathway analyses reveal that several pathways within the Pseudomonas species show high plasticity and versatility. Potential bypasses in 11 metabolic pathways were identified. We further confirmed the presence of the enzyme O-acetyl homoserine (thiol) lyase (EC: 2.5.1.49) in P. syringae pv. tomato that revealed inconsistent annotations in KEGG and in the recently published SYSTOMONAS database. These analyses connect and integrate systematic data generation, computational data interpretation, and experimental validation and represent a synergistic and powerful means for conducting biological research.
Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing.
Shatil, Anwar S; Younas, Sohail; Pourreza, Hossein; Figley, Chase R
2015-01-01
With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications.
Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing
Shatil, Anwar S.; Younas, Sohail; Pourreza, Hossein; Figley, Chase R.
2015-01-01
With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications. PMID:27279746
A Status Review of the Commercial Supersonic Technology (CST) Aeroservoelasticity (ASE) Project
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Funk, Christy; Keller, Donald F.; Ringertz, Ulf
2016-01-01
An overview of recent progress regarding the computational aeroelastic and aeroservoelastic (ASE) analyses of a low-boom supersonic configuration is presented. The overview includes details of the computational models developed to date with a focus on unstructured CFD grids, computational aeroelastic analyses, sonic boom propagation studies that include static aeroelastic effects, and gust loads analyses. In addition, flutter boundaries using aeroelastic Reduced-Order Models (ROMs) are presented at various Mach numbers of interest. Details regarding a collaboration with the Royal Institute of Technology (KTH, Stockholm, Sweden) to design, fabricate, and test a full-span aeroelastic wind-tunnel model are also presented.
Energy and time determine scaling in biological and computer designs
Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie
2016-01-01
Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy–time minimization principle may govern the design of many complex systems that process energy, materials and information. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431524
Energy and time determine scaling in biological and computer designs.
Moses, Melanie; Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie
2016-08-19
Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy-time minimization principle may govern the design of many complex systems that process energy, materials and information.This article is part of the themed issue 'The major synthetic evolutionary transitions'. © 2016 The Author(s).
Geary, D C; Frensch, P A; Wiley, J G
1993-06-01
Thirty-six younger adults (10 male, 26 female; ages 18 to 38 years) and 36 older adults (14 male, 22 female; ages 61 to 80 years) completed simple and complex paper-and-pencil subtraction tests and solved a series of simple and complex computer-presented subtraction problems. For the computer task, strategies and solution times were recorded on a trial-by-trial basis. Older Ss used a developmentally more mature mix of problem-solving strategies to solve both simple and complex subtraction problems. Analyses of component scores derived from the solution times suggest that the older Ss are slower at number encoding and number production but faster at executing the borrow procedure. In contrast, groups did not appear to differ in the speed of subtraction fact retrieval. Results from a computational simulation are consistent with the interpretation that older adults' advantage for strategy choices and for the speed of executing the borrow procedure might result from more practice solving subtraction problems.
Designing for deeper learning in a blended computer science course for middle school students
NASA Astrophysics Data System (ADS)
Grover, Shuchi; Pea, Roy; Cooper, Stephen
2015-04-01
The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were found to be strong predictors of learning outcomes.
SSME main combustion chamber and nozzle flowfield analysis
NASA Technical Reports Server (NTRS)
Farmer, R. C.; Wang, T. S.; Smith, S. D.; Prozan, R. J.
1986-01-01
An investigation is presented of the computational fluid dynamics (CFD) tools which would accurately analyze main combustion chamber and nozzle flow. The importance of combustion phenomena and local variations in mixture ratio are fully appreciated; however, the computational aspects of the gas dynamics involved were the sole issues addressed. The CFD analyses made are first compared with conventional nozzle analyses to determine the accuracy for steady flows, and then transient analyses are discussed.
Potential evapotranspiration and continental drying
Milly, Paul C.D.; Dunne, Krista A.
2016-01-01
By various measures (drought area and intensity, climatic aridity index, and climatic water deficits), some observational analyses have suggested that much of the Earth’s land has been drying during recent decades, but such drying seems inconsistent with observations of dryland greening and decreasing pan evaporation. ‘Offline’ analyses of climate-model outputs from anthropogenic climate change (ACC) experiments portend continuation of putative drying through the twenty-first century, despite an expected increase in global land precipitation. A ubiquitous increase in estimates of potential evapotranspiration (PET), driven by atmospheric warming, underlies the drying trends, but may be a methodological artefact. Here we show that the PET estimator commonly used (the Penman–Monteith PET for either an open-water surface or a reference crop) severely overpredicts the changes in non-water-stressed evapotranspiration computed in the climate models themselves in ACC experiments. This overprediction is partially due to neglect of stomatal conductance reductions commonly induced by increasing atmospheric CO2 concentrations in climate models. Our findings imply that historical and future tendencies towards continental drying, as characterized by offline-computed runoff, as well as other PET-dependent metrics, may be considerably weaker and less extensive than previously thought.
Scafuri, Bernardina; Marabotti, Anna; Carbone, Virginia; Minasi, Paola; Dotolo, Serena; Facchiano, Angelo
2016-01-01
We investigated the potential role of apple phenolic compounds in human pathologies by integrating chemical characterization of phenolic compounds in three apple varieties, computational approaches to identify potential protein targets of the compounds, bioinformatics analyses on data from public archive of gene expression data, and functional analyses to hypothesize the effects of the selected compounds in molecular pathways. Starting by the analytic characterization of phenolic compounds in three apple varieties, i.e. Annurca, Red Delicious, and Golden Delicious, we used computational approaches to verify by reverse docking the potential protein targets of the identified compounds. Direct docking validation of the potential protein-ligand interactions has generated a short list of human proteins potentially bound by the apple phenolic compounds. By considering the known chemo-preventive role of apple antioxidants’ extracts against some human pathologies, we performed a functional analysis by comparison with experimental gene expression data and interaction networks, obtained from public repositories. The results suggest the hypothesis that chemo-preventive effects of apple extracts in human pathologies, in particular for colorectal cancer, may be the interference with the activity of nucleotide metabolism and methylation enzymes, similarly to some classes of anticancer drugs. PMID:27587238
Computational Analyses of Offset Stream Nozzles for Noise Reduction
NASA Technical Reports Server (NTRS)
Dippold, Vance, III; Foster, Lancert; Wiese,Michael
2007-01-01
The Wind computational fluid dynamics code was used to perform a series of simulations on two offset stream nozzle concepts for jet noise reduction. The first concept used an S-duct to direct the secondary stream to the lower side of the nozzle. The second concept used vanes to turn the secondary flow downward. The analyses were completed in preparation of tests conducted in the NASA Glenn Research Center Aeroacoustic Propulsion Laboratory. The offset stream nozzles demonstrated good performance and reduced the amount of turbulence on the lower side of the jet plume. The computer analyses proved instrumental in guiding the development of the final test configurations and giving insight into the flow mechanics of offset stream nozzles. The computational predictions were compared with flowfield results from the jet rig testing and showed excellent agreement.
The effect of brain lesions on sound localization in complex acoustic environments.
Zündorf, Ida C; Karnath, Hans-Otto; Lewald, Jörg
2014-05-01
Localizing sound sources of interest in cluttered acoustic environments--as in the 'cocktail-party' situation--is one of the most demanding challenges to the human auditory system in everyday life. In this study, stroke patients' ability to localize acoustic targets in a single-source and in a multi-source setup in the free sound field were directly compared. Subsequent voxel-based lesion-behaviour mapping analyses were computed to uncover the brain areas associated with a deficit in localization in the presence of multiple distracter sound sources rather than localization of individually presented sound sources. Analyses revealed a fundamental role of the right planum temporale in this task. The results from the left hemisphere were less straightforward, but suggested an involvement of inferior frontal and pre- and postcentral areas. These areas appear to be particularly involved in the spectrotemporal analyses crucial for effective segregation of multiple sound streams from various locations, beyond the currently known network for localization of isolated sound sources in otherwise silent surroundings.
Borowiec, Marek L; Lee, Ernest K; Chiu, Joanna C; Plachetzki, David C
2015-11-23
Understanding the phylogenetic relationships among major lineages of multicellular animals (the Metazoa) is a prerequisite for studying the evolution of complex traits such as nervous systems, muscle tissue, or sensory organs. Transcriptome-based phylogenies have dramatically improved our understanding of metazoan relationships in recent years, although several important questions remain. The branching order near the base of the tree, in particular the placement of the poriferan (sponges, phylum Porifera) and ctenophore (comb jellies, phylum Ctenophora) lineages is one outstanding issue. Recent analyses have suggested that the comb jellies are sister to all remaining metazoan phyla including sponges. This finding is surprising because it suggests that neurons and other complex traits, present in ctenophores and eumetazoans but absent in sponges or placozoans, either evolved twice in Metazoa or were independently, secondarily lost in the lineages leading to sponges and placozoans. To address the question of basal metazoan relationships we assembled a novel dataset comprised of 1080 orthologous loci derived from 36 publicly available genomes representing major lineages of animals. From this large dataset we procured an optimized set of partitions with high phylogenetic signal for resolving metazoan relationships. This optimized data set is amenable to the most appropriate and computationally intensive analyses using site-heterogeneous models of sequence evolution. We also employed several strategies to examine the potential for long-branch attraction to bias our inferences. Our analyses strongly support the Ctenophora as the sister lineage to other Metazoa. We find no support for the traditional view uniting the ctenophores and Cnidaria. Our findings are supported by Bayesian comparisons of topological hypotheses and we find no evidence that they are biased by long-branch attraction. Our study further clarifies relationships among early branching metazoan lineages. Our phylogeny supports the still-controversial position of ctenophores as sister group to all other metazoans. This study also provides a workflow and computational tools for minimizing systematic bias in genome-based phylogenetic analyses. Future studies of metazoan phylogeny will benefit from ongoing efforts to sequence the genomes of additional invertebrate taxa that will continue to inform our view of the relationships among the major lineages of animals.
Adolescent computer use and alcohol use: what are the role of quantity and content of computer use?
Epstein, Jennifer A
2011-05-01
The purpose of this study was to examine the relationship between computer use and alcohol use among adolescents. In particular, the goal of the research was to determine the role of lifetime drinking and past month drinking on quantity as measured by amount of time on the computer (for school work and excluding school work) and on content as measured by the frequency of a variety of activities on the internet (e.g., e-mail, searching for information, social networking, listen to/download music). Participants (aged 13-17 years and residing in the United States) were recruited via the internet to complete an anonymous survey online using a popular survey tool (N=270). Their average age was 16 and the sample was predominantly female (63% girls). A series of analyses was conducted with the computer use measures as dependent variables (hours on the computer per week for school work and excluding school work; various internet activities including e-mail, searching for information, social networking, listen to/download music) controlling for gender, age, academic performance and age of first computer use. Based on the results, past month drinkers used the computer more hours per week excluding school work than those who did not. As expected, there were no differences in hours based on alcohol use for computer use for school work. Drinking also had relationships with more frequent social networking and listening to/downloading music. These findings suggest that both quantity and content of computer use were related to adolescent drinking. Copyright © 2010 Elsevier Ltd. All rights reserved.
Besnier, Francois; Glover, Kevin A.
2013-01-01
This software package provides an R-based framework to make use of multi-core computers when running analyses in the population genetics program STRUCTURE. It is especially addressed to those users of STRUCTURE dealing with numerous and repeated data analyses, and who could take advantage of an efficient script to automatically distribute STRUCTURE jobs among multiple processors. It also consists of additional functions to divide analyses among combinations of populations within a single data set without the need to manually produce multiple projects, as it is currently the case in STRUCTURE. The package consists of two main functions: MPI_structure() and parallel_structure() as well as an example data file. We compared the performance in computing time for this example data on two computer architectures and showed that the use of the present functions can result in several-fold improvements in terms of computation time. ParallelStructure is freely available at https://r-forge.r-project.org/projects/parallstructure/. PMID:23923012
NASA Technical Reports Server (NTRS)
Estes, Samantha; Parker, Nelson C. (Technical Monitor)
2001-01-01
Virtual reality and simulation applications are becoming widespread in human task analysis. These programs have many benefits for the Human Factors Engineering field. Not only do creating and using virtual environments for human engineering analyses save money and time, this approach also promotes user experimentation and provides increased quality of analyses. This paper explains the human engineering task analysis performed on the Environmental Control and Life Support System (ECLSS) space station rack and its Distillation Assembly (DA) subsystem using EAI's human modeling simulation software, Jack. When installed on the International Space Station (ISS), ECLSS will provide the life and environment support needed to adequately sustain crew life. The DA is an Orbital Replaceable Unit (ORU) that provides means of wastewater (primarily urine from flight crew and experimental animals) reclamation. Jack was used to create a model of the weightless environment of the ISS Node 3, where the ECLSS is housed. Computer aided drawings of the ECLSS rack and DA system were also brought into the environment. Anthropometric models of a 95th percentile male and 5th percentile female were used to examine the human interfaces encountered during various ECLSS and DA tasks. The results of the task analyses were used in suggesting modifications to hardware and crew task procedures to improve accessibility, conserve crew time, and add convenience for the crew. This paper will address some of those suggested modifications and the method of presenting final analyses for requirements verification.
NASA Technical Reports Server (NTRS)
Blakely, R. L.
1973-01-01
A G189A simulation of the shuttle orbiter EC/lSS was prepared and used to study payload support capabilities. Two master program libraries of the G189A computer program were prepared for the NASA/JSC computer system. Several new component subroutines were added to the G189A program library and many existing subroutines were revised to improve their capabilities. A number of special analyses were performed in support of a NASA/JSC shuttle orbiter EC/LSS payload support capability study.
Kaga, Akimune; Murotsuki, Jun; Kamimura, Miki; Kimura, Masato; Saito-Hakoda, Akiko; Kanno, Junko; Hoshi, Kazuhiko; Kure, Shigeo; Fujiwara, Ikuma
2015-05-01
Achondroplasia and Down syndrome are relatively common conditions individually. But co-occurrence of both conditions in the same patient is rare and there have been no reports of fetal analysis of this condition by prenatal sonographic and three-dimensional (3-D) helical computed tomography (CT). Prenatal sonographic findings seen in persons with Down syndrome, such as a thickened nuchal fold, cardiac defects, and echogenic bowel were not found in the patient. A prenatal 3-D helical CT revealed a large head with frontal bossing, metaphyseal flaring of the long bones, and small iliac wings, which suggested achondroplasia. In a case with combination of achondroplasia and Down syndrome, it may be difficult to diagnose the co-occurrence prenatally without typical markers of Down syndrome. © 2014 Japanese Teratology Society.
Principles of Experimental Design for Big Data Analysis.
Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G
2017-08-01
Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis.
Principles of Experimental Design for Big Data Analysis
Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G
2016-01-01
Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis. PMID:28883686
NASA Technical Reports Server (NTRS)
Ramsey, J. W., Jr.; Taylor, J. T.; Wilson, J. F.; Gray, C. E., Jr.; Leatherman, A. D.; Rooker, J. R.; Allred, J. W.
1976-01-01
The results of extensive computer (finite element, finite difference and numerical integration), thermal, fatigue, and special analyses of critical portions of a large pressurized, cryogenic wind tunnel (National Transonic Facility) are presented. The computer models, loading and boundary conditions are described. Graphic capability was used to display model geometry, section properties, and stress results. A stress criteria is presented for evaluation of the results of the analyses. Thermal analyses were performed for major critical and typical areas. Fatigue analyses of the entire tunnel circuit are presented.
Comparability of a Paper-Based Language Test and a Computer-Based Language Test.
ERIC Educational Resources Information Center
Choi, Inn-Chull; Kim, Kyoung Sung; Boo, Jaeyool
2003-01-01
Utilizing the Test of English Proficiency, developed by Seoul National University (TEPS), examined comparability between the paper-based language test and the computer-based language test based on content and construct validation employing content analyses based on corpus linguistic techniques in addition to such statistical analyses as…
NASA Technical Reports Server (NTRS)
Givi, Peyman; Madnia, Cyrus K.; Steinberger, C. J.; Frankel, S. H.
1992-01-01
The principal objective is to extend the boundaries within which large eddy simulations (LES) and direct numerical simulations (DNS) can be applied in computational analyses of high speed reacting flows. A summary of work accomplished during the last six months is presented.
Cloud computing: a new business paradigm for biomedical information sharing.
Rosenthal, Arnon; Mork, Peter; Li, Maya Hao; Stanford, Jean; Koester, David; Reynolds, Patti
2010-04-01
We examine how the biomedical informatics (BMI) community, especially consortia that share data and applications, can take advantage of a new resource called "cloud computing". Clouds generally offer resources on demand. In most clouds, charges are pay per use, based on large farms of inexpensive, dedicated servers, sometimes supporting parallel computing. Substantial economies of scale potentially yield costs much lower than dedicated laboratory systems or even institutional data centers. Overall, even with conservative assumptions, for applications that are not I/O intensive and do not demand a fully mature environment, the numbers suggested that clouds can sometimes provide major improvements, and should be seriously considered for BMI. Methodologically, it was very advantageous to formulate analyses in terms of component technologies; focusing on these specifics enabled us to bypass the cacophony of alternative definitions (e.g., exactly what does a cloud include) and to analyze alternatives that employ some of the component technologies (e.g., an institution's data center). Relative analyses were another great simplifier. Rather than listing the absolute strengths and weaknesses of cloud-based systems (e.g., for security or data preservation), we focus on the changes from a particular starting point, e.g., individual lab systems. We often find a rough parity (in principle), but one needs to examine individual acquisitions--is a loosely managed lab moving to a well managed cloud, or a tightly managed hospital data center moving to a poorly safeguarded cloud? 2009 Elsevier Inc. All rights reserved.
Cognitive context detection in UAS operators using eye-gaze patterns on computer screens
NASA Astrophysics Data System (ADS)
Mannaru, Pujitha; Balasingam, Balakumar; Pattipati, Krishna; Sibley, Ciara; Coyne, Joseph
2016-05-01
In this paper, we demonstrate the use of eye-gaze metrics of unmanned aerial systems (UAS) operators as effective indices of their cognitive workload. Our analyses are based on an experiment where twenty participants performed pre-scripted UAS missions of three different difficulty levels by interacting with two custom designed graphical user interfaces (GUIs) that are displayed side by side. First, we compute several eye-gaze metrics, traditional eye movement metrics as well as newly proposed ones, and analyze their effectiveness as cognitive classifiers. Most of the eye-gaze metrics are computed by dividing the computer screen into "cells". Then, we perform several analyses in order to select metrics for effective cognitive context classification related to our specific application; the objective of these analyses are to (i) identify appropriate ways to divide the screen into cells; (ii) select appropriate metrics for training and classification of cognitive features; and (iii) identify a suitable classification method.
Davies, E Bethan; Morriss, Richard; Glazebrook, Cris
2014-05-16
Depression and anxiety are common mental health difficulties experienced by university students and can impair academic and social functioning. Students are limited in seeking help from professionals. As university students are highly connected to digital technologies, Web-based and computer-delivered interventions could be used to improve students' mental health. The effectiveness of these intervention types requires investigation to identify whether these are viable prevention strategies for university students. The intent of the study was to systematically review and analyze trials of Web-based and computer-delivered interventions to improve depression, anxiety, psychological distress, and stress in university students. Several databases were searched using keywords relating to higher education students, mental health, and eHealth interventions. The eligibility criteria for studies included in the review were: (1) the study aimed to improve symptoms relating to depression, anxiety, psychological distress, and stress, (2) the study involved computer-delivered or Web-based interventions accessed via computer, laptop, or tablet, (3) the study was a randomized controlled trial, and (4) the study was trialed on higher education students. Trials were reviewed and outcome data analyzed through random effects meta-analyses for each outcome and each type of trial arm comparison. Cochrane Collaboration risk of bias tool was used to assess study quality. A total of 17 trials were identified, in which seven were the same three interventions on separate samples; 14 reported sufficient information for meta-analysis. The majority (n=13) were website-delivered and nine interventions were based on cognitive behavioral therapy (CBT). A total of 1795 participants were randomized and 1480 analyzed. Risk of bias was considered moderate, as many publications did not sufficiently report their methods and seven explicitly conducted completers' analyses. In comparison to the inactive control, sensitivity meta-analyses supported intervention in improving anxiety (pooled standardized mean difference [SMD] -0.56; 95% CI -0.77 to -0.35, P<.001), depression (pooled SMD -0.43; 95% CI -0.63 to -0.22, P<.001), and stress (pooled SMD -0.73; 95% CI -1.27 to -0.19, P=.008). In comparison to active controls, sensitivity analyses did not support either condition for anxiety (pooled SMD -0.18; 95% CI -0.98 to 0.62, P=.66) or depression (pooled SMD -0.28; 95% CI -0.75 to -0.20, P=.25). In contrast to a comparison intervention, neither condition was supported in sensitivity analyses for anxiety (pooled SMD -0.10; 95% CI -0.39 to 0.18, P=.48) or depression (pooled SMD -0.33; 95% CI -0.43 to 1.09, P=.40). The findings suggest Web-based and computer-delivered interventions can be effective in improving students' depression, anxiety, and stress outcomes when compared to inactive controls, but some caution is needed when compared to other trial arms and methodological issues were noticeable. Interventions need to be trialed on more heterogeneous student samples and would benefit from user evaluation. Future trials should address methodological considerations to improve reporting of trial quality and address post-intervention skewed data.
Varn, Frederick S.; Tafe, Laura J.; Amos, Christopher I.; Cheng, Chao
2018-01-01
ABSTRACT Non-small cell lung cancer is one of the leading causes of cancer-related death in the world. Lung adenocarcinoma, the most common type of non-small cell lung cancer, has been well characterized as having a dense lymphocytic infiltrate, suggesting that the immune system plays an active role in shaping this cancer's growth and development. Despite these findings, our understanding of how this infiltrate affects patient prognosis and its association with lung adenocarcinoma-specific clinical factors remains limited. To address these questions, we inferred the infiltration level of six distinct immune cell types from a series of four lung adenocarcinoma gene expression datasets. We found that naive B cell, CD8+ T cell, and myeloid cell-derived expression signals of immune infiltration were significantly predictive of patient survival in multiple independent datasets, with B cell and CD8+ T cell infiltration associated with prolonged prognosis and myeloid cell infiltration associated with shorter survival. These associations remained significant even after accounting for additional clinical variables. Patients stratified by smoking status exhibited decreased CD8+ T cell infiltration and altered prognostic associations, suggesting potential immunosuppressive mechanisms in smokers. Survival analyses accounting for immune checkpoint gene expression and cellular immune infiltrate indicated checkpoint protein-specific modulatory effects on CD8+ T cell and B cell function that may be associated with patient sensitivity to immunotherapy. Together, these analyses identified reproducible associations that can be used to better characterize the role of immune infiltration in lung adenocarcinoma and demonstrate the utility in using computational approaches to systematically characterize tissue-specific tumor-immune interactions. PMID:29872556
A computer program for multiple decrement life table analyses.
Poole, W K; Cooley, P C
1977-06-01
Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.
NASA Astrophysics Data System (ADS)
Al-Qawasmeh, Ahmad; Holzwarth, N. A. W.
2017-10-01
Two lithium oxonitridophosphate materials are computationally examined and found to be promising solid electrolytes for possible use in all solid-state batteries having metallic Li anodes - Li14P2O3N6 and Li7PN4. The first principles simulations are in good agreement with the structural analyses reported in the literature for these materials and the computed total energies indicate that both materials are stable with respect to decomposition into binary and ternary products. The computational results suggest that both materials are likely to form metastable interfaces with Li metal. The simulations also find both materials to have Li ion migration activation energies comparable or smaller than those of related Li ion electrolyte materials. Specifically, for Li7PN4, the experimentally measured activation energy can be explained by the migration of a Li ion vacancy stabilized by a small number of O2- ions substituting for N3- ions. For Li14P2O3N6, the activation energy for Li ion migration has not yet been experimentally measured, but simulations predict it to be smaller than that measured for Li7PN4.
Classical boson sampling algorithms with superior performance to near-term experiments
NASA Astrophysics Data System (ADS)
Neville, Alex; Sparrow, Chris; Clifford, Raphaël; Johnston, Eric; Birchall, Patrick M.; Montanaro, Ashley; Laing, Anthony
2017-12-01
It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of linear optics, which has sparked interest as a rapid way to demonstrate such quantum supremacy. Photon statistics are governed by intractable matrix functions, which suggests that sampling from the distribution obtained by injecting photons into a linear optical network could be solved more quickly by a photonic experiment than by a classical computer. The apparently low resource requirements for large boson sampling experiments have raised expectations of a near-term demonstration of quantum supremacy by boson sampling. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. Our classical algorithm, based on Metropolised independence sampling, allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. Compared to current experiments, a demonstration of quantum supremacy over a successful implementation of these classical methods on a supercomputer would require the number of photons and experimental components to increase by orders of magnitude, while tackling exponentially scaling photon loss.
Grammatical Analysis as a Distributed Neurobiological Function
Bozic, Mirjana; Fonteneau, Elisabeth; Su, Li; Marslen-Wilson, William D
2015-01-01
Language processing engages large-scale functional networks in both hemispheres. Although it is widely accepted that left perisylvian regions have a key role in supporting complex grammatical computations, patient data suggest that some aspects of grammatical processing could be supported bilaterally. We investigated the distribution and the nature of grammatical computations across language processing networks by comparing two types of combinatorial grammatical sequences—inflectionally complex words and minimal phrases—and contrasting them with grammatically simple words. Novel multivariate analyses revealed that they engage a coalition of separable subsystems: inflected forms triggered left-lateralized activation, dissociable into dorsal processes supporting morphophonological parsing and ventral, lexically driven morphosyntactic processes. In contrast, simple phrases activated a consistently bilateral pattern of temporal regions, overlapping with inflectional activations in L middle temporal gyrus. These data confirm the role of the left-lateralized frontotemporal network in supporting complex grammatical computations. Critically, they also point to the capacity of bilateral temporal regions to support simple, linear grammatical computations. This is consistent with a dual neurobiological framework where phylogenetically older bihemispheric systems form part of the network that supports language function in the modern human, and where significant capacities for language comprehension remain intact even following severe left hemisphere damage. PMID:25421880
Angiuoli, Samuel V; White, James R; Matalka, Malcolm; White, Owen; Fricke, W Florian
2011-01-01
The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers.
Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian
2011-01-01
Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers. PMID:22028928
Computational biology for ageing
Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.
2011-01-01
High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530
The influence of computational assumptions on analysing abdominal aortic aneurysm haemodynamics.
Ene, Florentina; Delassus, Patrick; Morris, Liam
2014-08-01
The variation in computational assumptions for analysing abdominal aortic aneurysm haemodynamics can influence the desired output results and computational cost. Such assumptions for abdominal aortic aneurysm modelling include static/transient pressures, steady/transient flows and rigid/compliant walls. Six computational methods and these various assumptions were simulated and compared within a realistic abdominal aortic aneurysm model with and without intraluminal thrombus. A full transient fluid-structure interaction was required to analyse the flow patterns within the compliant abdominal aortic aneurysms models. Rigid wall computational fluid dynamics overestimates the velocity magnitude by as much as 40%-65% and the wall shear stress by 30%-50%. These differences were attributed to the deforming walls which reduced the outlet volumetric flow rate for the transient fluid-structure interaction during the majority of the systolic phase. Static finite element analysis accurately approximates the deformations and von Mises stresses when compared with transient fluid-structure interaction. Simplifying the modelling complexity reduces the computational cost significantly. In conclusion, the deformation and von Mises stress can be approximately found by static finite element analysis, while for compliant models a full transient fluid-structure interaction analysis is required for acquiring the fluid flow phenomenon. © IMechE 2014.
General review of the MOSTAS computer code for wind turbines
NASA Technical Reports Server (NTRS)
Dungundji, J.; Wendell, J. H.
1981-01-01
The MOSTAS computer code for wind turbine analysis is reviewed, and techniques and methods used in its analyses are described. Impressions of its strengths and weakness, and recommendations for its application, modification, and further development are made. Basic techniques used in wind turbine stability and response analyses for systems with constant and periodic coefficients are reviewed.
Adolescent Sedentary Behaviors: Correlates Differ for Television Viewing and Computer Use
Babey, Susan H.; Hastert, Theresa A.; Wolstein, Joelle
2013-01-01
Purpose Sedentary behavior is associated with obesity in youth. Understanding correlates of specific sedentary behaviors can inform the development of interventions to reduce sedentary time. The current research examines correlates of leisure computer use and television viewing among California adolescents. Methods Using data from the 2005 California Health Interview Survey (CHIS), we examined individual, family and environmental correlates of two sedentary behaviors among 4,029 adolescents: leisure computer use and television watching. Results Linear regression analyses adjusting for a range of factors indicated several differences in the correlates of television watching and computer use. Correlates of additional time spent watching television included male sex, American Indian and African American race, lower household income, lower levels of physical activity, lower parent educational attainment, and additional hours worked by parents. Correlates of a greater amount of time spent using the computer for fun included older age, Asian race, higher household income, lower levels of physical activity, less parental knowledge of free time activities, and living in neighborhoods with higher proportions of non-white residents and higher proportions of low-income residents. Only physical activity was associated similarly with both watching television and computer use. Conclusions These results suggest that correlates of time spent on television watching and leisure computer use are different. Reducing screen time is a potentially successful strategy in combating childhood obesity, and understanding differences in the correlates of different screen time behaviors can inform the development of more effective interventions to reduce sedentary time. PMID:23260837
Feedback Inhibition Shapes Emergent Computational Properties of Cortical Microcircuit Motifs.
Jonke, Zeno; Legenstein, Robert; Habenschuss, Stefan; Maass, Wolfgang
2017-08-30
Cortical microcircuits are very complex networks, but they are composed of a relatively small number of stereotypical motifs. Hence, one strategy for throwing light on the computational function of cortical microcircuits is to analyze emergent computational properties of these stereotypical microcircuit motifs. We are addressing here the question how spike timing-dependent plasticity shapes the computational properties of one motif that has frequently been studied experimentally: interconnected populations of pyramidal cells and parvalbumin-positive inhibitory cells in layer 2/3. Experimental studies suggest that these inhibitory neurons exert some form of divisive inhibition on the pyramidal cells. We show that this data-based form of feedback inhibition, which is softer than that of winner-take-all models that are commonly considered in theoretical analyses, contributes to the emergence of an important computational function through spike timing-dependent plasticity: The capability to disentangle superimposed firing patterns in upstream networks, and to represent their information content through a sparse assembly code. SIGNIFICANCE STATEMENT We analyze emergent computational properties of a ubiquitous cortical microcircuit motif: populations of pyramidal cells that are densely interconnected with inhibitory neurons. Simulations of this model predict that sparse assembly codes emerge in this microcircuit motif under spike timing-dependent plasticity. Furthermore, we show that different assemblies will represent different hidden sources of upstream firing activity. Hence, we propose that spike timing-dependent plasticity enables this microcircuit motif to perform a fundamental computational operation on neural activity patterns. Copyright © 2017 the authors 0270-6474/17/378511-13$15.00/0.
NASA Astrophysics Data System (ADS)
Yozgatligil, Ceylan; Aslan, Sipan; Iyigun, Cem; Batmaz, Inci
2013-04-01
This study aims to compare several imputation methods to complete the missing values of spatio-temporal meteorological time series. To this end, six imputation methods are assessed with respect to various criteria including accuracy, robustness, precision, and efficiency for artificially created missing data in monthly total precipitation and mean temperature series obtained from the Turkish State Meteorological Service. Of these methods, simple arithmetic average, normal ratio (NR), and NR weighted with correlations comprise the simple ones, whereas multilayer perceptron type neural network and multiple imputation strategy adopted by Monte Carlo Markov Chain based on expectation-maximization (EM-MCMC) are computationally intensive ones. In addition, we propose a modification on the EM-MCMC method. Besides using a conventional accuracy measure based on squared errors, we also suggest the correlation dimension (CD) technique of nonlinear dynamic time series analysis which takes spatio-temporal dependencies into account for evaluating imputation performances. Depending on the detailed graphical and quantitative analysis, it can be said that although computational methods, particularly EM-MCMC method, are computationally inefficient, they seem favorable for imputation of meteorological time series with respect to different missingness periods considering both measures and both series studied. To conclude, using the EM-MCMC algorithm for imputing missing values before conducting any statistical analyses of meteorological data will definitely decrease the amount of uncertainty and give more robust results. Moreover, the CD measure can be suggested for the performance evaluation of missing data imputation particularly with computational methods since it gives more precise results in meteorological time series.
Preventing smoking relapse via Web-based computer-tailored feedback: a randomized controlled trial.
Elfeddali, Iman; Bolman, Catherine; Candel, Math J J M; Wiers, Reinout W; de Vries, Hein
2012-08-20
Web-based computer-tailored approaches have the potential to be successful in supporting smoking cessation. However, the potential effects of such approaches for relapse prevention and the value of incorporating action planning strategies to effectively prevent smoking relapse have not been fully explored. The Stay Quit for You (SQ4U) study compared two Web-based computer-tailored smoking relapse prevention programs with different types of planning strategies versus a control group. To assess the efficacy of two Web-based computer-tailored programs in preventing smoking relapse compared with a control group. The action planning (AP) program provided tailored feedback at baseline and invited respondents to do 6 preparatory and coping planning assignments (the first 3 assignments prior to quit date and the final 3 assignments after quit date). The action planning plus (AP+) program was an extended version of the AP program that also provided tailored feedback at 11 time points after the quit attempt. Respondents in the control group only filled out questionnaires. The study also assessed possible dose-response relationships between abstinence and adherence to the programs. The study was a randomized controlled trial with three conditions: the control group, the AP program, and the AP+ program. Respondents were daily smokers (N = 2031), aged 18 to 65 years, who were motivated and willing to quit smoking within 1 month. The primary outcome was self-reported continued abstinence 12 months after baseline. Logistic regression analyses were conducted using three samples: (1) all respondents as randomly assigned, (2) a modified sample that excluded respondents who did not make a quit attempt in conformance with the program protocol, and (3) a minimum dose sample that also excluded respondents who did not adhere to at least one of the intervention elements. Observed case analyses and conservative analyses were conducted. In the observed case analysis of the randomized sample, abstinence rates were 22% (45/202) in the control group versus 33% (63/190) in the AP program and 31% (53/174) in the AP+ program. The AP program (odds ratio 1.95, P = .005) and the AP+ program (odds ratio 1.61, P = .049) were significantly more effective than the control condition. Abstinence rates and effects differed per sample. Finally, the results suggest a dose-response relationship between abstinence and the number of program elements completed by the respondents. Despite the differences in results caused by the variation in our analysis approaches, we can conclude that Web-based computer-tailored programs combined with planning strategy assignments and feedback after the quit attempt can be effective in preventing relapse 12 months after baseline. However, adherence to the intervention seems critical for effectiveness. Finally, our results also suggest that more research is needed to assess the optimum intervention dose. Dutch Trial Register: NTR1892; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=1892 (Archived by WebCite at http://www.webcitation.org/693S6uuPM).
2011-01-01
Background The aryl hydrocarbon receptor (AhR) is a ligand-activated transcription factor (TF) that mediates responses to 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD). Integration of TCDD-induced genome-wide AhR enrichment, differential gene expression and computational dioxin response element (DRE) analyses further elucidate the hepatic AhR regulatory network. Results Global ChIP-chip and gene expression analyses were performed on hepatic tissue from immature ovariectomized mice orally gavaged with 30 μg/kg TCDD. ChIP-chip analysis identified 14,446 and 974 AhR enriched regions (1% false discovery rate) at 2 and 24 hrs, respectively. Enrichment density was greatest in the proximal promoter, and more specifically, within ± 1.5 kb of a transcriptional start site (TSS). AhR enrichment also occurred distal to a TSS (e.g. intergenic DNA and 3' UTR), extending the potential gene expression regulatory roles of the AhR. Although TF binding site analyses identified over-represented DRE sequences within enriched regions, approximately 50% of all AhR enriched regions lacked a DRE core (5'-GCGTG-3'). Microarray analysis identified 1,896 number of TCDD-responsive genes (|fold change| ≥ 1.5, P1(t) > 0.999). Integrating this gene expression data with our ChIP-chip and DRE analyses only identified 625 differentially expressed genes that involved an AhR interaction at a DRE. Functional annotation analysis of differentially regulated genes associated with AhR enrichment identified overrepresented processes related to fatty acid and lipid metabolism and transport, and xenobiotic metabolism, which are consistent with TCDD-elicited steatosis in the mouse liver. Conclusions Details of the AhR regulatory network have been expanded to include AhR-DNA interactions within intragenic and intergenic genomic regions. Moreover, the AhR can interact with DNA independent of a DRE core suggesting there are alternative mechanisms of AhR-mediated gene regulation. PMID:21762485
Linguistic analysis of project ownership for undergraduate research experiences.
Hanauer, D I; Frederick, J; Fotinakes, B; Strobel, S A
2012-01-01
We used computational linguistic and content analyses to explore the concept of project ownership for undergraduate research. We used linguistic analysis of student interview data to develop a quantitative methodology for assessing project ownership and applied this method to measure degrees of project ownership expressed by students in relation to different types of educational research experiences. The results of the study suggest that the design of a research experience significantly influences the degree of project ownership expressed by students when they describe those experiences. The analysis identified both positive and negative aspects of project ownership and provided a working definition for how a student experiences his or her research opportunity. These elements suggest several features that could be incorporated into an undergraduate research experience to foster a student's sense of project ownership.
NASA Technical Reports Server (NTRS)
Suarez, Max J. (Editor); Chang, Alfred T. C.; Chiu, Long S.
1997-01-01
Seventeen months of rainfall data (August 1987-December 1988) from nine satellite rainfall algorithms (Adler, Chang, Kummerow, Prabhakara, Huffman, Spencer, Susskind, and Wu) were analyzed to examine the uncertainty of satellite-derived rainfall estimates. The variability among algorithms, measured as the standard deviation computed from the ensemble of algorithms, shows regions of high algorithm variability tend to coincide with regions of high rain rates. Histograms of pattern correlation (PC) between algorithms suggest a bimodal distribution, with separation at a PC-value of about 0.85. Applying this threshold as a criteria for similarity, our analyses show that algorithms using the same sensor or satellite input tend to be similar, suggesting the dominance of sampling errors in these satellite estimates.
Debates—Hypothesis testing in hydrology: Introduction
NASA Astrophysics Data System (ADS)
Blöschl, Günter
2017-03-01
This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.
Steady-state and transient operation of a heat-pipe radiator system
NASA Technical Reports Server (NTRS)
Sellers, J. P.
1974-01-01
Data obtained on a VCHP heat-pipe radiator system tested in a vacuum environment were studied. Analyses and interpretation of the steady-state results are presented along with an initial analysis of some of the transient data. Particular emphasis was placed on quantitative comparisons of the experimental data with computer model simulations. The results of the study provide a better understanding of the system but do not provide a complete explanation for the observed low VCHP performance and the relatively flat radiator panel temperature distribution. The results of the study also suggest hardware, software, and testing improvements.
Space shuttle propulsion parameter estimation using optional estimation techniques
NASA Technical Reports Server (NTRS)
1983-01-01
A regression analyses on tabular aerodynamic data provided. A representative aerodynamic model for coefficient estimation. It also reduced the storage requirements for the "normal' model used to check out the estimation algorithms. The results of the regression analyses are presented. The computer routines for the filter portion of the estimation algorithm and the :"bringing-up' of the SRB predictive program on the computer was developed. For the filter program, approximately 54 routines were developed. The routines were highly subsegmented to facilitate overlaying program segments within the partitioned storage space on the computer.
NASA Technical Reports Server (NTRS)
Vallee, J.; Gibbs, B.
1976-01-01
Between August 1975 and March 1976, two NASA projects with geographically separated participants used a computer-conferencing system developed by the Institute for the Future for portions of their work. Monthly usage statistics for the system were collected in order to examine the group and individual participation figures for all conferences. The conference transcripts were analysed to derive observations about the use of the medium. In addition to the results of these analyses, the attitudes of users and the major components of the costs of computer conferencing are discussed.
Schulte, Friederike A; Lambers, Floor M; Mueller, Thomas L; Stauber, Martin; Müller, Ralph
2014-04-01
Time-lapsed in vivo micro-computed tomography is a powerful tool to analyse longitudinal changes in the bone micro-architecture. Registration can overcome problems associated with spatial misalignment between scans; however, it requires image interpolation which might affect the outcome of a subsequent bone morphometric analysis. The impact of the interpolation error itself, though, has not been quantified to date. Therefore, the purpose of this ex vivo study was to elaborate the effect of different interpolator schemes [nearest neighbour, tri-linear and B-spline (BSP)] on bone morphometric indices. None of the interpolator schemes led to significant differences between interpolated and non-interpolated images, with the lowest interpolation error found for BSPs (1.4%). Furthermore, depending on the interpolator, the processing order of registration, Gaussian filtration and binarisation played a role. Independent from the interpolator, the present findings suggest that the evaluation of bone morphometry should be done with images registered using greyscale information.
Rough set classification based on quantum logic
NASA Astrophysics Data System (ADS)
Hassan, Yasser F.
2017-11-01
By combining the advantages of quantum computing and soft computing, the paper shows that rough sets can be used with quantum logic for classification and recognition systems. We suggest the new definition of rough set theory as quantum logic theory. Rough approximations are essential elements in rough set theory, the quantum rough set model for set-valued data directly construct set approximation based on a kind of quantum similarity relation which is presented here. Theoretical analyses demonstrate that the new model for quantum rough sets has new type of decision rule with less redundancy which can be used to give accurate classification using principles of quantum superposition and non-linear quantum relations. To our knowledge, this is the first attempt aiming to define rough sets in representation of a quantum rather than logic or sets. The experiments on data-sets have demonstrated that the proposed model is more accuracy than the traditional rough sets in terms of finding optimal classifications.
Sato, Y; Wadamoto, M; Tsuga, K; Teixeira, E R
1999-04-01
More validity of finite element analysis in implant biomechanics requires element downsizing. However, excess downsizing needs computer memory and calculation time. To investigate the effectiveness of element downsizing on the construction of a three-dimensional finite element bone trabeculae model, with different element sizes (600, 300, 150 and 75 microm) models were constructed and stress induced by vertical 10 N loading was analysed. The difference in von Mises stress values between the models with 600 and 300 microm element sizes was larger than that between 300 and 150 microm. On the other hand, no clear difference of stress values was detected among the models with 300, 150 and 75 microm element sizes. Downsizing of elements from 600 to 300 microm is suggested to be effective in the construction of a three-dimensional finite element bone trabeculae model for possible saving of computer memory and calculation time in the laboratory.
ERIC Educational Resources Information Center
Barthur, Ashrith
2016-01-01
There are two essential goals of this research. The first goal is to design and construct a computational environment that is used for studying large and complex datasets in the cybersecurity domain. The second goal is to analyse the Spamhaus blacklist query dataset which includes uncovering the properties of blacklisted hosts and understanding…
Space coding for sensorimotor transformations can emerge through unsupervised learning.
De Filippo De Grazia, Michele; Cutini, Simone; Lisi, Matteo; Zorzi, Marco
2012-08-01
The posterior parietal cortex (PPC) is fundamental for sensorimotor transformations because it combines multiple sensory inputs and posture signals into different spatial reference frames that drive motor programming. Here, we present a computational model mimicking the sensorimotor transformations occurring in the PPC. A recurrent neural network with one layer of hidden neurons (restricted Boltzmann machine) learned a stochastic generative model of the sensory data without supervision. After the unsupervised learning phase, the activity of the hidden neurons was used to compute a motor program (a population code on a bidimensional map) through a simple linear projection and delta rule learning. The average motor error, calculated as the difference between the expected and the computed output, was less than 3°. Importantly, analyses of the hidden neurons revealed gain-modulated visual receptive fields, thereby showing that space coding for sensorimotor transformations similar to that observed in the PPC can emerge through unsupervised learning. These results suggest that gain modulation is an efficient coding strategy to integrate visual and postural information toward the generation of motor commands.
Vision 20/20: Automation and advanced computing in clinical radiation oncology.
Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa
2014-01-01
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.
NASA Technical Reports Server (NTRS)
Stricker, L. T.
1975-01-01
The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.
A computer program for geochemical analysis of acid-rain and other low-ionic-strength, acidic waters
Johnsson, P.A.; Lord, D.G.
1987-01-01
ARCHEM, a computer program written in FORTRAN 77, is designed primarily for use in the routine geochemical interpretation of low-ionic-strength, acidic waters. On the basis of chemical analyses of the water, and either laboratory or field determinations of pH, temperature, and dissolved oxygen, the program calculates the equilibrium distribution of major inorganic aqueous species and of inorganic aluminum complexes. The concentration of the organic anion is estimated from the dissolved organic concentration. Ionic ferrous iron is calculated from the dissolved oxygen concentration. Ionic balances and comparisons of computed with measured specific conductances are performed as checks on the analytical accuracy of chemical analyses. ARCHEM may be tailored easily to fit different sampling protocols, and may be run on multiple sample analyses. (Author 's abstract)
Global Dynamics of Proteins: Bridging Between Structure and Function
Bahar, Ivet; Lezon, Timothy R.; Yang, Lee-Wei; Eyal, Eran
2010-01-01
Biomolecular systems possess unique, structure-encoded dynamic properties that underlie their biological functions. Recent studies indicate that these dynamic properties are determined to a large extent by the topology of native contacts. In recent years, elastic network models used in conjunction with normal mode analyses have proven to be useful for elucidating the collective dynamics intrinsically accessible under native state conditions, including in particular the global modes of motions that are robustly defined by the overall architecture. With increasing availability of structural data for well-studied proteins in different forms (liganded, complexed, or free), there is increasing evidence in support of the correspondence between functional changes in structures observed in experiments and the global motions predicted by these coarse-grained analyses. These observed correlations suggest that computational methods may be advantageously employed for assessing functional changes in structure and allosteric mechanisms intrinsically favored by the native fold. PMID:20192781
Global dynamics of proteins: bridging between structure and function.
Bahar, Ivet; Lezon, Timothy R; Yang, Lee-Wei; Eyal, Eran
2010-01-01
Biomolecular systems possess unique, structure-encoded dynamic properties that underlie their biological functions. Recent studies indicate that these dynamic properties are determined to a large extent by the topology of native contacts. In recent years, elastic network models used in conjunction with normal mode analyses have proven to be useful for elucidating the collective dynamics intrinsically accessible under native state conditions, including in particular the global modes of motions that are robustly defined by the overall architecture. With increasing availability of structural data for well-studied proteins in different forms (liganded, complexed, or free), there is increasing evidence in support of the correspondence between functional changes in structures observed in experiments and the global motions predicted by these coarse-grained analyses. These observed correlations suggest that computational methods may be advantageously employed for assessing functional changes in structure and allosteric mechanisms intrinsically favored by the native fold.
Hallmann, Kirstin; Breuer, Christoph
2014-01-01
This article analyses sport participation using a demographic-economic model which was extended by the construct 'social recognition'. Social recognition was integrated into the model on the understanding that it is the purpose of each individual to maximise his or her utility. A computer-assisted telephone interview survey was conducted in the city of Rheinberg, Germany, producing an overall sample of n=1934. Regression analyses were performed to estimate the impact of socio-demographic, economic determinants, and social recognition on sport participation. The results suggest that various socio-economic factors and social recognition are important determinants of sport participation on the one hand, and on sport frequency on the other. Social recognition plays a significant yet different role for both sport participation and sport frequency. While friends' involvement with sport influences one's sport participation, parents' involvement with sport influences one's sport frequency.
A new tool called DISSECT for analysing large genomic data sets using a Big Data approach
Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert
2015-01-01
Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010
Modeling Human-Computer Decision Making with Covariance Structure Analysis.
ERIC Educational Resources Information Center
Coovert, Michael D.; And Others
Arguing that sufficient theory exists about the interplay between human information processing, computer systems, and the demands of various tasks to construct useful theories of human-computer interaction, this study presents a structural model of human-computer interaction and reports the results of various statistical analyses of this model.…
Computer Instructional Aids for Undergraduate Control Education.
ERIC Educational Resources Information Center
Volz, Richard A.; And Others
Engineering is coming to rely more and more heavily upon the computer for computations, analyses, and graphic displays which aid the design process. A general purpose simulation system, the Time-shared Automatic Control Laboratory (TACL), and a set of computer-aided design programs, Control Oriented Interactive Graphic Analysis and Design…
Biophysical and computational characterization of vandetanib-lysozyme interaction
NASA Astrophysics Data System (ADS)
Kabir, Md. Zahirul; Hamzah, Nur Aziean Binti; Ghani, Hamidah; Mohamad, Saharuddin B.; Alias, Zazali; Tayyab, Saad
2018-01-01
Interaction of an anticancer drug, vandetanib (VDB) with a ligand transporter, lysozyme (LYZ) was explored using multispectroscopic techniques, such as fluorescence, absorption and circular dichroism along with computational analysis. Fluorescence data and absorption results confirmed VDB-LYZ complexation. VDB-induced quenching was characterized as static quenching based on inverse correlation of KSV with temperature as well as kq values. The complex was characterized by the weak binding constant (Ka = 4.96-3.14 × 103 M-1). Thermodynamic data (ΔS = + 12.82 J mol-1 K-1; ΔH = - 16.73 kJ mol-1) of VDB-LYZ interaction revealed participation of hydrophobic and van der Waals forces along with hydrogen bonds in VDB-LYZ complexation. Microenvironmental perturbations around tryptophan and tyrosine residues as well as secondary and tertiary structural alterations in LYZ upon addition of VDB were evident from the 3-D fluorescence, far- and near-UV CD spectral analyses, respectively. Interestingly, addition of VDB to LYZ significantly increased protein's thermostability. Molecular docking results suggested the location of VDB binding site near the LYZ active site while molecular dynamics simulation results suggested stability of VDB-LYZ complex. Presence of Mg2+, Ba2+ and Zn2+ was found to interfere with VDB-LYZ interaction.
McNally, Richard J.; Heeren, Alexandre; Robinaugh, Donald J.
2017-01-01
ABSTRACT Background: The network approach to mental disorders offers a novel framework for conceptualizing posttraumatic stress disorder (PTSD) as a causal system of interacting symptoms. Objective: In this study, we extended this work by estimating the structure of relations among PTSD symptoms in adults reporting personal histories of childhood sexual abuse (CSA; N = 179). Method: We employed two complementary methods. First, using the graphical LASSO, we computed a sparse, regularized partial correlation network revealing associations (edges) between pairs of PTSD symptoms (nodes). Next, using a Bayesian approach, we computed a directed acyclic graph (DAG) to estimate a directed, potentially causal model of the relations among symptoms. Results: For the first network, we found that physiological reactivity to reminders of trauma, dreams about the trauma, and lost of interest in previously enjoyed activities were highly central nodes. However, stability analyses suggest that these findings were unstable across subsets of our sample. The DAG suggests that becoming physiologically reactive and upset in response to reminders of the trauma may be key drivers of other symptoms in adult survivors of CSA. Conclusions: Our study illustrates the strengths and limitations of these network analytic approaches to PTSD. PMID:29038690
The effect of country wealth on incidence of breast cancer.
Coccia, Mario
2013-09-01
The aim of this study is to analyze the relationship between the incidence of breast cancer and income per capita across countries. Data on breast cancer incidence in 52 countries were obtained from GLOBOCAN, along with economic indicators of gross domestic product per capita from the World Bank. Number of computed tomography scanners and magnetic resonance imaging (from World Health Organization) were used as a surrogate for technology and access to screening for cancer diagnosis. Statistical analyses for correlation and regression were performed, along with an analysis of variance (ANOVA). A strong positive association between breast cancer incidence and gross domestic product per capita, Pearson's r = 65.4 %, controlling latitude, density of computed tomography scanners and magnetic resonance imaging was found in countries of temperate zones. The estimated relationship suggests that 1 % higher gross domestic product per capita, within the temperate zones (latitudes), increases the expected age-standardized breast cancer incidence by about 35.6 % (p < 0.001). ANOVA confirms these vital results. While some have argued that latitude and seasonality may affect breast cancer incidence, these findings suggest that wealthier nations may have a higher incidence of breast cancer independent of geographic location and screening technology.
Bobo-Pinilla, Javier; Barrios de León, Sara B; Seguí Colomar, Jaume; Fenu, Giuseppe; Bacchetta, Gianluigi; Peñas de Giles, Julio; Martínez-Ortega, María Montserrat
2016-01-01
Although it has been traditionally accepted that Arenaria balearica (Caryophyllaceae) could be a relict Tertiary plant species, this has never been experimentally tested. Nor have the palaeohistorical reasons underlying the highly fragmented distribution of the species in the Western Mediterranean region been investigated. We have analysed AFLP data (213) and plastid DNA sequences (226) from a total of 250 plants from 29 populations sampled throughout the entire distribution range of the species in Majorca, Corsica, Sardinia, and the Tuscan Archipelago. The AFLP data analyses indicate very low geographic structure and population differentiation. Based on plastid DNA data, six alternative phylogeographic hypotheses were tested using Approximate Bayesian Computation (ABC). These analyses revealed ancient area fragmentation as the most probable scenario, which is in accordance with the star-like topology of the parsimony network that suggests a pattern of long term survival and subsequent in situ differentiation. Overall low levels of genetic diversity and plastid DNA variation were found, reflecting evolutionary stasis of a species preserved in locally long-term stable habitats.
NASA Astrophysics Data System (ADS)
Kumar, K. Ravi; Cheepu, Muralimohan; Srinivas, B.; Venkateswarlu, D.; Pramod Kumar, G.; Shiva, Apireddi
2018-03-01
In solar air heater, artificial roughness on absorber plate become prominent technique to improving heat transfer rate of air flowing passage as a result of laminar sublayer. The selection of rib geometries plays important role on friction characteristics and heat transfer rate. Many researchers studying the roughness shapes over the years to investigate the effect of geometries on the performance of friction factor and heat transfer of the solar air heater. The present study made an attempt to develop the different rib shapes utilised for creating artificial rib roughness and its comparison to investigate higher performance of the geometries. The use of computational fluid dynamics software resulted in correlation of friction factor and heat transfer rate. The simulations studies were performed on 2D computational fluid dynamics model and analysed to identify the most effective parameters of relative roughness of the height, width and pitch on major considerations of friction factor and heat transfer. The Reynolds number is varied in a range from 3000 to 20000, in the current study and modelling has conducted on heat transfer and turbulence phenomena by using Reynolds number. The modelling results showed the formation of strong vortex in the main stream flow due to the right angle triangle roughness over the square, rectangle, improved rectangle and equilateral triangle geometries enhanced the heat transfer extension in the solar air heater. The simulation of the turbulence kinetic energy of the geometry suggests the local turbulence kinetic energy has been influenced strongly by the alignments of the right angle triangle.
Onboard Navigation Systems Characteristics
NASA Technical Reports Server (NTRS)
1979-01-01
The space shuttle onboard navigation systems characteristics are described. A standard source of equations and numerical data for use in error analyses and mission simulations related to space shuttle development is reported. The sensor characteristics described are used for shuttle onboard navigation performance assessment. The use of complete models in the studies depend on the analyses to be performed, the capabilities of the computer programs, and the availability of computer resources.
Elastic-plastic finite-element analyses of thermally cycled double-edge wedge specimens
NASA Technical Reports Server (NTRS)
Kaufman, A.; Hunt, L. E.
1982-01-01
Elastic-plastic stress-strain analyses were performed for double-edge wedge specimens subjected to thermal cycling in fluidized beds at 316 and 1088 C. Four cases involving different nickel-base alloys (IN 100, Mar M-200, NASA TAZ-8A, and Rene 80) were analyzed by using the MARC nonlinear, finite element computer program. Elastic solutions from MARC showed good agreement with previously reported solutions obtained by using the NASTRAN and ISO3DQ computer programs. Equivalent total strain ranges at the critical locations calculated by elastic analyses agreed within 3 percent with those calculated from elastic-plastic analyses. The elastic analyses always resulted in compressive mean stresses at the critical locations. However, elastic-plastic analyses showed tensile mean stresses for two of the four alloys and an increase in the compressive mean stress for the highest plastic strain case.
Computer aided system engineering for space construction
NASA Technical Reports Server (NTRS)
Racheli, Ugo
1989-01-01
This viewgraph presentation covers the following topics. Construction activities envisioned for the assembly of large platforms in space (as well as interplanetary spacecraft and bases on extraterrestrial surfaces) require computational tools that exceed the capability of conventional construction management programs. The Center for Space Construction is investigating the requirements for new computational tools and, at the same time, suggesting the expansion of graduate and undergraduate curricula to include proficiency in Computer Aided Engineering (CAE) though design courses and individual or team projects in advanced space systems design. In the center's research, special emphasis is placed on problems of constructability and of the interruptability of planned activity sequences to be carried out by crews operating under hostile environmental conditions. The departure point for the planned work is the acquisition of the MCAE I-DEAS software, developed by the Structural Dynamics Research Corporation (SDRC), and its expansion to the level of capability denoted by the acronym IDEAS**2 currently used for configuration maintenance on Space Station Freedom. In addition to improving proficiency in the use of I-DEAS and IDEAS**2, it is contemplated that new software modules will be developed to expand the architecture of IDEAS**2. Such modules will deal with those analyses that require the integration of a space platform's configuration with a breakdown of planned construction activities and with a failure modes analysis to support computer aided system engineering (CASE) applied to space construction.
TV Time but Not Computer Time Is Associated with Cardiometabolic Risk in Dutch Young Adults
Altenburg, Teatske M.; de Kroon, Marlou L. A.; Renders, Carry M.; HiraSing, Remy; Chinapaw, Mai J. M.
2013-01-01
Background TV time and total sedentary time have been positively related to biomarkers of cardiometabolic risk in adults. We aim to examine the association of TV time and computer time separately with cardiometabolic biomarkers in young adults. Additionally, the mediating role of waist circumference (WC) is studied. Methods and Findings Data of 634 Dutch young adults (18–28 years; 39% male) were used. Cardiometabolic biomarkers included indicators of overweight, blood pressure, blood levels of fasting plasma insulin, cholesterol, glucose, triglycerides and a clustered cardiometabolic risk score. Linear regression analyses were used to assess the cross-sectional association of self-reported TV and computer time with cardiometabolic biomarkers, adjusting for demographic and lifestyle factors. Mediation by WC was checked using the product-of-coefficient method. TV time was significantly associated with triglycerides (B = 0.004; CI = [0.001;0.05]) and insulin (B = 0.10; CI = [0.01;0.20]). Computer time was not significantly associated with any of the cardiometabolic biomarkers. We found no evidence for WC to mediate the association of TV time or computer time with cardiometabolic biomarkers. Conclusions We found a significantly positive association of TV time with cardiometabolic biomarkers. In addition, we found no evidence for WC as a mediator of this association. Our findings suggest a need to distinguish between TV time and computer time within future guidelines for screen time. PMID:23460900
Morriss, Richard; Glazebrook, Cris
2014-01-01
Background Depression and anxiety are common mental health difficulties experienced by university students and can impair academic and social functioning. Students are limited in seeking help from professionals. As university students are highly connected to digital technologies, Web-based and computer-delivered interventions could be used to improve students’ mental health. The effectiveness of these intervention types requires investigation to identify whether these are viable prevention strategies for university students. Objective The intent of the study was to systematically review and analyze trials of Web-based and computer-delivered interventions to improve depression, anxiety, psychological distress, and stress in university students. Methods Several databases were searched using keywords relating to higher education students, mental health, and eHealth interventions. The eligibility criteria for studies included in the review were: (1) the study aimed to improve symptoms relating to depression, anxiety, psychological distress, and stress, (2) the study involved computer-delivered or Web-based interventions accessed via computer, laptop, or tablet, (3) the study was a randomized controlled trial, and (4) the study was trialed on higher education students. Trials were reviewed and outcome data analyzed through random effects meta-analyses for each outcome and each type of trial arm comparison. Cochrane Collaboration risk of bias tool was used to assess study quality. Results A total of 17 trials were identified, in which seven were the same three interventions on separate samples; 14 reported sufficient information for meta-analysis. The majority (n=13) were website-delivered and nine interventions were based on cognitive behavioral therapy (CBT). A total of 1795 participants were randomized and 1480 analyzed. Risk of bias was considered moderate, as many publications did not sufficiently report their methods and seven explicitly conducted completers’ analyses. In comparison to the inactive control, sensitivity meta-analyses supported intervention in improving anxiety (pooled standardized mean difference [SMD] −0.56; 95% CI −0.77 to −0.35, P<.001), depression (pooled SMD −0.43; 95% CI −0.63 to −0.22, P<.001), and stress (pooled SMD −0.73; 95% CI −1.27 to −0.19, P=.008). In comparison to active controls, sensitivity analyses did not support either condition for anxiety (pooled SMD −0.18; 95% CI −0.98 to 0.62, P=.66) or depression (pooled SMD −0.28; 95% CI −0.75 to −0.20, P=.25). In contrast to a comparison intervention, neither condition was supported in sensitivity analyses for anxiety (pooled SMD −0.10; 95% CI −0.39 to 0.18, P=.48) or depression (pooled SMD −0.33; 95% CI −0.43 to 1.09, P=.40). Conclusions The findings suggest Web-based and computer-delivered interventions can be effective in improving students’ depression, anxiety, and stress outcomes when compared to inactive controls, but some caution is needed when compared to other trial arms and methodological issues were noticeable. Interventions need to be trialed on more heterogeneous student samples and would benefit from user evaluation. Future trials should address methodological considerations to improve reporting of trial quality and address post-intervention skewed data. PMID:24836465
Roze, S; Liens, D; Palmer, A; Berger, W; Tucker, D; Renaudin, C
2006-12-01
The aim of this study was to describe a health economic model developed to project lifetime clinical and cost outcomes of lipid-modifying interventions in patients not reaching target lipid levels and to assess the validity of the model. The internet-based, computer simulation model is made up of two decision analytic sub-models, the first utilizing Monte Carlo simulation, and the second applying Markov modeling techniques. Monte Carlo simulation generates a baseline cohort for long-term simulation by assigning an individual lipid profile to each patient, and applying the treatment effects of interventions under investigation. The Markov model then estimates the long-term clinical (coronary heart disease events, life expectancy, and quality-adjusted life expectancy) and cost outcomes up to a lifetime horizon, based on risk equations from the Framingham study. Internal and external validation analyses were performed. The results of the model validation analyses, plotted against corresponding real-life values from Framingham, 4S, AFCAPS/TexCAPS, and a meta-analysis by Gordon et al., showed that the majority of values were close to the y = x line, which indicates a perfect fit. The R2 value was 0.9575 and the gradient of the regression line was 0.9329, both very close to the perfect fit (= 1). Validation analyses of the computer simulation model suggest the model is able to recreate the outcomes from published clinical studies and would be a valuable tool for the evaluation of new and existing therapy options for patients with persistent dyslipidemia.
Ai, Hiroyuki; Kai, Kazuki; Kumaraswamy, Ajayrama; Ikeno, Hidetoshi; Wachtler, Thomas
2017-11-01
Female honeybees use the "waggle dance" to communicate the location of nectar sources to their hive mates. Distance information is encoded in the duration of the waggle phase (von Frisch, 1967). During the waggle phase, the dancer produces trains of vibration pulses, which are detected by the follower bees via Johnston's organ located on the antennae. To uncover the neural mechanisms underlying the encoding of distance information in the waggle dance follower, we investigated morphology, physiology, and immunohistochemistry of interneurons arborizing in the primary auditory center of the honeybee ( Apis mellifera ). We identified major interneuron types, named DL-Int-1, DL-Int-2, and bilateral DL-dSEG-LP, that responded with different spiking patterns to vibration pulses applied to the antennae. Experimental and computational analyses suggest that inhibitory connection plays a role in encoding and processing the duration of vibration pulse trains in the primary auditory center of the honeybee. SIGNIFICANCE STATEMENT The waggle dance represents a form of symbolic communication used by honeybees to convey the location of food sources via species-specific sound. The brain mechanisms used to decipher this symbolic information are unknown. We examined interneurons in the honeybee primary auditory center and identified different neuron types with specific properties. The results of our computational analyses suggest that inhibitory connection plays a role in encoding waggle dance signals. Our results are critical for understanding how the honeybee deciphers information from the sound produced by the waggle dance and provide new insights regarding how common neural mechanisms are used by different species to achieve communication. Copyright © 2017 the authors 0270-6474/17/3710624-12$15.00/0.
Neural Spike-Train Analyses of the Speech-Based Envelope Power Spectrum Model
Rallapalli, Varsha H.
2016-01-01
Diagnosing and treating hearing impairment is challenging because people with similar degrees of sensorineural hearing loss (SNHL) often have different speech-recognition abilities. The speech-based envelope power spectrum model (sEPSM) has demonstrated that the signal-to-noise ratio (SNRENV) from a modulation filter bank provides a robust speech-intelligibility measure across a wider range of degraded conditions than many long-standing models. In the sEPSM, noise (N) is assumed to: (a) reduce S + N envelope power by filling in dips within clean speech (S) and (b) introduce an envelope noise floor from intrinsic fluctuations in the noise itself. While the promise of SNRENV has been demonstrated for normal-hearing listeners, it has not been thoroughly extended to hearing-impaired listeners because of limited physiological knowledge of how SNHL affects speech-in-noise envelope coding relative to noise alone. Here, envelope coding to speech-in-noise stimuli was quantified from auditory-nerve model spike trains using shuffled correlograms, which were analyzed in the modulation-frequency domain to compute modulation-band estimates of neural SNRENV. Preliminary spike-train analyses show strong similarities to the sEPSM, demonstrating feasibility of neural SNRENV computations. Results suggest that individual differences can occur based on differential degrees of outer- and inner-hair-cell dysfunction in listeners currently diagnosed into the single audiological SNHL category. The predicted acoustic-SNR dependence in individual differences suggests that the SNR-dependent rate of susceptibility could be an important metric in diagnosing individual differences. Future measurements of the neural SNRENV in animal studies with various forms of SNHL will provide valuable insight for understanding individual differences in speech-in-noise intelligibility.
Vision 20/20: Automation and advanced computing in clinical radiation oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less
Vision 20/20: Automation and advanced computing in clinical radiation oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.
2014-01-15
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less
Cornejo-Romero, Amelia; Aguilar-Martínez, Gustavo F.; Medina-Sánchez, Javier; Rendón-Aguilar, Beatriz; Valverde, Pedro Luis; Zavala-Hurtado, Jose Alejandro; Serrato, Alejandra; Rivas-Arancibia, Sombra; Pérez-Hernández, Marco Aurelio; López-Ortega, Gerardo; Jiménez-Sierra, Cecilia
2017-01-01
Historic demography changes of plant species adapted to New World arid environments could be consistent with either the Glacial Refugium Hypothesis (GRH), which posits that populations contracted to refuges during the cold-dry glacial and expanded in warm-humid interglacial periods, or with the Interglacial Refugium Hypothesis (IRH), which suggests that populations contracted during interglacials and expanded in glacial times. These contrasting hypotheses are developed in the present study for the giant columnar cactus Cephalocereus columna-trajani in the intertropical Mexican drylands where the effects of Late Quaternary climatic changes on phylogeography of cacti remain largely unknown. In order to determine if the historic demography and phylogeographic structure of the species are consistent with either hypothesis, sequences of the chloroplast regions psbA-trnH and trnT-trnL from 110 individuals from 10 populations comprising the full distribution range of this species were analysed. Standard estimators of genetic diversity and structure were calculated. The historic demography was analysed using a Bayesian approach and the palaeodistribution was derived from ecological niche modelling to determine if, in the arid environments of south-central Mexico, glacial-interglacial cycles drove the genetic divergence and diversification of this species. Results reveal low but statistically significant population differentiation (FST = 0.124, P < 0.001), although very clear geographic clusters are not formed. Genetic diversity, haplotype network and Approximate Bayesian Computation (ABC) demographic analyses suggest a population expansion estimated to have taken place in the Last Interglacial (123.04 kya, 95% CI 115.3–130.03). The species palaeodistribution is consistent with the ABC analyses and indicates that the potential area of palaedistribution and climatic suitability were larger during the Last Interglacial and Holocene than in the Last Glacial Maximum. Overall, these results suggest that C. columna-trajani experienced an expansion following the warm conditions of interglacials, in accordance with the GRH. PMID:28426818
Cornejo-Romero, Amelia; Vargas-Mendoza, Carlos Fabián; Aguilar-Martínez, Gustavo F; Medina-Sánchez, Javier; Rendón-Aguilar, Beatriz; Valverde, Pedro Luis; Zavala-Hurtado, Jose Alejandro; Serrato, Alejandra; Rivas-Arancibia, Sombra; Pérez-Hernández, Marco Aurelio; López-Ortega, Gerardo; Jiménez-Sierra, Cecilia
2017-01-01
Historic demography changes of plant species adapted to New World arid environments could be consistent with either the Glacial Refugium Hypothesis (GRH), which posits that populations contracted to refuges during the cold-dry glacial and expanded in warm-humid interglacial periods, or with the Interglacial Refugium Hypothesis (IRH), which suggests that populations contracted during interglacials and expanded in glacial times. These contrasting hypotheses are developed in the present study for the giant columnar cactus Cephalocereus columna-trajani in the intertropical Mexican drylands where the effects of Late Quaternary climatic changes on phylogeography of cacti remain largely unknown. In order to determine if the historic demography and phylogeographic structure of the species are consistent with either hypothesis, sequences of the chloroplast regions psbA-trnH and trnT-trnL from 110 individuals from 10 populations comprising the full distribution range of this species were analysed. Standard estimators of genetic diversity and structure were calculated. The historic demography was analysed using a Bayesian approach and the palaeodistribution was derived from ecological niche modelling to determine if, in the arid environments of south-central Mexico, glacial-interglacial cycles drove the genetic divergence and diversification of this species. Results reveal low but statistically significant population differentiation (FST = 0.124, P < 0.001), although very clear geographic clusters are not formed. Genetic diversity, haplotype network and Approximate Bayesian Computation (ABC) demographic analyses suggest a population expansion estimated to have taken place in the Last Interglacial (123.04 kya, 95% CI 115.3-130.03). The species palaeodistribution is consistent with the ABC analyses and indicates that the potential area of palaedistribution and climatic suitability were larger during the Last Interglacial and Holocene than in the Last Glacial Maximum. Overall, these results suggest that C. columna-trajani experienced an expansion following the warm conditions of interglacials, in accordance with the GRH.
The History of the AutoChemist®: From Vision to Reality.
Peterson, H E; Jungner, I
2014-05-22
This paper discusses the early history and development of a clinical analyser system in Sweden (AutoChemist, 1965). It highlights the importance of such high capacity system both for clinical use and health care screening. The device was developed to assure the quality of results and to automatically handle the orders, store the results in digital form for later statistical analyses and distribute the results to the patients' physicians by using the computer used for the analyser. The most important result of the construction of an analyser able to produce analytical results on a mass scale was the development of a mechanical multi-channel analyser for clinical laboratories that handled discrete sample technology and could prevent carry-over to the next test samples while incorporating computer technology to improve the quality of test results. The AutoChemist could handle 135 samples per hour in an 8-hour shift and up to 24 possible analyses channels resulting in 3,200 results per hour. Later versions would double this capacity. Some customers used the equipment 24 hours per day. With a capacity of 3,000 to 6,000 analyses per hour, pneumatic driven pipettes, special units for corrosive liquids or special activities, and an integrated computer, the AutoChemist system was unique and the largest of its kind for many years. Its follower - The AutoChemist PRISMA (PRogrammable Individually Selective Modular Analyzer) - was smaller in size but had a higher capacity. Both analysers established new standards of operation for clinical laboratories and encouraged others to use new technologies for building new analysers.
Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano
2013-01-01
The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.
NASA Astrophysics Data System (ADS)
Henderson, Jean Foster
The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that incorporate an open laboratory setting are just as effective on student achievement and attitudes as classroom structures that incorporate a closed laboratory setting. The results also suggest that math background is a strong predictor of student achievement in CS 1.
Radiological interpretation of images displayed on tablet computers: a systematic review.
Caffery, L J; Armfield, N R; Smith, A C
2015-06-01
To review the published evidence and to determine if radiological diagnostic accuracy is compromised when images are displayed on a tablet computer and thereby inform practice on using tablet computers for radiological interpretation by on-call radiologists. We searched the PubMed and EMBASE databases for studies on the diagnostic accuracy or diagnostic reliability of images interpreted on tablet computers. Studies were screened for inclusion based on pre-determined inclusion and exclusion criteria. Studies were assessed for quality and risk of bias using Quality Appraisal of Diagnostic Reliability Studies or the revised Quality Assessment of Diagnostic Accuracy Studies tool. Treatment of studies was reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). 11 studies met the inclusion criteria. 10 of these studies tested the Apple iPad(®) (Apple, Cupertino, CA). The included studies reported high sensitivity (84-98%), specificity (74-100%) and accuracy rates (98-100%) for radiological diagnosis. There was no statistically significant difference in accuracy between a tablet computer and a digital imaging and communication in medicine-calibrated control display. There was a near complete consensus from authors on the non-inferiority of diagnostic accuracy of images displayed on a tablet computer. All of the included studies were judged to be at risk of bias. Our findings suggest that the diagnostic accuracy of radiological interpretation is not compromised by using a tablet computer. This result is only relevant to the Apple iPad and to the modalities of CT, MRI and plain radiography. The iPad may be appropriate for an on-call radiologist to use for radiological interpretation.
Inquiry-Based Learning Case Studies for Computing and Computing Forensic Students
ERIC Educational Resources Information Center
Campbell, Jackie
2012-01-01
Purpose: The purpose of this paper is to describe and discuss the use of specifically-developed, inquiry-based learning materials for Computing and Forensic Computing students. Small applications have been developed which require investigation in order to de-bug code, analyse data issues and discover "illegal" behaviour. The applications…
Sequence-structure mapping errors in the PDB: OB-fold domains
Venclovas, Česlovas; Ginalski, Krzysztof; Kang, Chulhee
2004-01-01
The Protein Data Bank (PDB) is the single most important repository of structural data for proteins and other biologically relevant molecules. Therefore, it is critically important to keep the PDB data, as much as possible, error-free. In this study, we have analyzed PDB crystal structures possessing oligonucleotide/oligosaccharide binding (OB)-fold, one of the highly populated folds, for the presence of sequence-structure mapping errors. Using energy-based structure quality assessment coupled with sequence analyses, we have found that there are at least five OB-structures in the PDB that have regions where sequences have been incorrectly mapped onto the structure. We have demonstrated that the combination of these computation techniques is effective not only in detecting sequence-structure mapping errors, but also in providing guidance to correct them. Namely, we have used results of computational analysis to direct a revision of X-ray data for one of the PDB entries containing a fairly inconspicuous sequence-structure mapping error. The revised structure has been deposited with the PDB. We suggest use of computational energy assessment and sequence analysis techniques to facilitate structure determination when homologs having known structure are available to use as a reference. Such computational analysis may be useful in either guiding the sequence-structure assignment process or verifying the sequence mapping within poorly defined regions. PMID:15133161
Grammatical analysis as a distributed neurobiological function.
Bozic, Mirjana; Fonteneau, Elisabeth; Su, Li; Marslen-Wilson, William D
2015-03-01
Language processing engages large-scale functional networks in both hemispheres. Although it is widely accepted that left perisylvian regions have a key role in supporting complex grammatical computations, patient data suggest that some aspects of grammatical processing could be supported bilaterally. We investigated the distribution and the nature of grammatical computations across language processing networks by comparing two types of combinatorial grammatical sequences--inflectionally complex words and minimal phrases--and contrasting them with grammatically simple words. Novel multivariate analyses revealed that they engage a coalition of separable subsystems: inflected forms triggered left-lateralized activation, dissociable into dorsal processes supporting morphophonological parsing and ventral, lexically driven morphosyntactic processes. In contrast, simple phrases activated a consistently bilateral pattern of temporal regions, overlapping with inflectional activations in L middle temporal gyrus. These data confirm the role of the left-lateralized frontotemporal network in supporting complex grammatical computations. Critically, they also point to the capacity of bilateral temporal regions to support simple, linear grammatical computations. This is consistent with a dual neurobiological framework where phylogenetically older bihemispheric systems form part of the network that supports language function in the modern human, and where significant capacities for language comprehension remain intact even following severe left hemisphere damage. Copyright © 2014 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
Modality-independent coding of spatial layout in the human brain
Wolbers, Thomas; Klatzky, Roberta L.; Loomis, Jack M.; Wutte, Magdalena G.; Giudice, Nicholas A.
2011-01-01
Summary In many non-human species, neural computations of navigational information such as position and orientation are not tied to a specific sensory modality [1, 2]. Rather, spatial signals are integrated from multiple input sources, likely leading to abstract representations of space. In contrast, the potential for abstract spatial representations in humans is not known, as most neuroscientific experiments on human navigation have focused exclusively on visual cues. Here, we tested the modality independence hypothesis with two fMRI experiments that characterized computations in regions implicated in processing spatial layout [3]. According to the hypothesis, such regions should be recruited for spatial computation of 3-D geometric configuration, independent of a specific sensory modality. In support of this view, sighted participants showed strong activation of the parahippocampal place area (PPA) and the retrosplenial cortex (RSC) for visual and haptic exploration of information-matched scenes but not objects. Functional connectivity analyses suggested that these effects were not related to visual recoding, which was further supported by a similar preference for haptic scenes found with blind participants. Taken together, these findings establish the PPA/RSC network as critical in modality-independent spatial computations and provide important evidence for a theory of high-level abstract spatial information processing in the human brain. PMID:21620708
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tournier, J.; El-Genk, M.S.; Huang, L.
1999-01-01
The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tournier, J.; El-Genk, M.S.; Huang, L.
1999-01-01
The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less
Murphy, Suzanne M; Faulkner, Dorothy M; Reynolds, Laura R
2014-11-01
An intervention aiming to support children with social communication difficulties was tested using a randomised controlled design. Children aged 5-6 years old (n=32) were tested and selected for participation on the basis of their scores on the Test of Pragmatic Skills (TPS) and were then randomly assigned to the intervention arm or to the delayed intervention control group. Following previous research which suggested that computer technology may be particularly useful for this group of children, the intervention included a collaborative computer game which the children played with an adult. Subsequently, children's performance as they played the game with a classmate was observed. Micro-analytic observational methods were used to analyse the audio-recorded interaction of the children as they played. Pre- and post-intervention measures comprised the Test of Pragmatic Skills, children's performance on the computer game and verbal communication measures that the children used during the game. This evaluation of the intervention shows promise. At post-test, the children who had received the intervention, by comparison to the control group who had not, showed significant gains in their scores on the Test of Pragmatic Skills (p=.009, effect size r=-.42), a significant improvement in their performance on the computer game (p=.03, r=-.32) and significantly greater use of high-quality questioning during collaboration (p<.001, r=-.60). Furthermore, the children who received the intervention made significantly more positive statements about the game and about their partners (p=.02, r=-.34) suggesting that the intervention increased their confidence and enjoyment. Copyright © 2014 Elsevier Ltd. All rights reserved.
Elastic-plastic finite-element analyses of thermally cycled single-edge wedge specimens
NASA Technical Reports Server (NTRS)
Kaufman, A.
1982-01-01
Elastic-plastic stress-strain analyses were performed for single-edge wedge alloys subjected to thermal cycling in fluidized beds. Three cases (NASA TAZ-8A alloy under one cycling condition and 316 stainless steel alloy under two cycling conditions) were analyzed by using the MARC nonlinear, finite-element computer program. Elastic solutions from MARC showed good agreement with previously reported solutions that used the NASTRAN and ISO3DQ computer programs. The NASA TAZ-8A case exhibited no plastic strains, and the elastic and elastic-plastic analyses gave identical results. Elastic-plastic analyses of the 316 stainless steel alloy showed plastic strain reversal with a shift of the mean stresses in the compressive direction. The maximum equivalent total strain ranges for these cases were 13 to 22 percent greater than that calculated from elastic analyses.
NASA Technical Reports Server (NTRS)
1980-01-01
Burns & McDonnell Engineering's environmental control study is assisted by NASA's Computer Software Management and Information Center's programs in environmental analyses. Company is engaged primarily in design of such facilities as electrical utilities, industrial plants, wastewater treatment systems, dams and reservoirs and aviation installations. Company also conducts environmental engineering analyses and advises clients as to the environmental considerations of a particular construction project. Company makes use of many COSMIC computer programs which have allowed substantial savings.
Evaluating data mining algorithms using molecular dynamics trajectories.
Tatsis, Vasileios A; Tjortjis, Christos; Tzirakis, Panagiotis
2013-01-01
Molecular dynamics simulations provide a sample of a molecule's conformational space. Experiments on the mus time scale, resulting in large amounts of data, are nowadays routine. Data mining techniques such as classification provide a way to analyse such data. In this work, we evaluate and compare several classification algorithms using three data sets which resulted from computer simulations, of a potential enzyme mimetic biomolecule. We evaluated 65 classifiers available in the well-known data mining toolkit Weka, using 'classification' errors to assess algorithmic performance. Results suggest that: (i) 'meta' classifiers perform better than the other groups, when applied to molecular dynamics data sets; (ii) Random Forest and Rotation Forest are the best classifiers for all three data sets; and (iii) classification via clustering yields the highest classification error. Our findings are consistent with bibliographic evidence, suggesting a 'roadmap' for dealing with such data.
Technology in practice – GP computer use by age.
Henderson, Joan; Pollack, Allan; Gordon, Julie; Miller, Graeme
2014-12-01
Since 2005, more than 95% of general practitioners (GPs) have had access to computers in their clinical work. We have analysed the most recent 2 years of BEACH data (April 2012-March 2014) to determine whether GP age affects clinical computer use.
An autonomous molecular computer for logical control of gene expression.
Benenson, Yaakov; Gil, Binyamin; Ben-Dor, Uri; Adar, Rivka; Shapiro, Ehud
2004-05-27
Early biomolecular computer research focused on laboratory-scale, human-operated computers for complex computational problems. Recently, simple molecular-scale autonomous programmable computers were demonstrated allowing both input and output information to be in molecular form. Such computers, using biological molecules as input data and biologically active molecules as outputs, could produce a system for 'logical' control of biological processes. Here we describe an autonomous biomolecular computer that, at least in vitro, logically analyses the levels of messenger RNA species, and in response produces a molecule capable of affecting levels of gene expression. The computer operates at a concentration of close to a trillion computers per microlitre and consists of three programmable modules: a computation module, that is, a stochastic molecular automaton; an input module, by which specific mRNA levels or point mutations regulate software molecule concentrations, and hence automaton transition probabilities; and an output module, capable of controlled release of a short single-stranded DNA molecule. This approach might be applied in vivo to biochemical sensing, genetic engineering and even medical diagnosis and treatment. As a proof of principle we programmed the computer to identify and analyse mRNA of disease-related genes associated with models of small-cell lung cancer and prostate cancer, and to produce a single-stranded DNA molecule modelled after an anticancer drug.
Examining Computer Gaming Addiction in Terms of Different Variables
ERIC Educational Resources Information Center
Kurt, Adile Askim; Dogan, Ezgi; Erdogmus, Yasemin Kahyaoglu; Emiroglu, Bulent Gursel
2018-01-01
The computer gaming addiction is one of the newer concepts that young generations face and can be defined as the excessive and problematic use of computer games leading to social and/or emotional problems. The purpose of this study is to analyse through variables the computer gaming addiction levels of secondary school students. The research was…
Understanding the Critics of Educational Technology: Gender Inequities and Computers 1983-1993.
ERIC Educational Resources Information Center
Mangione, Melissa
Although many view computers purely as technological tools to be utilized in the classroom and workplace, attention has been drawn to the social differences computers perpetuate, including those of race, class, and gender. This paper focuses on gender and computing by examining recent analyses in regards to content, form, and usage concerns. The…
Computer Access and Computer Use for Science Performance of Racial and Linguistic Minority Students
ERIC Educational Resources Information Center
Chang, Mido; Kim, Sunha
2009-01-01
This study examined the effects of computer access and computer use on the science achievement of elementary school students, with focused attention on the effects for racial and linguistic minority students. The study used the Early Childhood Longitudinal Study (ECLS-K) database and conducted statistical analyses with proper weights and…
Summary of Computer Usage and Inventory of Computer Utilization in Curriculum. FY 1987-88.
ERIC Educational Resources Information Center
Tennessee Univ., Chattanooga. Center of Excellence for Computer Applications.
This report presents the results of a computer usage survey/inventory, the ninth in a series conducted at the University of Tennessee at Chattanooga to obtain information on the changing status of computer usage in the curricula. Data analyses are reported in 11 tables, which include comparisons between annual inventories and demonstrate growth…
The Impact of Secondary History Teachers' Teaching Conceptions on the Classroom Use of Computers
ERIC Educational Resources Information Center
Arancibia Herrera, Marcelo; Badia Garganté, Antoni; Soto Caro, Carmen Paz; Sigerson, Andrew Lee
2018-01-01
During the past 15 years, various studies have described factors affecting the use of computers in the classroom. In analysing factors of influence, many studies have focused on technology-related variables such as computer experience or attitudes toward computers, and others have considered teachers' beliefs as well; most of them have studied…
Investigation of Navier-Stokes Code Verification and Design Optimization
NASA Technical Reports Server (NTRS)
Vaidyanathan, Rajkumar
2004-01-01
With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization study is carried out using a geometric mean approach. Following this, sensitivity analyses with the aid of variance-based non-parametric approach and partial correlation coefficients are conducted using data available from surrogate models of the objectives and the multi-objective optima to identify the contribution of the design variables to the objective variability and to analyze the variability of the design variables and the objectives. In summary the present dissertation offers insight into an improved coarse to fine grid extrapolation technique for Navier-Stokes computations and also suggests tools for a designer to conduct design optimization study and related sensitivity analyses for a given design problem.
NASA Astrophysics Data System (ADS)
Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.
2012-06-01
The introduction of newer joining technologies like the so-called friction-stir welding (FSW) into automotive engineering entails the knowledge of the joint-material microstructure and properties. Since, the development of vehicles (including military vehicles capable of surviving blast and ballistic impacts) nowadays involves extensive use of the computational engineering analyses (CEA), robust high-fidelity material models are needed for the FSW joints. A two-level material-homogenization procedure is proposed and utilized in this study to help manage computational cost and computer storage requirements for such CEAs. The method utilizes experimental (microstructure, microhardness, tensile testing, and x-ray diffraction) data to construct: (a) the material model for each weld zone and (b) the material model for the entire weld. The procedure is validated by comparing its predictions with the predictions of more detailed but more costly computational analyses.
Computational analysis of conserved RNA secondary structure in transcriptomes and genomes.
Eddy, Sean R
2014-01-01
Transcriptomics experiments and computational predictions both enable systematic discovery of new functional RNAs. However, many putative noncoding transcripts arise instead from artifacts and biological noise, and current computational prediction methods have high false positive rates. I discuss prospects for improving computational methods for analyzing and identifying functional RNAs, with a focus on detecting signatures of conserved RNA secondary structure. An interesting new front is the application of chemical and enzymatic experiments that probe RNA structure on a transcriptome-wide scale. I review several proposed approaches for incorporating structure probing data into the computational prediction of RNA secondary structure. Using probabilistic inference formalisms, I show how all these approaches can be unified in a well-principled framework, which in turn allows RNA probing data to be easily integrated into a wide range of analyses that depend on RNA secondary structure inference. Such analyses include homology search and genome-wide detection of new structural RNAs.
Optimal temperature ladders in replica exchange simulations
NASA Astrophysics Data System (ADS)
Denschlag, Robert; Lingenheil, Martin; Tavan, Paul
2009-04-01
In replica exchange simulations, a temperature ladder with N rungs spans a given temperature interval. Considering systems with heat capacities independent of the temperature, here we address the question of how large N should be chosen for an optimally fast diffusion of the replicas through the temperature space. Using a simple example we show that choosing average acceptance probabilities of about 45% and computing N accordingly maximizes the round trip rates r across the given temperature range. This result differs from previous analyses which suggested smaller average acceptance probabilities of about 23%. We show that the latter choice maximizes the ratio r/N instead of r.
Conductance of single microRNAs chains related to the autism spectrum disorder
NASA Astrophysics Data System (ADS)
Oliveira, J. I. N.; Albuquerque, E. L.; Fulco, U. L.; Mauriz, P. W.; Sarmento, R. G.; Caetano, E. W. S.; Freire, V. N.
2014-09-01
The charge transport properties of single-stranded microRNAs (miRNAs) chains associated to autism disorder were investigated. The computations were performed within a tight-binding model, together with a transfer matrix technique, with ionization energies and hopping parameters obtained by quantum chemistry method. Current-voltage (I× V) curves of twelve miRNA chains related to the autism spectrum disorders were calculated and analysed. We have obtained both semiconductor and insulator behavior, and a relationship between the current intensity and the autism-related miRNA bases sequencies, suggesting that a kind of electronic biosensor can be developed to distinguish different profiles of autism disorders.
Configurational entropy and ρ and ϕ mesons production in QCD
NASA Astrophysics Data System (ADS)
Karapetyan, G.
2018-06-01
In the present work the electroproduction for diffractive ρ and ϕ mesons by considering AdS/QCD correspondence and Color Glass Condensate (CGC) approximation are studied with respect to the associated dipole cross section, whose parameters are studied and analysed in the framework of the configurational entropy. Our results suggest different quantum states of the nuclear matter, showing that the extremal points of the nuclear configurational entropy is able to reflect a true description of the ρ and ϕ mesons production, using current data concerning light quark masses. During the computations parameters, obtained in fitting procedure, coincide to the experimental within ∼ 0.1%.
Lindemann, J P; Kern, R; Michaelis, C; Meyer, P; van Hateren, J H; Egelhaaf, M
2003-03-01
A high-speed panoramic visual stimulation device is introduced which is suitable to analyse visual interneurons during stimulation with rapid image displacements as experienced by fast moving animals. The responses of an identified motion sensitive neuron in the visual system of the blowfly to behaviourally generated image sequences are very complex and hard to predict from the established input circuitry of the neuron. This finding suggests that the computational significance of visual interneurons can only be assessed if they are characterised not only by conventional stimuli as are often used for systems analysis, but also by behaviourally relevant input.
Møllersen, Kajsa; Zortea, Maciel; Schopf, Thomas R; Kirchesch, Herbert; Godtliebsen, Fred
2017-01-01
Melanoma is the deadliest form of skin cancer, and early detection is crucial for patient survival. Computer systems can assist in melanoma detection, but are not widespread in clinical practice. In 2016, an open challenge in classification of dermoscopic images of skin lesions was announced. A training set of 900 images with corresponding class labels and semi-automatic/manual segmentation masks was released for the challenge. An independent test set of 379 images, of which 75 were of melanomas, was used to rank the participants. This article demonstrates the impact of ranking criteria, segmentation method and classifier, and highlights the clinical perspective. We compare five different measures for diagnostic accuracy by analysing the resulting ranking of the computer systems in the challenge. Choice of performance measure had great impact on the ranking. Systems that were ranked among the top three for one measure, dropped to the bottom half when changing performance measure. Nevus Doctor, a computer system previously developed by the authors, was used to participate in the challenge, and investigate the impact of segmentation and classifier. The diagnostic accuracy when using an automatic versus the semi-automatic/manual segmentation is investigated. The unexpected small impact of segmentation method suggests that improvements of the automatic segmentation method w.r.t. resemblance to semi-automatic/manual segmentation will not improve diagnostic accuracy substantially. A small set of similar classification algorithms are used to investigate the impact of classifier on the diagnostic accuracy. The variability in diagnostic accuracy for different classifier algorithms was larger than the variability for segmentation methods, and suggests a focus for future investigations. From a clinical perspective, the misclassification of a melanoma as benign has far greater cost than the misclassification of a benign lesion. For computer systems to have clinical impact, their performance should be ranked by a high-sensitivity measure.
Numerical Analyses for Low Reynolds Flow in a Ventricular Assist Device.
Lopes, Guilherme; Bock, Eduardo; Gómez, Luben
2017-06-01
Scientific and technological advances in blood pump developments have been driven by their importance in cardiac patient treatments and in the expansion of life quality in assisted people. To improve and optimize the design and development, numerical tools were incorporated into the analyses of these mechanisms and have become indispensable in their advances. This study analyzes the flow behavior with low impeller Reynolds number, for which there is no consensus on the full development of turbulence in ventricular assist devices (VAD). For supporting analyses, computational numerical simulations were carried out in different scenarios with the same rotation speed. Two modeling approaches were applied: laminar flow and turbulent flow with the standard, RNG and realizable κ - ε; the standard and SST κ - ω models; and Spalart-Allmaras models. The results agree with the literature for VAD and the range for transient flows in stirred tanks with an impeller Reynolds number around 2800 for the tested scenarios. The turbulent models were compared, and it is suggested, based on the expected physical behavior, the use of κ-ε RNG, standard and SST κ-ω, and Spalart-Allmaras models to numerical analyses for low impeller Reynolds numbers according to the tested flow scenarios. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
Integrated Nondestructive Evaluation and Finite Element Analysis Predicts Crack Location and Shape
NASA Technical Reports Server (NTRS)
Abdul-Azia, Ali; Baaklini, George Y.; Trudell, Jeffrey J.
2002-01-01
This study describes the finite-element analyses and the NDE modality undertaken on two flywheel rotors that were spun to burst speed. Computed tomography and dimensional measurements were used to nondestructively evaluate the rotors before and/or after they were spun to the first crack detection. Computed tomography data findings of two- and three-dimensional crack formation were used to conduct finite-element (FEA) and fracture mechanics analyses. A procedure to extend these analyses to estimate the life of these components is also outlined. NDE-FEA results for one of the rotors are presented in the figures. The stress results, which represent the radial stresses in the rim, clearly indicate that the maximum stress region is within the section defined by the computed tomography scan. Furthermore, the NDE data correlate well with the FEA results. In addition, the measurements reported show that the NDE and FEA data are in parallel.
A versatile software package for inter-subject correlation based analyses of fMRI.
Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi
2014-01-01
In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/
A versatile software package for inter-subject correlation based analyses of fMRI
Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi
2014-01-01
In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818
Tomiello, Sara; Schöbi, Dario; Weber, Lilian; Haker, Helene; Sandra, Iglesias; Stephan, Klaas Enno
2018-01-01
Abstract Background Action optimisation relies on learning about past decisions and on accumulated knowledge about the stability of the environment. In Bayesian models of learning, belief updating is informed by multiple, hierarchically related, precision-weighted prediction errors (pwPEs). Recent work suggests that hierarchically different pwPEs may be encoded by specific neurotransmitters such as dopamine (DA) and acetylcholine (ACh). Abnormal dopaminergic and cholinergic modulation of N-methyl-D-aspartate (NMDA) receptors plays a central role in the dysconnection hypothesis, which considers impaired synaptic plasticity a central mechanisms in the pathophysiology of schizophrenia. Methods To probe the dichotomy between DA and ACh and to investigate timing parameters of pwPEs, we tested 74 healthy male volunteers performing a probabilistic reward associative learning task in which the contingency between cues and rewards changed over 160 trials between 0.8 and 0.2. Furthermore, the current study employed pharmacological interventions (amisulpride / biperiden / placebo) and genetic analyses (COMT and ChAT) to probe DA and ACh modulation of these computational quantities. The study was double-blind and between-subject. We inferred, from subject-specific behavioural data, a low-level choice PE about the reward outcome, a high-level PE about the probability of the outcome as well as the respective precision-weights (uncertainties) and used them, in a trial-by-trial analysis, to explain electroencephalogram (EEG) signals (64 channels). Behavioural data was modelled implementing three versions of the Hierarchical Gaussian Filter (HGF), a Rescorla-Wagner model, and a Sutton model with a dynamic learning rate. The computational trajectories of the winning model were used as regressors in single-subject trial-by-trial GLM analyses at the sensor level. The resulting parameter estimates were entered into 2nd-level ANOVAs. The reported results were family-wise error corrected at the peak-level (p<0.05) across the whole brain and time window (outcome phase: 0 - 500ms). Results A three-level HGF best explained the data and was used to compute the computational regressors for EEG analyses. We found a significant interaction between pharmacology and COMT for the high-level precision-weight (uncertainty). Specifically: - At 276 ms after outcome presentation the difference between Met/Met and Val/Met was more positive for amisulpride than for biperiden over occipital electrodes. - At 274ms and 278 ms after outcome presentation the difference between Met/Met and Val/Met was more negative over fronto-temporal electrodes for amisulpride than for placebo, and for amisulpride than for biperiden, respectively. No significant results were detected for the other computational quantities or for the ChAT gene. Discussion The differential effects of pharmacology on the processing of high-level precision-weight (uncertainty) were modulated by the DA-related gene COMT. Previous results linked high-level PEs to the cholinergic basal forebrain. One possible explanation for the current results is that high-level computational quantities are represented in cholinergic regions, which in turn are influenced by dopaminergic projections. In order to disentangle dopaminergic and cholinergic effects on synaptic plasticity further analyses will concentrate on biophysical models (e.g. DCM). This may prove useful in detecting pathophysiological subgroups and might therefore be of high relevance in a clinical setting.
Tom, Jennifer A; Sinsheimer, Janet S; Suchard, Marc A
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework.
Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.
2015-01-01
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework. PMID:26681992
NASA Astrophysics Data System (ADS)
Anderson, Delia Marie Castro
Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in the experimental group, who responded to the use of Internet Resources Survey, were positive (mean of 3.4 on the 4-point scale) toward their use of Internet resources which included the online courseware developed by the researcher. Findings from this study suggest that (1) the digital divide with respect to gender and ethnicity may be narrowing, and (2) students who are exposed to a course that augments computer-driven courseware with traditional teaching methods appear to have less anxiety, have a clearer perception of computer usefulness, and feel that online resources enhance their learning.
An Assessment of the State-of-the-art in Multidisciplinary Aeromechanical Analyses
NASA Technical Reports Server (NTRS)
Datta, Anubhav; Johnson, Wayne
2008-01-01
This paper presents a survey of the current state-of-the-art in multidisciplinary aeromechanical analyses which integrate advanced Computational Structural Dynamics (CSD) and Computational Fluid Dynamics (CFD) methods. The application areas to be surveyed include fixed wing aircraft, turbomachinery, and rotary wing aircraft. The objective of the authors in the present paper, together with a companion paper on requirements, is to lay out a path for a High Performance Computing (HPC) based next generation comprehensive rotorcraft analysis. From this survey of the key technologies in other application areas it is possible to identify the critical technology gaps that stem from unique rotorcraft requirements.
Method of performing computational aeroelastic analyses
NASA Technical Reports Server (NTRS)
Silva, Walter A. (Inventor)
2011-01-01
Computational aeroelastic analyses typically use a mathematical model for the structural modes of a flexible structure and a nonlinear aerodynamic model that can generate a plurality of unsteady aerodynamic responses based on the structural modes for conditions defining an aerodynamic condition of the flexible structure. In the present invention, a linear state-space model is generated using a single execution of the nonlinear aerodynamic model for all of the structural modes where a family of orthogonal functions is used as the inputs. Then, static and dynamic aeroelastic solutions are generated using computational interaction between the mathematical model and the linear state-space model for a plurality of periodic points in time.
Assessment of Computer Literacy of Nurses in Lesotho.
Mugomeri, Eltony; Chatanga, Peter; Maibvise, Charles; Masitha, Matseliso
2016-11-01
Health systems worldwide are moving toward use of information technology to improve healthcare delivery. However, this requires basic computer skills. This study assessed the computer literacy of nurses in Lesotho using a cross-sectional quantitative approach. A structured questionnaire with 32 standardized computer skills was distributed to 290 randomly selected nurses in Maseru District. Univariate and multivariate logistic regression analyses in Stata 13 were performed to identify factors associated with having inadequate computer skills. Overall, 177 (61%) nurses scored below 16 of the 32 skills assessed. Finding hyperlinks on Web pages (63%), use of advanced search parameters (60.2%), and downloading new software (60.1%) proved to be challenging to the highest proportions of nurses. Age, sex, year of obtaining latest qualification, computer experience, and work experience were significantly (P < .05) associated with inadequate computer skills in univariate analysis. However, in multivariate analyses, sex (P = .001), year of obtaining latest qualification (P = .011), and computer experience (P < .001) emerged as significant factors. The majority of nurses in Lesotho have inadequate computer skills, and this is significantly associated with having many years since obtaining their latest qualification, being female, and lack of exposure to computers. These factors should be considered during planning of training curriculum for nurses in Lesotho.
NASA Technical Reports Server (NTRS)
Vetter, A. A.; Maxwell, C. D.; Swean, T. F., Jr.; Demetriades, S. T.; Oliver, D. A.; Bangerter, C. D.
1981-01-01
Data from sufficiently well-instrumented, short-duration experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc., are compared to analyses with multidimensional and time-dependent simulations with the STD/MHD computer codes. These analyses reveal detailed features of major transient events, severe loss mechanisms, and anomalous MHD behavior. In particular, these analyses predicted higher-than-design voltage drops, Hall voltage overshoots, and asymmetric voltage drops before the experimental data were available. The predictions obtained with these analyses are in excellent agreement with the experimental data and the failure predictions are consistent with the experiments. The design of large, high-interaction or advanced MHD experiments will require application of sophisticated, detailed and comprehensive computational procedures in order to account for the critical mechanisms which led to the observed behavior in these experiments.
Comparing Computer-Adaptive and Curriculum-Based Measurement Methods of Assessment
ERIC Educational Resources Information Center
Shapiro, Edward S.; Gebhardt, Sarah N.
2012-01-01
This article reported the concurrent, predictive, and diagnostic accuracy of a computer-adaptive test (CAT) and curriculum-based measurements (CBM; both computation and concepts/application measures) for universal screening in mathematics among students in first through fourth grade. Correlational analyses indicated moderate to strong…
Aviation Technician Training I and Task Analyses: Semester II. Field Review Copy.
ERIC Educational Resources Information Center
Upchurch, Richard
This guide for aviation technician training begins with a course description, resource information, and a course outline. Tasks/competencies are categorized into 16 concept/duty areas: understanding technical symbols and abbreviations; understanding mathematical terms, symbols, and formulas; computing decimals; computing fractions; computing ratio…
48 CFR 252.204-7012 - Safeguarding of unclassified controlled technical information.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... Cyber incident means actions taken through the use of computer networks that result in an actual or... printed within an information system. Technical information means technical data or computer software, as..., catalog-item identifications, data sets, studies and analyses and related information, and computer...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2011 CFR
2011-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2012 CFR
2012-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2014 CFR
2014-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2010 CFR
2010-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...
Ferreira, Nicola; Owen, Adrian; Mohan, Anita; Corbett, Anne; Ballard, Clive
2015-04-01
Emerging literature suggests that lifestyle factors may play an important role in reducing age-related cognitive decline. There have, however, been few studies investigating the role of cognitively stimulating leisure activities in maintaining cognitive health. This study sought to identify changes in cognitive performance with age and to investigate associations of cognitive performance with several key cognitively stimulating leisure activities. Over 65,000 participants provided demographic and lifestyle information and completed tests of grammatical reasoning, spatial working memory, verbal working memory and episodic memory. Regression analyses suggested that frequency of engaging in Sudoku or similar puzzles was significantly positively associated with grammatical reasoning, spatial working memory and episodic memory scores. Furthermore, for participants aged under 65 years, frequency of playing non-cognitive training computer games was also positively associated with performance in the same cognitive domains. The results also suggest that grammatical reasoning and episodic memory are particularly vulnerable to age-related decline. Further investigation to determine the potential benefits of participating in Sudoku puzzles and non-cognitive computer games is indicated, particularly as they are associated with grammatical reasoning and episodic memory, cognitive domains found to be strongly associated with age-related cognitive decline. Results of this study have implications for developing improved guidance for the public regarding the potential value of cognitively stimulating leisure activities. The results also suggest that grammatical reasoning and episodic memory should be targeted in developing appropriate outcome measures to assess efficacy of future interventions, and in developing cognitive training programmes to prevent or delay cognitive decline. Copyright © 2014 John Wiley & Sons, Ltd.
[Diagnostic possibilities of digital volume tomography].
Lemkamp, Michael; Filippi, Andreas; Berndt, Dorothea; Lambrecht, J Thomas
2006-01-01
Cone beam computed tomography allows high quality 3D images of cranio-facial structures. Although detail resolution is increased, x-ray exposition is reduced compared to classic computer tomography. The volume is analysed in three orthogonal plains, which can be rotated independently without quality loss. Cone beam computed tomography seems to be a less expensive and less x-ray exposing alternative to classic computer tomography.
Development of Reduced-Order Models for Aeroelastic and Flutter Prediction Using the CFL3Dv6.0 Code
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Bartels, Robert E.
2002-01-01
A reduced-order model (ROM) is developed for aeroelastic analysis using the CFL3D version 6.0 computational fluid dynamics (CFD) code, recently developed at the NASA Langley Research Center. This latest version of the flow solver includes a deforming mesh capability, a modal structural definition for nonlinear aeroelastic analyses, and a parallelization capability that provides a significant increase in computational efficiency. Flutter results for the AGARD 445.6 Wing computed using CFL3D v6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are then computed using the CFL3Dv6 code and transformed into state-space form. Important numerical issues associated with the computation of the impulse responses are presented. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is used to rapidly compute aeroelastic transients including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly.
Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.
Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn
2016-01-26
Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness.
Park, HaJeung; González, Àlex L; Yildirim, Ilyas; Tran, Tuan; Lohman, Jeremy R; Fang, Pengfei; Guo, Min; Disney, Matthew D
2015-06-23
Spinocerebellar ataxia type 10 (SCA10) is caused by a pentanucleotide repeat expansion of r(AUUCU) within intron 9 of the ATXN10 pre-mRNA. The RNA causes disease by a gain-of-function mechanism in which it inactivates proteins involved in RNA biogenesis. Spectroscopic studies showed that r(AUUCU) repeats form a hairpin structure; however, there were no high-resolution structural models prior to this work. Herein, we report the first crystal structure of model r(AUUCU) repeats refined to 2.8 Å and analysis of the structure via molecular dynamics simulations. The r(AUUCU) tracts adopt an overall A-form geometry in which 3 × 3 nucleotide (5')UCU(3')/(3')UCU(5') internal loops are closed by AU pairs. Helical parameters of the refined structure as well as the corresponding electron density map on the crystallographic model reflect dynamic features of the internal loop. The computational analyses captured dynamic motion of the loop closing pairs, which can form single-stranded conformations with relatively low energies. Overall, the results presented here suggest the possibility for r(AUUCU) repeats to form metastable A-from structures, which can rearrange into single-stranded conformations and attract proteins such as heterogeneous nuclear ribonucleoprotein K (hnRNP K). The information presented here may aid in the rational design of therapeutics targeting this RNA.
Park, HaJeung; González, Àlex L.; Yildirim, Ilyas; Tran, Tuan; Lohman, Jeremy R.; Fang, Pengfei; Guo, Min; Disney, Matthew D.
2016-01-01
Spinocerebellar ataxia type 10 (SCA10) is caused by a pentanucleotide repeat expansion of r(AUUCU) within intron 9 of the ATXN10 pre-mRNA. The RNA causes disease by a gain-of-function mechanism in which it inactivates proteins involved in RNA biogenesis. Spectroscopic studies showed that r(AUUCU) repeats form a hairpin structure; however, there were no high-resolution structural models prior to this work. Herein, we report the first crystal structure of model r(AUUCU) repeats refined to 2.8 Å and analysis of the structure via molecular dynamics simulations. The r(AUUCU) tracts adopt an overall A-form geometry in which 3 × 3 nucleotide 5′UCU3′/3′UCU5′ internal loops are closed by AU pairs. Helical parameters of the refined structure as well as the corresponding electron density map on the crystallographic model reflect dynamic features of the internal loop. The computational analyses captured dynamic motion of the loop closing pairs, which can form single-stranded conformations with relatively low energies. Overall, the results presented here suggest the possibility for r(AUUCU) repeats to form metastable A-from structures, which can rearrange into single-stranded conformations and attract proteins such as heterogeneous nuclear ribonucleoprotein K (hnRNP K). The information presented here may aid in the rational design of therapeutics targeting this RNA. PMID:26039897
Chronotype Is Independently Associated With Glycemic Control in Type 2 Diabetes
Reutrakul, Sirimon; Hood, Megan M.; Crowley, Stephanie J.; Morgan, Mary K.; Teodori, Marsha; Knutson, Kristen L.; Van Cauter, Eve
2013-01-01
OBJECTIVE To examine whether chronotype and daily caloric distribution are associated with glycemic control in patients with type 2 diabetes independently of sleep disturbances. RESEARCH DESIGN AND METHODS Patients with type 2 diabetes had a structured interview and completed questionnaires to collect information on diabetes history and habitual sleep duration, quality, and timing. Shift workers were excluded. A recently validated construct derived from mid-sleep time on weekends was used as an indicator of chronotype. One-day food recall was used to compute the temporal distribution of caloric intake. Hierarchical linear regression analyses controlling for demographic and sleep variables were computed to determine whether chronotype was associated with HbA1c values and whether this association was mediated by a higher proportion of caloric intake at dinner. RESULTS We analyzed 194 completed questionnaires. Multiple regression analyses adjusting for age, sex, race, BMI, insulin use, depressed mood, diabetes complications, and perceived sleep debt found that chronotype was significantly associated with glycemic control (P = 0.001). This association was partially mediated by a greater percentage of total daily calories consumed at dinner. CONCLUSIONS Later chronotype and larger dinner were associated with poorer glycemic control in patients with type 2 diabetes independently of sleep disturbances. These results suggest that chronotype may be predictive of disease outcomes and lend further support to the role of the circadian system in metabolic regulation. PMID:23637357
Yu, Yue; Katiyar, Shashank P; Sundar, Durai; Kaul, Zeenia; Miyako, Eijiro; Zhang, Zhenya; Kaul, Sunil C; Reddel, Roger R; Wadhwa, Renu
2017-04-20
Maintenance of telomere length is the most consistent attribute of cancer cells. Tightly connected to their capacity to overcome replicative mortality, it is achieved either by activation of telomerase or an Alternative mechanism of Lengthening of Telomeres (ALT). Disruption of either of these mechanisms has been shown to induce DNA damage signalling leading to senescence or apoptosis. Telomerase inhibitors are considered as potential anticancer drugs but are ineffective for ALT cancers (~15% of all cancers). Withaferin-A (Wi-A), a major constituent of the medicinal plant, Withania somnifera (Ashwagandha), has been shown to exert anti-tumour activity. However, its effect on either telomerase or ALT mechanisms has not been investigated. Here, by using isogenic cancer cells with/without telomerase, we found that Wi-A caused stronger cytotoxicity to ALT cells. It was associated with inhibition of ALT-associated promyelocytic leukemia nuclear bodies, an established marker of ALT. Comparative analyses of telomerase positive and ALT cells revealed that Wi-A caused stronger telomere dysfunction and upregulation of DNA damage response in ALT cells. Molecular computational and experimental analyses revealed that Wi-A led to Myc-Mad mediated transcriptional suppression of NBS-1, an MRN complex protein that is an essential component of the ALT mechanism. The results suggest that Wi-A could be a new candidate drug for ALT cancers.
Yu, Yue; Katiyar, Shashank P; Sundar, Durai; Kaul, Zeenia; Miyako, Eijiro; Zhang, Zhenya; Kaul, Sunil C; Reddel, Roger R; Wadhwa, Renu
2017-01-01
Maintenance of telomere length is the most consistent attribute of cancer cells. Tightly connected to their capacity to overcome replicative mortality, it is achieved either by activation of telomerase or an Alternative mechanism of Lengthening of Telomeres (ALT). Disruption of either of these mechanisms has been shown to induce DNA damage signalling leading to senescence or apoptosis. Telomerase inhibitors are considered as potential anticancer drugs but are ineffective for ALT cancers (~15% of all cancers). Withaferin-A (Wi-A), a major constituent of the medicinal plant, Withania somnifera (Ashwagandha), has been shown to exert anti-tumour activity. However, its effect on either telomerase or ALT mechanisms has not been investigated. Here, by using isogenic cancer cells with/without telomerase, we found that Wi-A caused stronger cytotoxicity to ALT cells. It was associated with inhibition of ALT-associated promyelocytic leukemia nuclear bodies, an established marker of ALT. Comparative analyses of telomerase positive and ALT cells revealed that Wi-A caused stronger telomere dysfunction and upregulation of DNA damage response in ALT cells. Molecular computational and experimental analyses revealed that Wi-A led to Myc-Mad mediated transcriptional suppression of NBS-1, an MRN complex protein that is an essential component of the ALT mechanism. The results suggest that Wi-A could be a new candidate drug for ALT cancers. PMID:28425984
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dearing, J.F.
The Subchannel Analysis of Blockages in Reactor Elements (SABRE) computer code, developed by the United Kingdom Atomic Energy Authority, is currently the only practical tool available for performing detailed analyses of velocity and temperature fields in the recirculating flow regions downstream of blockages in liquid-metal fast breeder reactor (LMFBR) pin bundles. SABRE is a subchannel analysis code; that is, it accurately represents the complex geometry of nuclear fuel pins arranged on a triangular lattice. The results of SABRE computational models are compared here with temperature data from two out-of-pile 19-pin test bundles from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) Facility atmore » Oak Ridge National Laboratory. One of these bundles has a small central flow blockage (bundle 3A), while the other has a large edge blockage (bundle 5A). Values that give best agreement with experiment for the empirical thermal mixing correlation factor, FMIX, in SABRE are suggested. These values of FMIX are Reynolds-number dependent, however, indicating that the coded turbulent mixing correlation is not appropriate for wire-wrap pin bundles.« less
Memória, Cláudia M; Yassuda, Mônica S; Nakano, Eduardo Y; Forlenza, Orestes V
2014-05-07
ABSTRACT Background: The Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) is a computer-based cognitive screening instrument that involves automated administration and scoring and immediate analyses of test sessions. The objective of this study was to translate and culturally adapt the Brazilian Portuguese version of the CANS-MCI (CANS-MCI-BR) and to evaluate its reliability and validity for the diagnostic screening of MCI and dementia due to Alzheimer's disease. Methods: The test was administered to 97 older adults (mean age 73.41 ± 5.27 years) with at least four years of formal education (mean education 12.23 ± 4.48 years). Participants were classified into three diagnostic groups according to global cognitive status (normal controls, n = 41; MCI, n = 35; AD, n = 21) based on clinical data and formal neuropsychological assessments. Results: The results indicated high internal consistency (Cronbach's α = 0.77) in the total sample. Three-month test-retest reliability correlations were significant and robust (0.875; p < 0.001). A moderate level of concurrent validity was attained relative to the screening test for MCI (MoCA test, r = 0.76, p < 0.001). Confirmatory factor analysis supported the three-factor model of the original test, i.e., memory, language/spatial fluency, and executive function/mental control. Goodness of fit indicators were strong (Bentler Comparative Fit Index = 0.96, Root Mean Square Error of Approximation = 0.09). Receiver operating characteristic curve analyses suggested high sensitivity and specificity (81% and 73% respectively) to screen for possible MCI cases. Conclusions: The CANS-MCI-BR maintains adequate psychometric characteristics that render it suitable to identify elderly adults with probable cognitive impairment to whom a more extensive evaluation by formal neuropsychological tests may be required.
Paulsen, Niels Herluf; Carlsen, Bjarke Bønløkke; Dahl, Jordi Sanchez; Carter-Storch, Rasmus; Christensen, Nicolaj Lyhne; Khurrami, Lida; Møller, Jacob Eifer; Lindholt, Jes Sandal; Diederichsen, Axel Cosmus Pyndt
2016-01-01
Aortic valve calcification (AVC) measured on non-contrast computed tomography (CT) has shown correlation to severity of aortic valve stenosis (AS) and mortality in patients with known AS. The aim of this study was to determine the association of CT verified AVC and subclinical AS in a general population undergoing CT. CT scans from 566 randomly selected male participants (age 65-74) in the Danish cardiovascular screening study (DANCAVAS) were analyzed for AVC. All participants with a moderately or severely increased AVC score (≥300 arbitrary units (AU)) and a matched control group were invited for a supplementary echocardiography. AS was graded by indexed aortic valve area (AVAi) on echocardiography as moderate 0.6-0.85 cm(2)/m(2) and severe < 0.6 cm(2)/m(2), respectively. ROC- and regression analyses were performed. Due to prior valve surgery, and artifacts from ICD leads 16 individuals were excluded from the AVC scoring. Moderate or severe increased AVC was observed in 10.7% (95% CI: 8.4-13.7). Echocardiography was performed in 101 individuals; 32.7% (95% CI: 21.8 to 46.0) with moderate or high AVC score had moderate or severe AS, while none with no or low AVC. A ROC analysis defined an AVC score ≥588 AU to be suggestive of moderate or severe AS (AUC 0.89 ± 0.04, sensitivity 83% and specificity 87%). In the univariate analyses, AVC was the only variable significantly associated with AS. This study indicates an association between CT verified AVC and subclinical AS. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
A computational perspective on autism
Rosenberg, Ari; Patterson, Jaclyn Sky; Angelaki, Dora E.
2015-01-01
Autism is a neurodevelopmental disorder that manifests as a heterogeneous set of social, cognitive, motor, and perceptual symptoms. This system-wide pervasiveness suggests that, rather than narrowly impacting individual systems such as affection or vision, autism may broadly alter neural computation. Here, we propose that alterations in nonlinear, canonical computations occurring throughout the brain may underlie the behavioral characteristics of autism. One such computation, called divisive normalization, balances a neuron’s net excitation with inhibition reflecting the overall activity of the neuronal population. Through neural network simulations, we investigate how alterations in divisive normalization may give rise to autism symptomatology. Our findings show that a reduction in the amount of inhibition that occurs through divisive normalization can account for perceptual consequences of autism, consistent with the hypothesis of an increased ratio of neural excitation to inhibition (E/I) in the disorder. These results thus establish a bridge between an E/I imbalance and behavioral data on autism that is currently absent. Interestingly, our findings implicate the context-dependent, neuronal milieu as a key factor in autism symptomatology, with autism reflecting a less “social” neuronal population. Through a broader discussion of perceptual data, we further examine how altered divisive normalization may contribute to a wide array of the disorder’s behavioral consequences. These analyses show how a computational framework can provide insights into the neural basis of autism and facilitate the generation of falsifiable hypotheses. A computational perspective on autism may help resolve debates within the field and aid in identifying physiological pathways to target in the treatment of the disorder. PMID:26170299
Computer-Based Interaction Analysis with DEGREE Revisited
ERIC Educational Resources Information Center
Barros, B.; Verdejo, M. F.
2016-01-01
We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…
Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for...
The Contribution of Visualization to Learning Computer Architecture
ERIC Educational Resources Information Center
Yehezkel, Cecile; Ben-Ari, Mordechai; Dreyfus, Tommy
2007-01-01
This paper describes a visualization environment and associated learning activities designed to improve learning of computer architecture. The environment, EasyCPU, displays a model of the components of a computer and the dynamic processes involved in program execution. We present the results of a research program that analysed the contribution of…
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2013 CFR
2013-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...
Students' Activity in Computer-Supported Collaborative Problem Solving in Mathematics
ERIC Educational Resources Information Center
Hurme, Tarja-riitta; Jarvela, Sanna
2005-01-01
The purpose of this study was to analyse secondary school students' (N = 16) computer-supported collaborative mathematical problem solving. The problem addressed in the study was: What kinds of metacognitive processes appear during computer-supported collaborative learning in mathematics? Another aim of the study was to consider the applicability…
GPU accelerated dynamic functional connectivity analysis for functional MRI data.
Akgün, Devrim; Sakoğlu, Ünal; Esquivel, Johnny; Adinoff, Bryon; Mete, Mutlu
2015-07-01
Recent advances in multi-core processors and graphics card based computational technologies have paved the way for an improved and dynamic utilization of parallel computing techniques. Numerous applications have been implemented for the acceleration of computationally-intensive problems in various computational science fields including bioinformatics, in which big data problems are prevalent. In neuroimaging, dynamic functional connectivity (DFC) analysis is a computationally demanding method used to investigate dynamic functional interactions among different brain regions or networks identified with functional magnetic resonance imaging (fMRI) data. In this study, we implemented and analyzed a parallel DFC algorithm based on thread-based and block-based approaches. The thread-based approach was designed to parallelize DFC computations and was implemented in both Open Multi-Processing (OpenMP) and Compute Unified Device Architecture (CUDA) programming platforms. Another approach developed in this study to better utilize CUDA architecture is the block-based approach, where parallelization involves smaller parts of fMRI time-courses obtained by sliding-windows. Experimental results showed that the proposed parallel design solutions enabled by the GPUs significantly reduce the computation time for DFC analysis. Multicore implementation using OpenMP on 8-core processor provides up to 7.7× speed-up. GPU implementation using CUDA yielded substantial accelerations ranging from 18.5× to 157× speed-up once thread-based and block-based approaches were combined in the analysis. Proposed parallel programming solutions showed that multi-core processor and CUDA-supported GPU implementations accelerated the DFC analyses significantly. Developed algorithms make the DFC analyses more practical for multi-subject studies with more dynamic analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
An autonomous molecular computer for logical control of gene expression
Benenson, Yaakov; Gil, Binyamin; Ben-Dor, Uri; Adar, Rivka; Shapiro, Ehud
2013-01-01
Early biomolecular computer research focused on laboratory-scale, human-operated computers for complex computational problems1–7. Recently, simple molecular-scale autonomous programmable computers were demonstrated8–15 allowing both input and output information to be in molecular form. Such computers, using biological molecules as input data and biologically active molecules as outputs, could produce a system for ‘logical’ control of biological processes. Here we describe an autonomous biomolecular computer that, at least in vitro, logically analyses the levels of messenger RNA species, and in response produces a molecule capable of affecting levels of gene expression. The computer operates at a concentration of close to a trillion computers per microlitre and consists of three programmable modules: a computation module, that is, a stochastic molecular automaton12–17; an input module, by which specific mRNA levels or point mutations regulate software molecule concentrations, and hence automaton transition probabilities; and an output module, capable of controlled release of a short single-stranded DNA molecule. This approach might be applied in vivo to biochemical sensing, genetic engineering and even medical diagnosis and treatment. As a proof of principle we programmed the computer to identify and analyse mRNA of disease-related genes18–22 associated with models of small-cell lung cancer and prostate cancer, and to produce a single-stranded DNA molecule modelled after an anticancer drug. PMID:15116117
Radiological interpretation of images displayed on tablet computers: a systematic review
Armfield, N R; Smith, A C
2015-01-01
Objective: To review the published evidence and to determine if radiological diagnostic accuracy is compromised when images are displayed on a tablet computer and thereby inform practice on using tablet computers for radiological interpretation by on-call radiologists. Methods: We searched the PubMed and EMBASE databases for studies on the diagnostic accuracy or diagnostic reliability of images interpreted on tablet computers. Studies were screened for inclusion based on pre-determined inclusion and exclusion criteria. Studies were assessed for quality and risk of bias using Quality Appraisal of Diagnostic Reliability Studies or the revised Quality Assessment of Diagnostic Accuracy Studies tool. Treatment of studies was reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). Results: 11 studies met the inclusion criteria. 10 of these studies tested the Apple iPad® (Apple, Cupertino, CA). The included studies reported high sensitivity (84–98%), specificity (74–100%) and accuracy rates (98–100%) for radiological diagnosis. There was no statistically significant difference in accuracy between a tablet computer and a digital imaging and communication in medicine-calibrated control display. There was a near complete consensus from authors on the non-inferiority of diagnostic accuracy of images displayed on a tablet computer. All of the included studies were judged to be at risk of bias. Conclusion: Our findings suggest that the diagnostic accuracy of radiological interpretation is not compromised by using a tablet computer. This result is only relevant to the Apple iPad and to the modalities of CT, MRI and plain radiography. Advances in knowledge: The iPad may be appropriate for an on-call radiologist to use for radiological interpretation. PMID:25882691
Qayumi, A K; Kurihara, Y; Imai, M; Pachev, G; Seo, H; Hoshino, Y; Cheifetz, R; Matsuura, K; Momoi, M; Saleem, M; Lara-Guerra, H; Miki, Y; Kariya, Y
2004-10-01
This study aimed to compare the effects of computer-assisted, text-based and computer-and-text learning conditions on the performances of 3 groups of medical students in the pre-clinical years of their programme, taking into account their academic achievement to date. A fourth group of students served as a control (no-study) group. Participants were recruited from the pre-clinical years of the training programmes in 2 medical schools in Japan, Jichi Medical School near Tokyo and Kochi Medical School near Osaka. Participants were randomly assigned to 4 learning conditions and tested before and after the study on their knowledge of and skill in performing an abdominal examination, in a multiple-choice test and an objective structured clinical examination (OSCE), respectively. Information about performance in the programme was collected from school records and students were classified as average, good or excellent. Student and faculty evaluations of their experience in the study were explored by means of a short evaluation survey. Compared to the control group, all 3 study groups exhibited significant gains in performance on knowledge and performance measures. For the knowledge measure, the gains of the computer-assisted and computer-assisted plus text-based learning groups were significantly greater than the gains of the text-based learning group. The performances of the 3 groups did not differ on the OSCE measure. Analyses of gains by performance level revealed that high achieving students' learning was independent of study method. Lower achieving students performed better after using computer-based learning methods. The results suggest that computer-assisted learning methods will be of greater help to students who do not find the traditional methods effective. Explorations of the factors behind this are a matter for future research.
Nicolakakis, Nektaria; Stock, Susan R; Abrahamowicz, Michal; Kline, Rex; Messing, Karen
2017-11-01
Computer work has been identified as a risk factor for upper extremity musculoskeletal problems (UEMSP). But few studies have investigated how psychosocial and organizational work factors affect this relation. Nor have gender differences in the relation between UEMSP and these work factors been studied. We sought to estimate: (1) the association between UEMSP and a range of physical, psychosocial and organizational work exposures, including the duration of computer work, and (2) the moderating effect of psychosocial work exposures on the relation between computer work and UEMSP. Using 2007-2008 Québec survey data on 2478 workers, we carried out gender-stratified multivariable logistic regression modeling and two-way interaction analyses. In both genders, odds of UEMSP were higher with exposure to high physical work demands and emotionally demanding work. Additionally among women, UEMSP were associated with duration of occupational computer exposure, sexual harassment, tense situations when dealing with clients, high quantitative demands and lack of prospects for promotion, and among men, with low coworker support, episodes of unemployment, low job security and contradictory work demands. Among women, the effect of computer work on UEMSP was considerably increased in the presence of emotionally demanding work, and may also be moderated by low recognition at work, contradictory work demands, and low supervisor support. These results suggest that the relations between UEMSP and computer work are moderated by psychosocial work exposures and that the relations between working conditions and UEMSP are somewhat different for each gender, highlighting the complexity of these relations and the importance of considering gender.
A general concept for consistent documentation of computational analyses
Müller, Fabian; Nordström, Karl; Lengauer, Thomas; Schulz, Marcel H.
2015-01-01
The ever-growing amount of data in the field of life sciences demands standardized ways of high-throughput computational analysis. This standardization requires a thorough documentation of each step in the computational analysis to enable researchers to understand and reproduce the results. However, due to the heterogeneity in software setups and the high rate of change during tool development, reproducibility is hard to achieve. One reason is that there is no common agreement in the research community on how to document computational studies. In many cases, simple flat files or other unstructured text documents are provided by researchers as documentation, which are often missing software dependencies, versions and sufficient documentation to understand the workflow and parameter settings. As a solution we suggest a simple and modest approach for documenting and verifying computational analysis pipelines. We propose a two-part scheme that defines a computational analysis using a Process and an Analysis metadata document, which jointly describe all necessary details to reproduce the results. In this design we separate the metadata specifying the process from the metadata describing an actual analysis run, thereby reducing the effort of manual documentation to an absolute minimum. Our approach is independent of a specific software environment, results in human readable XML documents that can easily be shared with other researchers and allows an automated validation to ensure consistency of the metadata. Because our approach has been designed with little to no assumptions concerning the workflow of an analysis, we expect it to be applicable in a wide range of computational research fields. Database URL: http://deep.mpi-inf.mpg.de/DAC/cmds/pub/pyvalid.zip PMID:26055099
A computational study of liposome logic: towards cellular computing from the bottom up
Smaldon, James; Romero-Campero, Francisco J.; Fernández Trillo, Francisco; Gheorghe, Marian; Alexander, Cameron
2010-01-01
In this paper we propose a new bottom-up approach to cellular computing, in which computational chemical processes are encapsulated within liposomes. This “liposome logic” approach (also called vesicle computing) makes use of supra-molecular chemistry constructs, e.g. protocells, chells, etc. as minimal cellular platforms to which logical functionality can be added. Modeling and simulations feature prominently in “top-down” synthetic biology, particularly in the specification, design and implementation of logic circuits through bacterial genome reengineering. The second contribution in this paper is the demonstration of a novel set of tools for the specification, modelling and analysis of “bottom-up” liposome logic. In particular, simulation and modelling techniques are used to analyse some example liposome logic designs, ranging from relatively simple NOT gates and NAND gates to SR-Latches, D Flip-Flops all the way to 3 bit ripple counters. The approach we propose consists of specifying, by means of P systems, gene regulatory network-like systems operating inside proto-membranes. This P systems specification can be automatically translated and executed through a multiscaled pipeline composed of dissipative particle dynamics (DPD) simulator and Gillespie’s stochastic simulation algorithm (SSA). Finally, model selection and analysis can be performed through a model checking phase. This is the first paper we are aware of that brings to bear formal specifications, DPD, SSA and model checking to the problem of modeling target computational functionality in protocells. Potential chemical routes for the laboratory implementation of these simulations are also discussed thus for the first time suggesting a potentially realistic physiochemical implementation for membrane computing from the bottom-up. PMID:21886681
ERIC Educational Resources Information Center
Shade, Daniel D.
1994-01-01
Provides advice and suggestions for educators or parents who are trying to decide what type of computer to buy to run the latest computer software for children. Suggests that purchasers should buy a computer with as large a hard drive as possible, at least 10 megabytes of RAM, and a CD-ROM drive. (MDM)
Data Processing: Fifteen Suggestions for Computer Training in Your Business Education Classes.
ERIC Educational Resources Information Center
Barr, Lowell L.
1980-01-01
Presents 15 suggestions for training business education students in the use of computers. Suggestions involve computer language, method of presentation, laboratory time, programing assignments, instructions and handouts, problem solving, deadlines, reviews, programming concepts, programming logic, documentation, and defensive programming. (CT)
Nee, Derek Evan; Brown, Joshua W.
2013-01-01
Recent theories propose that the prefrontal cortex (PFC) is organized in a hierarchical fashion with more abstract, higher level information represented in anterior regions and more concrete, lower level information represented in posterior regions. This hierarchical organization affords flexible adjustments of action plans based on the context. Computational models suggest that such hierarchical organization in the PFC is achieved through interactions with the basal ganglia (BG) wherein the BG gate relevant contexts into the PFC. Here, we tested this proposal using functional magnetic resonance imaging (fMRI). Participants were scanned while updating working memory (WM) with 2 levels of hierarchical contexts. Consistent with PFC abstraction proposals, higher level context updates involved anterior portions of the PFC (BA 46), whereas lower level context updates involved posterior portions of the PFC (BA 6). Computational models were only partially supported as the BG were sensitive to higher, but not lower level context updates. The posterior parietal cortex (PPC) showed the opposite pattern. Analyses examining changes in functional connectivity confirmed dissociable roles of the anterior PFC–BG during higher level context updates and posterior PFC–PPC during lower level context updates. These results suggest that hierarchical contexts are organized by distinct frontal–striatal and frontal–parietal networks. PMID:22798339
Kuiper, L M; Thijs, A; Smulders, Y M
2012-01-01
The advent of beamer projection of radiological images raises the issue of whether such projection compromises diagnostic accuracy. The purpose of this study was to evaluate whether beamer projection of chest X-rays is inferior to monitor display. We selected 53 chest X-rays with subtle abnormalities and 15 normal X-rays. The images were independently judged by a senior radiologist and a senior pulmonologist with a state-of-art computer monitor. We used their unanimous or consensus judgment as the reference test. Subsequently, four observers (one senior pulmonologist, one senior radiologist and one resident from each speciality) judged these X-rays on a standard clinical computer monitor and with beamer projection. We compared the number of correct results for each method. Overall, the sensitivity and specificity did not differ between monitor and beamer projection. Separate analyses in senior and junior examiners suggested that senior examiners had a moderate loss of diagnostic accuracy (8% lower sensitivity, pp<0.05, and 6% lower specificity, p=ns) associated with the use of beamer projection, whereas juniors showed similar performance on both imaging modalities. These initial data suggest that beamer projection may be associated with a small loss of diagnostic accuracy in specific subgroups of physicians. This finding illustrates the need for more extensive studies.
Hierarchical prediction errors in midbrain and septum during social learning.
Diaconescu, Andreea O; Mathys, Christoph; Weber, Lilian A E; Kasper, Lars; Mauer, Jan; Stephan, Klaas E
2017-04-01
Social learning is fundamental to human interactions, yet its computational and physiological mechanisms are not well understood. One prominent open question concerns the role of neuromodulatory transmitters. We combined fMRI, computational modelling and genetics to address this question in two separate samples (N = 35, N = 47). Participants played a game requiring inference on an adviser's intentions whose motivation to help or mislead changed over time. Our analyses suggest that hierarchically structured belief updates about current advice validity and the adviser's trustworthiness, respectively, depend on different neuromodulatory systems. Low-level prediction errors (PEs) about advice accuracy not only activated regions known to support 'theory of mind', but also the dopaminergic midbrain. Furthermore, PE responses in ventral striatum were influenced by the Met/Val polymorphism of the Catechol-O-Methyltransferase (COMT) gene. By contrast, high-level PEs ('expected uncertainty') about the adviser's fidelity activated the cholinergic septum. These findings, replicated in both samples, have important implications: They suggest that social learning rests on hierarchically related PEs encoded by midbrain and septum activity, respectively, in the same manner as other forms of learning under volatility. Furthermore, these hierarchical PEs may be broadcast by dopaminergic and cholinergic projections to induce plasticity specifically in cortical areas known to represent beliefs about others. © The Author (2017). Published by Oxford University Press.
Hierarchical prediction errors in midbrain and septum during social learning
Mathys, Christoph; Weber, Lilian A. E.; Kasper, Lars; Mauer, Jan; Stephan, Klaas E.
2017-01-01
Abstract Social learning is fundamental to human interactions, yet its computational and physiological mechanisms are not well understood. One prominent open question concerns the role of neuromodulatory transmitters. We combined fMRI, computational modelling and genetics to address this question in two separate samples (N = 35, N = 47). Participants played a game requiring inference on an adviser’s intentions whose motivation to help or mislead changed over time. Our analyses suggest that hierarchically structured belief updates about current advice validity and the adviser’s trustworthiness, respectively, depend on different neuromodulatory systems. Low-level prediction errors (PEs) about advice accuracy not only activated regions known to support ‘theory of mind’, but also the dopaminergic midbrain. Furthermore, PE responses in ventral striatum were influenced by the Met/Val polymorphism of the Catechol-O-Methyltransferase (COMT) gene. By contrast, high-level PEs (‘expected uncertainty’) about the adviser’s fidelity activated the cholinergic septum. These findings, replicated in both samples, have important implications: They suggest that social learning rests on hierarchically related PEs encoded by midbrain and septum activity, respectively, in the same manner as other forms of learning under volatility. Furthermore, these hierarchical PEs may be broadcast by dopaminergic and cholinergic projections to induce plasticity specifically in cortical areas known to represent beliefs about others. PMID:28119508
Lucas, Margery; Koff, Elissa; Grossmith, Samantha; Migliorini, Robyn
2011-06-01
This study assessed the effects of short- and long-term mating contexts on preferences for body characteristics of potential relationship partners in lesbians and heterosexual women. Lesbians (n = 41) rated figure drawings and computer-generated images of women that varied in body fat, waist-to-hip ratio, and breast size; heterosexual women (n = 95) rated computer-generated images of men that varied in muscularity and body fat. Both lesbians and heterosexual women showed a shift in preferences toward more physically attractive partners for shortterm relationships. All body aspects were affected, except that heterosexual women did not show a preference shift for male body fat. The results were interpreted in terms of a mating trade-off strategy in which mate preferences are the consequence of cost/benefit analyses and suggest that preferences for physical attributes of sexual partners may be shared by members of the same sex regardless of sexual orientation.
Fatigue in isometric contraction in a single muscle fibre: a compartmental calcium ion flow model.
Kothiyal, K P; Ibramsha, M
1986-01-01
Fatigue in muscle is a complex biological phenomenon which has so far eluded a definite explanation. Many biochemical and physiological models have been suggested in the literature to account for the decrement in the ability of muscle to sustain a given level of force for a long time. Some of these models have been critically analysed in this paper and are shown to be not able to explain all the experimental observations. A new compartmental model based on the intracellular calcium ion movement in muscle is proposed to study the mechanical responses of a muscle fibre. Computer simulation is performed to obtain model responses in isometric contraction to an impulse and a train of stimuli of long duration. The simulated curves have been compared with experimentally observed mechanical responses of the semitendinosus muscle fibre of Rana pipiens. The comparison of computed and observed responses indicates that the proposed calcium ion model indeed accounts very well for the muscle fatigue.
Xiong, Zhiqiang; Liu, Wei; Zhou, Lei; Zou, Liqiang; Chen, Jun
2016-07-15
It has been revealed that some polyphenols can prevent enzymatic browning caused by polyphenoloxidase (PPO). Apigenin, widely distributed in many fruits and vegetables, is an important bioactive flavonoid compound. In this study, apigenin exhibited a strong inhibitory activity against PPO, and some reagents had synergistic effect with apigenin on inhibiting PPO. Apigenin inhibited PPO activity reversibly in a mixed-type manner. The fact that inactivation rate constant (k) of PPO increased while activation energy (Ea) and thermodynamic parameters (ΔG, ΔH and ΔS) decreased indicated that the thermosensitivity and stability of PPO decreased. The conformational changes of PPO were revealed by fluorescence emission spectra and circular dichroism. Atomic force microscopy observation suggested that the dimension of PPO molecules was larger after interacting with apigenin. Moreover, computational docking simulation indicated that apigenin bound to PPO and inserted into the hydrophobic cavity of PPO to interact with some amino acid residues. Copyright © 2016 Elsevier Ltd. All rights reserved.
patterns of doctor–patient interaction in online environment.
Zummo, Marianna Lya
2015-01-01
This paper questions the nature of the communicative event that takes place in online contexts between doctors and web-users, showing computer-mediated linguistic norms and discussing the nature of the participants’ roles. Based on an analysis of 1005 posts occurring between doctors and the users of health service websites, I analyse how doctor–patient communication is affected by the medium and how health professionals overcome issues concerning the virtual medical visit. Results suggest that (a) online medical answers offer a different service from that expected by users, as doctors cannot always fulfill patient requests, and (b) net consultations use aspects of traditional doctor–patient exchange and yet present a language and a style that are affected by the computer-mediated environment. Additionally, it seems that this new form leads to a different model of doctor–patient relationship. The findings are intended to provide new insights into web-based discourse in doctor–patient communication and to demonstrate the emergence of a new style in medical communication.
Selective updating of working memory content modulates meso-cortico-striatal activity.
Murty, Vishnu P; Sambataro, Fabio; Radulescu, Eugenia; Altamura, Mario; Iudicello, Jennifer; Zoltick, Bradley; Weinberger, Daniel R; Goldberg, Terry E; Mattay, Venkata S
2011-08-01
Accumulating evidence from non-human primates and computational modeling suggests that dopaminergic signals arising from the midbrain (substantia nigra/ventral tegmental area) mediate striatal gating of the prefrontal cortex during the selective updating of working memory. Using event-related functional magnetic resonance imaging, we explored the neural mechanisms underlying the selective updating of information stored in working memory. Participants were scanned during a novel working memory task that parses the neurophysiology underlying working memory maintenance, overwriting, and selective updating. Analyses revealed a functionally coupled network consisting of a midbrain region encompassing the substantia nigra/ventral tegmental area, caudate, and dorsolateral prefrontal cortex that was selectively engaged during working memory updating compared to the overwriting and maintenance of working memory content. Further analysis revealed differential midbrain-dorsolateral prefrontal interactions during selective updating between low-performing and high-performing individuals. These findings highlight the role of this meso-cortico-striatal circuitry during the selective updating of working memory in humans, which complements previous research in behavioral neuroscience and computational modeling. Published by Elsevier Inc.
An unusual exostotic lesion of the maxillary sinus from Roman Lincoln.
Kendall, Ross; Kendall, Ellen J; Macleod, Iain; Gowland, Rebecca; Beaumont, Julia
2015-12-01
This report provides a differential diagnosis of an exostotic bony lesion within the left maxillary sinus of a Romano-British (3rd to 4th century AD) adult male from Newport, Lincoln. Macroscopic, radiographic, and cone beam computed tomography (CBCT) analyses suggest that the lesion is likely of odontogenic origin. The overall size of the lesion and areas of sclerosis and radiolucency, together with its hypothesised odontogenic origin, suggest that the lesion represents a chronic exostotic osteomyelitic reaction to the presence of odontogenic bacteria. While modern case studies of odontogenic maxillary sinus osteomyelitis are noteworthy, published cases of this condition are extremely rare in an archaeological context and may be underreported due to the enclosed nature of the sinuses. Such infections may have serious implications for individual and population health, and non-destructive investigation should be considered in cases where significant maxillary caries are present. Copyright © 2015 Elsevier Inc. All rights reserved.
Chen, C N; Su, Y; Baybayan, P; Siruno, A; Nagaraja, R; Mazzarella, R; Schlessinger, D; Chen, E
1996-01-01
Ordered shotgun sequencing (OSS) has been successfully carried out with an Xq25 YAC substrate. yWXD703 DNA was subcloned into lambda phage and sequences of insert ends of the lambda subclones were used to generate a map to select a minimum tiling path of clones to be completely sequenced. The sequence of 135 038 nt contains the entire ANT2 cDNA as well as four other candidates suggested by computer-assisted analyses. One of the putative genes is homologous to a gene implicated in Graves' disease and it, ANT2 and two others are confirmed by EST matches. The results suggest that OSS can be applied to YACs in accord with earlier simulations and further indicate that the sequence of the YAC accurately reflects the sequence of uncloned human DNA. PMID:8918809
Computer assisted holographic moire contouring
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.
2000-01-01
Theoretical analyses and experimental results on holographic moire contouring on diffusely reflecting objects are presented. The sensitivity and limitations of the method are discussed. Particular emphasis is put on computer-assisted data retrieval, processing, and recording.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spycher, Nicolas; Peiffer, Loic; Finsterle, Stefan
GeoT implements the multicomponent geothermometry method developed by Reed and Spycher (1984, Geochim. Cosmichim. Acta 46 513–528) into a stand-alone computer program, to ease the application of this method and to improve the prediction of geothermal reservoir temperatures using full and integrated chemical analyses of geothermal fluids. Reservoir temperatures are estimated from statistical analyses of mineral saturation indices computed as a function of temperature. The reconstruction of the deep geothermal fluid compositions, and geothermometry computations, are all implemented into the same computer program, allowing unknown or poorly constrained input parameters to be estimated by numerical optimization using existing parameter estimationmore » software, such as iTOUGH2, PEST, or UCODE. This integrated geothermometry approach presents advantages over classical geothermometers for fluids that have not fully equilibrated with reservoir minerals and/or that have been subject to processes such as dilution and gas loss.« less
A Bayesian model averaging approach with non-informative priors for cost-effectiveness analyses.
Conigliani, Caterina
2010-07-20
We consider the problem of assessing new and existing technologies for their cost-effectiveness in the case where data on both costs and effects are available from a clinical trial, and we address it by means of the cost-effectiveness acceptability curve. The main difficulty in these analyses is that cost data usually exhibit highly skew and heavy-tailed distributions, so that it can be extremely difficult to produce realistic probabilistic models for the underlying population distribution. Here, in order to integrate the uncertainty about the model into the analysis of cost data and into cost-effectiveness analyses, we consider an approach based on Bayesian model averaging (BMA) in the particular case of weak prior informations about the unknown parameters of the different models involved in the procedure. The main consequence of this assumption is that the marginal densities required by BMA are undetermined. However, in accordance with the theory of partial Bayes factors and in particular of fractional Bayes factors, we suggest replacing each marginal density with a ratio of integrals that can be efficiently computed via path sampling. Copyright (c) 2010 John Wiley & Sons, Ltd.
Clustering of dietary intake and sedentary behavior in 2-year-old children.
Gubbels, Jessica S; Kremers, Stef P J; Stafleu, Annette; Dagnelie, Pieter C; de Vries, Sanne I; de Vries, Nanne K; Thijs, Carel
2009-08-01
To examine clustering of energy balance-related behaviors (EBRBs) in young children. This is crucial because lifestyle habits are formed at an early age and track in later life. This study is the first to examine EBRB clustering in children as young as 2 years. Cross-sectional data originated from the Child, Parent and Health: Lifestyle and Genetic Constitution (KOALA) Birth Cohort Study. Parents of 2578 2-year-old children completed a questionnaire. Correlation analyses, principal component analyses, and linear regression analyses were performed to examine clustering of EBRBs. We found modest but consistent correlations in EBRBs. Two clusters emerged: a "sedentary-snacking cluster" and a "fiber cluster." Television viewing clustered with computer use and unhealthy dietary behaviors. Children who frequently consumed vegetables also consumed fruit and brown bread more often and white bread less often. Lower maternal education and maternal obesity were associated with high scores on the sedentary-snacking cluster, whereas higher educational level was associated with high fiber cluster scores. Obesity-prone behavioral clusters are already visible in 2-year-old children and are related to maternal characteristics. The findings suggest that obesity prevention should apply an integrated approach to physical activity and dietary intake in early childhood.
Su, Andreas A. H.; Tripp, Vanessa; Randau, Lennart
2013-01-01
The methanogenic archaeon Methanopyrus kandleri grows near the upper temperature limit for life. Genome analyses revealed strategies to adapt to these harsh conditions and elucidated a unique transfer RNA (tRNA) C-to-U editing mechanism at base 8 for 30 different tRNA species. Here, RNA-Seq deep sequencing methodology was combined with computational analyses to characterize the small RNome of this hyperthermophilic organism and to obtain insights into the RNA metabolism at extreme temperatures. A large number of 132 small RNAs were identified that guide RNA modifications, which are expected to stabilize structured RNA molecules. The C/D box guide RNAs were shown to exist as circular RNA molecules. In addition, clustered regularly interspaced short palindromic repeats RNA processing and potential regulatory RNAs were identified. Finally, the identification of tRNA precursors before and after the unique C8-to-U8 editing activity enabled the determination of the order of tRNA processing events with termini truncation preceding intron removal. This order of tRNA maturation follows the compartmentalized tRNA processing order found in Eukaryotes and suggests its conservation during evolution. PMID:23620296
The application of computer-aided technologies in automotive styling design
NASA Astrophysics Data System (ADS)
Zheng, Ze-feng; Zhang, Ji; Zheng, Ying
2012-04-01
In automotive industry, outline design is its life and creative design is its soul indeed. Computer-aided technology has been widely used in the automotive industry and more and more attention has been paid. This paper chiefly introduce the application of computer-aided technologies including CAD, CAM and CAE, analyses the process of automotive structural design and describe the development tendency of computer-aided design.
Travers, Timothy; Wang, Katherine J.; Lopez, Cesar A.; ...
2018-02-09
Gram-negative multidrug resistance currently presents a serious threat to public health with infections effectively rendered untreatable. Multiple molecular mechanisms exist that cause antibiotic resistance and in addition, the last three decades have seen slowing rates of new drug development. In this paper, we summarize the use of various computational techniques for investigating the mechanisms of multidrug resistance mediated by Gram-negative tripartite efflux pumps and membranes. Recent work in our lab combines data-driven sequence and structure analyses to study the interactions and dynamics of these bacterial components. Computational studies can complement experimental methodologies for gaining crucial insights into combatting multidrug resistance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Travers, Timothy; Wang, Katherine J.; Lopez, Cesar A.
Gram-negative multidrug resistance currently presents a serious threat to public health with infections effectively rendered untreatable. Multiple molecular mechanisms exist that cause antibiotic resistance and in addition, the last three decades have seen slowing rates of new drug development. In this paper, we summarize the use of various computational techniques for investigating the mechanisms of multidrug resistance mediated by Gram-negative tripartite efflux pumps and membranes. Recent work in our lab combines data-driven sequence and structure analyses to study the interactions and dynamics of these bacterial components. Computational studies can complement experimental methodologies for gaining crucial insights into combatting multidrug resistance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hans D. Gougar
The Idaho National Laboratory’s deterministic neutronics analysis codes and methods were applied to the computation of the core multiplication factor of the HTR-Proteus pebble bed reactor critical facility. A combination of unit cell calculations (COMBINE-PEBDAN), 1-D discrete ordinates transport (SCAMP), and nodal diffusion calculations (PEBBED) were employed to yield keff and flux profiles. Preliminary results indicate that these tools, as currently configured and used, do not yield satisfactory estimates of keff. If control rods are not modeled, these methods can deliver much better agreement with experimental core eigenvalues which suggests that development efforts should focus on modeling control rod andmore » other absorber regions. Under some assumptions and in 1D subcore analyses, diffusion theory agrees well with transport. This suggests that developments in specific areas can produce a viable core simulation approach. Some corrections have been identified and can be further developed, specifically: treatment of the upper void region, treatment of inter-pebble streaming, and explicit (multiscale) transport modeling of TRISO fuel particles as a first step in cross section generation. Until corrections are made that yield better agreement with experiment, conclusions from core design and burnup analyses should be regarded as qualitative and not benchmark quality.« less
What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses
Doherty, Katie; Ciaranello, Andrea
2013-01-01
Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788
Efficient generation of low-energy folded states of a model protein
NASA Astrophysics Data System (ADS)
Gordon, Heather L.; Kwan, Wai Kei; Gong, Chunhang; Larrass, Stefan; Rothstein, Stuart M.
2003-01-01
A number of short simulated annealing runs are performed on a highly-frustrated 46-"residue" off-lattice model protein. We perform, in an iterative fashion, a principal component analysis of the 946 nonbonded interbead distances, followed by two varieties of cluster analyses: hierarchical and k-means clustering. We identify several distinct sets of conformations with reasonably consistent cluster membership. Nonbonded distance constraints are derived for each cluster and are employed within a distance geometry approach to generate many new conformations, previously unidentified by the simulated annealing experiments. Subsequent analyses suggest that these new conformations are members of the parent clusters from which they were generated. Furthermore, several novel, previously unobserved structures with low energy were uncovered, augmenting the ensemble of simulated annealing results, and providing a complete distribution of low-energy states. The computational cost of this approach to generating low-energy conformations is small when compared to the expense of further Monte Carlo simulated annealing runs.
On the use of log-transformation vs. nonlinear regression for analyzing biological power laws.
Xiao, Xiao; White, Ethan P; Hooten, Mevin B; Durham, Susan L
2011-10-01
Power-law relationships are among the most well-studied functional relationships in biology. Recently the common practice of fitting power laws using linear regression (LR) on log-transformed data has been criticized, calling into question the conclusions of hundreds of studies. It has been suggested that nonlinear regression (NLR) is preferable, but no rigorous comparison of these two methods has been conducted. Using Monte Carlo simulations, we demonstrate that the error distribution determines which method performs better, with NLR better characterizing data with additive, homoscedastic, normal error and LR better characterizing data with multiplicative, heteroscedastic, lognormal error. Analysis of 471 biological power laws shows that both forms of error occur in nature. While previous analyses based on log-transformation appear to be generally valid, future analyses should choose methods based on a combination of biological plausibility and analysis of the error distribution. We provide detailed guidelines and associated computer code for doing so, including a model averaging approach for cases where the error structure is uncertain.
Analytic and heuristic processing influences on adolescent reasoning and decision-making.
Klaczynski, P A
2001-01-01
The normative/descriptive gap is the discrepancy between actual reasoning and traditional standards for reasoning. The relationship between age and the normative/descriptive gap was examined by presenting adolescents with a battery of reasoning and decision-making tasks. Middle adolescents (N = 76) performed closer to normative ideals than early adolescents (N = 66), although the normative/descriptive gap was large for both groups. Correlational analyses revealed that (1) normative responses correlated positively with each other, (2) nonnormative responses were positively interrelated, and (3) normative and nonnormative responses were largely independent. Factor analyses suggested that performance was based on two processing systems. The "analytic" system operates on "decontextualized" task representations and underlies conscious, computational reasoning. The "heuristic" system operates on "contextualized," content-laden representations and produces "cognitively cheap" responses that sometimes conflict with traditional norms. Analytic processing was more clearly linked to age and to intelligence than heuristic processing. Implications for cognitive development, the competence/performance issue, and rationality are discussed.
Alshaarawy, Omayma; Anthony, James C.
2016-01-01
Background In preclinical animal studies, evidence links cannabis smoking (CS) with hyperphagia, obesity, and insulin resistance. Nonetheless, in humans, CS might protect against type 2 diabetes mellitus (DM). Here, we offer epidemiological estimates from eight independent replications from (1) the National Health and Nutrition Examination Surveys, and (2) the National Surveys on Drug Use and Health (2005-12). Methods For each national survey participant, computer-assisted self-interviews assess CS and physician-diagnosed DM; NHANES provides additional biomarker values and a composite DM diagnosis. Regression analyses produce estimates of CS-DM associations. Meta-analyses summarize the replication estimates. Results Recently active CS and DM are inversely associated. The meta-analytic summary odds ratio is 0.7 (95% CI = 0.6, 0.8). Conclusions Current evidence is too weak for causal inference, but there now is a more stable evidence base for new lines of clinical translational research on a possibly protective (or spurious) CS-DM association suggested in prior research. PMID:25978795
Iron-absorption band analysis for the discrimination of iron-rich zones
NASA Technical Reports Server (NTRS)
Rowan, L. C. (Principal Investigator)
1973-01-01
The author has identified the following significant results. Study has concentrated on the two primary aspects of the project, structural analysis through evaluation of lineaments and circular features and spectral analyses through digital computer-processing techniques. Several previously unrecognized lineaments are mapped which may be the surface manifestations of major fault or fracture zones. Two of these, the Walker Lane and the Midas Trench lineament system, transect the predominantly NNE-NNW-trending moutain ranges for more than 500 km. Correlation of major lineaments with productive mining districts implies a genetic relationship, the 50 circular or elliptical features delineated suggest a related role for Tertiary volcanism. Color-ratio composites have been used to identify limonitic zones and to discriminate mafic and felsic rock by combing diazo color transparencies of three different ratios. EROS Data Center scene identification number for color composite in this report is ER 1 CC 500. Refinement of enhancement procedures for the ratio images is progressing. Fieldwork in coordination with both spectral and structural analyses is underway.
Dimsdale-Zucker, Halle R; Ritchey, Maureen; Ekstrom, Arne D; Yonelinas, Andrew P; Ranganath, Charan
2018-01-18
The hippocampus plays a critical role in spatial and episodic memory. Mechanistic models predict that hippocampal subfields have computational specializations that differentially support memory. However, there is little empirical evidence suggesting differences between the subfields, particularly in humans. To clarify how hippocampal subfields support human spatial and episodic memory, we developed a virtual reality paradigm where participants passively navigated through houses (spatial contexts) across a series of videos (episodic contexts). We then used multivariate analyses of high-resolution fMRI data to identify neural representations of contextual information during recollection. Multi-voxel pattern similarity analyses revealed that CA1 represented objects that shared an episodic context as more similar than those from different episodic contexts. CA23DG showed the opposite pattern, differentiating between objects encountered in the same episodic context. The complementary characteristics of these subfields explain how we can parse our experiences into cohesive episodes while retaining the specific details that support vivid recollection.
A marked correlation function for constraining modified gravity models
NASA Astrophysics Data System (ADS)
White, Martin
2016-11-01
Future large scale structure surveys will provide increasingly tight constraints on our cosmological model. These surveys will report results on the distance scale and growth rate of perturbations through measurements of Baryon Acoustic Oscillations and Redshift-Space Distortions. It is interesting to ask: what further analyses should become routine, so as to test as-yet-unknown models of cosmic acceleration? Models which aim to explain the accelerated expansion rate of the Universe by modifications to General Relativity often invoke screening mechanisms which can imprint a non-standard density dependence on their predictions. This suggests density-dependent clustering as a `generic' constraint. This paper argues that a density-marked correlation function provides a density-dependent statistic which is easy to compute and report and requires minimal additional infrastructure beyond what is routinely available to such survey analyses. We give one realization of this idea and study it using low order perturbation theory. We encourage groups developing modified gravity theories to see whether such statistics provide discriminatory power for their models.
The November 1, 2017 issue of Cancer Research is dedicated to a collection of computational resource papers in genomics, proteomics, animal models, imaging, and clinical subjects for non-bioinformaticists looking to incorporate computing tools into their work. Scientists at Pacific Northwest National Laboratory have developed P-MartCancer, an open, web-based interactive software tool that enables statistical analyses of peptide or protein data generated from mass-spectrometry (MS)-based global proteomics experiments.
Space-Plane Spreadsheet Program
NASA Technical Reports Server (NTRS)
Mackall, Dale
1993-01-01
Basic Hypersonic Data and Equations (HYPERDATA) spreadsheet computer program provides data gained from three analyses of performance of space plane. Equations used to perform analyses derived from Newton's second law of physics, derivation included. First analysis is parametric study of some basic factors affecting ability of space plane to reach orbit. Second includes calculation of thickness of spherical fuel tank. Third produces ratio between volume of fuel and total mass for each of various aircraft. HYPERDATA intended for use on Macintosh(R) series computers running Microsoft Excel 3.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lizcano, D., E-mail: david.lizcano@udima.es, E-mail: mariaaurora.martinez@udima.es; Martínez, A. María, E-mail: david.lizcano@udima.es, E-mail: mariaaurora.martinez@udima.es
Edward Fredkin was an enthusiastic advocate of information-based theoretical physics, who, in the early 1980s, proposed a new theory of physics based on the idea that the universe is ultimately composed of software. According to Fredkin, reality should be considered as being composed not of particles, matter and forces or energy but of bits of data or information modified according to computational rules. Fredkin went on to demonstrate that, while energy is necessary for storing and retrieving information, it can be arbitrarily reduced in order to carry out any particular instance of information processing, and this operation does not havemore » a lower bound. This implies that it is information rather than matter or energy that should be considered at the ultimate fundamental constituent of reality. This possibility had already been suggested by other scientists. Norbert Wiener heralded a fundamental shift from energy to information and suggested that the universe was founded essentially on the transformation of information, not energy. However, Konrad Zuse was the first, back in 1967, to defend the idea that a digital computer is computing the universe. Richard P. Feynman showed this possibility in a similar light in his reflections on how information related to matter and energy. Other pioneering research on the theory of digital physics was published by Kantor in 1977 and more recently by Stephen Wolfram in 2002, who thereby joined the host of voices upholding that it is patterns of information, not matter and energy, that constitute the cornerstones of reality. In this paper, we introduce the use of knowledge management tools for the purpose of analysing this topic.« less
Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing
Nguyen, Nga Thi Thuy; Vincens, Pierre
2018-01-01
Abstract Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). PMID:29087490
Requirements for Next Generation Comprehensive Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Data, Anubhav
2008-01-01
The unique demands of rotorcraft aeromechanics analysis have led to the development of software tools that are described as comprehensive analyses. The next generation of rotorcraft comprehensive analyses will be driven and enabled by the tremendous capabilities of high performance computing, particularly modular and scaleable software executed on multiple cores. Development of a comprehensive analysis based on high performance computing both demands and permits a new analysis architecture. This paper describes a vision of the requirements for this next generation of comprehensive analyses of rotorcraft. The requirements are described and substantiated for what must be included and justification provided for what should be excluded. With this guide, a path to the next generation code can be found.
Voting with Their Seats: Computer Laboratory Design and the Casual User
ERIC Educational Resources Information Center
Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David
2007-01-01
Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…
The Dynamic Geometrisation of Computer Programming
ERIC Educational Resources Information Center
Sinclair, Nathalie; Patterson, Margaret
2018-01-01
The goal of this paper is to explore dynamic geometry environments (DGE) as a type of computer programming language. Using projects created by secondary students in one particular DGE, we analyse the extent to which the various aspects of computational thinking--including both ways of doing things and particular concepts--were evident in their…
The effect of opponent type on human performance in a three-alternative choice task.
Lie, Celia; Baxter, Jennifer; Alsop, Brent
2013-10-01
Adult participants played computerised games of "Paper Scissors Rock". Participants in one group were told that they were playing against the computer, and those in the other group were told that they were playing against another participant in the adjacent room. The participant who won the most games would receive a $50 prize. For both groups however, the opponent's responses (paper, scissors, or rock) were generated by the computer, and the distribution of these responses was varied across four blocks of 126 trials. Results were analysed using the generalised matching law for the three possible pairs of alternatives (paper vs. scissors, paper vs. rock, and scissors vs. rock) across all participants in each group. Overall, significantly higher estimates of sensitivity to the distribution of opponent's responses were obtained from participants who were told their opponent was a computer compared to participants who were told their opponent was another participant. While adding to the existing literature showing that the generalised matching law is an adequate descriptor of human three-alternative choice behaviour, these findings show that external factors such as perceived opponent type can affect the efficacy of reinforcer contingencies on human behaviour. This suggests that generalising the results from tasks performed against a computer to real-life human-to-human interactions warrants some caution. Copyright © 2013 Elsevier B.V. All rights reserved.
Ausems, Marlein; Mesters, Ilse; van Breukelen, Gerard; De Vries, Hein
2002-06-01
Smoking prevention programs usually run during school hours. In our study, an out-of-school program was developed consisting of a computer-tailored intervention aimed at the age group before school transition (11- to 12-year-old elementary schoolchildren). The aim of this study is to evaluate the additional effect of out-of-school smoking prevention. One hundred fifty-six participating schools were randomly allocated to one of four research conditions: (a) the in-school condition, an existing seven-lesson program; (b) the out-of-school condition, three computer-tailored letters sent to the students' homes; (c) the in-school and out-of-school condition, a combined approach; (d) the control condition. Pretest and 6 months follow-up data on smoking initiation and continuation, and data on psychosocial variables were collected from 3,349 students. Control and out-of-school conditions differed regarding posttest smoking initiation (18.1 and 10.4%) and regarding posttest smoking continuation (23.5 and 13.1%). Multilevel logistic regression analyses showed positive effects regarding the out-of-school program. Significant effects were not found regarding the in-school program, nor did the combined approach show stronger effects than the single-method approaches. The findings of this study suggest that smoking prevention trials for elementary schoolchildren can be effective when using out-of-school computer-tailored interventions. Copyright 2002 Elsevier Science (USA).
Petascale supercomputing to accelerate the design of high-temperature alloys
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...
2017-10-25
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
NASA Astrophysics Data System (ADS)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen
2017-12-01
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.
Kautiainen, S; Koivusilta, L; Lintonen, T; Virtanen, S M; Rimpelä, A
2005-08-01
The prevalence of overweight and obesity has increased among children and adolescents, as well as among adults, and television viewing has been suggested as one cause. Playing digital games (video, computer and console games), or using computer may be other sedentary behaviors related to the development of overweight and obesity. To study the relationships of times spent on viewing television, playing digital games and using computer to overweight among Finnish adolescents. Mailed cross-sectional survey. Nationally representative samples of 14-, 16-, and 18-y-old (N=6515, response rate 70%) in 2001. Overweight and obesity were assessed by body mass index (BMI). The respondents reported times spent daily on viewing television, playing digital games (video, computer and console games) and using computer (for e-mail, writing and surfing). Data on timing of biological maturation, intensity of weekly physical activity and family's socio economic status were taken into account in the statistical analyses. Increased times spent on viewing television and using computer were associated with increased prevalence of overweight (obesity inclusive) among girls: compared to girls viewing television <1 h daily, the adjusted odds ratio (OR) for being overweight was 1.4 when spending 1-3 h, and 2.0 when spending > or =4 h daily on viewing television. In girls using computer > or =1 h daily, the OR for being overweight was 1.5 compared to girls using computer <1 h daily. The results were similar in boys, although not statistically significant. Time spent on playing digital games was not associated with overweight. Overweight was associated with using information and communication technology (ICT), but only with certain forms of ICT. Increased use of ICT may be one factor explaining the increased prevalence of overweight and obesity at the population level, at least in girls. Playing digital games was not related to overweight, perhaps by virtue of game playing being less sedentary or related to a different lifestyle than viewing television and using computer.
An Interactive Version of MULR04 With Enhanced Graphic Capability
ERIC Educational Resources Information Center
Burkholder, Joel H.
1978-01-01
An existing computer program for computing multiple regression analyses is made interactive in order to alleviate core storage requirements. Also, some improvements in the graphics aspects of the program are included. (JKS)
Aerodynamic Analyses Requiring Advanced Computers, part 2
NASA Technical Reports Server (NTRS)
1975-01-01
Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.
Assessment of technical condition of concrete pavement by the example of district road
NASA Astrophysics Data System (ADS)
Linek, M.; Nita, P.; Żebrowski, W.; Wolka, P.
2018-05-01
The article presents the comprehensive assessment of concrete pavement condition. Analyses included the district road located in the swietokrzyskie province, used for 11 years. Comparative analyses were conducted twice. The first analysis was carried out after 9 years of pavement operation, in 2015. In order to assess the extent of pavement degradation, the tests were repeated in 2017. Within the scope of field research, the traffic intensity within the analysed road section was determined. Visual assessment of pavement condition was conducted, according to the guidelines included in SOSN-B. Visual assessment can be extended by ground-penetrating radar measurements which allow to provide comprehensive assessment of the occurred structure changes within its entire thickness and length. The assessment included also performance parameters, i.e. pavement regularity, surface roughness and texture. Extension of test results by the assessment of changes in internal structure of concrete composite and structure observations by means of Scanning Electron Microscope allow for the assessment of parameters of internal structure of hardened concrete. Supplementing the observations of internal structure by means of computed tomography scan provides comprehensive information of possible discontinuities and composite structure. According to the analysis of the obtained results, conclusions concerning the analysed pavement condition were reached. It was determined that the pavement is distinguished by high performance parameters, its condition is good and it does not require any repairs. Maintenance treatment was suggested in order to extend the period of proper operation of the analysed pavement.
Classical vs. non-classical pathways of mineral formation (Invited)
NASA Astrophysics Data System (ADS)
De Yoreo, J. J.
2013-12-01
Recent chemical analyses, microscopy studies and computer simulations suggest many minerals nucleate through aggregation of pre-nucleation clusters and grow by particle-mediated processes that involve amorphous or disordered precursors. Still other analyses, both experimental and computational, conclude that even simple mineral systems like calcium carbonate form via a barrier-free process of liquid-liquid separation, which is followed by dehydration of the ion-rich phase to form the solid products. However, careful measurements of calcite nucleation rates on a variety of ionized surfaces give results that are in complete agreement with the expectations of classical nucleation theory, in which clusters growing through ion-by-ion addition overcome a free energy barrier through the natural microscopic density fluctuations of the system. Here the challenge of integrating these seemingly disparate observations and analyses into a coherent picture of mineral formation is addressed by considering the energy barriers to calcite formation predicted by the classical theory and the changes in those barriers brought about by the introduction of interfaces and clusters, both stable and metastable. Results from a suite of in situ TEM, AFM, and optical experiments combined with simulations are used to illustrate the conclusions. The analyses show that the expected barrier to homogeneous calcite nucleation is prohibitive even at concentrations exceeding the solubility limit of amorphous calcium carbonate. However, as demonstrated by experiments on self-assembled monolayers, the introduction of surfaces that moderately decrease the interfacial energy associated with the forming nucleus can reduce the magnitude of the barrier to a level that is easily surmounted under typical laboratory conditions. In the absence of such surfaces, experiments that proceed by continually increasing supersaturation with time can easily by-pass direct nucleation of calcite and open up pathways through all other solid phases, as well as dense liquid phases associated with a spinodal. Simulations predict that this phase boundary lies within the region of the calcium carbonate - water phase diagram accessible at room temperature. AFM and TEM analyses of other mineral systems, particularly calcium phosphate, suggest cluster aggregation can play important roles both in modifying barriers and in biasing pathways towards or away from amorphous phases. Most importantly, analysis of the energetic changes shows that barriers are only reduced if the clusters are metastable relative to the free ions and that the reduction is naturally accompanied by a bias towards formation of amorphous precursors. Finally, results from in situ TEM observations of nanoparticle interactions are used to understand the mechanisms controlling particle-mediated growth following formation of primary nuclei of either crystalline phases or disordered precursors. Measurements of the particle speeds and accelerations are used to estimate the magnitude of the attractive potential that drives particle-particle aggregation.
Computer calculated dose in paediatric prescribing.
Kirk, Richard C; Li-Meng Goh, Denise; Packia, Jeya; Min Kam, Huey; Ong, Benjamin K C
2005-01-01
Medication errors are an important cause of hospital-based morbidity and mortality. However, only a few medication error studies have been conducted in children. These have mainly quantified errors in the inpatient setting; there is very little data available on paediatric outpatient and emergency department medication errors and none on discharge medication. This deficiency is of concern because medication errors are more common in children and it has been suggested that the risk of an adverse drug event as a consequence of a medication error is higher in children than in adults. The aims of this study were to assess the rate of medication errors in predominantly ambulatory paediatric patients and the effect of computer calculated doses on medication error rates of two commonly prescribed drugs. This was a prospective cohort study performed in a paediatric unit in a university teaching hospital between March 2003 and August 2003. The hospital's existing computer clinical decision support system was modified so that doctors could choose the traditional prescription method or the enhanced method of computer calculated dose when prescribing paracetamol (acetaminophen) or promethazine. All prescriptions issued to children (<16 years of age) at the outpatient clinic, emergency department and at discharge from the inpatient service were analysed. A medication error was defined as to have occurred if there was an underdose (below the agreed value), an overdose (above the agreed value), no frequency of administration specified, no dose given or excessive total daily dose. The medication error rates and the factors influencing medication error rates were determined using SPSS version 12. From March to August 2003, 4281 prescriptions were issued. Seven prescriptions (0.16%) were excluded, hence 4274 prescriptions were analysed. Most prescriptions were issued by paediatricians (including neonatologists and paediatric surgeons) and/or junior doctors. The error rate in the children's emergency department was 15.7%, for outpatients was 21.5% and for discharge medication was 23.6%. Most errors were the result of an underdose (64%; 536/833). The computer calculated dose error rate was 12.6% compared with the traditional prescription error rate of 28.2%. Logistical regression analysis showed that computer calculated dose was an important and independent variable influencing the error rate (adjusted relative risk = 0.436, 95% CI 0.336, 0.520, p < 0.001). Other important independent variables were seniority and paediatric training of the person prescribing and the type of drug prescribed. Medication error, especially underdose, is common in outpatient, emergency department and discharge prescriptions. Computer calculated doses can significantly reduce errors, but other risk factors have to be concurrently addressed to achieve maximum benefit.
Evidential reasoning research on intrusion detection
NASA Astrophysics Data System (ADS)
Wang, Xianpei; Xu, Hua; Zheng, Sheng; Cheng, Anyu
2003-09-01
In this paper, we mainly aim at D-S theory of evidence and the network intrusion detection these two fields. It discusses the method how to apply this probable reasoning as an AI technology to the Intrusion Detection System (IDS). This paper establishes the application model, describes the new mechanism of reasoning and decision-making and analyses how to implement the model based on the synscan activities detection on the network. The results suggest that if only rational probability values were assigned at the beginning, the engine can, according to the rules of evidence combination and hierarchical reasoning, compute the values of belief and finally inform the administrators of the qualities of the traced activities -- intrusions, normal activities or abnormal activities.
Birth order, family configuration, and verbal achievement.
Breland, H M
1974-12-01
Two samples of National Merit Scholarship participants test in 1962 and the entire population of almost 800,000 participants tested in 1965 were examined. Consistent effects in all 3 groups were observed with respect to both birth order and family size (1st born and those of smaller families scored higher). Control of both socioeconomic variables and maternal age, by analysis of variance as well as by analysis of covariance, failed to alter the relationships. Stepdown analyses suggested that the effects were due to a verbal component and that no differences were attributable to nonverbal factors. Mean test scores were computed for detailed sibship configurations based on birth order, family size, sibling spacing, and sibling sex.
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Game-based biofeedback for paediatric anxiety and depression
2011-01-01
Twenty-four children and adolescents aged 9–17 who were referred for treatment for anxiety were assigned to either a game-based biofeedback group or a waiting list comparison group. The eight-session biofeedback intervention included psychoeducation, identification of triggers and signs of anxiety, and in vivo practice. The intervention used computer-based gaming technology to teach and practise relaxation. Analyses using ANCOVA revealed significant differences in post-test scores of anxiety and depression measures between the two groups. The intervention group reduced anxiety and depression scores on standardised tests. Findings suggest that biofeedback-assisted relaxation training can be useful in decreasing anxiety and depressive symptoms in anxious youths. PMID:22942901
Qualitative data analysis: conceptual and practical considerations.
Liamputtong, Pranee
2009-08-01
Qualitative inquiry requires that collected data is organised in a meaningful way, and this is referred to as data analysis. Through analytic processes, researchers turn what can be voluminous data into understandable and insightful analysis. This paper sets out the different approaches that qualitative researchers can use to make sense of their data including thematic analysis, narrative analysis, discourse analysis and semiotic analysis and discusses the ways that qualitative researchers can analyse their data. I first discuss salient issues in performing qualitative data analysis, and then proceed to provide some suggestions on different methods of data analysis in qualitative research. Finally, I provide some discussion on the use of computer-assisted data analysis.
Walther, Birte; Morgenstern, Matthis; Hanewinkel, Reiner
2012-01-01
To investigate co-occurrence and shared personality characteristics of problematic computer gaming, problematic gambling and substance use. Cross-sectional survey data were collected from 2,553 German students aged 12-25 years. Self-report measures of substance use (alcohol, tobacco and cannabis), problematic gambling (South Oaks Gambling Screen - Revised for Adolescents, SOGS-RA), problematic computer gaming (Video Game Dependency Scale, KFN-CSAS-II), and of twelve different personality characteristics were obtained. Analyses revealed positive correlations between tobacco, alcohol and cannabis use and a smaller positive correlation between problematic gambling and problematic computer gaming. Problematic computer gaming co-occurred only with cannabis use, whereas problematic gambling was associated with all three types of substance use. Multivariate multilevel analyses showed differential patterns of personality characteristics. High impulsivity was the only personality characteristic associated with all five addictive behaviours. Depression and extraversion were specific to substance users. Four personality characteristics were specifically associated with problematic computer gaming: irritability/aggression, social anxiety, ADHD, and low self-esteem. Problematic gamblers seem to be more similar to substance users than problematic computer gamers. From a personality perspective, results correspond to the inclusion of gambling in the same DSM-V category as substance use and question a one-to-one proceeding for computer gaming. Copyright © 2012 S. Karger AG, Basel.
NASA Technical Reports Server (NTRS)
Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.
1973-01-01
Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.
DMINDA: an integrated web server for DNA motif identification and analyses
Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying
2014-01-01
DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. PMID:24753419
NASA Astrophysics Data System (ADS)
Herrick, Gregory Paul
The quest to accurately capture flow phenomena with length-scales both short and long and to accurately represent complex flow phenomena within disparately sized geometry inspires a need for an efficient, high-fidelity, multi-block structured computational fluid dynamics (CFD) parallel computational scheme. This research presents and demonstrates a more efficient computational method by which to perform multi-block structured CFD parallel computational simulations, thus facilitating higher-fidelity solutions of complicated geometries (due to the inclusion of grids for "small'' flow areas which are often merely modeled) and their associated flows. This computational framework offers greater flexibility and user-control in allocating the resource balance between process count and wall-clock computation time. The principal modifications implemented in this revision consist of a "multiple grid block per processing core'' software infrastructure and an analytic computation of viscous flux Jacobians. The development of this scheme is largely motivated by the desire to simulate axial compressor stall inception with more complete gridding of the flow passages (including rotor tip clearance regions) than has been previously done while maintaining high computational efficiency (i.e., minimal consumption of computational resources), and thus this paradigm shall be demonstrated with an examination of instability in a transonic axial compressor. However, the paradigm presented herein facilitates CFD simulation of myriad previously impractical geometries and flows and is not limited to detailed analyses of axial compressor flows. While the simulations presented herein were technically possible under the previous structure of the subject software, they were much less computationally efficient and thus not pragmatically feasible; the previous research using this software to perform three-dimensional, full-annulus, time-accurate, unsteady, full-stage (with sliding-interface) simulations of rotating stall inception in axial compressors utilized tip clearance periodic models, while the scheme here is demonstrated by a simulation of axial compressor stall inception utilizing gridded rotor tip clearance regions. As will be discussed, much previous research---experimental, theoretical, and computational---has suggested that understanding clearance flow behavior is critical to understanding stall inception, and previous computational research efforts which have used tip clearance models have begged the question, "What about the clearance flows?''. This research begins to address that question.
Davatzikos, Christos
2016-10-01
The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. Copyright © 2016. Published by Elsevier B.V.
Davatzikos, Christos
2017-01-01
The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. PMID:27514582
Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano
2013-01-01
The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard “condition-based” designs, as well as “computational” methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli. PMID:24194828
Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses
Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn
2016-01-01
Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Conclusions Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness. PMID:26813512
Aerodynamic Analyses Requiring Advanced Computers, Part 1
NASA Technical Reports Server (NTRS)
1975-01-01
Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.
Fluid dynamics computer programs for NERVA turbopump
NASA Technical Reports Server (NTRS)
Brunner, J. J.
1972-01-01
During the design of the NERVA turbopump, numerous computer programs were developed for the analyses of fluid dynamic problems within the machine. Program descriptions, example cases, users instructions, and listings for the majority of these programs are presented.
Program For Analysis Of Metal-Matrix Composites
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Mital, S. K.
1994-01-01
METCAN (METal matrix Composite ANalyzer) is computer program used to simulate computationally nonlinear behavior of high-temperature metal-matrix composite structural components in specific applications, providing comprehensive analyses of thermal and mechanical performances. Written in FORTRAN 77.
Analysis hierarchical model for discrete event systems
NASA Astrophysics Data System (ADS)
Ciortea, E. M.
2015-11-01
The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.
TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Mueller, Don; Bowman, Stephen M
2009-01-01
This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity andmore » uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.« less
Application of CFD to a generic hypersonic flight research study
NASA Technical Reports Server (NTRS)
Green, Michael J.; Lawrence, Scott L.; Dilley, Arthur D.; Hawkins, Richard W.; Walker, Mary M.; Oberkampf, William L.
1993-01-01
Computational analyses have been performed for the initial assessment of flight research vehicle concepts that satisfy requirements for potential hypersonic experiments. Results were obtained from independent analyses at NASA Ames, NASA Langley, and Sandia National Labs, using sophisticated time-dependent Navier-Stokes and parabolized Navier-Stokes methods. Careful study of a common problem consisting of hypersonic flow past a slightly blunted conical forebody was undertaken to estimate the level of uncertainty in the computed results, and to assess the capabilities of current computational methods for predicting boundary-layer transition onset. Results of this study in terms of surface pressure and heat transfer comparisons, as well as comparisons of boundary-layer edge quantities and flow-field profiles are presented here. Sensitivities to grid and gas model are discussed. Finally, representative results are presented relating to the use of Computational Fluid Dynamics in the vehicle design and the integration/support of potential experiments.
Consulting room computers and their effect on general practitioner-patient communication.
Noordman, Janneke; Verhaak, Peter; van Beljouw, Ilse; van Dulmen, Sandra
2010-12-01
in the western medical world, computers form part of the standard equipment in the consulting rooms of most GPs. As the use of a computer requires time and attention from GPs, this may well interfere with the communication process. Yet, the information accessed on the computer may also enhance communication. the present study affords insight into the relationship between computer use and GP-patient communication recorded by the same GPs over two periods. videotaped GP consultations collected in 2001 and 2008 were used to observe computer use and GP-patient communication. In addition, patients questionnaires about their experiences with communication by the GP were analysed using multilevel models with patients (Level 1) nested within GPs (Level 2). both in 2008 and in 2001, GPs used their computer in almost every consultation. Still, our study showed a change in computer use by the GPs over time. In addition, the results indicate that computer use is negatively related to some communication aspects: the patient-directed gaze of the GP and the amount of information given by GPs. There is also a negative association between computer use and the body posture of the GP. Computer use by GPs is not associated with other (analysed) non-verbal and verbal behaviour of GPs and patients. Moreover, computer use is scarcely related to patients' experiences with the communication behaviour of the GP. GPs show greater reluctance to use computers in 2008 compared to 2001. Computer use can indeed affect the communication between GPs and patients. Therefore, GPs ought to remain aware of their computer use during consultations and at the same time keep the interaction with the patient alive.
Bunburra Rockhole: Exploring the geology of a new differentiated asteroid
NASA Astrophysics Data System (ADS)
Benedix, G. K.; Bland, P. A.; Friedrich, J. M.; Mittlefehldt, D. W.; Sanborn, M. E.; Yin, Q.-Z.; Greenwood, R. C.; Franchi, I. A.; Bevan, A. W. R.; Towner, M. C.; Perrotta, G. C.; Mertzman, S. A.
2017-07-01
Bunburra Rockhole is the first recovered meteorite of the Desert Fireball Network. We expanded a bulk chemical study of the Bunburra Rockhole meteorite to include major, minor and trace element analyses, as well as oxygen and chromium isotopes, in several different pieces of the meteorite. This was to determine the extent of chemical heterogeneity and constrain the origin of the meteorite. Minor and trace element analyses in all pieces are exactly on the basaltic eucrite trend. Major element analyses show a slight deviation from basaltic eucrite compositions, but not in any systematic pattern. New oxygen isotope analyses on 23 pieces of Bunburra Rockhole shows large variation in both δ17O and δ18O, and both are well outside the HED parent body fractionation line. We present the first Cr isotope results of this rock, which are also distinct from HEDs. Detailed computed tomographic scanning and back-scattered electron mapping do not indicate the presence of any other meteoritic contaminant (contamination is also unlikely based on trace element chemistry). We therefore conclude that Bunburra Rockhole represents a sample of a new differentiated asteroid, one that may have more variable oxygen isotopic compositions than 4 Vesta. The fact that Bunburra Rockhole chemistry falls on the eucrite trend perhaps suggests that multiple objects with basaltic crusts accreted in a similar region of the Solar System.
ERIC Educational Resources Information Center
Enriquez, Judith Guevarra
2010-01-01
In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…
ERIC Educational Resources Information Center
Zigic, Sasha; Lemckert, Charles J.
2007-01-01
The following paper presents a computer-based learning strategy to assist in introducing and teaching water quality modelling to undergraduate civil engineering students. As part of the learning strategy, an interactive computer-based instructional (CBI) aid was specifically developed to assist students to set up, run and analyse the output from a…
A Developmental Scale of Mental Computation with Part-Whole Numbers
ERIC Educational Resources Information Center
Callingham, Rosemary; Watson, Jane
2004-01-01
In this article, data from a study of the mental computation competence of students in grades 3 to 10 are presented. Students responded to mental computation items, presented orally, that included operations applied to fractions, decimals and percents. The data were analysed using Rasch modelling techniques, and a six-level hierarchy of part-whole…
Analysing Test-Takers' Views on a Computer-Based Speaking Test
ERIC Educational Resources Information Center
Amengual-Pizarro, Marian; García-Laborda, Jesús
2017-01-01
This study examines test-takers' views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was…
Numerical Optimization Using Desktop Computers
1980-09-11
concentrating compound parabolic trough solar collector . Thermophysical, geophysical, optical and economic analyses were used to compute a life-cycle...third computer program, NISCO, was developed to model a nonimaging concentrating compound parabolic trough solar collector using thermophysical...concentrating compound parabolic trough Solar Collector . C. OBJECTIVE The objective of this thesis was to develop a system of interactive programs for the Hewlett
ERIC Educational Resources Information Center
Tas, Yasemin; Balgalmis, Esra
2016-01-01
The goal of this study was to describe Turkish mathematics and science teachers' use of computer in their classroom instruction by utilizing TIMSS 2011 data. Analyses results revealed that teachers most frequently used computers for preparation purpose and least frequently used computers for administration. There was no difference in teachers'…
Quinzio, Lorenzo; Blazek, Michael; Hartmann, Bernd; Röhrig, Rainer; Wille, Burkhard; Junger, Axel; Hempelmann, Gunter
2005-01-01
Computers are becoming increasingly visible in operating rooms (OR) and intensive care units (ICU) for use in bedside documentation. Recently, they have been suspected as possibly acting as reservoirs for microorganisms and vehicles for the transfer of pathogens to patients, causing nosocomial infections. The purpose of this study was to examine the microbiological (bacteriological and mycological) contamination of the central unit of computers used in an OR, a surgical and a pediatric ICU of a tertiary teaching hospital. Sterile swab samples were taken from five sites in each of 13 computers stationed at the two ICUs and 12 computers at the OR. Sample sites within the chassis housing of the computer processing unit (CPU) included the CPU fan, ventilator, and metal casing. External sites were the ventilator and the bottom of the computer tower. Quantitative and qualitative microbiological analyses were performed according to commonly used methods. One hundred and ninety sites were cultured for bacteria and fungi. Analyses of swabs taken at five equivalent sites inside and outside the computer chassis did not find any significant-number of potentially pathogenic bacteria or fungi. This can probably be attributed to either the absence or the low number of pathogens detected on the surfaces. Microbial contamination in the CPU of OR and ICU computers is too low for designating them as a reservoir for microorganisms.
Web-Based Integrated Research Environment for Aerodynamic Analyses and Design
NASA Astrophysics Data System (ADS)
Ahn, Jae Wan; Kim, Jin-Ho; Kim, Chongam; Cho, Jung-Hyun; Hur, Cinyoung; Kim, Yoonhee; Kang, Sang-Hyun; Kim, Byungsoo; Moon, Jong Bae; Cho, Kum Won
e-AIRS[1,2], an abbreviation of ‘e-Science Aerospace Integrated Research System,' is a virtual organization designed to support aerodynamic flow analyses in aerospace engineering using the e-Science environment. As the first step toward a virtual aerospace engineering organization, e-AIRS intends to give a full support of aerodynamic research process. Currently, e-AIRS can handle both the computational and experimental aerodynamic research on the e-Science infrastructure. In detail, users can conduct a full CFD (Computational Fluid Dynamics) research process, request wind tunnel experiment, perform comparative analysis between computational prediction and experimental measurement, and finally, collaborate with other researchers using the web portal. The present paper describes those services and the internal architecture of the e-AIRS system.
Butler, Stephen F; Villapiano, Albert; Malinow, Andrew
2009-12-01
People tend to disclose more personal information when communication is mediated through the use of a computer. This study was conducted to examine the impact of this phenomenon on the way respondents answer questions during computer-mediated, self-administration of the Addiction Severity Index (ASI) called the Addiction Severity Index-Multimedia Version((R)) (ASI-MV((R))). A sample of 142 clients in substance abuse treatment was administered the ASI via an interviewer and the computerized ASI-MV((R)), three to five days apart in a counterbalanced order. Seven composite scores were compared between the two test administrations using paired t-tests. Post hoc analyses examined interviewer effects. Comparisons of composite scores for each of the domains between the face-to-face administered and computer-mediated, self-administered ASI revealed that significantly greater problem severity was reported by clients in five of the seven domains during administration of the computer-mediated, self-administered version compared to the trained interviewer version. Item analyses identified certain items as responsible for significant differences, especially those asking clients to rate need for treatment. All items that were significantly different between the two modes of administration revealed greater problem severity reported on the ASI-MV((R)) as compared to the interview administered assessment. Post hoc analyses yielded significant interviewer effects on four of the five domains where differences were observed. These data support a growing literature documenting a tendency for respondents to be more self-disclosing in a computer-mediated format over a face-to-face interview. Differences in interviewer skill in establishing rapport may account for these observations.
Stacey, J.S.; Hope, J.
1975-01-01
A system is described which uses a minicomputer to control a surface ionization mass spectrometer in the peak switching mode, with the object of computing isotopic abundance ratios of elements of geologic interest. The program uses the BASIC language and is sufficiently flexible to be used for multiblock analyses of any spectrum containing from two to five peaks. In the case of strontium analyses, ratios are corrected for rubidium content and normalized for mass spectrometer fractionation. Although almost any minicomputer would be suitable, the model used was the Data General Nova 1210 with 8K memory. Assembly language driver program and interface hardware-descriptions for the Nova 1210 are included.
Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing.
Nguyen, Nga Thi Thuy; Vincens, Pierre; Roest Crollius, Hugues; Louis, Alexandra
2018-01-04
Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
DOT National Transportation Integrated Search
2017-06-30
The ever-increasing processing speed and computational power of computers and simulation systems has led to correspondingly larger, more sophisticated representations of evacuation traffic processes. Today, micro-level analyses can be conducted for m...
NASA Technical Reports Server (NTRS)
Viswanathan, A. V.; Tamekuni, M.
1974-01-01
General-purpose program performs exact instability analyses for structures such as unidirectionally-stiffened, rectangular composite panels. Program was written in FORTRAN IV and COMPASS for CDC-series computers.
Moles, Juan; Wägele, Heike; Ballesteros, Manuel; Pujals, Álvaro; Uhl, Gabriele; Avila, Conxita
2016-01-01
Although several studies are devoted to determining the diversity of Antarctic heterobranch sea slugs, new species are still being discovered. Among nudibranchs, Doto antarctica Eliot, 1907 is the single species of this genus described from Antarctica hitherto, the type locality being the Ross Sea. Doto antarctica was described mainly using external features. During our Antarctic research on marine benthic invertebrates, we found D. antarctica in the Weddell Sea and Bouvet Island, suggesting a circumpolar distribution. Species affiliation is herein supported by molecular analyses using cytochrome c oxidase subunit I, 16S rRNA, and histone H3 markers. We redescribe D. antarctica using histology, micro-computed tomography (micro-CT), and 3D-reconstruction of the internal organs. Moreover, we describe a new, sympatric species, namely D. carinova Moles, Avila & Wägele n. sp., and provide an anatomical comparison between the two Antarctic Doto species. Egg masses in both species are also described here for the first time. We demonstrate that micro-CT is a useful tool for non-destructive anatomical description of valuable specimens. Furthermore, our high resolution micro-CT data reveal that the central nervous system of both Doto species possesses numerous accessory giant cells, suggested to be neurons herein. In addition, the phylogenetic tree of all Doto species sequenced to date suggests a scenario for the evolution of the reproductive system in this genus: bursa copulatrix seems to have been reduced and the acquisition of a distal connection of the oviduct to the nidamental glands is a synapomorphy of the Antarctic Doto species. Overall, the combination of thorough morphological and anatomical description and molecular analyses provides a comprehensive means to characterize and delineate species, thus suggesting evolutionary scenarios.
Wägele, Heike; Ballesteros, Manuel; Pujals, Álvaro; Uhl, Gabriele; Avila, Conxita
2016-01-01
Although several studies are devoted to determining the diversity of Antarctic heterobranch sea slugs, new species are still being discovered. Among nudibranchs, Doto antarctica Eliot, 1907 is the single species of this genus described from Antarctica hitherto, the type locality being the Ross Sea. Doto antarctica was described mainly using external features. During our Antarctic research on marine benthic invertebrates, we found D. antarctica in the Weddell Sea and Bouvet Island, suggesting a circumpolar distribution. Species affiliation is herein supported by molecular analyses using cytochrome c oxidase subunit I, 16S rRNA, and histone H3 markers. We redescribe D. antarctica using histology, micro-computed tomography (micro-CT), and 3D-reconstruction of the internal organs. Moreover, we describe a new, sympatric species, namely D. carinova Moles, Avila & Wägele n. sp., and provide an anatomical comparison between the two Antarctic Doto species. Egg masses in both species are also described here for the first time. We demonstrate that micro-CT is a useful tool for non-destructive anatomical description of valuable specimens. Furthermore, our high resolution micro-CT data reveal that the central nervous system of both Doto species possesses numerous accessory giant cells, suggested to be neurons herein. In addition, the phylogenetic tree of all Doto species sequenced to date suggests a scenario for the evolution of the reproductive system in this genus: bursa copulatrix seems to have been reduced and the acquisition of a distal connection of the oviduct to the nidamental glands is a synapomorphy of the Antarctic Doto species. Overall, the combination of thorough morphological and anatomical description and molecular analyses provides a comprehensive means to characterize and delineate species, thus suggesting evolutionary scenarios. PMID:27411060
Cue reactivity and its inhibition in pathological computer game players.
Lorenz, Robert C; Krüger, Jenny-Kathinka; Neumann, Britta; Schott, Björn H; Kaufmann, Christian; Heinz, Andreas; Wüstenberg, Torsten
2013-01-01
Despite a rising social relevance of pathological computer game playing, it remains unclear whether the neurobiological basis of this addiction-like behavioral disorder and substance-related addiction are comparable. In substance-related addiction, attentional bias and cue reactivity are often observed. We conducted a functional magnetic resonance study using a dot probe paradigm with short-presentation (attentional bias) and long-presentation (cue reactivity) trials in eight male pathological computer game players (PCGPs) and nine healthy controls (HCs). Computer game-related and neutral computer-generated pictures, as well as pictures from the International Affective Picture System with positive and neutral valence, served as stimuli. PCGPs showed an attentional bias toward both game-related and affective stimuli with positive valence. In contrast, HCs showed no attentional bias effect at all. PCGPs showed stronger brain responses in short-presentation trials compared with HCs in medial prefrontal cortex (MPFC) and anterior cingulate gyrus and in long-presentation trials in lingual gyrus. In an exploratory post hoc functional connectivity analyses, for long-presentation trials, connectivity strength was higher between right inferior frontal gyrus, which was associated with inhibition processing in previous studies, and cue reactivity-related regions (left orbitofrontal cortex and ventral striatum) in PCGPs. We observed behavioral and neural effects in PCGPs, which are comparable with those found in substance-related addiction. However, cue-related brain responses were depending on duration of cue presentation. Together with the connectivity result, these findings suggest that top-down inhibitory processes might suppress the cue reactivity-related neural activity in long-presentation trials. © 2012 The Authors, Addiction Biology © 2012 Society for the Study of Addiction.
Marhefka, Stephanie L.; Santamaria, E. Karina; Leu, Cheng-Shiun; Brackis-Cott, Elizabeth; Mellins, Claude Ann
2013-01-01
Computer-assisted interview methods are increasingly popular in the assessment of sensitive behaviors (e.g., substance abuse and sexual behaviors). It has been suggested that the effect of social desirability is diminished when answering via computer, as compared to an interviewer-administered face-to-face (FTF) interview, although studies exploring this hypothesis among adolescents are rare and yield inconsistent findings. This study compared two interview modes among a sample of urban, ethnic-minority, perinatally HIV-exposed U.S. youth (baseline = 148 HIV+, 126 HIV−, ages 9–16 years; follow-up = 120 HIV+, 110 HIV−, ages 10–19 years). Participants were randomly assigned to receive a sexual behavior interview via either Audio Computer-Assisted Self-Interview (ACASI) or FTF interview. The prevalence of several sexual behaviors and participants’ reactions to the interviews were compared. Although higher rates of sexual behaviors were typically reported in the ACASI condition, the differences rarely reached statistical significance, even when limited to demographic subgroups—except for gender. Boys were significantly more likely to report several sexual behaviors in the ACASI condition compared to FTF, whereas among girls no significant differences were found between the two conditions. ACASI-assigned youth rated the interview process as easier and more enjoyable than did FTF-assigned youth, and this was fairly consistent across subgroup analyses as well. We conclude that these more positive reactions to the ACASI interview give that methodology a slight advantage, and boys may disclose more sexual behavior when using computer-assisted interviews. PMID:21604065
Dolezal, Curtis; Marhefka, Stephanie L; Santamaria, E Karina; Leu, Cheng-Shiun; Brackis-Cott, Elizabeth; Mellins, Claude Ann
2012-04-01
Computer-assisted interview methods are increasingly popular in the assessment of sensitive behaviors (e.g., substance abuse and sexual behaviors). It has been suggested that the effect of social desirability is diminished when answering via computer, as compared to an interviewer-administered face-to-face (FTF) interview, although studies exploring this hypothesis among adolescents are rare and yield inconsistent findings. This study compared two interview modes among a sample of urban, ethnic-minority, perinatally HIV-exposed U.S. youth (baseline = 148 HIV+, 126 HIV-, ages 9-16 years; follow-up = 120 HIV+, 110 HIV-, ages 10-19 years). Participants were randomly assigned to receive a sexual behavior interview via either Audio Computer-Assisted Self-Interview (ACASI) or FTF interview. The prevalence of several sexual behaviors and participants' reactions to the interviews were compared. Although higher rates of sexual behaviors were typically reported in the ACASI condition, the differences rarely reached statistical significance, even when limited to demographic subgroups--except for gender. Boys were significantly more likely to report several sexual behaviors in the ACASI condition compared to FTF, whereas among girls no significant differences were found between the two conditions. ACASI-assigned youth rated the interview process as easier and more enjoyable than did FTF-assigned youth, and this was fairly consistent across subgroup analyses as well. We conclude that these more positive reactions to the ACASI interview give that methodology a slight advantage, and boys may disclose more sexual behavior when using computer-assisted interviews.
Mishra, Arjun K; Agnihotri, Pragati; Srivastava, Vijay Kumar; Pratap, J Venkatesh
2015-01-09
Polyamine biosynthesis pathway has long been considered an essential drug target for trypanosomatids including Leishmania. S-adenosylmethionine decarboxylase (AdoMetDc) and spermidine synthase (SpdSyn) are enzymes of this pathway that catalyze successive steps, with the product of the former, decarboxylated S-adenosylmethionine (dcSAM), acting as an aminopropyl donor for the latter enzyme. Here we have explored the possibility of and identified the protein-protein interaction between SpdSyn and AdoMetDc. The protein-protein interaction has been identified using GST pull down assay. Isothermal titration calorimetry reveals that the interaction is thermodynamically favorable. Fluorescence spectroscopy studies also confirms the interaction, with SpdSyn exhibiting a change in tertiary structure with increasing concentrations of AdoMetDc. Size exclusion chromatography suggests the presence of the complex as a hetero-oligomer. Taken together, these results suggest that the enzymes indeed form a heteromer. Computational analyses suggest that this complex differs significantly from the corresponding human complex, implying that this complex could be a better therapeutic target than the individual enzymes. Copyright © 2014 Elsevier Inc. All rights reserved.
A randomized study of internet parent training accessed from community technology centers.
Irvine, A Blair; Gelatt, Vicky A; Hammond, Michael; Seeley, John R
2015-05-01
Behavioral parent training (BPT) has been shown to be efficacious to improve parenting skills for problematic interactions with adolescents displaying oppositional and antisocial behaviors. Some research suggests that support group curricula might be transferred to the Internet, and some studies suggest that other curriculum designs might also be effective. In this research, a BPT program for parents of at-risk adolescents was tested on the Internet in a randomized trial (N = 307) from computer labs at six community technology centers in or near large metropolitan areas. The instructional design was based on asynchronous scenario-based e-learning, rather than a traditional parent training model where presentation of course material builds content sequentially over multiple class sessions. Pretest to 30-day follow-up analyses indicated significant treatment effects on parent-reported discipline style (Parenting Scale, Adolescent version), child behavior (Eyberg Child Behavior Inventory), and on social cognitive theory constructs of intentions and self-efficacy. The effect sizes were small to medium. These findings suggest the potential to provide effective parent training programs on the Internet.
Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca; University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213; University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4
Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respondmore » over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual specimen. Low predicted bone density was lower than actual specimen. Differences were probably due to applied muscle and joint reaction loads, boundary conditions, and values of constants used. Work is underway to study this. Nonetheless, the results demonstrate three dimensional bone remodeling simulation validity and potential. Such adaptive predictions take physiological bone remodeling simulations one step closer to reality. Computational analyses are needed that integrate biological remodeling rules and predict how bone will respond over time. We expect the combination of computational static stress analyses together with adaptive bone remodeling simulations to become effective tools for regenerative medicine research.« less
Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine
NASA Astrophysics Data System (ADS)
Sharma, Gulshan B.; Robertson, Douglas D.
2013-07-01
Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula's material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element's remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual specimen. Low predicted bone density was lower than actual specimen. Differences were probably due to applied muscle and joint reaction loads, boundary conditions, and values of constants used. Work is underway to study this. Nonetheless, the results demonstrate three dimensional bone remodeling simulation validity and potential. Such adaptive predictions take physiological bone remodeling simulations one step closer to reality. Computational analyses are needed that integrate biological remodeling rules and predict how bone will respond over time. We expect the combination of computational static stress analyses together with adaptive bone remodeling simulations to become effective tools for regenerative medicine research.
3D DEM analyses of the 1963 Vajont rock slide
NASA Astrophysics Data System (ADS)
Boon, Chia Weng; Houlsby, Guy; Utili, Stefano
2013-04-01
The 1963 Vajont rock slide has been modelled using the distinct element method (DEM). The open-source DEM code, YADE (Kozicki & Donzé, 2008), was used together with the contact detection algorithm proposed by Boon et al. (2012). The critical sliding friction angle at the slide surface was sought using a strength reduction approach. A shear-softening contact model was used to model the shear resistance of the clayey layer at the slide surface. The results suggest that the critical sliding friction angle can be conservative if stability analyses are calculated based on the peak friction angles. The water table was assumed to be horizontal and the pore pressure at the clay layer was assumed to be hydrostatic. The influence of reservoir filling was marginal, increasing the sliding friction angle by only 1.6˚. The results of the DEM calculations were found to be sensitive to the orientations of the bedding planes and cross-joints. Finally, the failure mechanism was investigated and arching was found to be present at the bend of the chair-shaped slope. References Boon C.W., Houlsby G.T., Utili S. (2012). A new algorithm for contact detection between convex polygonal and polyhedral particles in the discrete element method. Computers and Geotechnics, vol 44, 73-82, doi.org/10.1016/j.compgeo.2012.03.012. Kozicki, J., & Donzé, F. V. (2008). A new open-source software developed for numerical simulations using discrete modeling methods. Computer Methods in Applied Mechanics and Engineering, 197(49-50), 4429-4443.
Nowak, Donald E
2018-06-01
The problem of gambling addiction is especially noteworthy among college students, many of whom have the resources, proximity, free time, and desire to become involved in the myriad options of gambling now available. Although limited attention has been paid specifically to college student gambling in the body of literature, there have been three published meta-analyses estimating the prevalence of probable pathological gambling among college students. The research presented is the largest and most comprehensive, presenting an up-to-date proportion of those students worldwide exhibiting gambling pathology as assessed by the South Oaks Gambling Screen, and is the first to include estimates of sub-clinical problem gambling. A thorough literature review and coding procedure resulted in 124 independent data estimates retrieved from 72 studies conducted between 1987 and the present, surveying 41,989 university students worldwide. The estimated proportion of probable pathological gamblers among students was computed at 6.13%, while the rate of problem gambling was computed at 10.23%. Statistical significance was found in the influence of the percentage of non-white students on pathological gambling rates. The implications of this and other moderator analyses such as age and year of studies, as well as recommendations for future practice in dealing with college students and gambling disorder on campus are outlined and described in detail. Suggestions and rationales for future avenues of research in the area are also described.
Next generation of network medicine: interdisciplinary signaling approaches.
Korcsmaros, Tamas; Schneider, Maria Victoria; Superti-Furga, Giulio
2017-02-20
In the last decade, network approaches have transformed our understanding of biological systems. Network analyses and visualizations have allowed us to identify essential molecules and modules in biological systems, and improved our understanding of how changes in cellular processes can lead to complex diseases, such as cancer, infectious and neurodegenerative diseases. "Network medicine" involves unbiased large-scale network-based analyses of diverse data describing interactions between genes, diseases, phenotypes, drug targets, drug transport, drug side-effects, disease trajectories and more. In terms of drug discovery, network medicine exploits our understanding of the network connectivity and signaling system dynamics to help identify optimal, often novel, drug targets. Contrary to initial expectations, however, network approaches have not yet delivered a revolution in molecular medicine. In this review, we propose that a key reason for the limited impact, so far, of network medicine is a lack of quantitative multi-disciplinary studies involving scientists from different backgrounds. To support this argument, we present existing approaches from structural biology, 'omics' technologies (e.g., genomics, proteomics, lipidomics) and computational modeling that point towards how multi-disciplinary efforts allow for important new insights. We also highlight some breakthrough studies as examples of the potential of these approaches, and suggest ways to make greater use of the power of interdisciplinarity. This review reflects discussions held at an interdisciplinary signaling workshop which facilitated knowledge exchange from experts from several different fields, including in silico modelers, computational biologists, biochemists, geneticists, molecular and cell biologists as well as cancer biologists and pharmacologists.
How to design a single-cell RNA-sequencing experiment: pitfalls, challenges and perspectives.
Dal Molin, Alessandra; Di Camillo, Barbara
2018-01-31
The sequencing of the transcriptome of single cells, or single-cell RNA-sequencing, has now become the dominant technology for the identification of novel cell types in heterogeneous cell populations or for the study of stochastic gene expression. In recent years, various experimental methods and computational tools for analysing single-cell RNA-sequencing data have been proposed. However, most of them are tailored to different experimental designs or biological questions, and in many cases, their performance has not been benchmarked yet, thus increasing the difficulty for a researcher to choose the optimal single-cell transcriptome sequencing (scRNA-seq) experiment and analysis workflow. In this review, we aim to provide an overview of the current available experimental and computational methods developed to handle single-cell RNA-sequencing data and, based on their peculiarities, we suggest possible analysis frameworks depending on specific experimental designs. Together, we propose an evaluation of challenges and open questions and future perspectives in the field. In particular, we go through the different steps of scRNA-seq experimental protocols such as cell isolation, messenger RNA capture, reverse transcription, amplification and use of quantitative standards such as spike-ins and Unique Molecular Identifiers (UMIs). We then analyse the current methodological challenges related to preprocessing, alignment, quantification, normalization, batch effect correction and methods to control for confounding effects. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Yao, Chenmin; Zhou, Liqun; Yang, Hongye; Wang, Yake; Sun, Hualing; Guo, Jingmei; Huang, Cui
2017-04-01
The aim of this study was to investigate the effect of silane pretreatment on the universal adhesive bonding between lithium disilicate glass ceramic and composite resin. IPS e.max ceramic blocks etched with hydrofluoric acid were randomly assigned to one of eight groups treated with one of four universal adhesives (two silane-free adhesives and two silane-containing adhesives), each with or without silane pretreatment. Bonded specimens were stored in water for 24 h. The shear bond strength (SBS) of the ceramic-resin interface was measured to evaluate bond strength, and the debonded interface after the SBS test was analysed using field-emission scanning electron microscopy to determine failure mode. Light microscopy was performed to analyse microleakage and marginal sealing ability. Silane pretreatment significantly and positively influenced SBS and marginal sealing ability. For all the universal adhesive groups, SBS increased and the percentage of microleakage decreased after the pretreatment. Without the pretreatment, SBS and the percentage of microleakage were not significantly different between the silane-containing universal adhesive groups and the silane-free groups. Cohesive failure was the main fracture pattern. The results suggest that additional silane pretreatment can effectively improve the bonding strength and marginal sealing of adhesives to lithium disilicate glass ceramics. The bonding performance of silane-containing universal adhesives without pretreatment is similar to that of silane-free adhesives. © 2017 Eur J Oral Sci.
NASA Astrophysics Data System (ADS)
Elez, Javier; Silva, Pablo G.; Huerta, Pedro; Perucha, M. Ángeles; Civis, Jorge; Roquero, Elvira; Rodríguez-Pascua, Miguel A.; Bardají, Teresa; Giner-Robles, Jorge L.; Martínez-Graña, Antonio
2016-12-01
The Malaga basin contains an important geological record documenting the complex paleogeographic evolution of the Gibraltar Arc before, during and after the closure and desiccation of the Mediterranean Sea triggered by the "Messinian Salinity crisis" (MSC). Proxy paleo-elevation data, estimated from the stratigraphic and geomorphological records, allow the building of quantitative paleogeoid, paleotopographic and paleogeographic models for the three main paleogeographic stages: pre-MSC (Tortonian-early Messinian), syn-MSC (late Messinian) and post-MSC (early Pliocene). The methodological workflow combines classical contouring procedures used in geology and isobase map models from geomorphometric analyses and proxy data overprinted on present Digital Terrain Models. The resulting terrain quantitative models have been arranged, managed and computed in a GIS environment. The computed terrain models enable the exploration of past landscapes usually beyond the reach of classical geomorphological analyses and strongly improve the paleogeographic and paleotopographic knowledge of the study area. The resulting models suggest the occurrence of a set of uplifted littoral erosive and paleokarstic landforms that evolved during pre-MSC times. These uplifted landform assemblages can explain the origin of key elements of the present landscape, such as the Torcal de Antequera and the large amount of mogote-like relict hills present in the zone, in terms of ancient uplifted tropical islands. The most prominent landform is the extensive erosional platform dominating the Betic frontal zone that represents the relic Atlantic wave cut platform elaborated during late-Tortonian to early Messinian times. The amount of uplift derived from paleogeoid models suggests that the area rose by about 340 m during the MSC. This points to isostatic uplift triggered by differential erosional unloading (towards the Mediterranean) as the main factor controlling landscape evolution in the area during and after the MSC. Former littoral landscapes in the old emergent axis of the Gibraltar Arc were uplifted to form the main water-divide of the present Betic Cordillera in the zone.
ERIC Educational Resources Information Center
Mavrou, Katerina
2012-01-01
This paper discusses the results of peer acceptance in a study investigating the interactions of pairs of disabled and non-disabled pupils working together on computer-based tasks in mainstream primary schools in Cyprus. Twenty dyads of pupils were observed and videotaped while working together at the computer. Data analyses were based on the…
ERIC Educational Resources Information Center
Nikolaidou, Georgia N.
2012-01-01
This exploratory work describes and analyses the collaborative interactions that emerge during computer-based music composition in the primary school. The study draws on socio-cultural theories of learning, originated within Vygotsky's theoretical context, and proposes a new model, namely Computer-mediated Praxis and Logos under Synergy (ComPLuS).…
ERIC Educational Resources Information Center
Chou, Huey-Wen; Wang, Yu-Fang
1999-01-01
Compares the effects of two training methods on computer attitude and performance in a World Wide Web page design program in a field experiment with high school students in Taiwan. Discusses individual differences, Kolb's Experiential Learning Theory and Learning Style Inventory, Computer Attitude Scale, and results of statistical analyses.…
ERIC Educational Resources Information Center
Kozbelt, Aaron; Dexter, Scott; Dolese, Melissa; Meredith, Daniel; Ostrofsky, Justin
2015-01-01
We applied computer-based text analyses of regressive imagery to verbal protocols of individuals engaged in creative problem-solving in two domains: visual art (23 experts, 23 novices) and computer programming (14 experts, 14 novices). Percentages of words involving primary process and secondary process thought, plus emotion-related words, were…
An efficient dynamic load balancing algorithm
NASA Astrophysics Data System (ADS)
Lagaros, Nikos D.
2014-01-01
In engineering problems, randomness and uncertainties are inherent. Robust design procedures, formulated in the framework of multi-objective optimization, have been proposed in order to take into account sources of randomness and uncertainty. These design procedures require orders of magnitude more computational effort than conventional analysis or optimum design processes since a very large number of finite element analyses is required to be dealt. It is therefore an imperative need to exploit the capabilities of computing resources in order to deal with this kind of problems. In particular, parallel computing can be implemented at the level of metaheuristic optimization, by exploiting the physical parallelization feature of the nondominated sorting evolution strategies method, as well as at the level of repeated structural analyses required for assessing the behavioural constraints and for calculating the objective functions. In this study an efficient dynamic load balancing algorithm for optimum exploitation of available computing resources is proposed and, without loss of generality, is applied for computing the desired Pareto front. In such problems the computation of the complete Pareto front with feasible designs only, constitutes a very challenging task. The proposed algorithm achieves linear speedup factors and almost 100% speedup factor values with reference to the sequential procedure.
Effect size calculation in meta-analyses of psychotherapy outcome research.
Hoyt, William T; Del Re, A C
2018-05-01
Meta-analysis of psychotherapy intervention research normally examines differences between treatment groups and some form of comparison group (e.g., wait list control; alternative treatment group). The effect of treatment is normally quantified as a standardized mean difference (SMD). We describe procedures for computing unbiased estimates of the population SMD from sample data (e.g., group Ms and SDs), and provide guidance about a number of complications that may arise related to effect size computation. These complications include (a) incomplete data in research reports; (b) use of baseline data in computing SMDs and estimating the population standard deviation (σ); (c) combining effect size data from studies using different research designs; and (d) appropriate techniques for analysis of data from studies providing multiple estimates of the effect of interest (i.e., dependent effect sizes). Clinical or Methodological Significance of this article: Meta-analysis is a set of techniques for producing valid summaries of existing research. The initial computational step for meta-analyses of research on intervention outcomes involves computing an effect size quantifying the change attributable to the intervention. We discuss common issues in the computation of effect sizes and provide recommended procedures to address them.
Computer program (POWREQ) for power requirements of mass transit vehicles
DOT National Transportation Integrated Search
1977-08-01
This project was performed to develop a computer program suitable for use in systematic analyses requiring estimates of the energy requirements of mass transit vehicles as a function of driving schedules and vehicle size, shape, and gross weight. The...
Good coupling for the multiscale patch scheme on systems with microscale heterogeneity
NASA Astrophysics Data System (ADS)
Bunder, J. E.; Roberts, A. J.; Kevrekidis, I. G.
2017-05-01
Computational simulation of microscale detailed systems is frequently only feasible over spatial domains much smaller than the macroscale of interest. The 'equation-free' methodology couples many small patches of microscale computations across space to empower efficient computational simulation over macroscale domains of interest. Motivated by molecular or agent simulations, we analyse the performance of various coupling schemes for patches when the microscale is inherently 'rough'. As a canonical problem in this universality class, we systematically analyse the case of heterogeneous diffusion on a lattice. Computer algebra explores how the dynamics of coupled patches predict the large scale emergent macroscale dynamics of the computational scheme. We determine good design for the coupling of patches by comparing the macroscale predictions from patch dynamics with the emergent macroscale on the entire domain, thus minimising the computational error of the multiscale modelling. The minimal error on the macroscale is obtained when the coupling utilises averaging regions which are between a third and a half of the patch. Moreover, when the symmetry of the inter-patch coupling matches that of the underlying microscale structure, patch dynamics predicts the desired macroscale dynamics to any specified order of error. The results confirm that the patch scheme is useful for macroscale computational simulation of a range of systems with microscale heterogeneity.
On the transonic aerodynamics of a compressor blade row
NASA Technical Reports Server (NTRS)
Erickson, J. C., Jr.; Lordi, J. A.; Rae, W. J.
1971-01-01
Linearized analyses have been carried out for the induced velocity and pressure fields within a compressor blade row operating in an infinite annulus at transonic Mach numbers of the flow relative to the blades. In addition, the relationship between the induced velocity and the shape of the mean blade surface has been determined. A computational scheme has been developed for evaluating the blade mean surface ordinates and surface pressure distributions. The separation of the effects of a specified blade thickness distribution from the effects of a specified distribution of the blade lift has been established. In this way, blade mean surface shapes that are necessary for the blades to be locally nonlifting have been computed and are presented for two examples of blades with biconvex parabolic arc sections of radially tapering thickness. Blade shapes that are required to achieve a zero thickness, uniform chordwise loading, constant work spanwise loading are also presented for two examples. In addition, corresponding surface pressure distributions are given. The flow relative to the blade tips has a high subsonic Mach number in the examples that have been computed. The results suggest that at near-sonic relative tip speeds the effective blade shape is dominated by the thickness distribution, with the lift distribution playing only a minor role.
Zheng, Meixun; Bender, Daniel
2018-03-13
Computer-based testing (CBT) has made progress in health sciences education. In 2015, the authors led implementation of a CBT system (ExamSoft) at a dental school in the U.S. Guided by the Technology Acceptance Model (TAM), the purposes of this study were to (a) examine dental students' acceptance of ExamSoft; (b) understand factors impacting acceptance; and (c) evaluate the impact of ExamSoft on students' learning and exam performance. Survey and focus group data revealed that ExamSoft was well accepted by students as a testing tool and acknowledged by most for its potential to support learning. Regression analyses showed that perceived ease of use and perceived usefulness of ExamSoft significantly predicted student acceptance. Prior CBT experience and computer skills did not significantly predict acceptance of ExamSoft. Students reported that ExamSoft promoted learning in the first program year, primarily through timely and rich feedback on examination performance. t-Tests yielded mixed results on whether students performed better on computerized or paper examinations. The study contributes to the literature on CBT and the application of the TAM model in health sciences education. Findings also suggest ways in which health sciences institutions can implement CBT to maximize its potential as an assessment and learning tool.
Staff, Michael; Roberts, Christopher; March, Lyn
2016-10-01
To describe the completeness of routinely collected primary care data that could be used by computer models to predict clinical outcomes among patients with Type 2 Diabetes (T2D). Data on blood pressure, weight, total cholesterol, HDL-cholesterol and glycated haemoglobin levels for regular patients were electronically extracted from the medical record software of 12 primary care practices in Australia for the period 2000-2012. The data was analysed for temporal trends and for associations between patient characteristics and completeness. General practitioners were surveyed to identify barriers to recording data and strategies to improve its completeness. Over the study period data completeness improved up to around 80% complete although the recording of weight remained poorer at 55%. T2D patients with Ischaemic Heart Disease were more likely to have their blood pressure recorded (OR 1.6, p=0.02). Practitioners reported not experiencing any major barriers to using their computer medical record system but did agree with some suggested strategies to improve record completeness. The completeness of routinely collected data suitable for input into computerised predictive models is improving although other dimensions of data quality need to be addressed. Copyright © 2016 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.
Hasbrouck, W.P.
1983-01-01
Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language used by the Tektronix 4051 Graphic System. This report presents computer programs to perform X-square/T-square analyses and to plot normal moveout lines on a seismogram overlay.
DAKOTA Design Analysis Kit for Optimization and Terascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.
2010-02-24
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less
Adaptation of a program for nonlinear finite element analysis to the CDC STAR 100 computer
NASA Technical Reports Server (NTRS)
Pifko, A. B.; Ogilvie, P. L.
1978-01-01
The conversion of a nonlinear finite element program to the CDC STAR 100 pipeline computer is discussed. The program called DYCAST was developed for the crash simulation of structures. Initial results with the STAR 100 computer indicated that significant gains in computation time are possible for operations on gloval arrays. However, for element level computations that do not lend themselves easily to long vector processing, the STAR 100 was slower than comparable scalar computers. On this basis it is concluded that in order for pipeline computers to impact the economic feasibility of large nonlinear analyses it is absolutely essential that algorithms be devised to improve the efficiency of element level computations.
Missense dopamine transporter mutations associate with adult parkinsonism and ADHD
Hansen, Freja H.; Skjørringe, Tina; Yasmeen, Saiqa; Arends, Natascha V.; Sahai, Michelle A.; Erreger, Kevin; Andreassen, Thorvald F.; Holy, Marion; Hamilton, Peter J.; Neergheen, Viruna; Karlsborg, Merete; Newman, Amy H.; Pope, Simon; Heales, Simon J.R.; Friberg, Lars; Law, Ian; Pinborg, Lars H.; Sitte, Harald H.; Loland, Claus; Shi, Lei; Weinstein, Harel; Galli, Aurelio; Hjermind, Lena E.; Møller, Lisbeth B.; Gether, Ulrik
2014-01-01
Parkinsonism and attention deficit hyperactivity disorder (ADHD) are widespread brain disorders that involve disturbances of dopaminergic signaling. The sodium-coupled dopamine transporter (DAT) controls dopamine homeostasis, but its contribution to disease remains poorly understood. Here, we analyzed a cohort of patients with atypical movement disorder and identified 2 DAT coding variants, DAT-Ile312Phe and a presumed de novo mutant DAT-Asp421Asn, in an adult male with early-onset parkinsonism and ADHD. According to DAT single-photon emission computed tomography (DAT-SPECT) scans and a fluoro-deoxy-glucose-PET/MRI (FDG-PET/MRI) scan, the patient suffered from progressive dopaminergic neurodegeneration. In heterologous cells, both DAT variants exhibited markedly reduced dopamine uptake capacity but preserved membrane targeting, consistent with impaired catalytic activity. Computational simulations and uptake experiments suggested that the disrupted function of the DAT-Asp421Asn mutant is the result of compromised sodium binding, in agreement with Asp421 coordinating sodium at the second sodium site. For DAT-Asp421Asn, substrate efflux experiments revealed a constitutive, anomalous efflux of dopamine, and electrophysiological analyses identified a large cation leak that might further perturb dopaminergic neurotransmission. Our results link specific DAT missense mutations to neurodegenerative early-onset parkinsonism. Moreover, the neuropsychiatric comorbidity provides additional support for the idea that DAT missense mutations are an ADHD risk factor and suggests that complex DAT genotype and phenotype correlations contribute to different dopaminergic pathologies. PMID:24911152
Eisenberg, Sarita; Guo, Ling-Yu
2016-05-01
This article reviews the existing literature on the diagnostic accuracy of two grammatical accuracy measures for differentiating children with and without language impairment (LI) at preschool and early school age based on language samples. The first measure, the finite verb morphology composite (FVMC), is a narrow grammatical measure that computes children's overall accuracy of four verb tense morphemes. The second measure, percent grammatical utterances (PGU), is a broader grammatical measure that computes children's accuracy in producing grammatical utterances. The extant studies show that FVMC demonstrates acceptable (i.e., 80 to 89% accurate) to good (i.e., 90% accurate or higher) diagnostic accuracy for children between 4;0 (years;months) and 6;11 in conversational or narrative samples. In contrast, PGU yields acceptable to good diagnostic accuracy for children between 3;0 and 8;11 regardless of sample types. Given the diagnostic accuracy shown in the literature, we suggest that FVMC and PGU can be used as one piece of evidence for identifying children with LI in assessment when appropriate. However, FVMC or PGU should not be used as therapy goals directly. Instead, when children are low in FVMC or PGU, we suggest that follow-up analyses should be conducted to determine the verb tense morphemes or grammatical structures that children have difficulty with. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Rajagopal, Raghu Raman; Rajarao, Ravindra; Sahajwalla, Veena
2016-11-01
This paper investigates the high temperature transformation, specifically the kinetic behaviour of the waste printed circuit board (WPCB) derived from computer monitor (single-sided/SSWPCB) and computer processing boards - CPU (multi-layered/MLWPCB) using Thermo-Gravimetric Analyser (TGA) and Vertical Thermo-Gravimetric Analyser (VTGA) techniques under nitrogen atmosphere. Furthermore, the resulting WPCB residues were subjected to characterisation using X-ray Fluorescence spectrometry (XRF), Carbon Analyser, X-ray Photoelectron Spectrometer (XPS) and Scanning Electron Microscopy (SEM). In order to analyse the material degradation of WPCB, TGA from 40°C to 700°C at the rates of 10°C, 20°C and 30°C and VTGA at 700°C, 900°C and 1100°C were performed respectively. The data obtained was analysed on the basis of first order reaction kinetics. Through experiments it is observed that there exists a substantial difference between SSWPCB and MLWPCB in their decomposition levels, kinetic behaviour and structural properties. The calculated activation energy (E A ) of SSWPCB is found to be lower than that of MLWPCB. Elemental analysis of SSWPCB determines to have high carbon content in contrast to MLWPCB and differences in materials properties have significant influence on kinetics, which is ceramic rich, proving to have differences in the physicochemical properties. These high temperature transformation studies and associated analytical investigations provide fundamental understanding of different WPCB and its major variations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Application of wall-models to discontinuous Galerkin LES
NASA Astrophysics Data System (ADS)
Frère, Ariane; Carton de Wiart, Corentin; Hillewaert, Koen; Chatelain, Philippe; Winckelmans, Grégoire
2017-08-01
Wall-resolved Large-Eddy Simulations (LES) are still limited to moderate Reynolds number flows due to the high computational cost required to capture the inner part of the boundary layer. Wall-modeled LES (WMLES) provide more affordable LES by modeling the near-wall layer. Wall function-based WMLES solve LES equations up to the wall, where the coarse mesh resolution essentially renders the calculation under-resolved. This makes the accuracy of WMLES very sensitive to the behavior of the numerical method. Therefore, best practice rules regarding the use and implementation of WMLES cannot be directly transferred from one methodology to another regardless of the type of discretization approach. Whilst numerous studies present guidelines on the use of WMLES, there is a lack of knowledge for discontinuous finite-element-like high-order methods. Incidentally, these methods are increasingly used on the account of their high accuracy on unstructured meshes and their strong computational efficiency. The present paper proposes best practice guidelines for the use of WMLES in these methods. The study is based on sensitivity analyses of turbulent channel flow simulations by means of a Discontinuous Galerkin approach. It appears that good results can be obtained without the use of a spatial or temporal averaging. The study confirms the importance of the wall function input data location and suggests to take it at the bottom of the second off-wall element. These data being available through the ghost element, the suggested method prevents the loss of computational scalability experienced in unstructured WMLES. The study also highlights the influence of the polynomial degree used in the wall-adjacent element. It should preferably be of even degree as using polynomials of degree two in the first off-wall element provides, surprisingly, better results than using polynomials of degree three.
Learning Computer Science: Perceptions, Actions and Roles
ERIC Educational Resources Information Center
Berglund, Anders; Eckerdal, Anna; Pears, Arnold; East, Philip; Kinnunen, Paivi; Malmi, Lauri; McCartney, Robert; Mostrom, Jan-Erik; Murphy, Laurie; Ratcliffe, Mark; Schulte, Carsten; Simon, Beth; Stamouli, Ioanna; Thomas, Lynda
2009-01-01
This phenomenographic study opens the classroom door to investigate teachers' experiences of students learning difficult computing topics. Three distinct themes are identified and analysed. "Why" do students succeed or fail to learn these concepts? "What" actions do teachers perceive will ameliorate the difficulties facing…
VIEWIT: computation of seen areas, slope, and aspect for land-use planning
Michael R. Travis; Gary H. Elsner; Wayne D. Iverson; Christine G. Johnson
1975-01-01
This user's guide provides instructions for using VIEWIT--a computerized technique for delineating the terrain visible from a single point or from multiple observer points, and for doing slope and aspect analyses. Results are in tabular or in overlay map form. VIEWIT can do individual view-area, slope, or aspect analyses or combined analyses, and can produce...
DMINDA: an integrated web server for DNA motif identification and analyses.
Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying
2014-07-01
DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
A PCA-Based method for determining craniofacial relationship and sexual dimorphism of facial shapes.
Shui, Wuyang; Zhou, Mingquan; Maddock, Steve; He, Taiping; Wang, Xingce; Deng, Qingqiong
2017-11-01
Previous studies have used principal component analysis (PCA) to investigate the craniofacial relationship, as well as sex determination using facial factors. However, few studies have investigated the extent to which the choice of principal components (PCs) affects the analysis of craniofacial relationship and sexual dimorphism. In this paper, we propose a PCA-based method for visual and quantitative analysis, using 140 samples of 3D heads (70 male and 70 female), produced from computed tomography (CT) images. There are two parts to the method. First, skull and facial landmarks are manually marked to guide the model's registration so that dense corresponding vertices occupy the same relative position in every sample. Statistical shape spaces of the skull and face in dense corresponding vertices are constructed using PCA. Variations in these vertices, captured in every principal component (PC), are visualized to observe shape variability. The correlations of skull- and face-based PC scores are analysed, and linear regression is used to fit the craniofacial relationship. We compute the PC coefficients of a face based on this craniofacial relationship and the PC scores of a skull, and apply the coefficients to estimate a 3D face for the skull. To evaluate the accuracy of the computed craniofacial relationship, the mean and standard deviation of every vertex between the two models are computed, where these models are reconstructed using real PC scores and coefficients. Second, each PC in facial space is analysed for sex determination, for which support vector machines (SVMs) are used. We examined the correlation between PCs and sex, and explored the extent to which the choice of PCs affects the expression of sexual dimorphism. Our results suggest that skull- and face-based PCs can be used to describe the craniofacial relationship and that the accuracy of the method can be improved by using an increased number of face-based PCs. The results show that the accuracy of the sex classification is related to the choice of PCs. The highest sex classification rate is 91.43% using our method. Copyright © 2017 Elsevier Ltd. All rights reserved.
SeedVicious: Analysis of microRNA target and near-target sites.
Marco, Antonio
2018-01-01
Here I describe seedVicious, a versatile microRNA target site prediction software that can be easily fitted into annotation pipelines and run over custom datasets. SeedVicious finds microRNA canonical sites plus other, less efficient, target sites. Among other novel features, seedVicious can compute evolutionary gains/losses of target sites using maximum parsimony, and also detect near-target sites, which have one nucleotide different from a canonical site. Near-target sites are important to study population variation in microRNA regulation. Some analyses suggest that near-target sites may also be functional sites, although there is no conclusive evidence for that, and they may actually be target alleles segregating in a population. SeedVicious does not aim to outperform but to complement existing microRNA prediction tools. For instance, the precision of TargetScan is almost doubled (from 11% to ~20%) when we filter predictions by the distance between target sites using this program. Interestingly, two adjacent canonical target sites are more likely to be present in bona fide target transcripts than pairs of target sites at slightly longer distances. The software is written in Perl and runs on 64-bit Unix computers (Linux and MacOS X). Users with no computing experience can also run the program in a dedicated web-server by uploading custom data, or browse pre-computed predictions. SeedVicious and its associated web-server and database (SeedBank) are distributed under the GPL/GNU license.
Chen, Po-Chia; Hologne, Maggy; Walker, Olivier
2017-03-02
Rotational diffusion (D rot ) is a fundamental property of biomolecules that contains information about molecular dimensions and solute-solvent interactions. While ab initio D rot prediction can be achieved by explicit all-atom molecular dynamics simulations, this is hindered by both computational expense and limitations in water models. We propose coarse-grained force fields as a complementary solution, and show that the MARTINI force field with elastic networks is sufficient to compute D rot in >10 proteins spanning 5-157 kDa. We also adopt a quaternion-based approach that computes D rot orientation directly from autocorrelations of best-fit rotations as used in, e.g., RMSD algorithms. Over 2 μs trajectories, isotropic MARTINI+EN tumbling replicates experimental values to within 10-20%, with convergence analyses suggesting a minimum sampling of >50 × τ theor to achieve sufficient precision. Transient fluctuations in anisotropic tumbling cause decreased precision in predictions of axisymmetric anisotropy and rhombicity, the latter of which cannot be precisely evaluated within 2000 × τ theor for GB3. Thus, we encourage reporting of axial decompositions D x , D y , D z to ease comparability between experiment and simulation. Where protein disorder is absent, we observe close replication of MARTINI+EN D rot orientations versus CHARMM22*/TIP3p and experimental data. This work anticipates the ab initio prediction of NMR-relaxation by combining coarse-grained global motions with all-atom local motions.
Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach
Cheung, Mike W.-L.; Jak, Suzanne
2016-01-01
Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639
Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.
Cheung, Mike W-L; Jak, Suzanne
2016-01-01
Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.
Savant Genome Browser 2: visualization and analysis for population-scale genomics.
Fiume, Marc; Smith, Eric J M; Brook, Andrew; Strbenac, Dario; Turner, Brian; Mezlini, Aziz M; Robinson, Mark D; Wodak, Shoshana J; Brudno, Michael
2012-07-01
High-throughput sequencing (HTS) technologies are providing an unprecedented capacity for data generation, and there is a corresponding need for efficient data exploration and analysis capabilities. Although most existing tools for HTS data analysis are developed for either automated (e.g. genotyping) or visualization (e.g. genome browsing) purposes, such tools are most powerful when combined. For example, integration of visualization and computation allows users to iteratively refine their analyses by updating computational parameters within the visual framework in real-time. Here we introduce the second version of the Savant Genome Browser, a standalone program for visual and computational analysis of HTS data. Savant substantially improves upon its predecessor and existing tools by introducing innovative visualization modes and navigation interfaces for several genomic datatypes, and synergizing visual and automated analyses in a way that is powerful yet easy even for non-expert users. We also present a number of plugins that were developed by the Savant Community, which demonstrate the power of integrating visual and automated analyses using Savant. The Savant Genome Browser is freely available (open source) at www.savantbrowser.com.
Savant Genome Browser 2: visualization and analysis for population-scale genomics
Smith, Eric J. M.; Brook, Andrew; Strbenac, Dario; Turner, Brian; Mezlini, Aziz M.; Robinson, Mark D.; Wodak, Shoshana J.; Brudno, Michael
2012-01-01
High-throughput sequencing (HTS) technologies are providing an unprecedented capacity for data generation, and there is a corresponding need for efficient data exploration and analysis capabilities. Although most existing tools for HTS data analysis are developed for either automated (e.g. genotyping) or visualization (e.g. genome browsing) purposes, such tools are most powerful when combined. For example, integration of visualization and computation allows users to iteratively refine their analyses by updating computational parameters within the visual framework in real-time. Here we introduce the second version of the Savant Genome Browser, a standalone program for visual and computational analysis of HTS data. Savant substantially improves upon its predecessor and existing tools by introducing innovative visualization modes and navigation interfaces for several genomic datatypes, and synergizing visual and automated analyses in a way that is powerful yet easy even for non-expert users. We also present a number of plugins that were developed by the Savant Community, which demonstrate the power of integrating visual and automated analyses using Savant. The Savant Genome Browser is freely available (open source) at www.savantbrowser.com. PMID:22638571
NASA Astrophysics Data System (ADS)
Diehl, Roger E.; Schinnerer, Ralph G.; Williamson, Walton E.; Boden, Daryl G.
The present conference discusses topics in orbit determination, tethered satellite systems, celestial mechanics, guidance optimization, flexible body dynamics and control, attitude dynamics and control, Mars mission analyses, earth-orbiting mission analysis/debris, space probe mission analyses, and orbital computation numerical analyses. Attention is given to electrodynamic forces for control of tethered satellite systems, orbiting debris threats to asteroid flyby missions, launch velocity requirements for interceptors of short range ballistic missiles, transfers between libration-point orbits in the elliptic restricted problem, minimum fuel spacecraft reorientation, orbital guidance for hitting a fixed point at maximum speed, efficient computation of satellite visibility periods, orbit decay and reentry prediction for space debris, and the determination of satellite close approaches.
NASA Technical Reports Server (NTRS)
Diehl, Roger E. (Editor); Schinnerer, Ralph G. (Editor); Williamson, Walton E. (Editor); Boden, Daryl G. (Editor)
1992-01-01
The present conference discusses topics in orbit determination, tethered satellite systems, celestial mechanics, guidance optimization, flexible body dynamics and control, attitude dynamics and control, Mars mission analyses, earth-orbiting mission analysis/debris, space probe mission analyses, and orbital computation numerical analyses. Attention is given to electrodynamic forces for control of tethered satellite systems, orbiting debris threats to asteroid flyby missions, launch velocity requirements for interceptors of short range ballistic missiles, transfers between libration-point orbits in the elliptic restricted problem, minimum fuel spacecraft reorientation, orbital guidance for hitting a fixed point at maximum speed, efficient computation of satellite visibility periods, orbit decay and reentry prediction for space debris, and the determination of satellite close approaches.
The use of self-organising maps for anomalous behaviour detection in a digital investigation.
Fei, B K L; Eloff, J H P; Olivier, M S; Venter, H S
2006-10-16
The dramatic increase in crime relating to the Internet and computers has caused a growing need for digital forensics. Digital forensic tools have been developed to assist investigators in conducting a proper investigation into digital crimes. In general, the bulk of the digital forensic tools available on the market permit investigators to analyse data that has been gathered from a computer system. However, current state-of-the-art digital forensic tools simply cannot handle large volumes of data in an efficient manner. With the advent of the Internet, many employees have been given access to new and more interesting possibilities via their desktop. Consequently, excessive Internet usage for non-job purposes and even blatant misuse of the Internet have become a problem in many organisations. Since storage media are steadily growing in size, the process of analysing multiple computer systems during a digital investigation can easily consume an enormous amount of time. Identifying a single suspicious computer from a set of candidates can therefore reduce human processing time and monetary costs involved in gathering evidence. The focus of this paper is to demonstrate how, in a digital investigation, digital forensic tools and the self-organising map (SOM)--an unsupervised neural network model--can aid investigators to determine anomalous behaviours (or activities) among employees (or computer systems) in a far more efficient manner. By analysing the different SOMs (one for each computer system), anomalous behaviours are identified and investigators are assisted to conduct the analysis more efficiently. The paper will demonstrate how the easy visualisation of the SOM enhances the ability of the investigators to interpret and explore the data generated by digital forensic tools so as to determine anomalous behaviours.
Tai, Meng Wei; Chong, Zhen Feng; Asif, Muhammad Khan; Rahmat, Rabiah A; Nambiar, Phrabhakaran
2016-09-01
This study was to compare the suitability and precision of xerographic and computer-assisted methods for bite mark investigations. Eleven subjects were asked to bite on their forearm and the bite marks were photographically recorded. Alginate impressions of the subjects' dentition were taken and their casts were made using dental stone. The overlays generated by xerographic method were obtained by photocopying the subjects' casts and the incisal edge outlines were then transferred on a transparent sheet. The bite mark images were imported into Adobe Photoshop® software and printed to life-size. The bite mark analyses using xerographically generated overlays were done by comparing an overlay to the corresponding printed bite mark images manually. In computer-assisted method, the subjects' casts were scanned into Adobe Photoshop®. The bite mark analyses using computer-assisted overlay generation were done by matching an overlay and the corresponding bite mark images digitally using Adobe Photoshop®. Another comparison method was superimposing the cast images with corresponding bite mark images employing the Adobe Photoshop® CS6 and GIF-Animator©. A score with a range of 0-3 was given during analysis to each precision-determining criterion and the score was increased with better matching. The Kruskal Wallis H test showed significant difference between the three sets of data (H=18.761, p<0.05). In conclusion, bite mark analysis using the computer-assisted animated-superimposition method was the most accurate, followed by the computer-assisted overlay generation and lastly the xerographic method. The superior precision contributed by digital method is discernible despite the human skin being a poor recording medium of bite marks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A multi-GPU real-time dose simulation software framework for lung radiotherapy.
Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A
2012-09-01
Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was <1%. Our analyses show that the framework applied to deformable lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.
NASA Astrophysics Data System (ADS)
Burgess, A. B. H.; Erler, A. R.; Shepherd, T. G.
2012-04-01
We present spectra, nonlinear interaction terms, and fluxes computed for horizontal wind fields from high-resolution meteorological analyses made available by ECMWF for the International Polar Year. Total kinetic energy spectra clearly show two spectral regimes: a steep spectrum at large scales and a shallow spectrum in the mesoscale. The spectral shallowing appears at ~200 hPa, and is due to decreasing rotational power with height, which results in the shallower divergent spectrum dominating in the mesoscale. The spectra we find are steeper than those observed in aircraft data and GCM simulations. Though the analyses resolve total spherical harmonic wavenumbers up to n = 721, effects of dissipation on the fluxes and spectra are visible starting at about n = 200. We find a weak forward energy cascade and a downscale enstrophy cascade in the mesoscale. Eddy-eddy nonlinear kinetic energy transfers reach maximum amplitudes at the tropopause, and decrease with height thereafter; zonal mean-eddy transfers dominate in the stratosphere. In addition, zonal anisotropy reaches a minimum at the tropopause. Combined with strong eddy-eddy interactions, this suggests flow in the tropopause region is very active and bears the greatest resemblance to isotropic turbulence. We find constant enstrophy flux over a broad range of wavenumbers around the tropopause and in the upper stratosphere. A relatively constant spectral enstrophy flux at the tropopause suggests a turbulent inertial range, and that the enstrophy flux is resolved. A main result of our work is its implications for explaining the shallow mesoscale spectrum observed in aircraft wind measurements, GCM studies, and now meteorological analyses. The strong divergent component in the shallow mesoscale spectrum indicates unbalanced flow, and nonlinear transfers decreasing quickly with height are characteristic of waves, not turbulence. Together with the downscale flux of energ y through the shallow spectral range, these findings add further evidence that the shallow mesoscale spectrum is not generated by balanced two-dimensional turbulence.
An Analysis of the Use of Cloud Computing among University Lecturers: A Case Study in Zimbabwe
ERIC Educational Resources Information Center
Musungwini, Samuel; Mugoniwa, Beauty; Furusa, Samuel Simbarashe; Rebanowako, Taurai George
2016-01-01
Cloud computing is a novel model of computing that may bring extensive benefits to users, institutions, businesses and academics, while at the same time also giving rise to new risks and challenges. This study looked at the benefits of using Google docs by researchers and academics and analysing the factors affecting the adoption and use of the…
Computer Literacy and the Construct Validity of a High-Stakes Computer-Based Writing Assessment
ERIC Educational Resources Information Center
Jin, Yan; Yan, Ming
2017-01-01
One major threat to validity in high-stakes testing is construct-irrelevant variance. In this study we explored whether the transition from a paper-and-pencil to a computer-based test mode in a high-stakes test in China, the College English Test, has brought about variance irrelevant to the construct being assessed in this test. Analyses of the…
Hiasa, Miki; Isoda, Yumiko; Kishimoto, Yasushi; Saitoh, Kenta; Kimura, Yasuaki; Kanai, Motomu; Shibasaki, Masakatsu; Hatakeyama, Dai; Kirino, Yutaka; Kuzuhara, Takashi
2013-05-01
Oseltamivir is the most widely prescribed anti-influenza medication. However, in rare instances, it has been reported to stimulate behavioural activities in adolescents. The goal of this study was to determine the molecular mechanism responsible for these behavioural activities. We performed an in vitro assay of MAO-A, the enzyme responsible for neurotransmitter degradation, using either the active form - oseltamivir carboxylate (OC) or the inactive prodrug - oseltamivir ethyl ester (OEE). We also analysed the docking of MAO-A with OEE or OC in silico. Mouse behaviours after OEE or OC administration were monitored using automated video and computer analysis. OEE, but not OC, competitively and selectively inhibited human MAO-A. The estimated Ki value was comparable with the Km values of native substrates of MAO-A. Docking simulations in silico based on the tertiary structure of MAO-A suggested that OEE could fit into the inner pocket of the enzyme. Behavioural monitoring using automated video analysis further revealed that OEE, not OC, significantly enhanced spontaneous behavioural activities in mice, such as jumping, rearing, sniffing, turning and walking. Our multilevel analyses suggested OEE to be the cause of the side effects associated with oseltamivir and revealed the molecular mechanism underlying the stimulated behaviours induced by oseltamivir in some circumstances. © 2013 The Authors. British Journal of Pharmacology © 2013 The British Pharmacological Society.
Hiasa, Miki; Isoda, Yumiko; Kishimoto, Yasushi; Saitoh, Kenta; Kimura, Yasuaki; Kanai, Motomu; Shibasaki, Masakatsu; Hatakeyama, Dai; Kirino, Yutaka; Kuzuhara, Takashi
2013-01-01
Background and Purpose Oseltamivir is the most widely prescribed anti-influenza medication. However, in rare instances, it has been reported to stimulate behavioural activities in adolescents. The goal of this study was to determine the molecular mechanism responsible for these behavioural activities. Experimental Approach We performed an in vitro assay of MAO-A, the enzyme responsible for neurotransmitter degradation, using either the active form – oseltamivir carboxylate (OC) or the inactive prodrug – oseltamivir ethyl ester (OEE). We also analysed the docking of MAO-A with OEE or OC in silico. Mouse behaviours after OEE or OC administration were monitored using automated video and computer analysis. Key Results OEE, but not OC, competitively and selectively inhibited human MAO-A. The estimated Ki value was comparable with the Km values of native substrates of MAO-A. Docking simulations in silico based on the tertiary structure of MAO-A suggested that OEE could fit into the inner pocket of the enzyme. Behavioural monitoring using automated video analysis further revealed that OEE, not OC, significantly enhanced spontaneous behavioural activities in mice, such as jumping, rearing, sniffing, turning and walking. Conclusions and Implications Our multilevel analyses suggested OEE to be the cause of the side effects associated with oseltamivir and revealed the molecular mechanism underlying the stimulated behaviours induced by oseltamivir in some circumstances. PMID:23320399
Zhang, Lei; Li, Wenfu; Wei, Dongtao; Yang, Wenjing; Yang, Ning; Qiao, Lei; Qiu, Jiang; Zuo, Xi-Nian; Zhang, Qinglin
2016-06-01
Mind pops or involuntary semantic memories refer to words, phrases, images, or melodies that suddenly pop into one's mind without any deliberate attempt to recall them. Despite their prevalence in everyday life, research on mind pops has started only recently. Notably, mind pops are very similar to clinical involuntary phenomena such as hallucinations in schizophrenia, suggesting their potential role in pathology. The present study aimed to investigate the relationship between mind pops and the brain morphometry measured in 302 healthy young adults; after exclusions, 256 participants were included in our analyses. Specifically, the Mind Popping Questionnaire (MPQ) was employed to measure the degree of individual mind pops, whereas the Voxel-Based Morphometry (VBM) was used to compute the volumes of both gray and white matter tissues. Multiple regression analyses on MPQ and VBM metrics indicated that high-frequency mind pops were significantly associated with smaller gray matter volume in the left middle temporal gyrus as well as with larger gray and white matter volume in the right medial prefrontal cortex. This increase in mind pops is also linked to higher creativity and the personality trait of 'openness'. These data not only suggest a key role of the two regions in generating self-related thoughts, but also open a possible link between brain and creativity or personality.
Adrenal incidentaloma in adults - management recommendations by the Polish Society of Endocrinology.
Bednarczuk, Tomasz; Bolanowski, Marek; Sworczak, Krzysztof; Górnicka, Barbara; Cieszanowski, Andrzej; Otto, Maciej; Ambroziak, Urszula; Pachucki, Janusz; Kubicka, Eliza; Babińska, Anna; Koperski, Łukasz; Januszewicz, Andrzej; Prejbisz, Aleksander; Górska, Maria; Jarząb, Barbara; Hubalewska-Dydejczyk, Alicja; Glinicki, Piotr; Ruchała, Marek; Kasperlik-Załuska, Anna
2016-01-01
A wide use of imaging techniques results in more frequent diagnosis of adrenal incidenataloma. To analyse the current state of knowledge on adrenal incidentaloma in adults in order to prepare practical management recommendations. Following a discussion, the Polish Society of Endocrinology expert working group have analysed the available data and summarised the analysis results in the form of recommendations. Unenhanced adrenal computed tomography (CT) may be recommended as an initial assessment examination helpful in the differentiation between adenomas and "non-adenomatous" lesions. In the case of density > 10 Hounsfield units, CT with contrast medium washout assessment or MRI are recommended. However, in all patients with adrenal incidentaloma, hormonal assessment is recommended in order to exclude pheochromocytoma and hypercortisolism, notwithstanding the clinical picture or concomitant diseases. In addition, examination to exclude primary hyperaldosteronism is suggested in patients with diagnosed hypertension or hypokalaemia. Surgical treatment should be recommended in patients with adrenal incidentaloma, where imaging examinations suggest a malignant lesion (oncological indication) or with confirmed hormonal activity (endocrinological indication). The basis of the surgical treatment is laparoscopic adrenalectomy. Patients with suspected pheochromocytoma must be pharmacologically prepared prior to surgery. In patients not qualified for surgery, control examinations (imaging and laboratory tests) should be established individually, taking into consideration such features as the size, image, and growth dynamics of the tumour, clinical symptoms, hormonal tests results, and concomitant diseases.
Validity and reliability of the G-Cog device for kinematic measurements.
Chiementin, X; Crequy, S; Bertucci, W
2013-11-01
The aim of this study was to test the validity and the reliability of the G-Cog which is a new BMX powermeter allowing for the measurements of the acceleration on X-Y-Z axis (250 Hz) at the BMX rear wheel. These measurements allow computing lateral, angular, linear acceleration, angular, linear velocity and the distance. Mechanical measurements at submaximal intensities in standardized laboratory conditions and during maximal exercises in the field conditions were performed to analyse the reliability of the G-Cog accelerometers. The performances were evaluated in comparison with an industrial accelerometer and with 2 powermeters, the SRM and PowerTap. Our results in laboratory conditions show that the G-Cog measurements have low value of variation coefficient (CV=2.35%). These results suggest that the G-cog accelerometers measurements are reproducible. The ratio limits of agreement of the rear hub angular velocity differences between the SRM and the G-Cog were 1.010 × ÷ 1.024 (95%CI=0.986-1.034) and between PowerTap and G-Cog were 0.993 × ÷ 1.019 (95%CI=0.974-1.012). In conclusion, our results suggest that the G-Cog angular velocity measurements are valid and reliable compared with SRM and PowerTap and could be used to analyse the kinematics during BMX actual conditions. © Georg Thieme Verlag KG Stuttgart · New York.
Tulongicin, an Antibacterial Tri-Indole Alkaloid from a Deep-Water Topsentia sp. Sponge.
Liu, Hong-Bing; Lauro, Gianluigi; O'Connor, Robert D; Lohith, Katheryn; Kelly, Michelle; Colin, Patrick; Bifulco, Giuseppe; Bewley, Carole A
2017-09-22
Antibacterial-guided fractionation of an extract of a deep-water Topsentia sp. marine sponge led to the isolation of two new indole alkaloids, tulongicin A (1) and dihydrospongotine C (2), along with two known analogues, spongotine C (3) and dibromodeoxytopsentin (4). Their planar structures were determined by NMR spectroscopy. Their absolute configurations were determined through a combination of experimental and computational analyses. Tulongicin (1) is the first natural product to contain a di(6-Br-1H-indol-3-yl)methyl group linked to an imidazole core. The coexistence of tri-indole 1 and bis-indole alcohol 2 suggests a possible route to 1. All of the compounds showed strong antimicrobial activity against Staphylococcus aureus.
Effect of subliminal visual material on an auditory signal detection task.
Moroney, E; Bross, M
1984-02-01
An experiment assessed the effect of subliminally embedded, visual material on an auditory detection task. 22 women and 19 men were presented tachistoscopically with words designated as "emotional" or "neutral" on the basis of prior GSRs and a Word Rating List under four conditions: (a) Unembedded Neutral, (b) Embedded Neutral, (c) Unembedded Emotional, and (d) Embedded Emotional. On each trial subjects made forced choices concerning the presence or absence of an auditory tone (1000 Hz) at threshold level; hits and false alarm rates were used to compute non-parametric indices for sensitivity (A') and response bias (B"). While over-all analyses of variance yielded no significant differences, further examination of the data suggests the presence of subliminally "receptive" and "non-receptive" subpopulations.
Miksztai-Réthey, Brigitta; Faragó, Kinga Bettina
2015-01-01
We studied an artificial intelligent assisted interaction between a computer and a human with severe speech and physical impairments (SSPI). In order to speed up AAC, we extended a former study of typing performance optimization using a framework that included head movement controlled assistive technology and an onscreen writing device. Quantitative and qualitative data were collected and analysed with mathematical methods, manual interpretation and semi-supervised machine video annotation. As the result of our research, in contrast to the former experiment's conclusions, we found that our participant had at least two different typing strategies. To maximize his communication efficiency, a more complex assistive tool is suggested, which takes the different methods into consideration.
Evolution of epigenetic regulation in vertebrate genomes
Lowdon, Rebecca F.; Jang, Hyo Sik; Wang, Ting
2016-01-01
Empirical models of sequence evolution have spurred progress in the field of evolutionary genetics for decades. We are now realizing the importance and complexity of the eukaryotic epigenome. While epigenome analysis has been applied to genomes from single cell eukaryotes to human, comparative analyses are still relatively few, and computational algorithms to quantify epigenome evolution remain scarce. Accordingly, a quantitative model of epigenome evolution remains to be established. Here we review the comparative epigenomics literature and synthesize its overarching themes. We also suggest one mechanism, transcription factor binding site turnover, which relates sequence evolution to epigenetic conservation or divergence. Lastly, we propose a framework for how the field can move forward to build a coherent quantitative model of epigenome evolution. PMID:27080453
Exploiting the flexibility of a family of models for taxation and redistribution
NASA Astrophysics Data System (ADS)
Bertotti, M. L.; Modanese, G.
2012-08-01
We discuss a family of models expressed by nonlinear differential equation systems describing closed market societies in the presence of taxation and redistribution. We focus in particular on three example models obtained in correspondence to different parameter choices. We analyse the influence of the various choices on the long time shape of the income distribution. Several simulations suggest that behavioral heterogeneity among the individuals plays a definite role in the formation of fat tails of the asymptotic stationary distributions. This is in agreement with results found with different approaches and techniques. We also show that an excellent fit for the computational outputs of our models is provided by the κ-generalized distribution introduced by Kaniadakis in [Physica A 296, 405 (2001)].
Mechanism of axial strain effects on friction in carbon nanotube rotating bearings.
Huang, Jianzhang; Han, Qiang
2018-08-10
A systematic study of axial strain effects on friction in carbon nanotube bearings is conducted in this paper. The relationships between friction and axial strains are determined by implementing molecular dynamics simulations. It is found that the dependence of friction on velocity and temperature is altered by axial strains. The mechanism of strain effects is revealed through numerical and theoretical analyses. Based on phonon computations, axial strain effects tune friction by adjusting the distribution of the phonon frequency density, which affects the transfer efficiency of orderly kinetic energy into disorderly thermal energy. The findings in this work advance the understanding of friction in carbon nanotubes and suggest the great potential of axial strain effects on tuning friction in nanodevice applications.
Computing Surface Coordinates Of Face-Milled Spiral-Bevel Gear Teeth
NASA Technical Reports Server (NTRS)
Handschuh, Robert F.; Litvin, Faydor L.
1995-01-01
Surface coordinates of face-milled spiral-bevel gear teeth computed by method involving numerical solution of governing equations. Needed to generate mathematical models of tooth surfaces for use in finite-element analyses of stresses, strains, and vibrations in meshing spiral-bevel gears.
Computer program analyzes Buckling Of Shells Of Revolution with various wall construction, BOSOR
NASA Technical Reports Server (NTRS)
Almroth, B. O.; Bushnell, D.; Sobel, L. H.
1968-01-01
Computer program performs stability analyses for a wide class of shells without unduly restrictive approximations. The program uses numerical integration, finite difference of finite element techniques to solve with reasonable accuracy almost any buckling problem for shells exhibiting orthotropic behavior.
Analytic variance estimates of Swank and Fano factors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutierrez, Benjamin; Badano, Aldo; Samuelson, Frank, E-mail: frank.samuelson@fda.hhs.gov
Purpose: Variance estimates for detector energy resolution metrics can be used as stopping criteria in Monte Carlo simulations for the purpose of ensuring a small uncertainty of those metrics and for the design of variance reduction techniques. Methods: The authors derive an estimate for the variance of two energy resolution metrics, the Swank factor and the Fano factor, in terms of statistical moments that can be accumulated without significant computational overhead. The authors examine the accuracy of these two estimators and demonstrate how the estimates of the coefficient of variation of the Swank and Fano factors behave with data frommore » a Monte Carlo simulation of an indirect x-ray imaging detector. Results: The authors' analyses suggest that the accuracy of their variance estimators is appropriate for estimating the actual variances of the Swank and Fano factors for a variety of distributions of detector outputs. Conclusions: The variance estimators derived in this work provide a computationally convenient way to estimate the error or coefficient of variation of the Swank and Fano factors during Monte Carlo simulations of radiation imaging systems.« less
Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria
2011-11-01
The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures.
Ratnam, Kalai Anand; Dominic, P D D; Ramayah, T
2014-08-01
The investments and costs of infrastructure, communication, medical-related equipments, and software within the global healthcare ecosystem portray a rather significant increase. The emergence of this proliferation is then expected to grow. As a result, information and cross-system communication became challenging due to the detached independent systems and subsystems which are not connected. The overall model fit expending over a sample size of 320 were tested with structural equation modelling (SEM) using AMOS 20.0 as the modelling tool. SPSS 20.0 is used to analyse the descriptive statistics and dimension reliability. Results of the study show that system utilisation and system impact dimension influences the overall level of services of the healthcare providers. In addition to that, the findings also suggest that systems integration and security plays a pivotal role for IT resources in healthcare organisations. Through this study, a basis for investigation on the need to improvise the Malaysian healthcare ecosystem and the introduction of a cloud computing platform to host the national healthcare information exchange has been successfully established.
Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott
2017-01-01
To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: -2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: -5.30 to 6.01). The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other.
Ding, Feng; Sharma, Shantanu; Chalasani, Poornima; Demidov, Vadim V.; Broude, Natalia E.; Dokholyan, Nikolay V.
2008-01-01
RNA molecules with novel functions have revived interest in the accurate prediction of RNA three-dimensional (3D) structure and folding dynamics. However, existing methods are inefficient in automated 3D structure prediction. Here, we report a robust computational approach for rapid folding of RNA molecules. We develop a simplified RNA model for discrete molecular dynamics (DMD) simulations, incorporating base-pairing and base-stacking interactions. We demonstrate correct folding of 150 structurally diverse RNA sequences. The majority of DMD-predicted 3D structures have <4 Å deviations from experimental structures. The secondary structures corresponding to the predicted 3D structures consist of 94% native base-pair interactions. Folding thermodynamics and kinetics of tRNAPhe, pseudoknots, and mRNA fragments in DMD simulations are in agreement with previous experimental findings. Folding of RNA molecules features transient, non-native conformations, suggesting non-hierarchical RNA folding. Our method allows rapid conformational sampling of RNA folding, with computational time increasing linearly with RNA length. We envision this approach as a promising tool for RNA structural and functional analyses. PMID:18456842
The benefits of virtual reality simulator training for laparoscopic surgery.
Hart, Roger; Karthigasu, Krishnan
2007-08-01
Virtual reality is a computer-generated system that provides a representation of an environment. This review will analyse the literature with regard to any benefit to be derived from training with virtual reality equipment and to describe the current equipment available. Virtual reality systems are not currently realistic of the live operating environment because they lack tactile sensation, and do not represent a complete operation. The literature suggests that virtual reality training is a valuable learning tool for gynaecologists in training, particularly those in the early stages of their careers. Furthermore, it may be of benefit for the ongoing audit of surgical skills and for the early identification of a surgeon's deficiencies before operative incidents occur. It is only a matter of time before realistic virtual reality models of most complete gynaecological operations are available, with improved haptics as a result of improved computer technology. It is inevitable that in the modern climate of litigation virtual reality training will become an essential part of clinical training, as evidence for its effectiveness as a training tool exists, and in many countries training by operating on live animals is not possible.
Bothe, Melanie K.; Maathuis, Annet J. H.; Bellmann, Susann; van der Vossen, Jos M. B. M.; Berressem, Dirk; Koehler, Annalena; Schwejda-Guettes, Susann; Gaigg, Barbara; Kuchinka-Koch, Angelika; Stover, John F.
2017-01-01
Lactulose, a disaccharide of galactose and fructose, used as a laxative or ammonia-lowering drug and as a functional food ingredient, enhances growth of Bifidobacterium and Lactobacillus at clinically relevant dosages. The prebiotic effect of subclinical dosages of Lactulose, however, remains to be elucidated. This study analyses changes in the microbiota and their metabolites after a 5 days Lactulose treatment using the TIM-2 system, a computer-controlled model of the proximal large intestine representing a complex, high density, metabolically active, anaerobic microbiota of human origin. Subclinical dosages of 2–5 g Lactulose were used. While 2 g Lactulose already increased the short-chain fatty acid levels of the intestinal content, 5 g Lactulose were required daily for 5 days in this study to exert the full beneficial prebiotic effect consisting of higher bacterial counts of Bifidobacterium, Lactobacillus, and Anaerostipes, a rise in acetate, butyrate and lactate, as well as a decrease in branched-chain fatty acids, pH (suggested by an increase in NaOH usage), and ammonia. PMID:28718839
Niekerk, Sjan-Mari van; Louw, Quinette Abigail; Grimmer-Sommers, Karen
2014-01-01
Dynamic movement whilst sitting is advocated as a way to reduce musculoskeletal symptoms from seated activities. Conventionally, in ergonomics research, only a 'snapshot' of static sitting posture is captured, which does not provide information on the number or type of movements over a period of time. A novel approach to analyse the number of postural changes whist sitting was employed in order to describe the sitting behaviour of adolescents whilst undertaking computing activities. A repeated-measures observational study was conducted. A total of 12 high school students were randomly selected from a conveniently selected school. Fifteen minutes of 3D posture measurements were recorded to determine the number of postural changes whilst using computers. Data of 11 students were able to be analysed. Large intra-subject variation of the median and IQR was observed, indicating frequent postural changes whilst sitting. Better understanding of usual dynamic postural movements whilst sitting will provide new insights into causes of musculoskeletal symptoms experienced by computer users.
Evaluation of a continuous-rotation, high-speed scanning protocol for micro-computed tomography.
Kerl, Hans Ulrich; Isaza, Cristina T; Boll, Hanne; Schambach, Sebastian J; Nolte, Ingo S; Groden, Christoph; Brockmann, Marc A
2011-01-01
Micro-computed tomography is used frequently in preclinical in vivo research. Limiting factors are radiation dose and long scan times. The purpose of the study was to compare a standard step-and-shoot to a continuous-rotation, high-speed scanning protocol. Micro-computed tomography of a lead grid phantom and a rat femur was performed using a step-and-shoot and a continuous-rotation protocol. Detail discriminability and image quality were assessed by 3 radiologists. The signal-to-noise ratio and the modulation transfer function were calculated, and volumetric analyses of the femur were performed. The radiation dose of the scan protocols was measured using thermoluminescence dosimeters. The 40-second continuous-rotation protocol allowed a detail discriminability comparable to the step-and-shoot protocol at significantly lower radiation doses. No marked differences in volumetric or qualitative analyses were observed. Continuous-rotation micro-computed tomography significantly reduces scanning time and radiation dose without relevantly reducing image quality compared with a normal step-and-shoot protocol.
Programs To Optimize Spacecraft And Aircraft Trajectories
NASA Technical Reports Server (NTRS)
Brauer, G. L.; Petersen, F. M.; Cornick, D.E.; Stevenson, R.; Olson, D. W.
1994-01-01
POST/6D POST is set of two computer programs providing ability to target and optimize trajectories of powered or unpowered spacecraft or aircraft operating at or near rotating planet. POST treats point-mass, three-degree-of-freedom case. 6D POST treats more-general rigid-body, six-degree-of-freedom (with point masses) case. Used to solve variety of performance, guidance, and flight-control problems for atmospheric and orbital vehicles. Applications include computation of performance or capability of vehicle in ascent, or orbit, and during entry into atmosphere, simulation and analysis of guidance and flight-control systems, dispersion-type analyses and analyses of loads, general-purpose six-degree-of-freedom simulation of controlled and uncontrolled vehicles, and validation of performance in six degrees of freedom. Written in FORTRAN 77 and C language. Two machine versions available: one for SUN-series computers running SunOS(TM) (LAR-14871) and one for Silicon Graphics IRIS computers running IRIX(TM) operating system (LAR-14869).
HYDES: A generalized hybrid computer program for studying turbojet or turbofan engine dynamics
NASA Technical Reports Server (NTRS)
Szuch, J. R.
1974-01-01
This report describes HYDES, a hybrid computer program capable of simulating one-spool turbojet, two-spool turbojet, or two-spool turbofan engine dynamics. HYDES is also capable of simulating two- or three-stream turbofans with or without mixing of the exhaust streams. The program is intended to reduce the time required for implementing dynamic engine simulations. HYDES was developed for running on the Lewis Research Center's Electronic Associates (EAI) 690 Hybrid Computing System and satisfies the 16384-word core-size and hybrid-interface limits of that machine. The program could be modified for running on other computing systems. The use of HYDES to simulate a single-spool turbojet and a two-spool, two-stream turbofan engine is demonstrated. The form of the required input data is shown and samples of output listings (teletype) and transient plots (x-y plotter) are provided. HYDES is shown to be capable of performing both steady-state design and off-design analyses and transient analyses.
ParallABEL: an R library for generalized parallelization of genome-wide association studies.
Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S
2010-04-29
Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.
PVT: An Efficient Computational Procedure to Speed up Next-generation Sequence Analysis
2014-01-01
Background High-throughput Next-Generation Sequencing (NGS) techniques are advancing genomics and molecular biology research. This technology generates substantially large data which puts up a major challenge to the scientists for an efficient, cost and time effective solution to analyse such data. Further, for the different types of NGS data, there are certain common challenging steps involved in analysing those data. Spliced alignment is one such fundamental step in NGS data analysis which is extremely computational intensive as well as time consuming. There exists serious problem even with the most widely used spliced alignment tools. TopHat is one such widely used spliced alignment tools which although supports multithreading, does not efficiently utilize computational resources in terms of CPU utilization and memory. Here we have introduced PVT (Pipelined Version of TopHat) where we take up a modular approach by breaking TopHat’s serial execution into a pipeline of multiple stages, thereby increasing the degree of parallelization and computational resource utilization. Thus we address the discrepancies in TopHat so as to analyze large NGS data efficiently. Results We analysed the SRA dataset (SRX026839 and SRX026838) consisting of single end reads and SRA data SRR1027730 consisting of paired-end reads. We used TopHat v2.0.8 to analyse these datasets and noted the CPU usage, memory footprint and execution time during spliced alignment. With this basic information, we designed PVT, a pipelined version of TopHat that removes the redundant computational steps during ‘spliced alignment’ and breaks the job into a pipeline of multiple stages (each comprising of different step(s)) to improve its resource utilization, thus reducing the execution time. Conclusions PVT provides an improvement over TopHat for spliced alignment of NGS data analysis. PVT thus resulted in the reduction of the execution time to ~23% for the single end read dataset. Further, PVT designed for paired end reads showed an improved performance of ~41% over TopHat (for the chosen data) with respect to execution time. Moreover we propose PVT-Cloud which implements PVT pipeline in cloud computing system. PMID:24894600
PVT: an efficient computational procedure to speed up next-generation sequence analysis.
Maji, Ranjan Kumar; Sarkar, Arijita; Khatua, Sunirmal; Dasgupta, Subhasis; Ghosh, Zhumur
2014-06-04
High-throughput Next-Generation Sequencing (NGS) techniques are advancing genomics and molecular biology research. This technology generates substantially large data which puts up a major challenge to the scientists for an efficient, cost and time effective solution to analyse such data. Further, for the different types of NGS data, there are certain common challenging steps involved in analysing those data. Spliced alignment is one such fundamental step in NGS data analysis which is extremely computational intensive as well as time consuming. There exists serious problem even with the most widely used spliced alignment tools. TopHat is one such widely used spliced alignment tools which although supports multithreading, does not efficiently utilize computational resources in terms of CPU utilization and memory. Here we have introduced PVT (Pipelined Version of TopHat) where we take up a modular approach by breaking TopHat's serial execution into a pipeline of multiple stages, thereby increasing the degree of parallelization and computational resource utilization. Thus we address the discrepancies in TopHat so as to analyze large NGS data efficiently. We analysed the SRA dataset (SRX026839 and SRX026838) consisting of single end reads and SRA data SRR1027730 consisting of paired-end reads. We used TopHat v2.0.8 to analyse these datasets and noted the CPU usage, memory footprint and execution time during spliced alignment. With this basic information, we designed PVT, a pipelined version of TopHat that removes the redundant computational steps during 'spliced alignment' and breaks the job into a pipeline of multiple stages (each comprising of different step(s)) to improve its resource utilization, thus reducing the execution time. PVT provides an improvement over TopHat for spliced alignment of NGS data analysis. PVT thus resulted in the reduction of the execution time to ~23% for the single end read dataset. Further, PVT designed for paired end reads showed an improved performance of ~41% over TopHat (for the chosen data) with respect to execution time. Moreover we propose PVT-Cloud which implements PVT pipeline in cloud computing system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eperin, A.P.; Zakharzhevsky, Yu.O.; Arzhaev, A.I.
A two-year Finnish-Russian cooperation program has been initiated in 1995 to demonstrate the applicability of the leak-before-break concept (LBB) to the primary circuit piping of the Leningrad NPP. The program includes J-R curve testing of authentic pipe materials at full operating temperature, screening and computational LBB analyses complying with the USNRC Standard Review Plan 3.6.3, and exchange of LBB-related information with emphasis on NDE. Domestic computer codes are mainly used, and all tests and analyses are independently carried out by each party. The results are believed to apply generally to RBMK type plants of the first generation.
Sensory System for Implementing a Human—Computer Interface Based on Electrooculography
Barea, Rafael; Boquete, Luciano; Rodriguez-Ascariz, Jose Manuel; Ortega, Sergio; López, Elena
2011-01-01
This paper describes a sensory system for implementing a human–computer interface based on electrooculography. An acquisition system captures electrooculograms and transmits them via the ZigBee protocol. The data acquired are analysed in real time using a microcontroller-based platform running the Linux operating system. The continuous wavelet transform and neural network are used to process and analyse the signals to obtain highly reliable results in real time. To enhance system usability, the graphical interface is projected onto special eyewear, which is also used to position the signal-capturing electrodes. PMID:22346579
Cowell, Robert G
2018-05-04
Current models for single source and mixture samples, and probabilistic genotyping software based on them used for analysing STR electropherogram data, assume simple probability distributions, such as the gamma distribution, to model the allelic peak height variability given the initial amount of DNA prior to PCR amplification. Here we illustrate how amplicon number distributions, for a model of the process of sample DNA collection and PCR amplification, may be efficiently computed by evaluating probability generating functions using discrete Fourier transforms. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Sun, D. C.; Yuan, Qin
1994-01-01
The first phase of the study of the performance of a wormgear transmission is reported. In this phase the work included the selection of a double-enveloping wormgear type, and its dimensions, suitable for use in helicopter transmissions; the 3-D graphics representation of the selected wormgear using the I-DEAS software; the analysis of the kinematics of meshing; the analysis of load sharing among the meshing teeth; and the implementation of the analyses in a computer program. The report describes the analyses, their results, and the use of the computer programs.
Faries, Douglas E; Nyhuis, Allen W; Ascher-Svanum, Haya
2009-05-27
Schizophrenia is a severe, chronic, and costly illness that adversely impacts patients' lives and health care payer budgets. Cost comparisons of treatment regimens are, therefore, important to health care payers and researchers. Pre-Post analyses ("mirror-image"), where outcomes prior to a medication switch are compared to outcomes post-switch, are commonly used in such research. However, medication changes often occur during a costly crisis event. Patients may relapse, be hospitalized, have a medication change, and then spend a period of time with intense use of costly resources (post-medication switch). While many advantages and disadvantages of Pre-Post methodology have been discussed, issues regarding the attributability of costs incurred around the time of medication switching have not been fully investigated. Medical resource use data, including medications and acute-care services (hospitalizations, partial hospitalizations, emergency department) were collected for patients with schizophrenia who switched antipsychotics (n = 105) during a 1-year randomized, naturalistic, antipsychotic cost-effectiveness schizophrenia trial. Within-patient changes in total costs per day were computed during the pre- and post-medication change periods. In addition to the standard Pre-Post analysis comparing costs pre- and post-medication change, we investigated the sensitivity of results to varying assumptions regarding the attributability of acute care service costs occurring just after a medication switch that were likely due to initial medication failure. Fifty-six percent of all costs incurred during the first week on the newly initiated antipsychotic were likely due to treatment failure with the previous antipsychotic. Standard analyses suggested an average increase in cost-per-day for each patient of $2.40 after switching medications. However, sensitivity analyses removing costs incurred post-switch that were potentially due to the failure of the initial medication suggested decreases in costs in the range of $4.77 to $9.69 per day post-switch. Pre-Post cost analyses are sensitive to the approach used to handle acute-service costs occurring just after a medication change. Given the importance of quality economic research on the cost of switching treatments, thorough sensitivity analyses should be performed to identify the impact of crisis events around the time of medication change.
Trujillo-Arias, Natalia; Dantas, Gisele P M; Arbeláez-Cortés, Enrique; Naoki, Kazuya; Gómez, Maria I; Santos, Fabricio R; Miyaki, Cristina Y; Aleixo, Alexandre; Tubaro, Pablo L; Cabanne, Gustavo S
2017-07-01
The Atlantic Forest is separated from the Andean tropical forest by dry and open vegetation biomes (Chaco and Cerrado). Despite this isolation, both rainforests share closely related lineages, which suggest a past connection. This connection could have been important for forest taxa evolution. In this study, we used the Saffron-billed Sparrow (Arremon flavirostris) as a model to evaluate whether the Andean and the Atlantic forests act as a refugia system, as well as to test for a history of biogeographic connection between them. In addition, we evaluated the molecular systematic of intraspecific lineages of the studied species. We modeled the current and past distribution of A. flavirostris, performed phylogeographic analyses based on mitochondrial and nuclear genes, and used Approximate Bayesian Computation (ABC) analyses to test for biogeographic scenarios. The major phylogeographic disjunction within A. flavirostris was found between the Andean and the Atlantic forests, with a divergence that occurred during the Mid-Pleistocene. Our paleodistribution models indicated a connection between these forest domains in different periods and through both the Chaco and Cerrado. Additionally, the phylogeographic and ABC analyses supported that the Cerrado was the main route of connection between these rainforests, but without giving decisive evidence against a Chaco connection. Our study with A. flavirostris suggest that the biodiversity of the Andean and of the Atlantic forests could have been impacted (and perhaps enriched?) by cycles of connections through the Cerrado and Chaco. This recurrent cycle of connection between the Andean and the Atlantic Forest could have been important for the evolution of Neotropical forest taxa. In addition, we discussed taxonomic implications of the results and proposed to split the studied taxon into two full species. Copyright © 2017 Elsevier Inc. All rights reserved.
Neuroimaging studies of GABA in schizophrenia: a systematic review with meta-analysis.
Egerton, A; Modinos, G; Ferrera, D; McGuire, P
2017-06-06
Data from animal models and from postmortem studies suggest that schizophrenia is associated with brain GABAergic dysfunction. The extent to which this is reflected in data from in vivo studies of GABA function in schizophrenia is unclear. The Medline database was searched to identify articles published until 21 October 2016. The search terms included GABA, proton magnetic resonance spectroscopy ( 1 H-MRS), positron emission tomography (PET), single photon emission computed tomography (SPECT), schizophrenia and psychosis. Sixteen GABA 1 H-MRS studies (538 controls, 526 patients) and seven PET/SPECT studies of GABA A /benzodiazepine receptor (GABA A /BZR) availability (118 controls, 113 patients) were identified. Meta-analyses of 1 H-MRS GABA in the medial prefrontal cortex (mPFC), parietal/occipital cortex (POC) and striatum did not show significant group differences (mFC: g=-0.3, 409 patients, 495 controls, 95% confidence interval (CI): -0.6 to 0.1; POC: g=-0.3, 139 patients, 111 controls, 95% CI: -0.9 to 0.3; striatum: g=-0.004, 123 patients, 95 controls, 95% CI: -0.7 to 0.7). Heterogeneity across studies was high (I 2 >50%), and this was not explained by subsequent moderator or meta-regression analyses. There were insufficient PET/SPECT receptor availability studies for meta-analyses, but a systematic review did not suggest replicable group differences in regional GABA A /BZR availability. The current literature does not reveal consistent alterations in in vivo GABA neuroimaging measures in schizophrenia, as might be hypothesized from animal models and postmortem data. The analysis highlights the need for further GABA neuroimaging studies with improved methodology and addressing potential sources of heterogeneity.
Neuroimaging studies of GABA in schizophrenia: a systematic review with meta-analysis
Egerton, A; Modinos, G; Ferrera, D; McGuire, P
2017-01-01
Data from animal models and from postmortem studies suggest that schizophrenia is associated with brain GABAergic dysfunction. The extent to which this is reflected in data from in vivo studies of GABA function in schizophrenia is unclear. The Medline database was searched to identify articles published until 21 October 2016. The search terms included GABA, proton magnetic resonance spectroscopy (1H-MRS), positron emission tomography (PET), single photon emission computed tomography (SPECT), schizophrenia and psychosis. Sixteen GABA 1H-MRS studies (538 controls, 526 patients) and seven PET/SPECT studies of GABAA/benzodiazepine receptor (GABAA/BZR) availability (118 controls, 113 patients) were identified. Meta-analyses of 1H-MRS GABA in the medial prefrontal cortex (mPFC), parietal/occipital cortex (POC) and striatum did not show significant group differences (mFC: g=−0.3, 409 patients, 495 controls, 95% confidence interval (CI): −0.6 to 0.1; POC: g=−0.3, 139 patients, 111 controls, 95% CI: −0.9 to 0.3; striatum: g=−0.004, 123 patients, 95 controls, 95% CI: −0.7 to 0.7). Heterogeneity across studies was high (I2>50%), and this was not explained by subsequent moderator or meta-regression analyses. There were insufficient PET/SPECT receptor availability studies for meta-analyses, but a systematic review did not suggest replicable group differences in regional GABAA/BZR availability. The current literature does not reveal consistent alterations in in vivo GABA neuroimaging measures in schizophrenia, as might be hypothesized from animal models and postmortem data. The analysis highlights the need for further GABA neuroimaging studies with improved methodology and addressing potential sources of heterogeneity. PMID:28585933
Detailed T1-Weighted Profiles from the Human Cortex Measured in Vivo at 3 Tesla MRI.
Ferguson, Bart; Petridou, Natalia; Fracasso, Alessio; van den Heuvel, Martijn P; Brouwer, Rachel M; Hulshoff Pol, Hilleke E; Kahn, René S; Mandl, René C W
2018-04-01
Studies into cortical thickness in psychiatric diseases based on T1-weighted MRI frequently report on aberrations in the cerebral cortex. Due to limitations in image resolution for studies conducted at conventional MRI field strengths (e.g. 3 Tesla (T)) this information cannot be used to establish which of the cortical layers may be implicated. Here we propose a new analysis method that computes one high-resolution average cortical profile per brain region extracting myeloarchitectural information from T1-weighted MRI scans that are routinely acquired at a conventional field strength. To assess this new method, we acquired standard T1-weighted scans at 3 T and compared them with state-of-the-art ultra-high resolution T1-weighted scans optimised for intracortical myelin contrast acquired at 7 T. Average cortical profiles were computed for seven different brain regions. Besides a qualitative comparison between the 3 T scans, 7 T scans, and results from literature, we tested if the results from dynamic time warping-based clustering are similar for the cortical profiles computed from 7 T and 3 T data. In addition, we quantitatively compared cortical profiles computed for V1, V2 and V7 for both 7 T and 3 T data using a priori information on their relative myelin concentration. Although qualitative comparisons show that at an individual level average profiles computed for 7 T have more pronounced features than 3 T profiles the results from the quantitative analyses suggest that average cortical profiles computed from T1-weighted scans acquired at 3 T indeed contain myeloarchitectural information similar to profiles computed from the scans acquired at 7 T. The proposed method therefore provides a step forward to study cortical myeloarchitecture in vivo at conventional magnetic field strength both in health and disease.
The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.
Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker
2016-01-01
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.
The Immuno-Dynamics of Conflict Intervention in Social Systems
Krakauer, David C.; Page, Karen; Flack, Jessica
2011-01-01
We present statistical evidence and dynamical models for the management of conflict and a division of labor (task specialization) in a primate society. Two broad intervention strategy classes are observed– a dyadic strategy – pacifying interventions, and a triadic strategy –policing interventions. These strategies, their respective degrees of specialization, and their consequences for conflict dynamics can be captured through empirically-grounded mathematical models inspired by immuno-dynamics. The spread of aggression, analogous to the proliferation of pathogens, is an epidemiological problem. We show analytically and computationally that policing is an efficient strategy as it requires only a small proportion of a population to police to reduce conflict contagion. Policing, but not pacifying, is capable of effectively eliminating conflict. These results suggest that despite implementation differences there might be universal features of conflict management mechanisms for reducing contagion-like dynamics that apply across biological and social levels. Our analyses further suggest that it can be profitable to conceive of conflict management strategies at the behavioral level as mechanisms of social immunity. PMID:21887221
The immuno-dynamics of conflict intervention in social systems.
Krakauer, David C; Page, Karen; Flack, Jessica
2011-01-01
We present statistical evidence and dynamical models for the management of conflict and a division of labor (task specialization) in a primate society. Two broad intervention strategy classes are observed--a dyadic strategy--pacifying interventions, and a triadic strategy--policing interventions. These strategies, their respective degrees of specialization, and their consequences for conflict dynamics can be captured through empirically-grounded mathematical models inspired by immuno-dynamics. The spread of aggression, analogous to the proliferation of pathogens, is an epidemiological problem. We show analytically and computationally that policing is an efficient strategy as it requires only a small proportion of a population to police to reduce conflict contagion. Policing, but not pacifying, is capable of effectively eliminating conflict. These results suggest that despite implementation differences there might be universal features of conflict management mechanisms for reducing contagion-like dynamics that apply across biological and social levels. Our analyses further suggest that it can be profitable to conceive of conflict management strategies at the behavioral level as mechanisms of social immunity.
A Balanced Mixture of Antagonistic Pressures Promotes the Evolution of Parallel Movement
NASA Astrophysics Data System (ADS)
Demšar, Jure; Štrumbelj, Erik; Lebar Bajec, Iztok
2016-12-01
A common hypothesis about the origins of collective behaviour suggests that animals might live and move in groups to increase their chances of surviving predator attacks. This hypothesis is supported by several studies that use computational models to simulate natural evolution. These studies, however, either tune an ad-hoc model to ‘reproduce’ collective behaviour, or concentrate on a single type of predation pressure, or infer the emergence of collective behaviour from an increase in prey density. In nature, prey are often targeted by multiple predator species simultaneously and this might have played a pivotal role in the evolution of collective behaviour. We expand on previous research by using an evolutionary rule-based system to simulate the evolution of prey behaviour when prey are subject to multiple simultaneous predation pressures. We analyse the evolved behaviour via prey density, polarization, and angular momentum. Our results suggest that a mixture of antagonistic external pressures that simultaneously steer prey towards grouping and dispersing might be required for prey individuals to evolve dynamic parallel movement.
Neurotoxic effects of ecstasy on the thalamus.
de Win, Maartje M L; Jager, Gerry; Booij, Jan; Reneman, Liesbeth; Schilt, Thelma; Lavini, Cristina; Olabarriaga, Sílvia D; Ramsey, Nick F; Heeten, Gerard J den; van den Brink, Wim
2008-10-01
Neurotoxic effects of ecstasy have been reported, although it remains unclear whether effects can be attributed to ecstasy, other recreational drugs or a combination of these. To assess specific/independent neurotoxic effects of heavy ecstasy use and contributions of amphetamine, cocaine and cannabis as part of The Netherlands XTC Toxicity (NeXT) study. Effects of ecstasy and other substances were assessed with (1)H-magnetic resonance spectroscopy, diffusion tensor imaging, perfusion weighted imaging and [(123)I]2beta-carbomethoxy-3beta-(4-iodophenyl)-tropane ([(123)I]beta-CIT) single photon emission computed tomography (serotonin transporters) in a sample (n=71) with broad variation in drug use, using multiple regression analyses. Ecstasy showed specific effects in the thalamus with decreased [(123)I]beta-CIT binding, suggesting serotonergic axonal damage; decreased fractional anisotropy, suggesting axonal loss; and increased cerebral blood volume probably caused by serotonin depletion. Ecstasy had no effect on brain metabolites and apparent diffusion coefficients. Converging evidence was found for a specific toxic effect of ecstasy on serotonergic axons in the thalamus.
Faridi, Mohd Hafeez; Maiguel, Dony; Brown, Brock T.; Suyama, Eigo; Barth, Constantinos J.; Hedrick, Michael; Vasile, Stefan; Sergienko, Eduard; Schürer, Stephan; Gupta, Vineet
2010-01-01
Binding of leukocyte specific integrin CD11b/CD18 to its physiologic ligands is important for the development of normal immune response in vivo. Integrin CD11b/CD18 is also a key cellular effector of various inflammatory and autoimmune diseases. However, small molecules selectively inhibiting the function of integrin CD11b/CD18 are currently lacking. We used a newly described cell-based high throughput screening assay to identify a number of highly potent antagonists of integrin CD11b/CD18 from chemical libraries containing >100,000 unique compounds. Computational analyses suggest that the identified compounds cluster into several different chemical classes. A number of the newly identified compounds blocked adhesion of wild-type mouse neutrophils to CD11b/CD18 ligand fibrinogen. Mapping the most active compounds against chemical fingerprints of known antagonists of related integrin CD11a/CD18 shows little structural similarity, suggesting that the newly identified compounds are novel and unique. PMID:20188705
Chicago's water market: Dynamics of demand, prices and scarcity rents
Ipe, V.C.; Bhagwat, S.B.
2002-01-01
Chicago and its suburbs are experiencing an increasing demand for water from a growing population and economy and may experience water scarcity in the near future. The Chicago metropolitan area has nearly depleted its groundwater resources to a point where interstate conflicts with Wisconsin could accompany an increased reliance on those sources. Further, the withdrawals from Lake Michigan is limited by the Supreme Court decree. The growing demand and indications of possible scarcity suggest a need to reexamine the pricing policies and the dynamics of demand. The study analyses the demand for water and develops estimates of scarcity rents for water in Chicago. The price and income elasticities computed at the means are -0.002 and 0.0002 respectively. The estimated scarcity rents ranges from $0.98 to $1.17 per thousand gallons. The results indicate that the current prices do not fully account for the scarcity rents and suggest a current rate with in the range $1.53 to $1.72 per thousand gallons.
A discovery of novel microRNAs in the silkworm (Bombyx mori) genome.
Yu, Xiaomin; Zhou, Qing; Cai, Yimei; Luo, Qibin; Lin, Hongbin; Hu, Songnian; Yu, Jun
2009-12-01
MicroRNAs (miRNAs) are pivotal regulators involved in various physiological and pathological processes via their post-transcriptional regulation of gene expressions. We sequenced 14 libraries of small RNAs constructed from samples spanning the life cycle of silkworms, and discovered 50 novel miRNAs previously not known in animals and verified 43 of them using stem-loop RT-PCR. Our genome-wide analyses of 27 species-specific miRNAs suggest they arise from transposable elements, protein-coding genes duplication/transposition and random foldback sequences; which is consistent with the idea that novel animal miRNAs may evolve from incomplete self-complementary transcripts and become fixed in the process of co-adaptation with their targets. Computational prediction suggests that the silkworm-specific miRNAs may have a preference of regulating genes that are related to life-cycle-associated traits, and these genes can serve as potential targets for subsequent studies of the modulating networks in the development of Bombyx mori.
NASA Technical Reports Server (NTRS)
Miller, R. D.; Rogers, J. T.
1975-01-01
General requirements for dynamic loads analyses are described. The indicial lift growth function unsteady subsonic aerodynamic representation is reviewed, and the FLEXSTAB CPS is evaluated with respect to these general requirements. The effects of residual flexibility techniques on dynamic loads analyses are also evaluated using a simple dynamic model.
Rapid solution of large-scale systems of equations
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O.
1994-01-01
The analysis and design of complex aerospace structures requires the rapid solution of large systems of linear and nonlinear equations, eigenvalue extraction for buckling, vibration and flutter modes, structural optimization and design sensitivity calculation. Computers with multiple processors and vector capabilities can offer substantial computational advantages over traditional scalar computer for these analyses. These computers fall into two categories: shared memory computers and distributed memory computers. This presentation covers general-purpose, highly efficient algorithms for generation/assembly or element matrices, solution of systems of linear and nonlinear equations, eigenvalue and design sensitivity analysis and optimization. All algorithms are coded in FORTRAN for shared memory computers and many are adapted to distributed memory computers. The capability and numerical performance of these algorithms will be addressed.
Grammar Is a System That Characterizes Talk in Interaction
Ginzburg, Jonathan; Poesio, Massimo
2016-01-01
Much of contemporary mainstream formal grammar theory is unable to provide analyses for language as it occurs in actual spoken interaction. Its analyses are developed for a cleaned up version of language which omits the disfluencies, non-sentential utterances, gestures, and many other phenomena that are ubiquitous in spoken language. Using evidence from linguistics, conversation analysis, multimodal communication, psychology, language acquisition, and neuroscience, we show these aspects of language use are rule governed in much the same way as phenomena captured by conventional grammars. Furthermore, we argue that over the past few years some of the tools required to provide a precise characterizations of such phenomena have begun to emerge in theoretical and computational linguistics; hence, there is no reason for treating them as “second class citizens” other than pre-theoretical assumptions about what should fall under the purview of grammar. Finally, we suggest that grammar formalisms covering such phenomena would provide a better foundation not just for linguistic analysis of face-to-face interaction, but also for sister disciplines, such as research on spoken dialogue systems and/or psychological work on language acquisition. PMID:28066279
NASA Technical Reports Server (NTRS)
Browder, J. A. (Principal Investigator); Rosenthal, A.; May, L. N., Jr.; Bauman, R. H.; Gosselink, J. G.
1985-01-01
The purpose of the project is to refine and validate a probabilistic spatial computer model through the analyses of thematic mapper imagery. The model is designed to determine how the interface between marshland and water changes as marshland is converted to water in a disintegrating marsh. Coastal marshland in Louisiana is disintegrating at the rate of approximately 40 sq mi a year, and an evaluation of the potential impact of this loss on the landings of estuarine-dependent fisheries is needed by fisheries managers. Understanding how marshland-water interface changes as coastal marshland is lost is essential to the process of evaluating fisheries effects, because several studies suggest that the production of estuarine-dependent fish and shellfish may be more closely related to the interface between marshland and water than to acreage of marshland. The need to address this practical problem has provided an opportunity to apply some scientifically interesting new techniques to the analyses of satellite imagery. Progress with the development of these techniques is the subject of this report.
Crosby, Richard A.; Mena, Leandro; Ricks, JaNelle
2018-01-01
This study applied an 8-item index of recent sexual risk behaviors to young Black men who have sex with men (YBMSM) and evaluated the distribution for normality. The distribution was tested for associations with possible antecedents of sexual risk. YBMSM (N=600), ages 16–29 years, were recruited from an STI clinic, located in the Southern United States. Men completed an extensive audio-computer assisted self-interview. Thirteen possible antecedents of sexual risk, as assessed by the index, were selected for analyses. The 8-item index formed a normal distribution with a mean of 4.77 (sd=1.77). In adjusted analyses, not having completed education beyond high school was associated with less risk, as was having sex with females. Conversely, meeting sex partners online was associated with greater risk, as was reporting that sex partners were drunk during sex. The obtained normal distribution of sexual risk behaviors suggests a corresponding need to “target and tailor” clinic-based counseling and prevention services for YBMSM. Avoiding sex when partners are intoxicated may be an especially valuable goal of counseling sessions. PMID:27875903
Crosby, Richard A; Mena, Leandro; Ricks, JaNelle M
2017-06-01
This study applied an 8-item index of recent sexual-risk behaviors to young Black men who have sex with men (YBMSM) and evaluated the distribution for normality. The distribution was tested for associations with possible antecedents of sexual risk. YBMSM (N = 600), aged 16-29 years, were recruited from a sexually transmitted infection clinic, located in the southern US. Men completed an extensive audio computer-assisted self-interview. Thirteen possible antecedents of sexual risk, as assessed by the index, were selected for analyses. The 8-item index formed a normal distribution with a mean of 4.77 (SD = 1.77). In adjusted analyses, not having completed education beyond high school was associated with less risk, as was having sex with females. Conversely, meeting sex partners online was associated with greater risk, as was reporting that sex partners were drunk during sex. The obtained normal distribution of sexual-risk behaviors suggests a corresponding need to "target and tailor" clinic-based counseling and prevention services for YBMSM. Avoiding sex when partners are intoxicated may be an especially valuable goal of counseling sessions.
On the use of log-transformation vs. nonlinear regression for analyzing biological power laws
Xiao, X.; White, E.P.; Hooten, M.B.; Durham, S.L.
2011-01-01
Power-law relationships are among the most well-studied functional relationships in biology. Recently the common practice of fitting power laws using linear regression (LR) on log-transformed data has been criticized, calling into question the conclusions of hundreds of studies. It has been suggested that nonlinear regression (NLR) is preferable, but no rigorous comparison of these two methods has been conducted. Using Monte Carlo simulations, we demonstrate that the error distribution determines which method performs better, with NLR better characterizing data with additive, homoscedastic, normal error and LR better characterizing data with multiplicative, heteroscedastic, lognormal error. Analysis of 471 biological power laws shows that both forms of error occur in nature. While previous analyses based on log-transformation appear to be generally valid, future analyses should choose methods based on a combination of biological plausibility and analysis of the error distribution. We provide detailed guidelines and associated computer code for doing so, including a model averaging approach for cases where the error structure is uncertain. ?? 2011 by the Ecological Society of America.
Proteomic and Biochemical Analyses of the Cotyledon and Root of Flooding-Stressed Soybean Plants
Komatsu, Setsuko; Makino, Takahiro; Yasue, Hiroshi
2013-01-01
Background Flooding significantly reduces the growth and grain yield of soybean plants. Proteomic and biochemical techniques were used to determine whether the function of cotyledon and root is altered in soybean under flooding stress. Results Two-day-old soybean plants were flooded for 2 days, after which the proteins from root and cotyledon were extracted for proteomic analysis. In response to flooding stress, the abundance of 73 and 28 proteins was significantly altered in the root and cotyledon, respectively. The accumulation of only one protein, 70 kDa heat shock protein (HSP70) (Glyma17g08020.1), increased in both organs following flooding. The ratio of protein abundance of HSP70 and biophoton emission in the cotyledon was higher than those detected in the root under flooding stress. Computed tomography and elemental analyses revealed that flooding stress decreases the number of calcium oxalate crystal the cotyledon, indicating calcium ion was elevated in the cotyledon under flooding stress. Conclusion These results suggest that calcium might play one role through HSP70 in the cotyledon under flooding stress. PMID:23799004
Defining objective clusters for rabies virus sequences using affinity propagation clustering
Fischer, Susanne; Freuling, Conrad M.; Pfaff, Florian; Bodenhofer, Ulrich; Höper, Dirk; Fischer, Mareike; Marston, Denise A.; Fooks, Anthony R.; Mettenleiter, Thomas C.; Conraths, Franz J.; Homeier-Bachmann, Timo
2018-01-01
Rabies is caused by lyssaviruses, and is one of the oldest known zoonoses. In recent years, more than 21,000 nucleotide sequences of rabies viruses (RABV), from the prototype species rabies lyssavirus, have been deposited in public databases. Subsequent phylogenetic analyses in combination with metadata suggest geographic distributions of RABV. However, these analyses somewhat experience technical difficulties in defining verifiable criteria for cluster allocations in phylogenetic trees inviting for a more rational approach. Therefore, we applied a relatively new mathematical clustering algorythm named ‘affinity propagation clustering’ (AP) to propose a standardized sub-species classification utilizing full-genome RABV sequences. Because AP has the advantage that it is computationally fast and works for any meaningful measure of similarity between data samples, it has previously been applied successfully in bioinformatics, for analysis of microarray and gene expression data, however, cluster analysis of sequences is still in its infancy. Existing (516) and original (46) full genome RABV sequences were used to demonstrate the application of AP for RABV clustering. On a global scale, AP proposed four clusters, i.e. New World cluster, Arctic/Arctic-like, Cosmopolitan, and Asian as previously assigned by phylogenetic studies. By combining AP with established phylogenetic analyses, it is possible to resolve phylogenetic relationships between verifiably determined clusters and sequences. This workflow will be useful in confirming cluster distributions in a uniform transparent manner, not only for RABV, but also for other comparative sequence analyses. PMID:29357361
The moderating role of emotional competence in suicidal ideation among Chinese university students.
Kwok, Sylvia Y C L
2014-04-01
To explore the relationship among perceived family functioning, emotional competence and suicidal ideation and to examine the moderating role of emotional competence in suicidal ideation. Previous studies have highlighted that poor family relationships and emotional symptoms are significant predictors of suicidal ideation. However, the roles of perceived family functioning and emotional competence in predicting suicidal ideation have not been given adequate attention. A cross-sectional survey using convenience sampling. A questionnaire was administered to 302 university students from February-April in 2011 in Hong Kong. The means, standard deviations and Cronbach's alphas of the variables were computed. Pearson correlation analyses and hierarchical regression analyses were performed. Hierarchical regression analyses showed that perceived high family functioning and emotional competence were significant negative predictors of suicidal ideation. Further analyses showed that parental concern, parental control and creative use of emotions were significant predictors of suicidal ideation. Emotional competence, specifically creative use of emotions, was found to moderate the relationship between perceived family functioning and suicidal ideation. The findings support the family ecological framework and provide evidence for emotional competence as a resilience factor that buffers low family functioning on suicidal ideation. Suggested measures to decrease suicidal ideation include enhancing parental concern, lessening parental control, developing students' awareness, regulation and management of their own emotions, fostering empathy towards others' emotional expression, enhancing social skills in sharing and influencing others' emotions and increasing the positive use of emotions for the evaluation and generation of new ideas. © 2013 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Pederson, Kathleen Marshall
The status of research on computer-assisted language learning (CALL) is explored beginning with a historical perspective of research on the language laboratory, followed by analyses of applied research on CALL. A theoretical base is provided to illustrate the need for more basic research on CALL that considers computer capabilities, learner…
COMPUTER-ASSISTED MOTION ANALYSIS OF SPERM FROM THE COMMON CARP
Computer-assisted semen analysis (CASA) technology was applied to the measurement of sperm motility parameters in the common carp Cyprinus carpio. Activated sperm were videotaped at 200 frames s-1 and analysed with the CellTrak/S CASA research system. The percentage of motile cel...
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program... new discharge estimates. (6) Any computer program used to perform hydrologic or hydraulic analyses in... control and/or the regulation of flood plain lands. For computer programs adopted by non-Federal agencies...
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program... new discharge estimates. (6) Any computer program used to perform hydrologic or hydraulic analyses in... control and/or the regulation of flood plain lands. For computer programs adopted by non-Federal agencies...
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program... new discharge estimates. (6) Any computer program used to perform hydrologic or hydraulic analyses in... control and/or the regulation of flood plain lands. For computer programs adopted by non-Federal agencies...
44 CFR 65.6 - Revision of base flood elevation determinations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program... new discharge estimates. (6) Any computer program used to perform hydrologic or hydraulic analyses in... control and/or the regulation of flood plain lands. For computer programs adopted by non-Federal agencies...
An Analysis of Methods Used to Examine Gender Differences in Computer-Related Behavior.
ERIC Educational Resources Information Center
Kay, Robin
1992-01-01
Review of research investigating gender differences in computer-related behavior examines statistical and methodological flaws. Issues addressed include sample selection, sample size, scale development, scale quality, the use of univariate and multivariate analyses, regressional analysis, construct definition, construct testing, and the…
Assessment of Situated Learning Using Computer Environments.
ERIC Educational Resources Information Center
Young, Michael
1995-01-01
Suggests that, based on a theory of situated learning, assessment must emphasize process as much as product. Several assessment examples are given, including a computer-based planning assistant for a mathematics and science video, suggestions for computer-based portfolio assessment, and speculations about embedded assessment of virtual situations.…
4P: fast computing of population genetics statistics from large DNA polymorphism panels
Benazzo, Andrea; Panziera, Alex; Bertorelle, Giorgio
2015-01-01
Massive DNA sequencing has significantly increased the amount of data available for population genetics and molecular ecology studies. However, the parallel computation of simple statistics within and between populations from large panels of polymorphic sites is not yet available, making the exploratory analyses of a set or subset of data a very laborious task. Here, we present 4P (parallel processing of polymorphism panels), a stand-alone software program for the rapid computation of genetic variation statistics (including the joint frequency spectrum) from millions of DNA variants in multiple individuals and multiple populations. It handles a standard input file format commonly used to store DNA variation from empirical or simulation experiments. The computational performance of 4P was evaluated using large SNP (single nucleotide polymorphism) datasets from human genomes or obtained by simulations. 4P was faster or much faster than other comparable programs, and the impact of parallel computing using multicore computers or servers was evident. 4P is a useful tool for biologists who need a simple and rapid computer program to run exploratory population genetics analyses in large panels of genomic data. It is also particularly suitable to analyze multiple data sets produced in simulation studies. Unix, Windows, and MacOs versions are provided, as well as the source code for easier pipeline implementations. PMID:25628874
Where next for the reproducibility agenda in computational biology?
Lewis, Joanna; Breeze, Charles E; Charlesworth, Jane; Maclaren, Oliver J; Cooper, Jonathan
2016-07-15
The concept of reproducibility is a foundation of the scientific method. With the arrival of fast and powerful computers over the last few decades, there has been an explosion of results based on complex computational analyses and simulations. The reproducibility of these results has been addressed mainly in terms of exact replicability or numerical equivalence, ignoring the wider issue of the reproducibility of conclusions through equivalent, extended or alternative methods. We use case studies from our own research experience to illustrate how concepts of reproducibility might be applied in computational biology. Several fields have developed 'minimum information' checklists to support the full reporting of computational simulations, analyses and results, and standardised data formats and model description languages can facilitate the use of multiple systems to address the same research question. We note the importance of defining the key features of a result to be reproduced, and the expected agreement between original and subsequent results. Dynamic, updatable tools for publishing methods and results are becoming increasingly common, but sometimes come at the cost of clear communication. In general, the reproducibility of computational research is improving but would benefit from additional resources and incentives. We conclude with a series of linked recommendations for improving reproducibility in computational biology through communication, policy, education and research practice. More reproducible research will lead to higher quality conclusions, deeper understanding and more valuable knowledge.
MultiPhyl: a high-throughput phylogenomics webserver using distributed computing
Keane, Thomas M.; Naughton, Thomas J.; McInerney, James O.
2007-01-01
With the number of fully sequenced genomes increasing steadily, there is greater interest in performing large-scale phylogenomic analyses from large numbers of individual gene families. Maximum likelihood (ML) has been shown repeatedly to be one of the most accurate methods for phylogenetic construction. Recently, there have been a number of algorithmic improvements in maximum-likelihood-based tree search methods. However, it can still take a long time to analyse the evolutionary history of many gene families using a single computer. Distributed computing refers to a method of combining the computing power of multiple computers in order to perform some larger overall calculation. In this article, we present the first high-throughput implementation of a distributed phylogenetics platform, MultiPhyl, capable of using the idle computational resources of many heterogeneous non-dedicated machines to form a phylogenetics supercomputer. MultiPhyl allows a user to upload hundreds or thousands of amino acid or nucleotide alignments simultaneously and perform computationally intensive tasks such as model selection, tree searching and bootstrapping of each of the alignments using many desktop machines. The program implements a set of 88 amino acid models and 56 nucleotide maximum likelihood models and a variety of statistical methods for choosing between alternative models. A MultiPhyl webserver is available for public use at: http://www.cs.nuim.ie/distributed/multiphyl.php. PMID:17553837
Bayomy, Hanaa; El Awadi, Mona; El Araby, Eman; Abed, Hala A
2016-12-01
Computer-assisted medical education has been developed to enhance learning and enable high-quality medical care. This study aimed to assess computer knowledge and attitude toward the inclusion of computers in medical education among second-year medical students in Benha Faculty of Medicine, Egypt, to identify limitations, and obtain suggestions for successful computer-based learning. This was a one-group pre-post-test study, which was carried out on second-year students in Benha Faculty of Medicine. A structured self-administered questionnaire was used to compare students' knowledge, attitude, limitations, and suggestions toward computer usage in medical education before and after the computer course to evaluate the change in students' responses. The majority of students were familiar with use of the mouse and keyboard, basic word processing, internet and web searching, and e-mail both before and after the computer course. The proportion of students who were familiar with software programs other than the word processing and trouble-shoot software/hardware was significantly higher after the course (P<0.001). There was a significant increase in the proportion of students who agreed on owning a computer (P=0.008), the inclusion of computer skills course in medical education, downloading lecture handouts, and computer-based exams (P<0.001) after the course. After the course, there was a significant increase in the proportion of students who agreed that the lack of central computers limited the inclusion of computer in medical education (P<0.001). Although the lack of computer labs, lack of Information Technology staff mentoring, large number of students, unclear course outline, and lack of internet access were more frequently reported before the course (P<0.001), the majority of students suggested the provision of computer labs, inviting Information Technology staff to support computer teaching, and the availability of free Wi-Fi internet access covering several areas in the university campus; all would support computer-assisted medical education. Medical students in Benha University are computer literate, which allows for computer-based medical education. Staff training, provision of computer labs, and internet access are essential requirements for enhancing computer usage in medical education in the university.
Reduced-Order Modeling: Cooperative Research and Development at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Beran, Philip S.; Cesnik, Carlos E. S.; Guendel, Randal E.; Kurdila, Andrew; Prazenica, Richard J.; Librescu, Liviu; Marzocca, Piergiovanni; Raveh, Daniella E.
2001-01-01
Cooperative research and development activities at the NASA Langley Research Center (LaRC) involving reduced-order modeling (ROM) techniques are presented. Emphasis is given to reduced-order methods and analyses based on Volterra series representations, although some recent results using Proper Orthogonal Deco in position (POD) are discussed as well. Results are reported for a variety of computational and experimental nonlinear systems to provide clear examples of the use of reduced-order models, particularly within the field of computational aeroelasticity. The need for and the relative performance (speed, accuracy, and robustness) of reduced-order modeling strategies is documented. The development of unsteady aerodynamic state-space models directly from computational fluid dynamics analyses is presented in addition to analytical and experimental identifications of Volterra kernels. Finally, future directions for this research activity are summarized.
NASA Technical Reports Server (NTRS)
Covington, M. A.
2005-01-01
New tests and analyses are reported that were carried out to resolve testing uncertainties in the original development and qualification of a lightweight ablative material used for the Stardust spacecraft forebody heat shield. These additional arcjet tests and analyses confirmed the ablative and thermal performance of low density Phenolic Impregnated Carbon Ablator (PICA) material used for the Stardust design. Testing was done under conditions that simulate the peak convective heating conditions (1200 W/cm2 and 0.5 atm) expected during Earth entry of the Stardust Sample Return Capsule. Test data and predictions from an ablative material response computer code for the in-depth temperatures were compared to guide iterative adjustment of material thermophysical properties used in the code so that the measured and predicted temperatures agreed. The PICA recession rates and maximum internal temperatures were satisfactorily predicted by the computer code with the revised properties. Predicted recession rates were also in acceptable agreement with measured rates for heating conditions 37% greater than the nominal peak heating rate of 1200 W/sq cm. The measured in-depth temperature response data show consistent temperature rise deviations that may be caused by an undocumented endothermic process within the PICA material that is not accurately modeled by the computer code. Predictions of the Stardust heat shield performance based on the present evaluation provide evidence that the maximum adhesive bondline temperature will be much lower than the maximum allowable of 250 C and an earlier design prediction. The re-evaluation also suggests that even with a 25 percent increase in peak heating rates, the total recession of the heat shield would be a small fraction of the as-designed thickness. These results give confidence in the Stardust heat shield design and confirm the potential of PICA material for use in new planetary probe and sample return applications.
Short, C E; James, E L; Rebar, A L; Duncan, M J; Courneya, K S; Plotnikoff, R C; Crutzen, R; Bidargaddi, N; Vandelanotte, C
2017-11-01
Participating in regular physical activity is a recommended cancer recovery strategy for breast cancer survivors. However, tailored support services are not widely available and most survivors are insufficiently active to obtain health benefits. Delivering tailored programs via the Internet offers one promising approach. However, recent evaluations of such programs suggest that major improvements are needed to ensure programs meet the needs of users and are delivered in an engaging way. Understanding participants' experiences with current programs can help to inform the next generation of systems. The purposes of this study are to explore breast cancer survivor's perspectives of and experiences using a novel computer-tailored intervention and to describe recommendations for future iterations. Qualitative data from a sub-sample of iMove More for Life study participants were analysed thematically to identify key themes. Participants long-term goals for participating in the program were explored by analysing open-ended data extracted from action plans completed during the intervention (n = 370). Participants negative and positive perceptions of the website and recommendations for improvement were explored using data extracted from open-ended survey items collected at the immediate intervention follow-up (n = 156). The majority of participants reported multi-faceted goals, consisting of two or more outcomes they hoped to achieve within a year. While clear themes were identified (e.g. 'being satisfied with body weight'), there was considerable variability in the scope of the goal (e.g. desired weight loss ranged from 2 to 30 kg). Participants' perceptions of the website were mixed, but clear indications were provided of how intervention content and structure could be improved. This study provides insight into how to better accommodate breast cancer survivors in the future and ultimately design more engaging computer-tailored interventions.
De Cocker, Katrien; De Bourdeaudhuij, Ilse; Cardon, Greet; Vandelanotte, Corneel
2017-05-03
Office workers demonstrate high levels of sitting on workdays. As sitting is positively associated with adverse health risks in adults, a theory-driven web-based computer-tailored intervention to influence workplace sitting, named 'Start to Stand,' was developed. The intervention was found to be effective in reducing self-reported workplace sitting among Flemish employees. The aim of this study was to investigate through which mechanisms the web-based computer-tailored intervention influenced self-reported workplace sitting. Employees (n = 155) participated in a clustered randomised controlled trial and reported socio-demographics (age, gender, education), work-related (hours at work, employment duration), health-related (weight and height, workplace sitting and physical activity) and psychosocial (knowledge, attitudes, self-efficacy, social support, intention regarding (changing) sitting behaviours) variables at baseline and 1-month follow-up. The product-of-coefficients test of MacKinnon based on multiple linear regression analyses was conducted to examine the mediating role of five psychosocial factors (knowledge, attitudes, self-efficacy, social support, intention). The influence of one self-regulation skill (action planning) in the association between the intervention and self-reported workplace sitting time was investigated via moderation analyses. The intervention had a positive influence on knowledge (p = 0.040), but none of the psychosocial variables did mediate the intervention effect on self-reported workplace sitting. Action planning was found to be a significant moderator (p < 0.001) as the decrease in self-reported workplace sitting only occurred in the group completing an action plan. Future interventions aimed at reducing employees' workplace sitting are suggested to focus on self-regulatory skills and promote action planning when using web-based computer-tailored advice. Clinicaltrials.gov NCT02672215 ; (Archived by WebCite at https://clinicaltrials.gov/ct2/show/NCT02672215 ).
Sampling factors influencing accuracy of sperm kinematic analysis.
Owen, D H; Katz, D F
1993-01-01
Sampling conditions that influence the accuracy of experimental measurement of sperm head kinematics were studied by computer simulation methods. Several archetypal sperm trajectories were studied. First, mathematical models of typical flagellar beats were input to hydrodynamic equations of sperm motion. The instantaneous swimming velocities of such sperm were computed over sequences of flagellar beat cycles, from which the resulting trajectories were determined. In a second, idealized approach, direct mathematical models of trajectories were utilized, based upon similarities to the previous hydrodynamic constructs. In general, it was found that analyses of sampling factors produced similar results for the hydrodynamic and idealized trajectories. A number of experimental sampling factors were studied, including the number of sperm head positions measured per flagellar beat, and the time interval over which these measurements are taken. It was found that when one flagellar beat is sampled, values of amplitude of lateral head displacement (ALH) and linearity (LIN) approached their actual values when five or more sample points per beat were taken. Mean angular displacement (MAD) values, however, remained sensitive to sampling rate even when large sampling rates were used. Values of MAD were also much more sensitive to the initial starting point of the sampling procedure than were ALH or LIN. On the basis of these analyses of measurement accuracy for individual sperm, simulations were then performed of cumulative effects when studying entire populations of motile cells. It was found that substantial (double digit) errors occurred in the mean values of curvilinear velocity (VCL), LIN, and MAD under the conditions of 30 video frames per second and 0.5 seconds of analysis time. Increasing the analysis interval to 1 second did not appreciably improve the results. However, increasing the analysis rate to 60 frames per second significantly reduced the errors. These findings thus suggest that computer-aided sperm analysis (CASA) application at 60 frames per second will significantly improve the accuracy of kinematic analysis in most applications to human and other mammalian sperm.
Vocal Imitations of Non-Vocal Sounds
Houix, Olivier; Voisin, Frédéric; Misdariis, Nicolas; Susini, Patrick
2016-01-01
Imitative behaviors are widespread in humans, in particular whenever two persons communicate and interact. Several tokens of spoken languages (onomatopoeias, ideophones, and phonesthemes) also display different degrees of iconicity between the sound of a word and what it refers to. Thus, it probably comes at no surprise that human speakers use a lot of imitative vocalizations and gestures when they communicate about sounds, as sounds are notably difficult to describe. What is more surprising is that vocal imitations of non-vocal everyday sounds (e.g. the sound of a car passing by) are in practice very effective: listeners identify sounds better with vocal imitations than with verbal descriptions, despite the fact that vocal imitations are inaccurate reproductions of a sound created by a particular mechanical system (e.g. a car driving by) through a different system (the voice apparatus). The present study investigated the semantic representations evoked by vocal imitations of sounds by experimentally quantifying how well listeners could match sounds to category labels. The experiment used three different types of sounds: recordings of easily identifiable sounds (sounds of human actions and manufactured products), human vocal imitations, and computational “auditory sketches” (created by algorithmic computations). The results show that performance with the best vocal imitations was similar to the best auditory sketches for most categories of sounds, and even to the referent sounds themselves in some cases. More detailed analyses showed that the acoustic distance between a vocal imitation and a referent sound is not sufficient to account for such performance. Analyses suggested that instead of trying to reproduce the referent sound as accurately as vocally possible, vocal imitations focus on a few important features, which depend on each particular sound category. These results offer perspectives for understanding how human listeners store and access long-term sound representations, and sets the stage for the development of human-computer interfaces based on vocalizations. PMID:27992480
Methods for Functional Connectivity Analyses
2012-12-13
motor , or hand motor function (green, red, or blue shading, respectively). Thus, this work produced the first comprehensive analysis of ECoG...Computer Engineering, University of Texas at El Paso , TX, USA 3Department of Neurology, Albany Medical College, Albany, NY, USA 4Department of Computer...Department of Health, Albany, NY, USA bDepartment of Electrical and Computer Engineering, University of Texas at El Paso , TX, USA cDepartment of Neurology
Gauss Elimination: Workhorse of Linear Algebra.
1995-08-05
linear algebra computation for solving systems, computing determinants and determining the rank of matrix. All of these are discussed in varying contexts. These include different arithmetic or algebraic setting such as integer arithmetic or polynomial rings as well as conventional real (floating-point) arithmetic. These have effects on both accuracy and complexity analyses of the algorithm. These, too, are covered here. The impact of modern parallel computer architecture on GE is also
Boissin, Constance; Blom, Lisa; Wallis, Lee; Laflamme, Lucie
2017-02-01
Mobile health has promising potential in improving healthcare delivery by facilitating access to expert advice. Enabling experts to review images on their smartphone or tablet may save valuable time. This study aims at assessing whether images viewed by medical specialists on handheld devices such as smartphones and tablets are perceived to be of comparable quality as when viewed on a computer screen. This was a prospective study comparing the perceived quality of 18 images on three different display devices (smartphone, tablet and computer) by 27 participants (4 burn surgeons and 23 emergency medicine specialists). The images, presented in random order, covered clinical (dermatological conditions, burns, ECGs and X-rays) and non-clinical subjects and their perceived quality was assessed using a 7-point Likert scale. Differences in devices' quality ratings were analysed using linear regression models for clustered data adjusting for image type and participants' characteristics (age, gender and medical specialty). Overall, the images were rated good or very good in most instances and more so for the smartphone (83.1%, mean score 5.7) and tablet (78.2%, mean 5.5) than for a standard computer (70.6%, mean 5.2). Both handheld devices had significantly higher ratings than the computer screen, even after controlling for image type and participants' characteristics. Nearly all experts expressed that they would be comfortable using smartphones (n=25) or tablets (n=26) for image-based teleconsultation. This study suggests that handheld devices could be a substitute for computer screens for teleconsultation by physicians working in emergency settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Development and assessment of a chemistry-based computer video game as a learning tool
NASA Astrophysics Data System (ADS)
Martinez-Hernandez, Kermin Joel
The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning experience through gameplay. The project consists of three areas: development, assessment, and implementation. However, the foci of this study were the development and assessment of the computer video game including possible learning outcomes and game design elements. A chemistry-based game using a mixed genre of a single player first-person game embedded with action-adventure and puzzle components was developed to determine if students' level of understanding of chemistry concepts change after gameplay intervention. Three phases have been completed to assess students' understanding of chemistry concepts prior and after gameplay intervention. Two main assessment instruments (pre/post open-ended content survey and individual semi-structured interviews) were used to assess student understanding of concepts. In addition, game design elements were evaluated for future development phases. Preliminary analyses of the interview data suggest that students were able to understand most of the chemistry challenges presented in the game and the game served as a review for previously learned concepts as well as a way to apply such previous knowledge. To guarantee a better understanding of the chemistry concepts, additions such as debriefing and feedback about the content presented in the game seem to be needed. The use of visuals in the game to represent chemical processes, game genre, and game idea appear to be the game design elements that students like the most about the current computer video game.
RNA-Seq Analysis to Measure the Expression of SINE Retroelements.
Román, Ángel Carlos; Morales-Hernández, Antonio; Fernández-Salguero, Pedro M
2016-01-01
The intrinsic features of retroelements, like their repetitive nature and disseminated presence in their host genomes, demand the use of advanced methodologies for their bioinformatic and functional study. The short length of SINE (short interspersed elements) retrotransposons makes such analyses even more complex. Next-generation sequencing (NGS) technologies are currently one of the most widely used tools to characterize the whole repertoire of gene expression in a specific tissue. In this chapter, we will review the molecular and computational methods needed to perform NGS analyses on SINE elements. We will also describe new methods of potential interest for researchers studying repetitive elements. We intend to outline the general ideas behind the computational analyses of NGS data obtained from SINE elements, and to stimulate other scientists to expand our current knowledge on SINE biology using RNA-seq and other NGS tools.
NASA Technical Reports Server (NTRS)
Manderscheid, J. M.; Kaufman, A.
1985-01-01
Turbine blades for reusable space propulsion systems are subject to severe thermomechanical loading cycles that result in large inelastic strains and very short lives. These components require the use of anisotropic high-temperature alloys to meet the safety and durability requirements of such systems. To assess the effects on blade life of material anisotropy, cyclic structural analyses are being performed for the first stage high-pressure fuel turbopump blade of the space shuttle main engine. The blade alloy is directionally solidified MAR-M 246 alloy. The analyses are based on a typical test stand engine cycle. Stress-strain histories at the airfoil critical location are computed using the MARC nonlinear finite-element computer code. The MARC solutions are compared to cyclic response predictions from a simplified structural analysis procedure developed at the NASA Lewis Research Center.
Behavior of auxetic structures under compression and impact forces
NASA Astrophysics Data System (ADS)
Yang, Chulho; Vora, Hitesh D.; Chang, Young
2018-02-01
In recent years, various auxetic material structures have been designed and fabricated for diverse applications that utilize normal materials that follow Hooke’s law but still show the properties of negative Poisson’s ratios (NPR). One potential application is body protection pads that are comfortable to wear and effective in protecting body parts by reducing impact force and preventing injuries in high-risk individuals such as elderly people, industrial workers, law enforcement and military personnel, and athletes. This paper reports an integrated theoretical, computational, and experimental investigation conducted for typical auxetic materials that exhibit NPR properties. Parametric 3D CAD models of auxetic structures such as re-entrant hexagonal cells and arrowheads were developed. Then, key structural characteristics of protection pads were evaluated through static analyses of FEA models. Finally, impact analyses were conducted through dynamic simulations of FEA models to validate the results obtained from the static analyses. Efforts were also made to relate the individual and/or combined effect of auxetic structures and materials to the overall stiffness and shock-absorption performance of the protection pads. An advanced additive manufacturing (3D printing) technique was used to build prototypes of the auxetic structures. Three different materials typically used for fused deposition modeling technology, namely polylactic acid (PLA) and thermoplastic polyurethane material (NinjaFlex® and SemiFlex®), were used for different stiffness and shock-absorption properties. The 3D printed prototypes were then tested and the results were compared to the computational predictions. The results showed that the auxetic material could be effective in reducing the shock forces. Each structure and material combination demonstrated unique structural properties such as stiffness, Poisson’s ratio, and efficiency in shock absorption. Auxetic structures showed better shock absorption performance than non-auxetic ones. The mechanism for ideal input force distribution or shunting could be suggested for designing protectors using various shapes, thicknesses, and materials of auxetic materials to reduce the risk of injury.
McGuire, Mary F; Sriram Iyengar, M; Mercer, David W
2012-04-01
Although trauma is the leading cause of death for those below 45years of age, there is a dearth of information about the temporal behavior of the underlying biological mechanisms in those who survive the initial trauma only to later suffer from syndromes such as multiple organ failure. Levels of serum cytokines potentially affect the clinical outcomes of trauma; understanding how cytokine levels modulate intra-cellular signaling pathways can yield insights into molecular mechanisms of disease progression and help to identify targeted therapies. However, developing such analyses is challenging since it necessitates the integration and interpretation of large amounts of heterogeneous, quantitative and qualitative data. Here we present the Pathway Semantics Algorithm (PSA), an algebraic process of node and edge analyses of evoked biological pathways over time for in silico discovery of biomedical hypotheses, using data from a prospective controlled clinical study of the role of cytokines in multiple organ failure (MOF) at a major US trauma center. A matrix algebra approach was used in both the PSA node and PSA edge analyses with different matrix configurations and computations based on the biomedical questions to be examined. In the edge analysis, a percentage measure of crosstalk called XTALK was also developed to assess cross-pathway interference. In the node/molecular analysis of the first 24h from trauma, PSA uncovered seven molecules evoked computationally that differentiated outcomes of MOF or non-MOF (NMOF), of which three molecules had not been previously associated with any shock/trauma syndrome. In the edge/molecular interaction analysis, PSA examined four categories of functional molecular interaction relationships--activation, expression, inhibition, and transcription--and found that the interaction patterns and crosstalk changed over time and outcome. The PSA edge analysis suggests that a diagnosis, prognosis or therapy based on molecular interaction mechanisms may be most effective within a certain time period and for a specific functional relationship. Copyright © 2011 Elsevier Inc. All rights reserved.
Statistical analyses and computational prediction of helical kinks in membrane proteins
NASA Astrophysics Data System (ADS)
Huang, Y.-H.; Chen, C.-M.
2012-10-01
We have carried out statistical analyses and computer simulations of helical kinks for TM helices in the PDBTM database. About 59 % of 1562 TM helices showed a significant kink, and 38 % of these kinks are associated with prolines in a range of ±4 residues. Our analyses show that helical kinks are more populated in the central region of helices, particularly in the range of 1-3 residues away from the helix center. Among 1,053 helical kinks analyzed, 88 % of kinks are bends (change in helix axis without loss of helical character) and 12 % are disruptions (change in helix axis and loss of helical character). It is found that proline residues tend to cause larger kink angles in helical bends, while this effect is not observed in helical disruptions. A further analysis of these kinked helices suggests that a kinked helix usually has 1-2 broken backbone hydrogen bonds with the corresponding N-O distance in the range of 4.2-8.7 Å, whose distribution is sharply peaked at 4.9 Å followed by an exponential decay with increasing distance. Our main aims of this study are to understand the formation of helical kinks and to predict their structural features. Therefore we further performed molecular dynamics (MD) simulations under four simulation scenarios to investigate kink formation in 37 kinked TM helices and 5 unkinked TM helices. The representative models of these kinked helices are predicted by a clustering algorithm, SPICKER, from numerous decoy structures possessing the above generic features of kinked helices. Our results show an accuracy of 95 % in predicting the kink position of kinked TM helices and an error less than 10° in the angle prediction of 71.4 % kinked helices. For unkinked helices, based on various structure similarity tests, our predicted models are highly consistent with their crystal structure. These results provide strong supports for the validity of our method in predicting the structure of TM helices.
McGuire, Mary F.; Iyengar, M. Sriram; Mercer, David W.
2012-01-01
Motivation Although trauma is the leading cause of death for those below 45 years of age, there is a dearth of information about the temporal behavior of the underlying biological mechanisms in those who survive the initial trauma only to later suffer from syndromes such as multiple organ failure. Levels of serum cytokines potentially affect the clinical outcomes of trauma; understanding how cytokine levels modulate intra-cellular signaling pathways can yield insights into molecular mechanisms of disease progression and help to identify targeted therapies. However, developing such analyses is challenging since it necessitates the integration and interpretation of large amounts of heterogeneous, quantitative and qualitative data. Here we present the Pathway Semantics Algorithm (PSA), an algebraic process of node and edge analyses of evoked biological pathways over time for in silico discovery of biomedical hypotheses, using data from a prospective controlled clinical study of the role of cytokines in multiple organ failure (MOF) at a major US trauma center. A matrix algebra approach was used in both the PSA node and PSA edge analyses with different matrix configurations and computations based on the biomedical questions to be examined. In the edge analysis, a percentage measure of crosstalk called XTALK was also developed to assess cross-pathway interference. Results In the node/molecular analysis of the first 24 hours from trauma, PSA uncovered 7 molecules evoked computationally that differentiated outcomes of MOF or non-MOF (NMOF), of which 3 molecules had not been previously associated with any shock / trauma syndrome. In the edge/molecular interaction analysis, PSA examined four categories of functional molecular interaction relationships – activation, expression, inhibition, and transcription – and found that the interaction patterns and crosstalk changed over time and outcome. The PSA edge analysis suggests that a diagnosis, prognosis or therapy based on molecular interaction mechanisms may be most effective within a certain time period and for a specific functional relationship. PMID:22200681
Identification of Computational and Experimental Reduced-Order Models
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Hong, Moeljo S.; Bartels, Robert E.; Piatak, David J.; Scott, Robert C.
2003-01-01
The identification of computational and experimental reduced-order models (ROMs) for the analysis of unsteady aerodynamic responses and for efficient aeroelastic analyses is presented. For the identification of a computational aeroelastic ROM, the CFL3Dv6.0 computational fluid dynamics (CFD) code is used. Flutter results for the AGARD 445.6 Wing and for a Rigid Semispan Model (RSM) computed using CFL3Dv6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are computed using the CFL3Dv6.0 code and transformed into state-space form. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is then used to rapidly compute aeroelastic transients, including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly. For the identification of experimental unsteady pressure ROMs, results are presented for two configurations: the RSM and a Benchmark Supercritical Wing (BSCW). Both models were used to acquire unsteady pressure data due to pitching oscillations on the Oscillating Turntable (OTT) system at the Transonic Dynamics Tunnel (TDT). A deconvolution scheme involving a step input in pitch and the resultant step response in pressure, for several pressure transducers, is used to identify the unsteady pressure impulse responses. The identified impulse responses are then used to predict the pressure responses due to pitching oscillations at several frequencies. Comparisons with the experimental data are then presented.
Protecting genomic data analytics in the cloud: state of the art and opportunities.
Tang, Haixu; Jiang, Xiaoqian; Wang, Xiaofeng; Wang, Shuang; Sofia, Heidi; Fox, Dov; Lauter, Kristin; Malin, Bradley; Telenti, Amalio; Xiong, Li; Ohno-Machado, Lucila
2016-10-13
The outsourcing of genomic data into public cloud computing settings raises concerns over privacy and security. Significant advancements in secure computation methods have emerged over the past several years, but such techniques need to be rigorously evaluated for their ability to support the analysis of human genomic data in an efficient and cost-effective manner. With respect to public cloud environments, there are concerns about the inadvertent exposure of human genomic data to unauthorized users. In analyses involving multiple institutions, there is additional concern about data being used beyond agreed research scope and being prcoessed in untrused computational environments, which may not satisfy institutional policies. To systematically investigate these issues, the NIH-funded National Center for Biomedical Computing iDASH (integrating Data for Analysis, 'anonymization' and SHaring) hosted the second Critical Assessment of Data Privacy and Protection competition to assess the capacity of cryptographic technologies for protecting computation over human genomes in the cloud and promoting cross-institutional collaboration. Data scientists were challenged to design and engineer practical algorithms for secure outsourcing of genome computation tasks in working software, whereby analyses are performed only on encrypted data. They were also challenged to develop approaches to enable secure collaboration on data from genomic studies generated by multiple organizations (e.g., medical centers) to jointly compute aggregate statistics without sharing individual-level records. The results of the competition indicated that secure computation techniques can enable comparative analysis of human genomes, but greater efficiency (in terms of compute time and memory utilization) are needed before they are sufficiently practical for real world environments.
Monitoring Collaborative Activities in Computer Supported Collaborative Learning
ERIC Educational Resources Information Center
Persico, Donatella; Pozzi, Francesca; Sarti, Luigi
2010-01-01
Monitoring the learning process in computer supported collaborative learning (CSCL) environments is a key element for supporting the efficacy of tutor actions. This article proposes an approach for analysing learning processes in a CSCL environment to support tutors in their monitoring tasks. The approach entails tracking the interactions within…
Evaluation and Assessment of a Biomechanics Computer-Aided Instruction.
ERIC Educational Resources Information Center
Washington, N.; Parnianpour, M.; Fraser, J. M.
1999-01-01
Describes the Biomechanics Tutorial, a computer-aided instructional tool that was developed at Ohio State University to expedite the transition from lecture to application for undergraduate students. Reports evaluation results that used statistical analyses and student questionnaires to show improved performance on posttests as well as positive…
Quantum computation with cold bosonic atoms in an optical lattice.
García-Ripoll, Juan José; Cirac, Juan Ignacio
2003-07-15
We analyse an implementation of a quantum computer using bosonic atoms in an optical lattice. We show that, even though the number of atoms per site and the tunnelling rate between neighbouring sites is unknown, one may operate a universal set of gates by means of adiabatic passage.
Designing for Learner Engagement with Computer Based Testing
ERIC Educational Resources Information Center
Walker, Richard; Handley, Zoe
2016-01-01
The issues influencing student engagement with high-stakes computer-based exams were investigated, drawing on feedback from two cohorts of international MA Education students encountering this assessment method for the first time. Qualitative data from surveys and focus groups on the students' examination experience were analysed, leading to the…
Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses
ERIC Educational Resources Information Center
Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…
CSM parallel structural methods research
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O.
1989-01-01
Parallel structural methods, research team activities, advanced architecture computers for parallel computational structural mechanics (CSM) research, the FLEX/32 multicomputer, a parallel structural analyses testbed, blade-stiffened aluminum panel with a circular cutout and the dynamic characteristics of a 60 meter, 54-bay, 3-longeron deployable truss beam are among the topics discussed.
Conformational dynamics of proanthocyanidins: physical and computational approaches
Fred L. Tobiason; Richard W. Hemingway; T. Hatano
1998-01-01
The interaction of plant polyphenols with proteins accounts for a good part of their commercial (e.g., leather manufacture) and biological (e.g., antimicrobial activity) significance. The interplay between observations of physical data such as crystal structure, NMR analyses, and time-resolved fluorescence with results of computational chemistry approaches has been...
Wan, Shixiang; Zou, Quan
2017-01-01
Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.
Linear regression metamodeling as a tool to summarize and present simulation model results.
Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M
2013-10-01
Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.
BMPOS: a Flexible and User-Friendly Tool Sets for Microbiome Studies.
Pylro, Victor S; Morais, Daniel K; de Oliveira, Francislon S; Dos Santos, Fausto G; Lemos, Leandro N; Oliveira, Guilherme; Roesch, Luiz F W
2016-08-01
Recent advances in science and technology are leading to a revision and re-orientation of methodologies, addressing old and current issues under a new perspective. Advances in next generation sequencing (NGS) are allowing comparative analysis of the abundance and diversity of whole microbial communities, generating a large amount of data and findings at a systems level. The current limitation for biologists has been the increasing demand for computational power and training required for processing of NGS data. Here, we describe the deployment of the Brazilian Microbiome Project Operating System (BMPOS), a flexible and user-friendly Linux distribution dedicated to microbiome studies. The Brazilian Microbiome Project (BMP) has developed data analyses pipelines for metagenomic studies (phylogenetic marker genes), conducted using the two main high-throughput sequencing platforms (Ion Torrent and Illumina MiSeq). The BMPOS is freely available and possesses the entire requirement of bioinformatics packages and databases to perform all the pipelines suggested by the BMP team. The BMPOS may be used as a bootable live USB stick or installed in any computer with at least 1 GHz CPU and 512 MB RAM, independent of the operating system previously installed. The BMPOS has proved to be effective for sequences processing, sequences clustering, alignment, taxonomic annotation, statistical analysis, and plotting of metagenomic data. The BMPOS has been used during several metagenomic analyses courses, being valuable as a tool for training, and an excellent starting point to anyone interested in performing metagenomic studies. The BMPOS and its documentation are available at http://www.brmicrobiome.org .
Contribution of attachment insecurity to health-related quality of life in depressed patients.
Ponizovsky, Alexander M; Drannikov, Angela
2013-06-22
To examine the individual contributions of insecure attachment styles and depression symptom severity to health-related quality of life (HRQoL) in patients diagnosed with adjustment disorder (AJD) with depressed mood. Participants were 67 patients diagnosed with International Classification of Diseases, Tenth edition AJD with depressed mood, who completed standardised self-report questionnaires measuring study variables. Mean scores and SDs were computed for the outcome and predictor measures. Pearson correlations among the measures were computed. The study hypotheses were tested using analysis of variance (ANOVA) and multiple regression analyses. All analyses were performed using the SPSS-17 software package (SPSS Inc., Chicago, IL, United States). ANOVA showed a significant main effect of the insecure attachment styles on depression symptom severity and life satisfaction scores. The results suggest that depressive symptoms were more severe (F = 4.13, df = 2.67, P < 0.05) and life satisfaction was poorer (F = 5.69, df = 2.67, P < 0.01) in both anxious-ambivalently and avoidantly attached patients compared with their securely attached counterparts, whereas the two insecure groups did not significantly differ by these variables. The anxious/ambivalent attachment style and depression symptom severity significantly contributed to HRQoL, accounting for 21.4% and 29.7% of the total variance, respectively [R(2) = 0.79; Adjusted R(2) = 0.77; F (5, 67) = 33.68, P < 0.0001], even after controlling for gender, marital and employment status confounders. The results show that the anxious/ambivalent attachment style together with depression symptom severity substantially and independently predict the HRQoL outcome in AJD with depressed mood.
Lin, Chun; Solera Garcia, Maria Angeles; Timmis, Roger; Jones, Kevin C
2011-03-01
A new type of directional passive air sampler (DPAS) is described for collecting particulate matter (PM) in ambient air. The prototype sampler has a non-rotating circular sampling tray that is divided into covered angular channels, whose ends are open to winds from sectors covering the surrounding 360°. Wind-blown PM from different directions enters relevant wind-facing channels, and is retained there in collecting pools containing various sampling media. Information on source direction and type can be obtained by examining the distribution of PM between channels. Wind tunnel tests show that external wind velocities are at least halved over an extended area of the collecting pools, encouraging PM to settle from the air stream. Internal and external wind velocities are well-correlated over an external velocity range of 2.0-10.0 m s⁻¹, which suggests it may be possible to relate collected amounts of PM simply to ambient concentrations and wind velocities. Measurements of internal wind velocities in different channels show that velocities decrease from the upwind channel round to the downwind channel, so that the sampler effectively resolves wind directions. Computational fluid dynamics (CFD) analyses were performed on a computer-generated model of the sampler for a range of external wind velocities; the results of these analyses were consistent with those from the wind tunnel. Further wind tunnel tests were undertaken using different artificial particulates in order to assess the collection performance of the sampler in practice. These tests confirmed that the sampler can resolve the directions of sources, by collecting particulates preferentially in source-facing channels.
Developing educational resources for population genetics in R: An open and collaborative approach
USDA-ARS?s Scientific Manuscript database
The R computing and statistical language community has developed a myriad of resources for conducting populations genetic analyses. However, resources for learning how to carry out population genetic analyses in R are scattered and often incomplete, which can make acquiring this skill unnecessarily ...
Extreme Scale Computing Studies
2010-12-01
PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. *//Signature// //Signature// KERRY HILL, Program Manager BRADLEY J ...Research Institute William Carlson Institute for Defense Analyses William Dally Stanford University Monty Denneau IBM T. J . Watson Research...for Defense Analyses William Dally, Stanford University Monty Denneau, IBM T. J . Watson Research Laboratories Paul Franzon, North Carolina State
Hypnotic Enhancement of Cognitive-Behavioral Weight Loss Treatments--Another Meta-reanalysis.
ERIC Educational Resources Information Center
Kirsch, Irving
1996-01-01
In a meta-analysis of the effect of adding hypnosis to cognitive-behavioral treatments for weight reduction, additional data were obtained from authors of two previous studies, and computational inaccuracies in the previous meta-analyses were corrected. Discusses findings. Correlational analyses indicated that the benefits of hypnosis increased…
ERIC Educational Resources Information Center
Ellis, Barbara G.; Dick, Steven J.
1996-01-01
Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Chwalowski, Pawel; Wieseman, Carol D.; Eller, David; Ringertz, Ulf
2017-01-01
A status report is provided on the collaboration between the Royal Institute of Technology (KTH) in Sweden and the NASA Langley Research Center regarding the aeroelastic analyses of a full-span fighter configuration wind-tunnel model. This wind-tunnel model was tested in the Transonic Dynamics Tunnel (TDT) in the summer of 2016. Large amounts of data were acquired including steady/unsteady pressures, accelerations, strains, and measured dynamic deformations. The aeroelastic analyses presented include linear aeroelastic analyses, CFD steady analyses, and analyses using CFD-based reduced-order models (ROMs).
1981-02-01
Continue on tevetee «Id* If necemtery mid Identify br black number) Battlefield automated systems Human- computer interaction. Design criteria System...Report (this report) In-Depth Analyses of Individual Systems A. Tactical Fire Direction System (TACFIRE) (RP 81-26) B. Tactical Computer Terminal...select the design features and operating procedures of the human- computer Interface which best match the require- ments and capabilities of anticipated
GPU-computing in econophysics and statistical physics
NASA Astrophysics Data System (ADS)
Preis, T.
2011-03-01
A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.
Linear static structural and vibration analysis on high-performance computers
NASA Technical Reports Server (NTRS)
Baddourah, M. A.; Storaasli, O. O.; Bostic, S. W.
1993-01-01
Parallel computers offer the oppurtunity to significantly reduce the computation time necessary to analyze large-scale aerospace structures. This paper presents algorithms developed for and implemented on massively-parallel computers hereafter referred to as Scalable High-Performance Computers (SHPC), for the most computationally intensive tasks involved in structural analysis, namely, generation and assembly of system matrices, solution of systems of equations and calculation of the eigenvalues and eigenvectors. Results on SHPC are presented for large-scale structural problems (i.e. models for High-Speed Civil Transport). The goal of this research is to develop a new, efficient technique which extends structural analysis to SHPC and makes large-scale structural analyses tractable.
The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences
Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker
2016-01-01
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant’s platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses. PMID:26752627
Bread, beer and wine: Saccharomyces cerevisiae diversity reflects human history.
Legras, Jean-Luc; Merdinoglu, Didier; Cornuet, Jean-Marie; Karst, Francis
2007-05-01
Fermented beverages and foods have played a significant role in most societies worldwide for millennia. To better understand how the yeast species Saccharomyces cerevisiae, the main fermenting agent, evolved along this historical and expansion process, we analysed the genetic diversity among 651 strains from 56 different geographical origins, worldwide. Their genotyping at 12 microsatellite loci revealed 575 distinct genotypes organized in subgroups of yeast types, i.e. bread, beer, wine, sake. Some of these groups presented unexpected relatedness: Bread strains displayed a combination of alleles intermediate between beer and wine strains, and strains used for rice wine and sake were most closely related to beer and bread strains. However, up to 28% of genetic diversity between these technological groups was associated with geographical differences which suggests local domestications. Focusing on wine yeasts, a group of Lebanese strains were basal in an F(ST) tree, suggesting a Mesopotamia-based origin of most wine strains. In Europe, migration of wine strains occurred through the Danube Valley, and around the Mediterranean Sea. An approximate Bayesian computation approach suggested a postglacial divergence (most probable period 10,000-12,000 bp). As our results suggest intimate association between man and wine yeast across centuries, we hypothesize that yeast followed man and vine migrations as a commensal member of grapevine flora.
Computer-based desktop system for surgical videotape editing.
Vincent-Hamelin, E; Sarmiento, J M; de la Puente, J M; Vicente, M
1997-05-01
The educational role of surgical video presentations should be optimized by linking surgical images to graphic evaluation of indications, techniques, and results. We describe a PC-based video production system for personal editing of surgical tapes, according to the objectives of each presentation. The hardware requirement is a personal computer (100 MHz processor, 1-Gb hard disk, 16 Mb RAM) with a PC-to-TV/video transfer card plugged into a slot. Computer-generated numerical data, texts, and graphics are transformed into analog signals displayed on TV/video. A Genlock interface (a special interface card) synchronizes digital and analog signals, to overlay surgical images to electronic illustrations. The presentation is stored as digital information or recorded on a tape. The proliferation of multimedia tools is leading us to adapt presentations to the objectives of lectures and to integrate conceptual analyses with dynamic image-based information. We describe a system that handles both digital and analog signals, production being recorded on a tape. Movies may be managed in a digital environment, with either an "on-line" or "off-line" approach. System requirements are high, but handling a single device optimizes editing without incurring such complexity that management becomes impractical to surgeons. Our experience suggests that computerized editing allows linking surgical scientific and didactic messages on a single communication medium, either a videotape or a CD-ROM.
Ruediger, T M; Allison, S C; Moore, J M; Wainner, R S
2014-09-01
The purposes of this descriptive and exploratory study were to examine electrophysiological measures of ulnar sensory nerve function in disease free adults to determine reliability, determine reference values computed with appropriate statistical methods, and examine predictive ability of anthropometric variables. Antidromic sensory nerve conduction studies of the ulnar nerve using surface electrodes were performed on 100 volunteers. Reference values were computed from optimally transformed data. Reliability was computed from 30 subjects. Multiple linear regression models were constructed from four predictor variables. Reliability was greater than 0.85 for all paired measures. Responses were elicited in all subjects; reference values for sensory nerve action potential (SNAP) amplitude from above elbow stimulation are 3.3 μV and decrement across-elbow less than 46%. No single predictor variable accounted for more than 15% of the variance in the response. Electrophysiologic measures of the ulnar sensory nerve are reliable. Absent SNAP responses are inconsistent with disease free individuals. Reference values recommended in this report are based on appropriate transformations of non-normally distributed data. No strong statistical model of prediction could be derived from the limited set of predictor variables. Reliability analyses combined with relatively low level of measurement error suggest that ulnar sensory reference values may be used with confidence. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Mechanisms of Neurofeedback: A Computation-theoretic Approach.
Davelaar, Eddy J
2018-05-15
Neurofeedback training is a form of brain training in which information about a neural measure is fed back to the trainee who is instructed to increase or decrease the value of that particular measure. This paper focuses on electroencephalography (EEG) neurofeedback in which the neural measures of interest are the brain oscillations. To date, the neural mechanisms that underlie successful neurofeedback training are still unexplained. Such an understanding would benefit researchers, funding agencies, clinicians, regulatory bodies, and insurance firms. Based on recent empirical work, an emerging theory couched firmly within computational neuroscience is proposed that advocates a critical role of the striatum in modulating EEG frequencies. The theory is implemented as a computer simulation of peak alpha upregulation, but in principle any frequency band at one or more electrode sites could be addressed. The simulation successfully learns to increase its peak alpha frequency and demonstrates the influence of threshold setting - the threshold that determines whether positive or negative feedback is provided. Analyses of the model suggest that neurofeedback can be likened to a search process that uses importance sampling to estimate the posterior probability distribution over striatal representational space, with each representation being associated with a distribution of values of the target EEG band. The model provides an important proof of concept to address pertinent methodological questions about how to understand and improve EEG neurofeedback success. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less
Doyle, Heather; Lohfeld, Stefan; Dürselen, Lutz; McHugh, Peter
2015-04-01
Computational model geometries of tibial defects with two types of implanted tissue engineering scaffolds, β-tricalcium phosphate (β-TCP) and poly-ε-caprolactone (PCL)/β-TCP, are constructed from µ-CT scan images of the real in vivo defects. Simulations of each defect under four-point bending and under simulated in vivo axial compressive loading are performed. The mechanical stability of each defect is analysed using stress distribution analysis. The results of this analysis highlights the influence of callus volume, and both scaffold volume and stiffness, on the load-bearing abilities of these defects. Clinically-used image-based methods to predict the safety of removing external fixation are evaluated for each defect. Comparison of these measures with the results of computational analyses indicates that care must be taken in the interpretation of these measures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Overview of Sensitivity Analysis and Shape Optimization for Complex Aerodynamic Configurations
NASA Technical Reports Server (NTRS)
Newman, Perry A.; Newman, James C., III; Barnwell, Richard W.; Taylor, Arthur C., III; Hou, Gene J.-W.
1998-01-01
This paper presents a brief overview of some of the more recent advances in steady aerodynamic shape-design sensitivity analysis and optimization, based on advanced computational fluid dynamics. The focus here is on those methods particularly well- suited to the study of geometrically complex configurations and their potentially complex associated flow physics. When nonlinear state equations are considered in the optimization process, difficulties are found in the application of sensitivity analysis. Some techniques for circumventing such difficulties are currently being explored and are included here. Attention is directed to methods that utilize automatic differentiation to obtain aerodynamic sensitivity derivatives for both complex configurations and complex flow physics. Various examples of shape-design sensitivity analysis for unstructured-grid computational fluid dynamics algorithms are demonstrated for different formulations of the sensitivity equations. Finally, the use of advanced, unstructured-grid computational fluid dynamics in multidisciplinary analyses and multidisciplinary sensitivity analyses within future optimization processes is recommended and encouraged.
Status and prospects of computational fluid dynamics for unsteady transonic viscous flows
NASA Technical Reports Server (NTRS)
Mccroskey, W. J.; Kutler, P.; Bridgeman, J. O.
1984-01-01
Applications of computational aerodynamics to aeronautical research, design, and analysis have increased rapidly over the past decade, and these applications offer significant benefits to aeroelasticians. The past developments are traced by means of a number of specific examples, and the trends are projected over the next several years. The crucial factors that limit the present capabilities for unsteady analyses are identified; they include computer speed and memory, algorithm and solution methods, grid generation, turbulence modeling, vortex modeling, data processing, and coupling of the aerodynamic and structural dynamic analyses. The prospects for overcoming these limitations are presented, and many improvements appear to be readily attainable. If so, a complete and reliable numerical simulation of the unsteady, transonic viscous flow around a realistic fighter aircraft configuration could become possible within the next decade. The possibilities of using artificial intelligence concepts to hasten the achievement of this goal are also discussed.
Building a Data Science capability for USGS water research and communication
NASA Astrophysics Data System (ADS)
Appling, A.; Read, E. K.
2015-12-01
Interpreting and communicating water issues in an era of exponentially increasing information requires a blend of domain expertise, computational proficiency, and communication skills. The USGS Office of Water Information has established a Data Science team to meet these needs, providing challenging careers for diverse domain scientists and innovators in the fields of information technology and data visualization. Here, we detail the experience of building a Data Science capability as a bridging element between traditional water resources analyses and modern computing tools and data management techniques. This approach includes four major components: 1) building reusable research tools, 2) documenting data-intensive research approaches in peer reviewed journals, 3) communicating complex water resources issues with interactive web visualizations, and 4) offering training programs for our peers in scientific computing. These components collectively improve the efficiency, transparency, and reproducibility of USGS data analyses and scientific workflows.
NASA Technical Reports Server (NTRS)
Ko, William L.; Olona, Timothy
1987-01-01
The effect of element size on the solution accuracies of finite-element heat transfer and thermal stress analyses of space shuttle orbiter was investigated. Several structural performance and resizing (SPAR) thermal models and NASA structural analysis (NASTRAN) structural models were set up for the orbiter wing midspan bay 3. The thermal model was found to be the one that determines the limit of finite-element fineness because of the limitation of computational core space required for the radiation view factor calculations. The thermal stresses were found to be extremely sensitive to a slight variation of structural temperature distributions. The minimum degree of element fineness required for the thermal model to yield reasonably accurate solutions was established. The radiation view factor computation time was found to be insignificant compared with the total computer time required for the SPAR transient heat transfer analysis.
NASA Technical Reports Server (NTRS)
Faust, N.; Jordon, L.
1981-01-01
Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.
A Window into the Intoxicated Mind? Speech as an Index of Psychoactive Drug Effects
Bedi, Gillinder; Cecchi, Guillermo A; Slezak, Diego F; Carrillo, Facundo; Sigman, Mariano; de Wit, Harriet
2014-01-01
Abused drugs can profoundly alter mental states in ways that may motivate drug use. These effects are usually assessed with self-report, an approach that is vulnerable to biases. Analyzing speech during intoxication may present a more direct, objective measure, offering a unique ‘window' into the mind. Here, we employed computational analyses of speech semantic and topological structure after ±3,4-methylenedioxymethamphetamine (MDMA; ‘ecstasy') and methamphetamine in 13 ecstasy users. In 4 sessions, participants completed a 10-min speech task after MDMA (0.75 and 1.5 mg/kg), methamphetamine (20 mg), or placebo. Latent Semantic Analyses identified the semantic proximity between speech content and concepts relevant to drug effects. Graph-based analyses identified topological speech characteristics. Group-level drug effects on semantic distances and topology were assessed. Machine-learning analyses (with leave-one-out cross-validation) assessed whether speech characteristics could predict drug condition in the individual subject. Speech after MDMA (1.5 mg/kg) had greater semantic proximity than placebo to the concepts friend, support, intimacy, and rapport. Speech on MDMA (0.75 mg/kg) had greater proximity to empathy than placebo. Conversely, speech on methamphetamine was further from compassion than placebo. Classifiers discriminated between MDMA (1.5 mg/kg) and placebo with 88% accuracy, and MDMA (1.5 mg/kg) and methamphetamine with 84% accuracy. For the two MDMA doses, the classifier performed at chance. These data suggest that automated semantic speech analyses can capture subtle alterations in mental state, accurately discriminating between drugs. The findings also illustrate the potential for automated speech-based approaches to characterize clinically relevant alterations to mental state, including those occurring in psychiatric illness. PMID:24694926
A window into the intoxicated mind? Speech as an index of psychoactive drug effects.
Bedi, Gillinder; Cecchi, Guillermo A; Slezak, Diego F; Carrillo, Facundo; Sigman, Mariano; de Wit, Harriet
2014-09-01
Abused drugs can profoundly alter mental states in ways that may motivate drug use. These effects are usually assessed with self-report, an approach that is vulnerable to biases. Analyzing speech during intoxication may present a more direct, objective measure, offering a unique 'window' into the mind. Here, we employed computational analyses of speech semantic and topological structure after ±3,4-methylenedioxymethamphetamine (MDMA; 'ecstasy') and methamphetamine in 13 ecstasy users. In 4 sessions, participants completed a 10-min speech task after MDMA (0.75 and 1.5 mg/kg), methamphetamine (20 mg), or placebo. Latent Semantic Analyses identified the semantic proximity between speech content and concepts relevant to drug effects. Graph-based analyses identified topological speech characteristics. Group-level drug effects on semantic distances and topology were assessed. Machine-learning analyses (with leave-one-out cross-validation) assessed whether speech characteristics could predict drug condition in the individual subject. Speech after MDMA (1.5 mg/kg) had greater semantic proximity than placebo to the concepts friend, support, intimacy, and rapport. Speech on MDMA (0.75 mg/kg) had greater proximity to empathy than placebo. Conversely, speech on methamphetamine was further from compassion than placebo. Classifiers discriminated between MDMA (1.5 mg/kg) and placebo with 88% accuracy, and MDMA (1.5 mg/kg) and methamphetamine with 84% accuracy. For the two MDMA doses, the classifier performed at chance. These data suggest that automated semantic speech analyses can capture subtle alterations in mental state, accurately discriminating between drugs. The findings also illustrate the potential for automated speech-based approaches to characterize clinically relevant alterations to mental state, including those occurring in psychiatric illness.
Tridico, Silvana R; Murray, Dáithí C; Addison, Jayne; Kirkbride, Kenneth P; Bunce, Michael
2014-01-01
Mammalian hairs are one of the most ubiquitous types of trace evidence collected in the course of forensic investigations. However, hairs that are naturally shed or that lack roots are problematic substrates for DNA profiling; these hair types often contain insufficient nuclear DNA to yield short tandem repeat (STR) profiles. Whilst there have been a number of initial investigations evaluating the value of metagenomics analyses for forensic applications (e.g. examination of computer keyboards), there have been no metagenomic evaluations of human hairs-a substrate commonly encountered during forensic practice. This present study attempts to address this forensic capability gap, by conducting a qualitative assessment into the applicability of metagenomic analyses of human scalp and pubic hair. Forty-two DNA extracts obtained from human scalp and pubic hairs generated a total of 79,766 reads, yielding 39,814 reads post control and abundance filtering. The results revealed the presence of unique combinations of microbial taxa that can enable discrimination between individuals and signature taxa indigenous to female pubic hairs. Microbial data from a single co-habiting couple added an extra dimension to the study by suggesting that metagenomic analyses might be of evidentiary value in sexual assault cases when other associative evidence is not present. Of all the data generated in this study, the next-generation sequencing (NGS) data generated from pubic hair held the most potential for forensic applications. Metagenomic analyses of human hairs may provide independent data to augment other forensic results and possibly provide association between victims of sexual assault and offender when other associative evidence is absent. Based on results garnered in the present study, we believe that with further development, bacterial profiling of hair will become a valuable addition to the forensic toolkit.
Demographics and macroeconomic effects in aesthetic surgery in the UK.
Duncan, C O; Ho-Asjoe, M; Hittinger, R; Nishikawa, H; Waterhouse, N; Coghlan, B; Jones, B
2004-09-01
Media interest in aesthetic surgery is substantial and suggestions of demographic changes such as reductions in age or an increase in the number of male patients are common. In spite of this, there is no peer reviewed literature reporting demographics of a contemporary large patient cohort or of the effect of macroeconomic indicators on aesthetic surgery in the UK. In this study, computer records 13006 patients presenting between 1998 and the first quarter of 2003 at a significant aesthetic surgery centre were analysed for procedures undergone, patient age and sex. Male to female ratios for each procedure were calculated and a comparison was made between unit activity and macroeconomic indicators. The results showed that there has been no significant demographic change in the procedures studied with patient age and male to female ratio remaining constant throughout the period studied for each procedure. Comparison with macroeconomic indicators suggested increasing demand for aesthetic surgery in spite of a global recession. In conclusion, media reports of large scale demographic shifts in aesthetic surgery patients are exaggerated. The stability of unit activity in spite of falling national economic indicators suggested that some units in the UK might be relatively immune to economic vagaries. The implications for training are discussed.
ERIC Educational Resources Information Center
LESCARBEAU, ROLAND F.; AND OTHERS
A SUGGESTED POST-SECONDARY CURRICULUM GUIDE FOR ELECTRO-MECHANICAL TECHNOLOGY ORIENTED SPECIFICALLY TO THE COMPUTER AND BUSINESS MACHINE FIELDS WAS DEVELOPED BY A GROUP OF COOPERATING INSTITUTIONS, NOW INCORPORATED AS TECHNICAL EDUCATION CONSORTIUM, INCORPORATED. SPECIFIC NEEDS OF THE COMPUTER AND BUSINESS MACHINE INDUSTRY WERE DETERMINED FROM…
ERIC Educational Resources Information Center
Springer, Michael T.
2014-01-01
Several articles suggest how to incorporate computer models into the organic chemistry laboratory, but relatively few papers discuss how to incorporate these models broadly into the organic chemistry lecture. Previous research has suggested that "manipulating" physical or computer models enhances student understanding; this study…
Electronic Circuit Analysis Language (ECAL)
NASA Astrophysics Data System (ADS)
Chenghang, C.
1983-03-01
The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.
Advanced Computing for Medicine.
ERIC Educational Resources Information Center
Rennels, Glenn D.; Shortliffe, Edward H.
1987-01-01
Discusses contributions that computers and computer networks are making to the field of medicine. Emphasizes the computer's speed in storing and retrieving data. Suggests that doctors may soon be able to use computers to advise on diagnosis and treatment. (TW)
NASA Astrophysics Data System (ADS)
Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.
2018-02-01
Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.
Identifying Differences between Depressed Adolescent Suicide Ideators and Attempters
Auerbach, Randy P.; Millner, Alexander J.; Stewart, Jeremy G.; Esposito, Erika
2015-01-01
Background Adolescent depression and suicide are pressing public health concerns, and identifying key differences among suicide ideators and attempters is critical. The goal of the current study is to test whether depressed adolescent suicide attempters report greater anhedonia severity and exhibit aberrant effort-cost computations in the face of uncertainty. Methods Depressed adolescents (n = 101) ages 13–19 years were administered structured clinical interviews to assess current mental health disorders and a history of suicidality (suicide ideators = 55, suicide attempters = 46). Then, participants completed self-report instruments assessing symptoms of suicidal ideation, depression, anhedonia, and anxiety as well as a computerized effort-cost computation task. Results Compared with depressed adolescent suicide ideators, attempters report greater anhedonia severity, even after concurrently controlling for symptoms of suicidal ideation, depression, and anxiety. Additionally, when completing the effort-cost computation task, suicide attempters are less likely to pursue the difficult, high value option when outcomes are uncertain. Follow-up, trial-level analyses of effort-cost computations suggest that receipt of reward does not influence future decision-making among suicide attempters, however, suicide ideators exhibit a win-stay approach when receiving rewards on previous trials. Limitations Findings should be considered in light of limitations including a modest sample size, which limits generalizability, and the cross-sectional design. Conclusions Depressed adolescent suicide attempters are characterized by greater anhedonia severity, which may impair the ability to integrate previous rewarding experiences to inform future decisions. Taken together, this may generate a feeling of powerlessness that contributes to increased suicidality and a needless loss of life. PMID:26233323
DISTMIX: direct imputation of summary statistics for unmeasured SNPs from mixed ethnicity cohorts.
Lee, Donghyung; Bigdeli, T Bernard; Williamson, Vernell S; Vladimirov, Vladimir I; Riley, Brien P; Fanous, Ayman H; Bacanu, Silviu-Alin
2015-10-01
To increase the signal resolution for large-scale meta-analyses of genome-wide association studies, genotypes at unmeasured single nucleotide polymorphisms (SNPs) are commonly imputed using large multi-ethnic reference panels. However, the ever increasing size and ethnic diversity of both reference panels and cohorts makes genotype imputation computationally challenging for moderately sized computer clusters. Moreover, genotype imputation requires subject-level genetic data, which unlike summary statistics provided by virtually all studies, is not publicly available. While there are much less demanding methods which avoid the genotype imputation step by directly imputing SNP statistics, e.g. Directly Imputing summary STatistics (DIST) proposed by our group, their implicit assumptions make them applicable only to ethnically homogeneous cohorts. To decrease computational and access requirements for the analysis of cosmopolitan cohorts, we propose DISTMIX, which extends DIST capabilities to the analysis of mixed ethnicity cohorts. The method uses a relevant reference panel to directly impute unmeasured SNP statistics based only on statistics at measured SNPs and estimated/user-specified ethnic proportions. Simulations show that the proposed method adequately controls the Type I error rates. The 1000 Genomes panel imputation of summary statistics from the ethnically diverse Psychiatric Genetic Consortium Schizophrenia Phase 2 suggests that, when compared to genotype imputation methods, DISTMIX offers comparable imputation accuracy for only a fraction of computational resources. DISTMIX software, its reference population data, and usage examples are publicly available at http://code.google.com/p/distmix. dlee4@vcu.edu Supplementary Data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Caetano-Anollés, Gustavo; Caetano-Anollés, Derek
2015-01-01
Accretion occurs pervasively in nature at widely different timeframes. The process also manifests in the evolution of macromolecules. Here we review recent computational and structural biology studies of evolutionary accretion that make use of the ideographic (historical, retrodictive) and nomothetic (universal, predictive) scientific frameworks. Computational studies uncover explicit timelines of accretion of structural parts in molecular repertoires and molecules. Phylogenetic trees of protein structural domains and proteomes and their molecular functions were built from a genomic census of millions of encoded proteins and associated terminal Gene Ontology terms. Trees reveal a ‘metabolic-first’ origin of proteins, the late development of translation, and a patchwork distribution of proteins in biological networks mediated by molecular recruitment. Similarly, the natural history of ancient RNA molecules inferred from trees of molecular substructures built from a census of molecular features shows patchwork-like accretion patterns. Ideographic analyses of ribosomal history uncover the early appearance of structures supporting mRNA decoding and tRNA translocation, the coevolution of ribosomal proteins and RNA, and a first evolutionary transition that brings ribosomal subunits together into a processive protein biosynthetic complex. Nomothetic structural biology studies of tertiary interactions and ancient insertions in rRNA complement these findings, once concentric layering assumptions are removed. Patterns of coaxial helical stacking reveal a frustrated dynamics of outward and inward ribosomal growth possibly mediated by structural grafting. The early rise of the ribosomal ‘turnstile’ suggests an evolutionary transition in natural biological computation. Results make explicit the need to understand processes of molecular growth and information transfer of macromolecules. PMID:27096056
Numerical simulation of synthesis gas incineration
NASA Astrophysics Data System (ADS)
Kazakov, A. V.; Khaustov, S. A.; Tabakaev, R. B.; Belousova, Y. A.
2016-04-01
The authors have analysed the expediency of the suggested low-grade fuels application method. Thermal processing of solid raw materials in the gaseous fuel, called synthesis gas, is investigated. The technical challenges concerning the applicability of the existing gas equipment developed and extensively tested exclusively for natural gas were considered. For this purpose computer simulation of three-dimensional syngas-incinerating flame dynamics was performed by means of the ANSYS Multiphysics engineering software. The subjects of studying were: a three-dimensional aerodynamic flame structure, heat-release and temperature fields, a set of combustion properties: a flare range and the concentration distribution of burnout reagents. The obtained results were presented in the form of a time-averaged pathlines with color indexing. The obtained results can be used for qualitative and quantitative evaluation of complex multicomponent gas incineration singularities.
Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J
2015-12-01
In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. (c) 2015 APA, all rights reserved).
Applicability of IHE/Continua components for PHR systems: learning from experiences.
Urbauer, Philipp; Sauermann, Stefan; Frohner, Matthias; Forjan, Mathias; Pohn, Birgit; Mense, Alexander
2015-04-01
Capturing personal health data using smartphones, PCs or other devices, and the reuse of the data in personal health records (PHR) is becoming more and more attractive for modern health-conscious populations. This paper analyses interoperability specifications targeting standards-based communication of computer systems and personal health devices (e.g. blood pressure monitor) in healthcare from initiatives like Integrating the Healthcare Enterprise (IHE) and Continua Health Alliance driven by industry and healthcare professionals. Furthermore it identifies certain contradictions and gaps in the specifications and suggests possible solutions. Despite these shortcomings, the specifications allow fully functional implementations of PHR systems. Henceforth, both big business and small and medium-sized enterprises (SMEs) can actively contribute to the widespread use of large-scale interoperable PHR systems. Copyright © 2013 Elsevier Ltd. All rights reserved.
Wachsmuth, Leah M; Johnson, Meredith G; Gavenonis, Jason
2017-06-01
Parasitic diseases caused by kinetoplastid parasites of the genera Trypanosoma and Leishmania are an urgent public health crisis in the developing world. These closely related species possess a number of multimeric enzymes in highly conserved pathways involved in vital functions, such as redox homeostasis and nucleotide synthesis. Computational alanine scanning of these protein-protein interfaces has revealed a host of potentially ligandable sites on several established and emerging anti-parasitic drug targets. Analysis of interfaces with multiple clustered hotspots has suggested several potentially inhibitable protein-protein interactions that may have been overlooked by previous large-scale analyses focusing solely on secondary structure. These protein-protein interactions provide a promising lead for the development of new peptide and macrocycle inhibitors of these enzymes.
Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J.
2016-01-01
In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. PMID:26389526