ERIC Educational Resources Information Center
Deryakulu, Deniz; Olkun, Sinan
2009-01-01
This study examined Turkish computer teachers' professional memories telling of their experiences with school administrators and supervisors. Seventy-four computer teachers participated in the study. Content analysis of the memories revealed that the most frequently mentioned themes concerning school administrators were "unsupportive…
Errors in finite-difference computations on curvilinear coordinate systems
NASA Technical Reports Server (NTRS)
Mastin, C. W.; Thompson, J. F.
1980-01-01
Curvilinear coordinate systems were used extensively to solve partial differential equations on arbitrary regions. An analysis of truncation error in the computation of derivatives revealed why numerical results may be erroneous. A more accurate method of computing derivatives is presented.
Business Demands for Web-Related Skills as Compared to Other Computer Skills.
ERIC Educational Resources Information Center
Groneman, Nancy
2000-01-01
Analysis of 23,704 want ads for computer-related jobs revealed that the most frequently mentioned skills were UNIX, SQL programming, and computer security. Curriculum implications were derived from the most desired and less frequently mentioned skills. (SK)
Linguistic Analysis of Natural Language Communication with Computers.
ERIC Educational Resources Information Center
Thompson, Bozena Henisz
Interaction with computers in natural language requires a language that is flexible and suited to the task. This study of natural dialogue was undertaken to reveal those characteristics which can make computer English more natural. Experiments were made in three modes of communication: face-to-face, terminal-to-terminal, and human-to-computer,…
ERIC Educational Resources Information Center
Wu, Heping; Gao, Junde; Zhang, Weimin
2014-01-01
The present study examines the professional growth of three Chinese English teachers by analyzing their interactional patterns and their social and cognitive presence in an online community. The data from social network analysis (SNA) and content analysis revealed that computer-mediated communication (CMC) created new opportunities for teachers to…
ERIC Educational Resources Information Center
Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.
1998-01-01
Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…
Argumentation in a Multi Party Asynchronous Computer Mediated Conference: A Generic Analysis
ERIC Educational Resources Information Center
Coffin, Caroline; Painter, Clare; Hewings, Ann
2005-01-01
This paper draws on systemic functional linguistic genre analysis to illuminate the way in which post graduate applied linguistics students structure their argumentation within a multi party asynchronous computer mediated conference. Two conference discussions within the same postgraduate course are compared in order to reveal the way in which…
Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei
2015-04-01
Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.
ERIC Educational Resources Information Center
Mildenhall, Paula; Hackling, Mark
2012-01-01
This paper reports on the analysis of a study of a professional learning intervention focussing on computational estimation. Using a multiple case study design it was possible to describe the impact of the intervention of students' beliefs and computational estimation performance. The study revealed some noteworthy impacts on computational…
Technological Developments in Journalism: The Impact of the Computer in the Newsroom.
ERIC Educational Resources Information Center
Garrison, Bruce
A review of the literature for the past 7 years reveals that the computer serves several key functions in the newsroom. Its more dominant role is in word processing, or internal copy processing regardless of the source of the copy. Computers are also useful in reviewing documents for content analysis, for survey research in public opinion polls…
ERIC Educational Resources Information Center
Judd, Terry; Kennedy, Gregor
2011-01-01
Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…
ERIC Educational Resources Information Center
Namlu, Aysen Gurcan; Odabasi, Hatice Ferhan
2007-01-01
This study was carried out in a Turkish university with 216 undergraduate students of computer technology as respondents. The study aimed to develop a scale (UECUBS) to determine the unethical computer use behavior. A factor analysis of the related items revealed that the factors were can be divided under five headings; intellectual property,…
Molecular biological analysis in a patient with multiple lung adenocarcinomas.
Wakayama, Tomoshige; Hirata, Hirokuni; Suka, Shunsuke; Sato, Kozo; Tatewaki, Masamitsu; Souma, Ryosuke; Satoh, Hideyuki; Tamura, Motohiko; Matsumura, Yuji; Imada, Hiroki; Sugiyama, Kumiya; Arima, Masafumi; Kurasawa, Kazuhiro; Fukuda, Takeshi; Fukushima, Yasutsugu
2018-05-01
The utility of molecular biological analysis in lung adenocarcinoma has been demonstrated. Herein we report a rare case presenting as multiple lung adenocarcinomas with four different EGFR gene mutations detected in three lung tumors. After opacification was detected by routine chest X-ray, the patient, a 64-year-old woman, underwent chest computed tomography which revealed a right lung segment S4 ground-glass nodule (GGN). Follow-up computed tomography revealed a 42 mm GGN nodule with a 26 mm nodule (S6) and a 20 mm GGN (S10). Histopathology of resected specimens from the right middle and lower lobes revealed all three nodules were adenocarcinomas. Four EGFR mutations were detected; no three tumors had the same mutations. Molecular biological analysis is a promising tool for the diagnosis of primary tumors in patients with multiple lung carcinomas of the same histotype, enabling appropriate treatment. © 2018 The Authors. Thoracic Cancer published by China Lung Oncology Group and John Wiley & Sons Australia, Ltd.
An Analysis of Mission Critical Computer Software in Naval Aviation
1991-03-01
No. Task No. Work Unit Accesion Number 11. TITLE (Include Security Classification) AN ANALYSIS OF MISSION CRITICAL COMPUTER SOFTWARE IN NAVAL AVIATION...software development schedules were sustained without a milestone change being made. Also, software that was released to the fleet had no major...fleet contain any major defects? This research has revealed that only about half of the original software development schedules were sustained without a
The Impact of Learner Characteristics on the Multi-Dimensional Construct of Social Presence
ERIC Educational Resources Information Center
Mykota, David
2017-01-01
This study explored the impact of learner characteristics on the multi-dimensional construct of social presence as measured by the computer-mediated communication questionnaire. Using Multiple Analysis of Variance findings reveal that the number of online courses taken and computer-mediated communication experience significantly affect the…
Fetterhoff, Dustin; Opris, Ioan; Simpson, Sean L.; Deadwyler, Sam A.; Hampson, Robert E.; Kraft, Robert A.
2014-01-01
Background Multifractal analysis quantifies the time-scale-invariant properties in data by describing the structure of variability over time. By applying this analysis to hippocampal interspike interval sequences recorded during performance of a working memory task, a measure of long-range temporal correlations and multifractal dynamics can reveal single neuron correlates of information processing. New method Wavelet leaders-based multifractal analysis (WLMA) was applied to hippocampal interspike intervals recorded during a working memory task. WLMA can be used to identify neurons likely to exhibit information processing relevant to operation of brain–computer interfaces and nonlinear neuronal models. Results Neurons involved in memory processing (“Functional Cell Types” or FCTs) showed a greater degree of multifractal firing properties than neurons without task-relevant firing characteristics. In addition, previously unidentified FCTs were revealed because multifractal analysis suggested further functional classification. The cannabinoid-type 1 receptor partial agonist, tetrahydrocannabinol (THC), selectively reduced multifractal dynamics in FCT neurons compared to non-FCT neurons. Comparison with existing methods WLMA is an objective tool for quantifying the memory-correlated complexity represented by FCTs that reveals additional information compared to classification of FCTs using traditional z-scores to identify neuronal correlates of behavioral events. Conclusion z-Score-based FCT classification provides limited information about the dynamical range of neuronal activity characterized by WLMA. Increased complexity, as measured with multifractal analysis, may be a marker of functional involvement in memory processing. The level of multifractal attributes can be used to differentially emphasize neural signals to improve computational models and algorithms underlying brain–computer interfaces. PMID:25086297
Study of USGS/NASA land use classification system. [computer analysis from LANDSAT data
NASA Technical Reports Server (NTRS)
Spann, G. W.
1975-01-01
The results of a computer mapping project using LANDSAT data and the USGS/NASA land use classification system are summarized. During the computer mapping portion of the project, accuracies of 67 percent to 79 percent were achieved using Level II of the classification system and a 4,000 acre test site centered on Douglasville, Georgia. Analysis of response to a questionaire circulated to actual and potential LANDSAT data users reveals several important findings: (1) there is a substantial desire for additional information related to LANDSAT capabilities; (2) a majority of the respondents feel computer mapping from LANDSAT data could aid present or future projects; and (3) the costs of computer mapping are substantially less than those of other methods.
Growth Dynamics of Information Search Services
ERIC Educational Resources Information Center
Lindquist, Mats G.
1978-01-01
An analysis of computer-based search services (ISSs) from a system's viewpoint, using a continuous simulation model to reveal growth and stagnation of a typical system is presented, as well as an analysis of decision making for an ISS. (Author/MBR)
Content Analysis: What Are They Talking About?
ERIC Educational Resources Information Center
Strijbos, Jan-Willem; Martens, Rob L.; Prins, Frans J.; Jochems, Wim M. G.
2006-01-01
Quantitative content analysis is increasingly used to surpass surface level analyses in computer-supported collaborative learning (e.g., counting messages), but critical reflection on accepted practice has generally not been reported. A review of CSCL conference proceedings revealed a general vagueness in definitions of units of analysis. In…
NASA Astrophysics Data System (ADS)
Koseki, Jun; Matsui, Hidetoshi; Konno, Masamitsu; Nishida, Naohiro; Kawamoto, Koichi; Kano, Yoshihiro; Mori, Masaki; Doki, Yuichiro; Ishii, Hideshi
2016-02-01
Bioinformatics and computational modelling are expected to offer innovative approaches in human medical science. In the present study, we performed computational analyses and made predictions using transcriptome and metabolome datasets obtained from fluorescence-based visualisations of chemotherapy-resistant cancer stem cells (CSCs) in the human oesophagus. This approach revealed an uncharacterized role for the ornithine metabolic pathway in the survival of chemotherapy-resistant CSCs. The present study fastens this rationale for further characterisation that may lead to the discovery of innovative drugs against robust CSCs.
A Case for Ubiquitous, Integrated Computing in Teacher Education
ERIC Educational Resources Information Center
Kay, Robin H.; Knaack, Liesel
2005-01-01
The purpose of this study was to evaluate the effect of an integrated, laptop-based approach on pre-service teachers' computer attitudes, ability and use. Pre-post program analysis revealed significant differences in behavioural attitudes and perceived control (self-efficacy), but not in affective and cognitive attitudes. In addition, there was a…
Barreau, S; Morton, J
1999-11-09
The work reported provides an information processing account of young children's performance on the Smarties task (Perner, J., Leekam, S.R., & Wimmer, H. 1987, Three-year-olds' difficulty with false belief: the case for a conceptual deficit. British Journal of Developmental Psychology, 5, 125-137). In this task, a 3-year-old is shown a Smarties tube and asked about the supposed contents. The true contents, pencils, is then revealed, and the majority of 3-year-olds cannot recall their initial belief that the tube contained Smarties. The theoretical analysis, based on the Headed Records framework (Morton, J., Hammersley, R.J., & Bekerian, D.A. 1985, Headed records: a model for memory and its failures, Cognition, 20, 1-23), focuses on the computational conditions that are required to resolve the Smarties task; on the possible limitations in the developing memory system that may lead to a computational breakdown; and on ways of bypassing such limitations to ensure correct resolution. The design, motivated by this analysis, is a variation on Perner's Smarties task. Instead of revealing the tube's contents immediately after establishing the child's beliefs about it, these contents were then transferred to a bag and a (false) belief about the bag's contents established. Only then were the true contents of the bag revealed. The same procedure (different contents) was carried out a week later. As predicted children's performance was better (a) in the 'tube' condition; and (b) on the second test. Consistent with the proposed analysis, the data show that when the computational demands imposed by the original task are reduced, young children can and do remember what they had thought about the contents of the tube even after its true contents are revealed.
Mirror neurons and imitation: a computationally guided review.
Oztop, Erhan; Kawato, Mitsuo; Arbib, Michael
2006-04-01
Neurophysiology reveals the properties of individual mirror neurons in the macaque while brain imaging reveals the presence of 'mirror systems' (not individual neurons) in the human. Current conceptual models attribute high level functions such as action understanding, imitation, and language to mirror neurons. However, only the first of these three functions is well-developed in monkeys. We thus distinguish current opinions (conceptual models) on mirror neuron function from more detailed computational models. We assess the strengths and weaknesses of current computational models in addressing the data and speculations on mirror neurons (macaque) and mirror systems (human). In particular, our mirror neuron system (MNS), mental state inference (MSI) and modular selection and identification for control (MOSAIC) models are analyzed in more detail. Conceptual models often overlook the computational requirements for posited functions, while too many computational models adopt the erroneous hypothesis that mirror neurons are interchangeable with imitation ability. Our meta-analysis underlines the gap between conceptual and computational models and points out the research effort required from both sides to reduce this gap.
Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks
Kaltenbacher, Barbara; Hasenauer, Jan
2017-01-01
Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351
Increasing Elementary School Teachers' Awareness of Gender Inequity in Student Computer Usage
ERIC Educational Resources Information Center
Luongo, Nicole
2012-01-01
This study was designed to increase gender equity awareness in elementary school teachers with respect to student computer and technology usage. Using professional development methods with a group of teachers, the writer attempted to help them become more aware of gender bias in technology instruction. An analysis of the data revealed that…
ERIC Educational Resources Information Center
Stefanski, Angela J.; Leitze, Amy; Fife-Demski, Veronica M.
2018-01-01
This collective case study used methods of discourse analysis to consider what computer-mediated collaboration might reveal about preservice teachers' sense-making in a field-based practicum as they learn to teach reading to children identified as struggling readers. Researchers agree that field-based experiences coupled with time for reflection…
Amber Vanden Wymelenberg; Patrick Minges; Grzegorz Sabat; Diego Martinez; Andrea Aerts; Asaf Salamov; Igor Grigoriev; Harris Shapiro; Nik Putnam; Paula Belinky; Carlos Dosoretz; Jill Gaskell; Phil Kersten; Dan Cullen
2006-01-01
The white-rot basidiomycete Phanerochaete chrysosporium employs extracellular enzymes to completely degrade the major polymers of wood: cellulose, hemicellulose, and lignin. Analysis of a total of 10,048 v2.1 gene models predicts 769 secreted proteins, a substantial increase over the 268 models identified in the earlier database (v1.0). Within the v2.1 âcomputational...
Multiscale tomographic analysis of heterogeneous cast Al-Si-X alloys.
Asghar, Z; Requena, G; Sket, F
2015-07-01
The three-dimensional microstructure of cast AlSi12Ni and AlSi10Cu5Ni2 alloys is investigated by laboratory X-ray computed tomography, synchrotron X-ray computed microtomography, light optical tomography and synchrotron X-ray computed microtomography with submicrometre resolution. The results obtained with each technique are correlated with the size of the scanned volumes and resolved microstructural features. Laboratory X-ray computed tomography is sufficient to resolve highly absorbing aluminides but eutectic and primary Si remain unrevealed. Synchrotron X-ray computed microtomography at ID15/ESRF gives better spatial resolution and reveals primary Si in addition to aluminides. Synchrotron X-ray computed microtomography at ID19/ESRF reveals all the phases ≥ ∼1 μm in volumes about 80 times smaller than laboratory X-ray computed tomography. The volumes investigated by light optical tomography and submicrometre synchrotron X-ray computed microtomography are much smaller than laboratory X-ray computed tomography but both techniques provide local chemical information on the types of aluminides. The complementary techniques applied enable a full three-dimensional characterization of the microstructure of the alloys at length scales ranging over six orders of magnitude. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines
Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less
NASA Technical Reports Server (NTRS)
Parrish, R. S.; Carter, M. C.
1974-01-01
This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.
Multiplex Quantitative Histologic Analysis of Human Breast Cancer Cell Signaling and Cell Fate
2010-05-01
Breast cancer, cell signaling, cell proliferation, histology, image analysis 15. NUMBER OF PAGES - 51 16. PRICE CODE 17. SECURITY CLASSIFICATION...revealed by individual stains in multiplex combinations; and (3) software (FARSIGHT) for automated multispectral image analysis that (i) segments...Task 3. Develop computational algorithms for multispectral immunohistological image analysis FARSIGHT software was developed to quantify intrinsic
Systems Analysis, Machineable Circulation Data and Library Users and Non-Users.
ERIC Educational Resources Information Center
Lubans, John, Jr.
A study to be made with computer-based circulation data of the non-use and use of a large academic library is discussed. A search of the literature reveals that computer-based circulation systems can be, but have not been, utilized to provide data bases for systematic analyses of library users and resources. The data gathered in the circulation…
ERIC Educational Resources Information Center
Schmid, Richard F.; Bernard, Robert M.; Borokhovski, Eugene; Tamim, Rana; Abrami, Philip C.; Wade, C. Anne; Surkes, Michael A.; Lowerison, Gretchen
2009-01-01
This paper reports the findings of a Stage I meta-analysis exploring the achievement effects of computer-based technology use in higher education classrooms (non-distance education). An extensive literature search revealed more than 6,000 potentially relevant primary empirical studies. Analysis of a representative sample of 231 studies (k = 310)…
MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models
Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines
2016-08-03
Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less
Colonic metastasis from breast carcinoma: a case report.
Tsujimura, Kazuma; Teruya, Tsuyoshi; Kiyuna, Masaya; Higa, Kuniki; Higa, Junko; Iha, Kouji; Chinen, Kiyoshi; Asato, Masaya; Takushi, Yasukatsu; Ota, Morihito; Dakeshita, Eijirou; Nakachi, Atsushi; Gakiya, Akira; Shiroma, Hiroshi
2017-07-05
Colonic metastasis from breast carcinoma is very rare. Here, we report a case of colonic metastasis from breast carcinoma. The patient was a 51-year-old woman. She had upper abdominal pain, vomiting, and diarrhea, repeatedly. We performed abdominal contrast-enhanced computed tomography (CT) to investigate these symptoms. The CT scan revealed a tumor in the ascending colon with contrast enhancement and showed an expanded small intestine. For further investigation of this tumor, we performed whole positron emission tomography-computed tomography (PET-CT). The PET-CT scan revealed fluorodeoxyglucose uptake in the ascending colon, mesentery, left breast, and left axillary region. Analysis of biopsy samples obtained during colonoscopy revealed signet ring cell-like carcinoma. Moreover, biopsy of the breast tumor revealed invasive lobular carcinoma. Therefore, the preoperative diagnosis was colonic metastasis from breast carcinoma. Open ileocecal resection was performed. The final diagnosis was multiple metastatic breast carcinomas, and the TNM classification was T2N1M1 Stage IV. We presented a rare case of colonic metastasis from breast carcinoma. PET-CT may be useful in the diagnosis of metastatic breast cancer. When analysis of biopsy samples obtained during colonoscopy reveals signet ring cell-like carcinoma, the possibility of breast cancer as the primary tumor should be considered.
Yeari, Menahem; van den Broek, Paul
2016-09-01
It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Computation material science of structural-phase transformation in casting aluminium alloys
NASA Astrophysics Data System (ADS)
Golod, V. M.; Dobosh, L. Yu
2017-04-01
Successive stages of computer simulation the formation of the casting microstructure under non-equilibrium conditions of crystallization of multicomponent aluminum alloys are presented. On the basis of computer thermodynamics and heat transfer during solidification of macroscale shaped castings are specified the boundary conditions of local heat exchange at mesoscale modeling of non-equilibrium formation the solid phase and of the component redistribution between phases during coalescence of secondary dendrite branches. Computer analysis of structural - phase transitions based on the principle of additive physico-chemical effect of the alloy components in the process of diffusional - capillary morphological evolution of the dendrite structure and the o of local dendrite heterogeneity which stochastic nature and extent are revealed under metallographic study and modeling by the Monte Carlo method. The integrated computational materials science tools at researches of alloys are focused and implemented on analysis the multiple-factor system of casting processes and prediction of casting microstructure.
Computed tomographic findings of trichuriasis
Tokmak, Naime; Koc, Zafer; Ulusan, Serife; Koltas, Ismail Soner; Bal, Nebil
2006-01-01
In this report, we present computed tomographic findings of colonic trichuriasis. The patient was a 75-year-old man who complained of abdominal pain, and weight loss. Diagnosis was achieved by colonoscopic biopsy. Abdominal computed tomography showed irregular and nodular thickening of the wall of the cecum and ascending colon. Although these findings are nonspecific, they may be one of the findings of trichuriasis. These findings, confirmed by pathologic analysis of the biopsied tissue and Kato-Katz parasitological stool flotation technique, revealed adult Trichuris. To our knowledge, this is the first report of colonic trichuriasis indicated by computed tomography. PMID:16830393
Wheeze sound analysis using computer-based techniques: a systematic review.
Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian
2017-10-31
Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.
An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process
NASA Technical Reports Server (NTRS)
Carter, M. C.; Madison, M. W.
1973-01-01
The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.
Exploring Classroom Interaction with Dynamic Social Network Analysis
ERIC Educational Resources Information Center
Bokhove, Christian
2018-01-01
This article reports on an exploratory project in which technology and dynamic social network analysis (SNA) are used for modelling classroom interaction. SNA focuses on the links between social actors, draws on graphic imagery to reveal and display the patterning of those links, and develops mathematical and computational models to describe and…
Computational Analysis of Stresses Acting on Intermodular Junctions in Thoracic Aortic Endografts
Prasad, Anamika; To, Lillian K.; Gorrepati, Madhu L.; Zarins, Christopher K.; Figueroa, C. Alberto
2011-01-01
Purpose: To evaluate the biomechanical and hemodynamic forces acting on the intermodular junctions of a multi-component thoracic endograft and elucidate their influence on the development of type III endoleak due to disconnection of stent-graft segments. Methods: Three-dimensional computer models of the thoracic aorta and a 4-component thoracic endograft were constructed using postoperative (baseline) and follow-up computed tomography (CT) data from a 69-year-old patient who developed type III endoleak 4 years after stent-graft placement. Computational fluid dynamics (CFD) techniques were used to quantitate the displacement forces acting on the device. The contact stresses between the different modules of the graft were then quantified using computational solid mechanics (CSM) techniques. Lastly, the intermodular junction frictional stability was evaluated using a Coulomb model. Results: The CFD analysis revealed that curvature and length are key determinants of the displacement forces experienced by each endograft and that the first 2 modules were exposed to displacement forces acting in opposite directions in both the lateral and longitudinal axes. The CSM analysis revealed that the highest concentration of stresses occurred at the junction between the first and second modules of the device. Furthermore, the frictional analysis demonstrated that most of the surface area (53%) of this junction had unstable contact. The predicted critical zone of intermodular stress concentration and frictional instability matched the location of the type III endoleak observed in the 4-year follow-up CT image. Conclusion: The region of larger intermodular stresses and highest frictional instability correlated with the zone where a type III endoleak developed 4 years after thoracic stent-graft placement. Computational techniques can be helpful in evaluating the risk of endograft migration and potential for modular disconnection and may be useful in improving device placement strategies and endograft design. PMID:21861748
Prediction of ball and roller bearing thermal and kinematic performance by computer analysis
NASA Technical Reports Server (NTRS)
Pirvics, J.; Kleckner, R. J.
1983-01-01
Characteristics of good computerized analysis software are suggested. These general remarks and an overview of representative software precede a more detailed discussion of load support system analysis program structure. Particular attention is directed at a recent cylindrical roller bearing analysis as an example of the available design tools. Selected software modules are then examined to reveal the detail inherent in contemporary analysis. This leads to a brief section on current design computation which seeks to suggest when and why computerized analysis is warranted. An example concludes the argument offered for such design methodology. Finally, remarks are made concerning needs for model development to address effects which are now considered to be secondary but are anticipated to emerge to primary status in the near future.
Research Education in Undergraduate Occupational Therapy Programs.
ERIC Educational Resources Information Center
Petersen, Paul; And Others
1992-01-01
Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)
A combined computational-experimental analyses of selected metabolic enzymes in Pseudomonas species.
Perumal, Deepak; Lim, Chu Sing; Chow, Vincent T K; Sakharkar, Kishore R; Sakharkar, Meena K
2008-09-10
Comparative genomic analysis has revolutionized our ability to predict the metabolic subsystems that occur in newly sequenced genomes, and to explore the functional roles of the set of genes within each subsystem. These computational predictions can considerably reduce the volume of experimental studies required to assess basic metabolic properties of multiple bacterial species. However, experimental validations are still required to resolve the apparent inconsistencies in the predictions by multiple resources. Here, we present combined computational-experimental analyses on eight completely sequenced Pseudomonas species. Comparative pathway analyses reveal that several pathways within the Pseudomonas species show high plasticity and versatility. Potential bypasses in 11 metabolic pathways were identified. We further confirmed the presence of the enzyme O-acetyl homoserine (thiol) lyase (EC: 2.5.1.49) in P. syringae pv. tomato that revealed inconsistent annotations in KEGG and in the recently published SYSTOMONAS database. These analyses connect and integrate systematic data generation, computational data interpretation, and experimental validation and represent a synergistic and powerful means for conducting biological research.
Ali, H Raza; Dariush, Aliakbar; Provenzano, Elena; Bardwell, Helen; Abraham, Jean E; Iddawela, Mahesh; Vallier, Anne-Laure; Hiller, Louise; Dunn, Janet A; Bowden, Sarah J; Hickish, Tamas; McAdam, Karen; Houston, Stephen; Irwin, Mike J; Pharoah, Paul D P; Brenton, James D; Walton, Nicholas A; Earl, Helena M; Caldas, Carlos
2016-02-16
There is a need to improve prediction of response to chemotherapy in breast cancer in order to improve clinical management and this may be achieved by harnessing computational metrics of tissue pathology. We investigated the association between quantitative image metrics derived from computational analysis of digital pathology slides and response to chemotherapy in women with breast cancer who received neoadjuvant chemotherapy. We digitised tissue sections of both diagnostic and surgical samples of breast tumours from 768 patients enrolled in the Neo-tAnGo randomized controlled trial. We subjected digital images to systematic analysis optimised for detection of single cells. Machine-learning methods were used to classify cells as cancer, stromal or lymphocyte and we computed estimates of absolute numbers, relative fractions and cell densities using these data. Pathological complete response (pCR), a histological indicator of chemotherapy response, was the primary endpoint. Fifteen image metrics were tested for their association with pCR using univariate and multivariate logistic regression. Median lymphocyte density proved most strongly associated with pCR on univariate analysis (OR 4.46, 95 % CI 2.34-8.50, p < 0.0001; observations = 614) and on multivariate analysis (OR 2.42, 95 % CI 1.08-5.40, p = 0.03; observations = 406) after adjustment for clinical factors. Further exploratory analyses revealed that in approximately one quarter of cases there was an increase in lymphocyte density in the tumour removed at surgery compared to diagnostic biopsies. A reduction in lymphocyte density at surgery was strongly associated with pCR (OR 0.28, 95 % CI 0.17-0.47, p < 0.0001; observations = 553). A data-driven analysis of computational pathology reveals lymphocyte density as an independent predictor of pCR. Paradoxically an increase in lymphocyte density, following exposure to chemotherapy, is associated with a lack of pCR. Computational pathology can provide objective, quantitative and reproducible tissue metrics and represents a viable means of outcome prediction in breast cancer. ClinicalTrials.gov NCT00070278 ; 03/10/2003.
On Lying and Being Lied to: A Linguistic Analysis of Deception in Computer-Mediated Communication
ERIC Educational Resources Information Center
Hancock, Jeffrey T.; Curry, Lauren E.; Goorha, Saurabh; Woodworth, Michael
2008-01-01
This study investigated changes in both the liar's and the conversational partner's linguistic style across truthful and deceptive dyadic communication in a synchronous text-based setting. An analysis of 242 transcripts revealed that liars produced more words, more sense-based words (e.g., seeing, touching), and used fewer self-oriented but more…
Users' Perceptions of the Web As Revealed by Transaction Log Analysis.
ERIC Educational Resources Information Center
Moukdad, Haidar; Large, Andrew
2001-01-01
Describes the results of a transaction log analysis of a Web search engine, WebCrawler, to analyze user's queries for information retrieval. Results suggest most users do not employ advanced search features, and the linguistic structure often resembles a human-human communication model that is not always successful in human-computer communication.…
A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books
ERIC Educational Resources Information Center
Parette, Howard P.; Blum, Craig; Luthin, Katie
2015-01-01
In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…
Ravichandran, Srikanth; Michelucci, Alessandro; del Sol, Antonio
2018-01-01
Alzheimer's disease (AD) is a major neurodegenerative disease and is one of the most common cause of dementia in older adults. Among several factors, neuroinflammation is known to play a critical role in the pathogenesis of chronic neurodegenerative diseases. In particular, studies of brains affected by AD show a clear involvement of several inflammatory pathways. Furthermore, depending on the brain regions affected by the disease, the nature and the effect of inflammation can vary. Here, in order to shed more light on distinct and common features of inflammation in different brain regions affected by AD, we employed a computational approach to analyze gene expression data of six site-specific neuronal populations from AD patients. Our network based computational approach is driven by the concept that a sustained inflammatory environment could result in neurotoxicity leading to the disease. Thus, our method aims to infer intracellular signaling pathways/networks that are likely to be constantly activated or inhibited due to persistent inflammatory conditions. The computational analysis identified several inflammatory mediators, such as tumor necrosis factor alpha (TNF-a)-associated pathway, as key upstream receptors/ligands that are likely to transmit sustained inflammatory signals. Further, the analysis revealed that several inflammatory mediators were mainly region specific with few commonalities across different brain regions. Taken together, our results show that our integrative approach aids identification of inflammation-related signaling pathways that could be responsible for the onset or the progression of AD and can be applied to study other neurodegenerative diseases. Furthermore, such computational approaches can enable the translation of clinical omics data toward the development of novel therapeutic strategies for neurodegenerative diseases. PMID:29551980
Ravichandran, Srikanth; Michelucci, Alessandro; Del Sol, Antonio
2018-01-01
Alzheimer's disease (AD) is a major neurodegenerative disease and is one of the most common cause of dementia in older adults. Among several factors, neuroinflammation is known to play a critical role in the pathogenesis of chronic neurodegenerative diseases. In particular, studies of brains affected by AD show a clear involvement of several inflammatory pathways. Furthermore, depending on the brain regions affected by the disease, the nature and the effect of inflammation can vary. Here, in order to shed more light on distinct and common features of inflammation in different brain regions affected by AD, we employed a computational approach to analyze gene expression data of six site-specific neuronal populations from AD patients. Our network based computational approach is driven by the concept that a sustained inflammatory environment could result in neurotoxicity leading to the disease. Thus, our method aims to infer intracellular signaling pathways/networks that are likely to be constantly activated or inhibited due to persistent inflammatory conditions. The computational analysis identified several inflammatory mediators, such as tumor necrosis factor alpha (TNF-a)-associated pathway, as key upstream receptors/ligands that are likely to transmit sustained inflammatory signals. Further, the analysis revealed that several inflammatory mediators were mainly region specific with few commonalities across different brain regions. Taken together, our results show that our integrative approach aids identification of inflammation-related signaling pathways that could be responsible for the onset or the progression of AD and can be applied to study other neurodegenerative diseases. Furthermore, such computational approaches can enable the translation of clinical omics data toward the development of novel therapeutic strategies for neurodegenerative diseases.
The Exponential Expansion of Simulation in Research
2012-12-01
exponential growth of computing power. Although other analytic approaches also benefit from this trend, keyword searches of several scholarly search ... engines reveal that the reliance on simulation is increasing more rapidly. A descriptive analysis paints a compelling picture: simulation is frequently
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crockett, D.P.; Smith, W.K.; Proshansky, E.
1989-10-08
We report on computer-assisted three-dimensional reconstruction of spinal cord activity associated with stimulation of the plantar cushion (PC) as revealed by (14C)-2-deoxy-D-glucose (2-DG) serial autoradiographs. Moderate PC stimulation in cats elicits a reflex phasic plantar flexion of the toes. Four cats were chronically spinalized at about T6 under barbiturate anesthesia. Four to 11 days later, the cats were injected (i.v.) with 2-DG (100 microCi/kg) and the PC was electrically stimulated with needle electrodes at 2-5 times threshold for eliciting a reflex. Following stimulation, the spinal cord was processed for autoradiography. Subsequently, autoradiographs, representing approximately 8-18 mm from spinal segments L6-S1,more » were digitized for computer analysis and 3-D reconstruction. Several strategies of analysis were employed: (1) Three-dimensional volume images were color-coded to represent different levels of functional activity. (2) On the reconstructed volumes, virtual sections were made in the horizontal, sagittal, and transverse planes to view regions of 2-DG activity. (3) In addition, we were able to sample different regions within the grey and white matter semi-quantitatively (i.e., pixel intensity) from section to section to reveal differences between ipsi- and contralateral activity, as well as possible variation between sections. These analyses revealed 2-DG activity associated with moderate PC stimulation, not only in the ipsilateral dorsal horn as we had previously demonstrated, but also in both the ipsilateral and contralateral ventral horns, as well as in the intermediate grey matter. The use of novel computer analysis techniques--combined with an unanesthetized preparation--enabled us to demonstrate that the increased metabolic activity in the lumbosacral spinal cord associated with PC stimulation was much more extensive than had heretofore been observed.« less
Bhattacharjee, Kaushik; Banerjee, Subhro; Joshi, Santa Ram
2012-01-01
Isolation and characterization of actinomycetes from soil samples from altitudinal gradient of North-East India were investigated for computational RNomics based phylogeny. A total of 52 diverse isolates of Streptomyces from the soil samples were isolated on four different media and from these 6 isolates were selected on the basis of cultural characteristics, microscopic and biochemical studies. Sequencing of 16S rDNA of the selected isolates identified them to belong to six different species of Streptomyces. The molecular morphometric and physico-kinetic analysis of 16S rRNA sequences were performed to predict the diversity of the genus. The computational RNomics study revealed the significance of the structural RNA based phylogenetic analysis in a relatively diverse group of Streptomyces. PMID:22829729
Challenges in reducing the computational time of QSTS simulations for distribution system analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.
The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less
1989-09-01
technologies during the days of the Industrial Revolution . [8:229) Those words while almost five years old still hold true as computer-based...Within the seeds of this revolution in technology lies potential for change as potent and ubiquitous as that brought about by changes in manufacturing...with the largest increases coming from the microcomputer industry . A set of companies recently studied revealed growth rates from 30 to 100 percent
Soriano, Elena; Marco-Contelles, José; Colmena, Inés; Gandía, Luis
2010-05-01
One of the most critical issues on the study of ligand-receptor interactions in drug design is the knowledge of the bioactive conformation of the ligand. In this study, we describe a computational approach aimed at estimating the binding ability of epibatidine analogs to interact with the neuronal nicotinic acetylcholine receptor (nAChR) and get insights into the bioactive conformation. The protocol followed consists of a docking analysis and evaluation of pharmacophore parameters of the docked structures. On the basis of the biological data, the results have revealed that the docking analysis is able to predict active ligands, whereas further efforts are needed to develop a suitable and solid pharmacophore model.
Error Estimates of the Ares I Computed Turbulent Ascent Longitudinal Aerodynamic Analysis
NASA Technical Reports Server (NTRS)
Abdol-Hamid, Khaled S.; Ghaffari, Farhad
2012-01-01
Numerical predictions of the longitudinal aerodynamic characteristics for the Ares I class of vehicles, along with the associated error estimate derived from an iterative convergence grid refinement, are presented. Computational results are based on an unstructured grid, Reynolds-averaged Navier-Stokes analysis. The validity of the approach to compute the associated error estimates, derived from a base grid to an extrapolated infinite-size grid, was first demonstrated on a sub-scaled wind tunnel model at representative ascent flow conditions for which the experimental data existed. Such analysis at the transonic flow conditions revealed a maximum deviation of about 23% between the computed longitudinal aerodynamic coefficients with the base grid and the measured data across the entire roll angles. This maximum deviation from the wind tunnel data was associated with the computed normal force coefficient at the transonic flow condition and was reduced to approximately 16% based on the infinite-size grid. However, all the computed aerodynamic coefficients with the base grid at the supersonic flow conditions showed a maximum deviation of only about 8% with that level being improved to approximately 5% for the infinite-size grid. The results and the error estimates based on the established procedure are also presented for the flight flow conditions.
Stylistic Variations in Science Lectures: Teaching Vocabulary.
ERIC Educational Resources Information Center
Jackson, Jane; Bilton, Linda
1994-01-01
Twenty lectures by native speaker geology lecturers to nonnative speaker students were transcribed, and 921 instances of vocabulary elaboration were coded into a computer database according to 20 linguistic features. Analysis revealed noticeable variation among lecturers in language range/technicality, vocabulary elaboration, signalling, and use…
ERIC Educational Resources Information Center
Radok, Uwe
1985-01-01
The International Antarctic Glaciological Project has collected information on the East Antarctic ice sheet since 1969. Analysis of ice cores revealed climatic history, and radar soundings helped map bedrock of the continent. Computer models of the ice sheet and its changes over time will aid in predicting the future. (DH)
Kwon, Andrew T.; Chou, Alice Yi; Arenillas, David J.; Wasserman, Wyeth W.
2011-01-01
We performed a genome-wide scan for muscle-specific cis-regulatory modules (CRMs) using three computational prediction programs. Based on the predictions, 339 candidate CRMs were tested in cell culture with NIH3T3 fibroblasts and C2C12 myoblasts for capacity to direct selective reporter gene expression to differentiated C2C12 myotubes. A subset of 19 CRMs validated as functional in the assay. The rate of predictive success reveals striking limitations of computational regulatory sequence analysis methods for CRM discovery. Motif-based methods performed no better than predictions based only on sequence conservation. Analysis of the properties of the functional sequences relative to inactive sequences identifies nucleotide sequence composition can be an important characteristic to incorporate in future methods for improved predictive specificity. Muscle-related TFBSs predicted within the functional sequences display greater sequence conservation than non-TFBS flanking regions. Comparison with recent MyoD and histone modification ChIP-Seq data supports the validity of the functional regions. PMID:22144875
Language-Agnostic Reproducible Data Analysis Using Literate Programming.
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.
Language-Agnostic Reproducible Data Analysis Using Literate Programming
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasenkamp, Daren; Sim, Alexander; Wehner, Michael
Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, whilemore » we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.« less
The Exponential Expansion of Simulation: How Simulation has Grown as a Research Tool
2012-09-01
exponential growth of computing power. Although other analytic approaches also benefit from this trend, keyword searches of several scholarly search ... engines reveal that the reliance on simulation is increasing more rapidly. A descriptive analysis paints a compelling picture: simulation is frequently
NASA Astrophysics Data System (ADS)
Gerjuoy, Edward
2005-06-01
The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.
In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway
NASA Astrophysics Data System (ADS)
Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun
2016-12-01
HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.
Revealing the vectors of cellular identity with single-cell genomics
Wagner, Allon; Regev, Aviv; Yosef, Nir
2017-01-01
Single-cell genomics has now made it possible to create a comprehensive atlas of human cells. At the same time, it has reopened definitions of a cell’s identity and type and of the ways in which they are regulated by the cell’s molecular circuitry. Emerging computational analysis methods, especially in single-cell RNA sequencing (scRNA-seq), have already begun to reveal, in a data-driven way, the diverse simultaneous facets of a cell’s identity, from a taxonomy of discrete cell types to continuous dynamic transitions and spatial locations. These developments will eventually allow a cell to be represented as a superposition of ‘basis vectors’, each determining a different (but possibly dependent) aspect of cellular organization and function. However, computational methods must also overcome considerable challenges—from handling technical noise and data scale to forming new abstractions of biology. As the scale of single-cell experiments continues to increase, new computational approaches will be essential for constructing and characterizing a reference map of cell identities. PMID:27824854
Larsen, V H; Waldau, T; Gravesen, H; Siggaard-Andersen, O
1996-01-01
To describe a clinical case where an extremely low erythrocyte 2,3-diphosphoglycerate concentration (2,3-DPG) was discovered by routine blood gas analysis supplemented by computer calculation of derived quantities. The finding of a low 2,3-DPG revealed a severe hypophosphatemia. Open uncontrolled study of a patient case. Intensive care observation during 41 days. A 44 year old woman with an abdominal abscess. Surgical drainage, antibiotics and parenteral nutrition. daily routine blood gas analyses with computer calculation of the hemoglobin oxygen affinity and estimation of the 2,3-DPG. An abrupt decline of 2,3-DPG was observed late in the course coincident with a pronounced hypophosphatemia. The fall in 2,3-DPG was verified by enzymatic analysis. 2,3-DPG may be estimated by computer calculation of routine blood gas data. A low 2,3-DPG which may be associated with hypophosphatemia causes an unfavorable increase in hemoglobin oxygen affinity which reduces the oxygen release to the tissues.
Toward Theory-Based Instruction in Scientific Problem Solving.
ERIC Educational Resources Information Center
Heller, Joan I.; And Others
Several empirical and theoretical analyses related to scientific problem-solving are reviewed, including: detailed studies of individuals at different levels of expertise, and computer models simulating some aspects of human information processing during problem solving. Analysis of these studies has revealed many facets about the nature of the…
ERIC Educational Resources Information Center
Kou, Xiaojing
2011-01-01
Various formats of online discussion have proven valuable for enhancing learning and collaboration in distance and blended learning contexts. However, despite their capacity to reveal essential processes in collaborative inquiry, current mainstream analytical frameworks, such as the cognitive presence framework (Garrison, Anderson, & Archer,…
USDA-ARS?s Scientific Manuscript database
The non-culturable bacterium ‘Candidatus Liberibacter solanacearum’ (Lso) is the causative agent of zebra chip disease in potato. Computational analysis of the Lso genome revealed a serralysin-like gene based on conserved domains characteristic of genes encoding metalloprotease enzymes similar to se...
Inferential Procedures for Correlation Coefficients Corrected for Attenuation.
ERIC Educational Resources Information Center
Hakstian, A. Ralph; And Others
1988-01-01
A model and computation procedure based on classical test score theory are presented for determination of a correlation coefficient corrected for attenuation due to unreliability. Delta and Monte Carlo method applications are discussed. A power analysis revealed no serious loss in efficiency resulting from correction for attentuation. (TJH)
The Dynamics of Information Search Services.
ERIC Educational Resources Information Center
Lindquist, Mats G.
Computer-based information search services (ISSs) of the type that provide online literature searches are analyzed from a systems viewpoint using a continuous simulation model. The methodology applied is "system dynamics," and the system language is DYNAMO. The analysis reveals that the observed growth and stagnation of a typical ISS can…
Computer Review Can Cut HVAC Energy Use
ERIC Educational Resources Information Center
McClure, Charles J. R.
1974-01-01
A computerized review of construction bidding documents, usually done by a consulting engineer, can reveal how much money it will cost to operate various alternative types of HVAC equipment over a school's lifetime. The review should include a computerized load calculation, energy systems flow diagram, control system analysis, and a computerized…
Nadzirin, Nurul; Firdaus-Raih, Mohd
2012-10-08
Proteins of uncharacterized functions form a large part of many of the currently available biological databases and this situation exists even in the Protein Data Bank (PDB). Our analysis of recent PDB data revealed that only 42.53% of PDB entries (1084 coordinate files) that were categorized under "unknown function" are true examples of proteins of unknown function at this point in time. The remainder 1465 entries also annotated as such appear to be able to have their annotations re-assessed, based on the availability of direct functional characterization experiments for the protein itself, or for homologous sequences or structures thus enabling computational function inference.
Ramsden, Helen L; Sürmeli, Gülşen; McDonagh, Steven G; Nolan, Matthew F
2015-01-01
Neural circuits in the medial entorhinal cortex (MEC) encode an animal's position and orientation in space. Within the MEC spatial representations, including grid and directional firing fields, have a laminar and dorsoventral organization that corresponds to a similar topography of neuronal connectivity and cellular properties. Yet, in part due to the challenges of integrating anatomical data at the resolution of cortical layers and borders, we know little about the molecular components underlying this organization. To address this we develop a new computational pipeline for high-throughput analysis and comparison of in situ hybridization (ISH) images at laminar resolution. We apply this pipeline to ISH data for over 16,000 genes in the Allen Brain Atlas and validate our analysis with RNA sequencing of MEC tissue from adult mice. We find that differential gene expression delineates the borders of the MEC with neighboring brain structures and reveals its laminar and dorsoventral organization. We propose a new molecular basis for distinguishing the deep layers of the MEC and show that their similarity to corresponding layers of neocortex is greater than that of superficial layers. Our analysis identifies ion channel-, cell adhesion- and synapse-related genes as candidates for functional differentiation of MEC layers and for encoding of spatial information at different scales along the dorsoventral axis of the MEC. We also reveal laminar organization of genes related to disease pathology and suggest that a high metabolic demand predisposes layer II to neurodegenerative pathology. In principle, our computational pipeline can be applied to high-throughput analysis of many forms of neuroanatomical data. Our results support the hypothesis that differences in gene expression contribute to functional specialization of superficial layers of the MEC and dorsoventral organization of the scale of spatial representations.
Ramsden, Helen L.; Sürmeli, Gülşen; McDonagh, Steven G.; Nolan, Matthew F.
2015-01-01
Neural circuits in the medial entorhinal cortex (MEC) encode an animal’s position and orientation in space. Within the MEC spatial representations, including grid and directional firing fields, have a laminar and dorsoventral organization that corresponds to a similar topography of neuronal connectivity and cellular properties. Yet, in part due to the challenges of integrating anatomical data at the resolution of cortical layers and borders, we know little about the molecular components underlying this organization. To address this we develop a new computational pipeline for high-throughput analysis and comparison of in situ hybridization (ISH) images at laminar resolution. We apply this pipeline to ISH data for over 16,000 genes in the Allen Brain Atlas and validate our analysis with RNA sequencing of MEC tissue from adult mice. We find that differential gene expression delineates the borders of the MEC with neighboring brain structures and reveals its laminar and dorsoventral organization. We propose a new molecular basis for distinguishing the deep layers of the MEC and show that their similarity to corresponding layers of neocortex is greater than that of superficial layers. Our analysis identifies ion channel-, cell adhesion- and synapse-related genes as candidates for functional differentiation of MEC layers and for encoding of spatial information at different scales along the dorsoventral axis of the MEC. We also reveal laminar organization of genes related to disease pathology and suggest that a high metabolic demand predisposes layer II to neurodegenerative pathology. In principle, our computational pipeline can be applied to high-throughput analysis of many forms of neuroanatomical data. Our results support the hypothesis that differences in gene expression contribute to functional specialization of superficial layers of the MEC and dorsoventral organization of the scale of spatial representations. PMID:25615592
Dumesic, Phillip A.; Rosenblad, Magnus A.; Samuelsson, Tore; Nguyen, Tiffany; Moresco, James J.; Yates, John R.; Madhani, Hiten D.
2015-01-01
Despite conservation of the signal recognition particle (SRP) from bacteria to man, computational approaches have failed to identify SRP components from genomes of many lower eukaryotes, raising the possibility that they have been lost or altered in those lineages. We report purification and analysis of SRP in the human pathogen Cryptococcus neoformans, providing the first description of SRP in basidiomycetous yeast. The C. neoformans SRP RNA displays a predicted structure in which the universally conserved helix 8 contains an unprecedented stem-loop insertion. Guided by this sequence, we computationally identified 152 SRP RNAs throughout the phylum Basidiomycota. This analysis revealed additional helix 8 alterations including single and double stem-loop insertions as well as loop diminutions affecting RNA structural elements that are otherwise conserved from bacteria to man. Strikingly, these SRP RNA features in Basidiomycota are accompanied by phylum-specific alterations in the RNA-binding domain of Srp54, the SRP protein subunit that directly interacts with helix 8. Our findings reveal unexpected fungal SRP diversity and suggest coevolution of the two most conserved SRP features—SRP RNA helix 8 and Srp54—in basidiomycetes. Because members of this phylum include important human and plant pathogens, these noncanonical features provide new targets for antifungal compound development. PMID:26275773
Identification and addressing reduction-related misconceptions
NASA Astrophysics Data System (ADS)
Gal-Ezer, Judith; Trakhtenbrot, Mark
2016-07-01
Reduction is one of the key techniques used for problem-solving in computer science. In particular, in the theory of computation and complexity (TCC), mapping and polynomial reductions are used for analysis of decidability and computational complexity of problems, including the core concept of NP-completeness. Reduction is a highly abstract technique that involves revealing close non-trivial connections between problems that often seem to have nothing in common. As a result, proper understanding and application of reduction is a serious challenge for students and a source of numerous misconceptions. The main contribution of this paper is detection of such misconceptions, analysis of their roots, and proposing a way to address them in an undergraduate TCC course. Our observations suggest that the main source of the misconceptions is the false intuitive rule "the bigger is a set/problem, the harder it is to solve". Accordingly, we developed a series of exercises for proactive prevention of these misconceptions.
Tertiary structure-based analysis of microRNA–target interactions
Gan, Hin Hark; Gunsalus, Kristin C.
2013-01-01
Current computational analysis of microRNA interactions is based largely on primary and secondary structure analysis. Computationally efficient tertiary structure-based methods are needed to enable more realistic modeling of the molecular interactions underlying miRNA-mediated translational repression. We incorporate algorithms for predicting duplex RNA structures, ionic strength effects, duplex entropy and free energy, and docking of duplex–Argonaute protein complexes into a pipeline to model and predict miRNA–target duplex binding energies. To ensure modeling accuracy and computational efficiency, we use an all-atom description of RNA and a continuum description of ionic interactions using the Poisson–Boltzmann equation. Our method predicts the conformations of two constructs of Caenorhabditis elegans let-7 miRNA–target duplexes to an accuracy of ∼3.8 Å root mean square distance of their NMR structures. We also show that the computed duplex formation enthalpies, entropies, and free energies for eight miRNA–target duplexes agree with titration calorimetry data. Analysis of duplex–Argonaute docking shows that structural distortions arising from single-base-pair mismatches in the seed region influence the activity of the complex by destabilizing both duplex hybridization and its association with Argonaute. Collectively, these results demonstrate that tertiary structure-based modeling of miRNA interactions can reveal structural mechanisms not accessible with current secondary structure-based methods. PMID:23417009
NASA Technical Reports Server (NTRS)
Farhat, C.; Park, K. C.; Dubois-Pelerin, Y.
1991-01-01
An unconditionally stable second order accurate implicit-implicit staggered procedure for the finite element solution of fully coupled thermoelasticity transient problems is proposed. The procedure is stabilized with a semi-algebraic augmentation technique. A comparative cost analysis reveals the superiority of the proposed computational strategy to other conventional staggered procedures. Numerical examples of one and two-dimensional thermomechanical coupled problems demonstrate the accuracy of the proposed numerical solution algorithm.
Analysis of HRCT-derived xylem network reveals reverse flow in some vessels
USDA-ARS?s Scientific Manuscript database
Flow in xylem vessels is modeled based on constructions of three dimensional xylem networks derived from High Resolution Computed Tomography (HRCT) images of grapevine (Vitis vinifera) stems. Flow in 6-14% of the vessels was found to be oriented in the opposite direction to the bulk flow under norma...
An investigation of condition mapping and plot proportion calculation issues
Demetrios Gatziolis
2007-01-01
A systematic examination of Forest Inventory and Analysis condition data collected under the annual inventory protocol in the Pacific Northwest region between 2000 and 2004 revealed the presence of errors both in condition topology and plot proportion computations. When plots were compiled to generate population estimates, proportion errors were found to cause...
Farmers' Preferences for Methods of Receiving Information on New or Innovative Farming Practices.
ERIC Educational Resources Information Center
Riesenberg, Lou E.; Gor, Christopher Obel
1989-01-01
Survey of 386 Idaho farmers (response rate 58 percent) identified preferred methods of receiving information on new or innovative farming practices. Analysis revealed preference for interpersonal methods (demonstrations, tours, and field trips) over mass media such as computer-assisted instruction (CAI) and home study, although younger farmers,…
Interactions between Prefrontal Cortex and Cerebellum Revealed by Trace Eyelid Conditioning
ERIC Educational Resources Information Center
Kalmbach, Brian E.; Ohyama, Tatsuya; Kreider, Joy C.; Riusech, Frank; Mauk, Michael D.
2009-01-01
Eyelid conditioning has proven useful for analysis of learning and computation in the cerebellum. Two variants, delay and trace conditioning, differ only by the relative timing of the training stimuli. Despite the subtlety of this difference, trace eyelid conditioning is prevented by lesions of the cerebellum, hippocampus, or medial prefrontal…
Atmospheric simulation using a liquid crystal wavefront-controlling device
NASA Astrophysics Data System (ADS)
Brooks, Matthew R.; Goda, Matthew E.
2004-10-01
Test and evaluation of laser warning devices is important due to the increased use of laser devices in aerial applications. This research consists of an atmospheric aberrating system to enable in-lab testing of various detectors and sensors. This system employs laser light at 632.8nm from a Helium-Neon source and a spatial light modulator (SLM) to cause phase changes using a birefringent liquid crystal material. Measuring outgoing radiation from the SLM using a CCD targetboard and Shack-Hartmann wavefront sensor reveals an acceptable resemblance of system output to expected atmospheric theory. Over three turbulence scenarios, an error analysis reveals that turbulence data matches theory. A wave optics computer simulation is created analogous to the lab-bench design. Phase data, intensity data, and a computer simulation affirm lab-bench results so that the aberrating SLM system can be operated confidently.
Baresic, Mario; Salatino, Silvia; Kupr, Barbara
2014-01-01
Skeletal muscle tissue shows an extraordinary cellular plasticity, but the underlying molecular mechanisms are still poorly understood. Here, we use a combination of experimental and computational approaches to unravel the complex transcriptional network of muscle cell plasticity centered on the peroxisome proliferator-activated receptor γ coactivator 1α (PGC-1α), a regulatory nexus in endurance training adaptation. By integrating data on genome-wide binding of PGC-1α and gene expression upon PGC-1α overexpression with comprehensive computational prediction of transcription factor binding sites (TFBSs), we uncover a hitherto-underestimated number of transcription factor partners involved in mediating PGC-1α action. In particular, principal component analysis of TFBSs at PGC-1α binding regions predicts that, besides the well-known role of the estrogen-related receptor α (ERRα), the activator protein 1 complex (AP-1) plays a major role in regulating the PGC-1α-controlled gene program of the hypoxia response. Our findings thus reveal the complex transcriptional network of muscle cell plasticity controlled by PGC-1α. PMID:24912679
Bourke, Jason M; Witmer, Lawrence M
2016-12-01
We tested the aerodynamic function of nasal conchae in birds using CT data from an adult male wild turkey (Meleagris gallopavo) to construct 3D models of its nasal passage. A series of digital "turbinectomies" were performed on these models and computational fluid dynamic analyses were performed to simulate resting inspiration. Models with turbinates removed were compared to the original, unmodified control airway. Results revealed that the four conchae found in turkeys, along with the crista nasalis, alter the flow of inspired air in ways that can be considered baffle-like. However, these baffle-like functions were remarkably limited in their areal extent, indicating that avian conchae are more functionally independent than originally hypothesized. Our analysis revealed that the conchae of birds are efficient baffles that-along with potential heat and moisture transfer-serve to efficiently move air to specific regions of the nasal passage. This alternate function of conchae has implications for their evolution in birds and other amniotes. Copyright © 2016 Elsevier B.V. All rights reserved.
Computational analysis of an aortic valve jet
NASA Astrophysics Data System (ADS)
Shadden, Shawn C.; Astorino, Matteo; Gerbeau, Jean-Frédéric
2009-11-01
In this work we employ a coupled FSI scheme using an immersed boundary method to simulate flow through a realistic deformable, 3D aortic valve model. This data was used to compute Lagrangian coherent structures, which revealed flow separation from the valve leaflets during systole, and correspondingly, the boundary between the jet of ejected fluid and the regions of separated, recirculating flow. Advantages of computing LCS in multi-dimensional FSI models of the aortic valve are twofold. For one, the quality and effectiveness of existing clinical indices used to measure aortic jet size can be tested by taking advantage of the accurate measure of the jet area derived from LCS. Secondly, as an ultimate goal, a reliable computational framework for the assessment of the aortic valve stenosis could be developed.
Active medulloblastoma enhancers reveal subgroup-specific cellular origins
Lin, Charles Y.; Erkek, Serap; Tong, Yiai; Yin, Linlin; Federation, Alexander J.; Zapatka, Marc; Haldipur, Parthiv; Kawauchi, Daisuke; Risch, Thomas; Warnatz, Hans-Jörg; Worst, Barbara C.; Ju, Bensheng; Orr, Brent A.; Zeid, Rhamy; Polaski, Donald R.; Segura-Wang, Maia; Waszak, Sebastian M.; Jones, David T.W.; Kool, Marcel; Hovestadt, Volker; Buchhalter, Ivo; Sieber, Laura; Johann, Pascal; Chavez, Lukas; Gröschel, Stefan; Ryzhova, Marina; Korshunov, Andrey; Chen, Wenbiao; Chizhikov, Victor V.; Millen, Kathleen J.; Amstislavskiy, Vyacheslav; Lehrach, Hans; Yaspo, Marie-Laure; Eils, Roland; Lichter, Peter; Korbel, Jan O.; Pfister, Stefan M.; Bradner, James E.; Northcott, Paul A.
2016-01-01
Summary Medulloblastoma is a highly malignant paediatric brain tumour, often inflicting devastating consequences on the developing child. Genomic studies have revealed four distinct molecular subgroups with divergent biology and clinical behaviour. An understanding of the regulatory circuitry governing the transcriptional landscapes of medulloblastoma subgroups, and how this relates to their respective developmental origins, is lacking. Using H3K27ac and BRD4 ChIP-Seq, coupled with tissue-matched DNA methylation and transcriptome data, we describe the active cis-regulatory landscape across 28 primary medulloblastoma specimens. Analysis of differentially regulated enhancers and super-enhancers reinforced inter-subgroup heterogeneity and revealed novel, clinically relevant insights into medulloblastoma biology. Computational reconstruction of core regulatory circuitry identified a master set of transcription factors, validated by ChIP-Seq, that are responsible for subgroup divergence and implicate candidate cells-of-origin for Group 4. Our integrated analysis of enhancer elements in a large series of primary tumour samples reveals insights into cis-regulatory architecture, unrecognized dependencies, and cellular origins. PMID:26814967
LiPISC: A Lightweight and Flexible Method for Privacy-Aware Intersection Set Computation
Huang, Shiyong; Ren, Yi; Choo, Kim-Kwang Raymond
2016-01-01
Privacy-aware intersection set computation (PISC) can be modeled as secure multi-party computation. The basic idea is to compute the intersection of input sets without leaking privacy. Furthermore, PISC should be sufficiently flexible to recommend approximate intersection items. In this paper, we reveal two previously unpublished attacks against PISC, which can be used to reveal and link one input set to another input set, resulting in privacy leakage. We coin these as Set Linkage Attack and Set Reveal Attack. We then present a lightweight and flexible PISC scheme (LiPISC) and prove its security (including against Set Linkage Attack and Set Reveal Attack). PMID:27326763
LiPISC: A Lightweight and Flexible Method for Privacy-Aware Intersection Set Computation.
Ren, Wei; Huang, Shiyong; Ren, Yi; Choo, Kim-Kwang Raymond
2016-01-01
Privacy-aware intersection set computation (PISC) can be modeled as secure multi-party computation. The basic idea is to compute the intersection of input sets without leaking privacy. Furthermore, PISC should be sufficiently flexible to recommend approximate intersection items. In this paper, we reveal two previously unpublished attacks against PISC, which can be used to reveal and link one input set to another input set, resulting in privacy leakage. We coin these as Set Linkage Attack and Set Reveal Attack. We then present a lightweight and flexible PISC scheme (LiPISC) and prove its security (including against Set Linkage Attack and Set Reveal Attack).
Bringing the CMS distributed computing system into scalable operations
NASA Astrophysics Data System (ADS)
Belforte, S.; Fanfani, A.; Fisk, I.; Flix, J.; Hernández, J. M.; Kress, T.; Letts, J.; Magini, N.; Miccio, V.; Sciabà, A.
2010-04-01
Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure and optimize the networking and storage through scale testing data transfers and data processing, and improving the efficiency of accessing data across the CMS computing system from global transfers to local access. This contribution reports on the tools and procedures developed by CMS for computing commissioning and scale testing as well as the improvements accomplished towards efficient, reliable and scalable computing operations. The activities include the development and operation of load generators for job submission and data transfers with the aim of stressing the experiment and Grid data management and workload management systems, site commissioning procedures and tools to monitor and improve site availability and reliability, as well as activities targeted to the commissioning of the distributed production, user analysis and monitoring systems.
The semantic system is involved in mathematical problem solving.
Zhou, Xinlin; Li, Mengyi; Li, Leinian; Zhang, Yiyun; Cui, Jiaxin; Liu, Jie; Chen, Chuansheng
2018-02-01
Numerous studies have shown that the brain regions around bilateral intraparietal cortex are critical for number processing and arithmetical computation. However, the neural circuits for more advanced mathematics such as mathematical problem solving (with little routine arithmetical computation) remain unclear. Using functional magnetic resonance imaging (fMRI), this study (N = 24 undergraduate students) compared neural bases of mathematical problem solving (i.e., number series completion, mathematical word problem solving, and geometric problem solving) and arithmetical computation. Direct subject- and item-wise comparisons revealed that mathematical problem solving typically had greater activation than arithmetical computation in all 7 regions of the semantic system (which was based on a meta-analysis of 120 functional neuroimaging studies on semantic processing). Arithmetical computation typically had greater activation in the supplementary motor area and left precentral gyrus. The results suggest that the semantic system in the brain supports mathematical problem solving. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Titov, A. G.; Gordov, E. P.; Okladnikov, I.; Shulgina, T. M.
2011-12-01
Analysis of recent climatic and environmental changes in Siberia performed on the basis of the CLEARS (CLimate and Environment Analysis and Research System) information-computational system is presented. The system was developed using the specialized software framework for rapid development of thematic information-computational systems based on Web-GIS technologies. It comprises structured environmental datasets, computational kernel, specialized web portal implementing web mapping application logic, and graphical user interface. Functional capabilities of the system include a number of procedures for mathematical and statistical analysis, data processing and visualization. At present a number of georeferenced datasets is available for processing including two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 and ERA Interim Reanalysis, meteorological observation data for the territory of the former USSR, and others. Firstly, using functionality of the computational kernel employing approved statistical methods it was shown that the most reliable spatio-temporal characteristics of surface temperature and precipitation in Siberia in the second half of 20th and beginning of 21st centuries are provided by ERA-40/ERA Interim Reanalysis and APHRODITE JMA Reanalysis, respectively. Namely those Reanalyses are statistically consistent with reliable in situ meteorological observations. Analysis of surface temperature and precipitation dynamics for the territory of Siberia performed on the base of the developed information-computational system reveals fine spatial and temporal details in heterogeneous patterns obtained for the region earlier. Dynamics of bioclimatic indices determining climate change impact on structure and functioning of regional vegetation cover was investigated as well. Analysis shows significant positive trends of growing season length accompanied by statistically significant increase of sum of growing degree days and total annual precipitation over the south of Western Siberia. In particular, we conclude that analysis of trends of growing season length, sum of growing degree-days and total precipitation during the growing season reveals a tendency to an increase of vegetation ecosystems productivity across the south of Western Siberia (55°-60°N, 59°-84°E) in the past several decades. The developed system functionality providing instruments for comparison of modeling and observational data and for reliable climatological analysis allowed us to obtain new results characterizing regional manifestations of global change. It should be added that each analysis performed using the system leads also to generation of the archive of spatio-temporal data fields ready for subsequent usage by other specialists. In particular, the archive of bioclimatic indices obtained will allow performing further detailed studies of interrelations between local climate and vegetation cover changes, including changes of carbon uptake related to variations of types and amount of vegetation and spatial shift of vegetation zones. This work is partially supported by RFBR grants #10-07-00547 and #11-05-01190-a, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7.
Quantitative morphometrical characterization of human pronuclear zygotes.
Beuchat, A; Thévenaz, P; Unser, M; Ebner, T; Senn, A; Urner, F; Germond, M; Sorzano, C O S
2008-09-01
Identification of embryos with high implantation potential remains a challenge in in vitro fertilization (IVF). Subjective pronuclear (PN) zygote scoring systems have been developed for that purpose. The aim of this work was to provide a software tool that enables objective measuring of morphological characteristics of the human PN zygote. A computer program was created to analyse zygote images semi-automatically, providing precise morphological measurements. The accuracy of this approach was first validated by comparing zygotes from two different IVF centres with computer-assisted measurements or subjective scoring. Computer-assisted measurement and subjective scoring were then compared for their ability to classify zygotes with high and low implantation probability by using a linear discriminant analysis. Zygote images coming from the two IVF centres were analysed with the software, resulting in a series of precise measurements of 24 variables. Using subjective scoring, the cytoplasmic halo was the only feature which was significantly different between the two IVF centres. Computer-assisted measurements revealed significant differences between centres in PN centring, PN proximity, cytoplasmic halo and features related to nucleolar precursor bodies distribution. The zygote classification error achieved with the computer-assisted measurements (0.363) was slightly inferior to that of the subjective ones (0.393). A precise and objective characterization of the morphology of human PN zygotes can be achieved by the use of an advanced image analysis tool. This computer-assisted analysis allows for a better morphological characterization of human zygotes and can be used for classification.
Shukla, Rohit; Shukla, Harish; Tripathi, Timir
2018-01-01
Mycobacterium tuberculosis isocitrate lyase (MtbICL) is a crucial enzyme of the glyoxylate cycle and is a validated anti-tuberculosis drug target. Structurally distant, non-active site mutation (H46A) in MtbICL has been found to cause loss of enzyme activity. The aim of the present work was to explore the structural alterations induced by H46A mutation that caused the loss of enzyme activity. The structural and dynamic consequences of H46A mutation were studied using multiple computational methods such as docking, molecular dynamics simulation and residue interaction network analysis (RIN). Principal component analysis and cross correlation analysis revealed the difference in conformational flexibility and collective modes of motions between the wild-type and mutant enzyme, particularly in the active site region. RIN analysis revealed that the active site geometry was disturbed in the mutant enzyme. Thus, the dynamic perturbation of the active site led to enzyme transition from its active form to inactive form upon mutation. The computational analyses elucidated the mutant-specific conformational alterations, differential dominant motions, and anomalous residue level interactions that contributed to the abrogated function of mutant MtbICL. An understanding of interactions of mutant enzymes may help in modifying the existing drugs and designing improved drugs for successful control of tuberculosis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cost-effectiveness of breast cancer screening policies using simulation.
Gocgun, Y; Banjevic, D; Taghipour, S; Montgomery, N; Harvey, B J; Jardine, A K S; Miller, A B
2015-08-01
In this paper, we study breast cancer screening policies using computer simulation. We developed a multi-state Markov model for breast cancer progression, considering both the screening and treatment stages of breast cancer. The parameters of our model were estimated through data from the Canadian National Breast Cancer Screening Study as well as data in the relevant literature. Using computer simulation, we evaluated various screening policies to study the impact of mammography screening for age-based subpopulations in Canada. We also performed sensitivity analysis to examine the impact of certain parameters on number of deaths and total costs. The analysis comparing screening policies reveals that a policy in which women belonging to the 40-49 age group are not screened, whereas those belonging to the 50-59 and 60-69 age groups are screened once every 5 years, outperforms others with respect to cost per life saved. Our analysis also indicates that increasing the screening frequencies for the 50-59 and 60-69 age groups decrease mortality, and that the average number of deaths generally decreases with an increase in screening frequency. We found that screening annually for all age groups is associated with the highest costs per life saved. Our analysis thus reveals that cost per life saved increases with an increase in screening frequency. Copyright © 2015 Elsevier Ltd. All rights reserved.
Climate Analytics as a Service. Chapter 11
NASA Technical Reports Server (NTRS)
Schnase, John L.
2016-01-01
Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.
Pan, Yijie; Wang, Yongtian; Liu, Juan; Li, Xin; Jia, Jia
2014-03-01
Previous research [Appl. Opt.52, A290 (2013)] has revealed that Fourier analysis of three-dimensional affine transformation theory can be used to improve the computation speed of the traditional polygon-based method. In this paper, we continue our research and propose an improved full analytical polygon-based method developed upon this theory. Vertex vectors of primitive and arbitrary triangles and the pseudo-inverse matrix were used to obtain an affine transformation matrix representing the spatial relationship between the two triangles. With this relationship and the primitive spectrum, we analytically obtained the spectrum of the arbitrary triangle. This algorithm discards low-level angular dependent computations. In order to add diffusive reflection to each arbitrary surface, we also propose a whole matrix computation approach that takes advantage of the affine transformation matrix and uses matrix multiplication to calculate shifting parameters of similar sub-polygons. The proposed method improves hologram computation speed for the conventional full analytical approach. Optical experimental results are demonstrated which prove that the proposed method can effectively reconstruct three-dimensional scenes.
Thali, Michael J; Taubenreuther, Ulrike; Karolczak, Marek; Braun, Marcel; Brueschweiler, Walter; Kalender, Willi A; Dirnhofer, Richard
2003-11-01
When a knife is stabbed in bone, it leaves an impression in the bone. The characteristics (shape, size, etc.) may indicate the type of tool used to produce the patterned injury in bone. Until now it has been impossible in forensic sciences to document such damage precisely and non-destructively. Micro-computed tomography (Micro-CT) offers an opportunity to analyze patterned injuries of tool marks made in bone. Using high-resolution Micro-CT and computer software, detailed analysis of three-dimensional (3D) architecture has recently become feasible and allows microstructural 3D bone information to be collected. With adequate viewing software, data from 2D slice of an arbitrary plane can be extracted from 3D datasets. Using such software as a "digital virtual knife," the examiner can interactively section and analyze the 3D sample. Analysis of the bone injury revealed that Micro-CT provides an opportunity to correlate a bone injury to an injury-causing instrument. Even broken knife tips can be graphically and non-destructively assigned to a suspect weapon.
Roles for Agent Assistants in Field Science: Understanding Personal Projects and Collaboration
NASA Technical Reports Server (NTRS)
Clancey, William J.
2003-01-01
A human-centered approach to computer systems design involves reframing analysis in terms of the people interacting with each other. The primary concern is not how people can interact with computers, but how shall we design work systems (facilities, tools, roles, and procedures) to help people pursue their personal projects, as they work independently and collaboratively? Two case studies provide empirical requirements. First, an analysis of astronaut interactions with CapCom on Earth during one traverse of Apollo 17 shows what kind of information was conveyed and what might be automated today. A variety of agent and robotic technologies are proposed that deal with recurrent problems in communication and coordination during the analyzed traverse. Second, an analysis of biologists and a geologist working at Haughton Crater in the High Canadian Arctic reveals how work interactions between people involve independent personal projects, sensitively coordinated for mutual benefit. In both cases, an agent or robotic system's role would be to assist people, rather than collaborating, because today's computer systems lack the identity and purpose that consciousness provides.
Computational and Experimental Analysis of the Secretome of Methylococcus capsulatus (Bath)
Indrelid, Stine; Mathiesen, Geir; Jacobsen, Morten; Lea, Tor; Kleiveland, Charlotte R.
2014-01-01
The Gram-negative methanotroph Methylococcus capsulatus (Bath) was recently demonstrated to abrogate inflammation in a murine model of inflammatory bowel disease, suggesting interactions with cells involved in maintaining mucosal homeostasis and emphasizing the importance of understanding the many properties of M. capsulatus. Secreted proteins determine how bacteria may interact with their environment, and a comprehensive knowledge of such proteins is therefore vital to understand bacterial physiology and behavior. The aim of this study was to systematically analyze protein secretion in M. capsulatus (Bath) by identifying the secretion systems present and the respective secreted substrates. Computational analysis revealed that in addition to previously recognized type II secretion systems and a type VII secretion system, a type Vb (two-partner) secretion system and putative type I secretion systems are present in M. capsulatus (Bath). In silico analysis suggests that the diverse secretion systems in M.capsulatus transport proteins likely to be involved in adhesion, colonization, nutrient acquisition and homeostasis maintenance. Results of the computational analysis was verified and extended by an experimental approach showing that in addition an uncharacterized protein and putative moonlighting proteins are released to the medium during exponential growth of M. capsulatus (Bath). PMID:25479164
Zhang, Jiang; Liu, Qi; Chen, Huafu; Yuan, Zhen; Huang, Jin; Deng, Lihua; Lu, Fengmei; Zhang, Junpeng; Wang, Yuqing; Wang, Mingwen; Chen, Liangyin
2015-01-01
Clustering analysis methods have been widely applied to identifying the functional brain networks of a multitask paradigm. However, the previously used clustering analysis techniques are computationally expensive and thus impractical for clinical applications. In this study a novel method, called SOM-SAPC that combines self-organizing mapping (SOM) and supervised affinity propagation clustering (SAPC), is proposed and implemented to identify the motor execution (ME) and motor imagery (MI) networks. In SOM-SAPC, SOM was first performed to process fMRI data and SAPC is further utilized for clustering the patterns of functional networks. As a result, SOM-SAPC is able to significantly reduce the computational cost for brain network analysis. Simulation and clinical tests involving ME and MI were conducted based on SOM-SAPC, and the analysis results indicated that functional brain networks were clearly identified with different response patterns and reduced computational cost. In particular, three activation clusters were clearly revealed, which include parts of the visual, ME and MI functional networks. These findings validated that SOM-SAPC is an effective and robust method to analyze the fMRI data with multitasks.
Convergence of sampling in protein simulations
NASA Astrophysics Data System (ADS)
Hess, Berk
2002-03-01
With molecular dynamics protein dynamics can be simulated in atomic detail. Current computers are not fast enough to probe all available conformations, but fluctuations around one conformation can be sampled to a reasonable extent. The motions with the largest fluctuations can be filtered out of a simulation using covariance or principal component analysis. A problem with this analysis is that random diffusion can appear as correlated motion. An analysis is presented of how long a simulation should be to obtain relevant results for global motions. The analysis reveals that the cosine content of the principal components is a good indicator for bad sampling.
Thirugnanasambantham, Krishnaraj; Saravanan, Subramanian; Karikalan, Kulandaivelu; Bharanidharan, Rajaraman; Lalitha, Perumal; Ilango, S; HairulIslam, Villianur Ibrahim
2015-10-01
Momordica charantia (bitter gourd, bitter melon) is a monoecious Cucurbitaceae with anti-oxidant, anti-microbial, anti-viral and anti-diabetic potential. Molecular studies on this economically valuable plant are very essential to understand its phylogeny and evolution. MicroRNAs (miRNAs) are conserved, small, non-coding RNA with ability to regulate gene expression by bind the 3' UTR region of target mRNA and are evolved at different rates in different plant species. In this study we have utilized homology based computational approach and identified 27 mature miRNAs for the first time from this bio-medically important plant. The phylogenetic tree developed from binary data derived from the data on presence/absence of the identified miRNAs were noticed to be uncertain and biased. Most of the identified miRNAs were highly conserved among the plant species and sequence based phylogeny analysis of miRNAs resolved the above difficulties in phylogeny approach using miRNA. Predicted gene targets of the identified miRNAs revealed their importance in regulation of plant developmental process. Reported miRNAs held sequence conservation in mature miRNAs and the detailed phylogeny analysis of pre-miRNA sequences revealed genus specific segregation of clusters. Copyright © 2015 Elsevier Ltd. All rights reserved.
RNA 3D Modules in Genome-Wide Predictions of RNA 2D Structure
Theis, Corinna; Zirbel, Craig L.; zu Siederdissen, Christian Höner; Anthon, Christian; Hofacker, Ivo L.; Nielsen, Henrik; Gorodkin, Jan
2015-01-01
Recent experimental and computational progress has revealed a large potential for RNA structure in the genome. This has been driven by computational strategies that exploit multiple genomes of related organisms to identify common sequences and secondary structures. However, these computational approaches have two main challenges: they are computationally expensive and they have a relatively high false discovery rate (FDR). Simultaneously, RNA 3D structure analysis has revealed modules composed of non-canonical base pairs which occur in non-homologous positions, apparently by independent evolution. These modules can, for example, occur inside structural elements which in RNA 2D predictions appear as internal loops. Hence one question is if the use of such RNA 3D information can improve the prediction accuracy of RNA secondary structure at a genome-wide level. Here, we use RNAz in combination with 3D module prediction tools and apply them on a 13-way vertebrate sequence-based alignment. We find that RNA 3D modules predicted by metaRNAmodules and JAR3D are significantly enriched in the screened windows compared to their shuffled counterparts. The initially estimated FDR of 47.0% is lowered to below 25% when certain 3D module predictions are present in the window of the 2D prediction. We discuss the implications and prospects for further development of computational strategies for detection of RNA 2D structure in genomic sequence. PMID:26509713
Computed tomography angiography reveals the crime instrument – case report
Banaszek, Anna; Guziński, Maciej; Sąsiadek, Marek
2010-01-01
Summary Background: The development of multislice CT technology enabled imaging of post-traumatic brain lesions with isotropic resolution, which led to unexpected results in the presented case Case Report: An unconscious, 49-year-old male with a suspected trauma underwent a routine CT examination of the head, which revealed an unusual intracerebral bleeding and therefore was followed by CT angiography (CTA). The thorough analysis of CTA source scans led to the detection of the bleeding cause. Conclusions: The presented case showed that a careful analysis of a CT scan allows not only to define the extent of pathological lesions in the intracranial space but it also helps to detect the crime instrument, which is of medico-legal significance. PMID:22802784
"Are You a Computer?" Opening Exchanges in Virtual Reference Shape the Potential for Teaching
ERIC Educational Resources Information Center
Dempsey, Paula R.
2016-01-01
Academic reference librarians frequently work with students who are not aware of their professional roles. In online interactions, a student might not even realize that the librarian is a person. The ways students initiate conversations reveal their understanding of the mutual roles involved in reference encounters. Conversation analysis of live…
Parsing Protocols Using Problem Solving Grammars. AI Memo 385.
ERIC Educational Resources Information Center
Miller, Mark L.; Goldstein, Ira P.
A theory of the planning and debugging of computer programs is formalized as a context free grammar, which is used to reveal the constituent structure of problem solving episodes by parsing protocols in which programs are written, tested, and debugged. This is illustrated by the detailed analysis of an actual session with a beginning student…
Good History vs. Bad History: The Changing Art of Book Reviewing.
ERIC Educational Resources Information Center
Bilhartz, Terry David
A content analysis of 560 book reviews published in the "Journal of American History" over the past 30 years reveals changes in the criteria scholars use when evaluating works on history. Data were collected for several categories and then analyzed by computer. The paper begins with a discussion of distinguishing characteristics of reviews for…
First-Order or Second-Order Kinetics? A Monte Carlo Answer
ERIC Educational Resources Information Center
Tellinghuisen, Joel
2005-01-01
Monte Carlo computational experiments reveal that the ability to discriminate between first- and second-order kinetics from least-squares analysis of time-dependent concentration data is better than implied in earlier discussions of the problem. The problem is rendered as simple as possible by assuming that the order must be either 1 or 2 and that…
Fault tolerance in computational grids: perspectives, challenges, and issues.
Haider, Sajjad; Nazir, Babar
2016-01-01
Computational grids are established with the intention of providing shared access to hardware and software based resources with special reference to increased computational capabilities. Fault tolerance is one of the most important issues faced by the computational grids. The main contribution of this survey is the creation of an extended classification of problems that incur in the computational grid environments. The proposed classification will help researchers, developers, and maintainers of grids to understand the types of issues to be anticipated. Moreover, different types of problems, such as omission, interaction, and timing related have been identified that need to be handled on various layers of the computational grid. In this survey, an analysis and examination is also performed pertaining to the fault tolerance and fault detection mechanisms. Our conclusion is that a dependable and reliable grid can only be established when more emphasis is on fault identification. Moreover, our survey reveals that adaptive and intelligent fault identification, and tolerance techniques can improve the dependability of grid working environments.
NASA Astrophysics Data System (ADS)
Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.
2015-12-01
For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical of (2) and the global averaged methods typical of (3) compare for typical systems? The discussion will use examples of response of the Greenland glacier to global warming and surface and groundwater modeling.
Computational analysis of human and mouse CREB3L4 Protein
Velpula, Kiran Kumar; Rehman, Azeem Abdul; Chigurupati, Soumya; Sanam, Ramadevi; Inampudi, Krishna Kishore; Akila, Chandra Sekhar
2012-01-01
CREB3L4 is a member of the CREB/ATF transcription factor family, characterized by their regulation of gene expression through the cAMP-responsive element. Previous studies identified this protein in mice and humans. Whereas CREB3L4 in mice (referred to as Tisp40) is found in the testes and functions in spermatogenesis, human CREB3L4 is primarily detected in the prostate and has been implicated in cancer. We conducted computational analyses to compare the structural homology between murine Tisp40α human CREB3L4. Our results reveal that the primary and secondary structures of the two proteins contain high similarity. Additionally, predicted helical transmembrane structure reveals that the proteins likely have similar structure and function. This study offers preliminary findings that support the translation of mouse Tisp40α findings into human models, based on structural homology. PMID:22829733
Vila, Jorge A.; Scheraga, Harold A.
2008-01-01
Interest centers here on the analysis of two different, but related, phenomena that affect side-chain conformations and consequently 13Cα chemical shifts and their applications to determine, refine, and validate protein structures. The first is whether 13Cα chemical shifts, computed at the DFT level of approximation with charged residues is a better approximation of observed 13Cα chemical shifts than those computed with neutral residues for proteins in solution. Accurate computation of 13Cα chemical shifts requires a proper representation of the charges, which might not take on integral values. For this analysis, the charges for 139 conformations of the protein ubiquitin were determined by explicit consideration of protein binding equilibria, at a given pH, that is, by exploring the 2ξ possible ionization states of the whole molecule, with ξ being the number of ionizable groups. The results of this analysis, as revealed by the shielding/deshield-ing of the 13Cα nucleus, indicated that: (i) there is a significant difference in the computed 13Cα chemical shifts, between basic and acidic groups, as a function of the degree of charge of the side chain; (ii) this difference is attributed to the distance between the ionizable groups and the 13Cα nucleus, which is shorter for the acidic Asp and Glu groups as compared with that for the basic Lys and Arg groups; and (iii) the use of neutral, rather than charged, basic and acidic groups is a better approximation of the observed 13Cα chemical shifts of a protein in solution. The second is how side-chain flexibility influences computed 13Cα chemical shifts in an additional set of ubiquitin conformations, in which the side chains are generated from an NMR-derived structure with the backbone conformation assumed to be fixed. The 13Cα chemical shift of a given amino acid residue in a protein is determined, mainly, by its own backbone and side-chain torsional angles, independent of the neighboring residues; the conformation of a given residue itself, however, depends on the environment of this residue and, hence, on the whole protein structure. As a consequence, this analysis reveals the role and impact of an accurate side-chain computation in the determination and refinement of protein conformation. The results of this analysis are: (i) a lower error between computed and observed 13Cα chemical shifts (by up to 3.7 ppm), was found for ~68% and ~63% of all ionizable residues and all non-Ala/Pro/Gly residues, respectively, in the additional set of conformations, compared with results for the model from which the set was derived; and (ii) all the additional conformations exhibit a lower root-mean-square-deviation (1.97 ppm ≤ rmsd ≤ 2.13 ppm), between computed and observed 13Cα chemical shifts, than the rmsd (2.32 ppm) computed for the starting conformation from which this additional set was derived. As a validation test, an analysis of the additional set of ubiquitin conformations, comparing computed and observed values of both 13Cα chemical shifts and χ1 torsional angles (given by the vicinal coupling constants, 3JN–Cγ and 3JC′–Cγ, is discussed. PMID:17975838
Computer image analysis of etched tracks from ionizing radiation
NASA Technical Reports Server (NTRS)
Blanford, George E.
1994-01-01
I proposed to continue a cooperative research project with Dr. David S. McKay concerning image analysis of tracks. Last summer we showed that we could measure track densities using the Oxford Instruments eXL computer and software that is attached to an ISI scanning electron microscope (SEM) located in building 31 at JSC. To reduce the dependence on JSC equipment, we proposed to transfer the SEM images to UHCL for analysis. Last summer we developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. As part of a consortium effort to better understand the maturation of lunar soil and its relation to its infrared reflectance properties, we worked on lunar samples 67701,205 and 61221,134. These samples were etched for a shorter time (6 hours) than last summer's sample and this difference has presented problems for establishing the correct analysis conditions. We used computer counting and measurement of area to obtain preliminary track densities and a track density distribution that we could interpret for sample 67701,205. This sample is a submature soil consisting of approximately 85 percent mature soil mixed with approximately 15 percent immature, but not pristine, soil.
Novel Regulatory Small RNAs in Streptococcus pyogenes
Tesorero, Rafael A.; Yu, Ning; Wright, Jordan O.; Svencionis, Juan P.; Cheng, Qiang; Kim, Jeong-Ho; Cho, Kyu Hong
2013-01-01
Streptococcus pyogenes (Group A Streptococcus or GAS) is a Gram-positive bacterial pathogen that has shown complex modes of regulation of its virulence factors to cause diverse diseases. Bacterial small RNAs are regarded as novel widespread regulators of gene expression in response to environmental signals. Recent studies have revealed that several small RNAs (sRNAs) have an important role in S. pyogenes physiology and pathogenesis by regulating gene expression at the translational level. To search for new sRNAs in S. pyogenes, we performed a genomewide analysis through computational prediction followed by experimental verification. To overcome the limitation of low accuracy in computational prediction, we employed a combination of three different computational algorithms (sRNAPredict, eQRNA and RNAz). A total of 45 candidates were chosen based on the computational analysis, and their transcription was analyzed by reverse-transcriptase PCR and Northern blot. Through this process, we discovered 7 putative novel trans-acting sRNAs. Their abundance varied between different growth phases, suggesting that their expression is influenced by environmental or internal signals. Further, to screen target mRNAs of an sRNA, we employed differential RNA sequencing analysis. This study provides a significant resource for future study of small RNAs and their roles in physiology and pathogenesis of S. pyogenes. PMID:23762235
Computer-Aided Recognition of Facial Attributes for Fetal Alcohol Spectrum Disorders.
Valentine, Matthew; Bihm, Dustin C J; Wolf, Lior; Hoyme, H Eugene; May, Philip A; Buckley, David; Kalberg, Wendy; Abdul-Rahman, Omar A
2017-12-01
To compare the detection of facial attributes by computer-based facial recognition software of 2-D images against standard, manual examination in fetal alcohol spectrum disorders (FASD). Participants were gathered from the Fetal Alcohol Syndrome Epidemiology Research database. Standard frontal and oblique photographs of children were obtained during a manual, in-person dysmorphology assessment. Images were submitted for facial analysis conducted by the facial dysmorphology novel analysis technology (an automated system), which assesses ratios of measurements between various facial landmarks to determine the presence of dysmorphic features. Manual blinded dysmorphology assessments were compared with those obtained via the computer-aided system. Areas under the curve values for individual receiver-operating characteristic curves revealed the computer-aided system (0.88 ± 0.02) to be comparable to the manual method (0.86 ± 0.03) in detecting patients with FASD. Interestingly, cases of alcohol-related neurodevelopmental disorder (ARND) were identified more efficiently by the computer-aided system (0.84 ± 0.07) in comparison to the manual method (0.74 ± 0.04). A facial gestalt analysis of patients with ARND also identified more generalized facial findings compared to the cardinal facial features seen in more severe forms of FASD. We found there was an increased diagnostic accuracy for ARND via our computer-aided method. As this category has been historically difficult to diagnose, we believe our experiment demonstrates that facial dysmorphology novel analysis technology can potentially improve ARND diagnosis by introducing a standardized metric for recognizing FASD-associated facial anomalies. Earlier recognition of these patients will lead to earlier intervention with improved patient outcomes. Copyright © 2017 by the American Academy of Pediatrics.
Compute as Fast as the Engineers Can Think! ULTRAFAST COMPUTING TEAM FINAL REPORT
NASA Technical Reports Server (NTRS)
Biedron, R. T.; Mehrotra, P.; Nelson, M. L.; Preston, M. L.; Rehder, J. J.; Rogersm J. L.; Rudy, D. H.; Sobieski, J.; Storaasli, O. O.
1999-01-01
This report documents findings and recommendations by the Ultrafast Computing Team (UCT). In the period 10-12/98, UCT reviewed design case scenarios for a supersonic transport and a reusable launch vehicle to derive computing requirements necessary for support of a design process with efficiency so radically improved that human thought rather than the computer paces the process. Assessment of the present computing capability against the above requirements indicated a need for further improvement in computing speed by several orders of magnitude to reduce time to solution from tens of hours to seconds in major applications. Evaluation of the trends in computer technology revealed a potential to attain the postulated improvement by further increases of single processor performance combined with massively parallel processing in a heterogeneous environment. However, utilization of massively parallel processing to its full capability will require redevelopment of the engineering analysis and optimization methods, including invention of new paradigms. To that end UCT recommends initiation of a new activity at LaRC called Computational Engineering for development of new methods and tools geared to the new computer architectures in disciplines, their coordination, and validation and benefit demonstration through applications.
Influence of computer work under time pressure on cardiac activity.
Shi, Ping; Hu, Sijung; Yu, Hongliu
2015-03-01
Computer users are often under stress when required to complete computer work within a required time. Work stress has repeatedly been associated with an increased risk for cardiovascular disease. The present study examined the effects of time pressure workload during computer tasks on cardiac activity in 20 healthy subjects. Heart rate, time domain and frequency domain indices of heart rate variability (HRV) and Poincaré plot parameters were compared among five computer tasks and two rest periods. Faster heart rate and decreased standard deviation of R-R interval were noted in response to computer tasks under time pressure. The Poincaré plot parameters showed significant differences between different levels of time pressure workload during computer tasks, and between computer tasks and the rest periods. In contrast, no significant differences were identified for the frequency domain indices of HRV. The results suggest that the quantitative Poincaré plot analysis used in this study was able to reveal the intrinsic nonlinear nature of the autonomically regulated cardiac rhythm. Specifically, heightened vagal tone occurred during the relaxation computer tasks without time pressure. In contrast, the stressful computer tasks with added time pressure stimulated cardiac sympathetic activity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Computational design of an endo-1,4-[beta]-xylanase ligand binding site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morin, Andrew; Kaufmann, Kristian W.; Fortenberry, Carie
2012-09-05
The field of computational protein design has experienced important recent success. However, the de novo computational design of high-affinity protein-ligand interfaces is still largely an open challenge. Using the Rosetta program, we attempted the in silico design of a high-affinity protein interface to a small peptide ligand. We chose the thermophilic endo-1,4-{beta}-xylanase from Nonomuraea flexuosa as the protein scaffold on which to perform our designs. Over the course of the study, 12 proteins derived from this scaffold were produced and assayed for binding to the target ligand. Unfortunately, none of the designed proteins displayed evidence of high-affinity binding. Structural characterizationmore » of four designed proteins revealed that although the predicted structure of the protein model was highly accurate, this structural accuracy did not translate into accurate prediction of binding affinity. Crystallographic analyses indicate that the lack of binding affinity is possibly due to unaccounted for protein dynamics in the 'thumb' region of our design scaffold intrinsic to the family 11 {beta}-xylanase fold. Further computational analysis revealed two specific, single amino acid substitutions responsible for an observed change in backbone conformation, and decreased dynamic stability of the catalytic cleft. These findings offer new insight into the dynamic and structural determinants of the {beta}-xylanase proteins.« less
Simplified diagnostic coding sheet for computerized data storage and analysis in ophthalmology.
Tauber, J; Lahav, M
1987-11-01
A review of currently-available diagnostic coding systems revealed that most are either too abbreviated or too detailed. We have compiled a simplified diagnostic coding sheet based on the International Coding and Diagnosis (ICD-9), which is both complete and easy to use in a general practice. The information is transferred to a computer, which uses the relevant (ICD-9) diagnoses as database and can be retrieved later for display of patients' problems or analysis of clinical data.
Conceptual model of iCAL4LA: Proposing the components using comparative analysis
NASA Astrophysics Data System (ADS)
Ahmad, Siti Zulaiha; Mutalib, Ariffin Abdul
2016-08-01
This paper discusses an on-going study that initiates an initial process in determining the common components for a conceptual model of interactive computer-assisted learning that is specifically designed for low achieving children. This group of children needs a specific learning support that can be used as an alternative learning material in their learning environment. In order to develop the conceptual model, this study extracts the common components from 15 strongly justified computer assisted learning studies. A comparative analysis has been conducted to determine the most appropriate components by using a set of specific indication classification to prioritize the applicability. The results of the extraction process reveal 17 common components for consideration. Later, based on scientific justifications, 16 of them were selected as the proposed components for the model.
NASA Astrophysics Data System (ADS)
Anzulewicz, Anna; Sobota, Krzysztof; Delafield-Butt, Jonathan T.
2016-08-01
Autism is a developmental disorder evident from infancy. Yet, its clinical identification requires expert diagnostic training. New evidence indicates disruption to motor timing and integration may underpin the disorder, providing a potential new computational marker for its early identification. In this study, we employed smart tablet computers with touch-sensitive screens and embedded inertial movement sensors to record the movement kinematics and gesture forces made by 37 children 3-6 years old with autism and 45 age- and gender-matched children developing typically. Machine learning analysis of the children’s motor patterns identified autism with up to 93% accuracy. Analysis revealed these patterns consisted of greater forces at contact and with a different distribution of forces within a gesture, and gesture kinematics were faster and larger, with more distal use of space. These data support the notion disruption to movement is core feature of autism, and demonstrate autism can be computationally assessed by fun, smart device gameplay.
NASA Astrophysics Data System (ADS)
Fei, Cheng-Wei; Bai, Guang-Chen
2014-12-01
To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.
Anzulewicz, Anna; Sobota, Krzysztof; Delafield-Butt, Jonathan T
2016-08-24
Autism is a developmental disorder evident from infancy. Yet, its clinical identification requires expert diagnostic training. New evidence indicates disruption to motor timing and integration may underpin the disorder, providing a potential new computational marker for its early identification. In this study, we employed smart tablet computers with touch-sensitive screens and embedded inertial movement sensors to record the movement kinematics and gesture forces made by 37 children 3-6 years old with autism and 45 age- and gender-matched children developing typically. Machine learning analysis of the children's motor patterns identified autism with up to 93% accuracy. Analysis revealed these patterns consisted of greater forces at contact and with a different distribution of forces within a gesture, and gesture kinematics were faster and larger, with more distal use of space. These data support the notion disruption to movement is core feature of autism, and demonstrate autism can be computationally assessed by fun, smart device gameplay.
NASA Astrophysics Data System (ADS)
Reeta Felscia, U.; Rajkumar, Beulah J. M.; Sankar, Pranitha; Philip, Reji; Briget Mary, M.
2017-09-01
The interaction of pyrene on silver has been investigated using both experimental and computational methods. Hyperpolarizabilities computed theoretically together with experimental nonlinear absorption from open aperture Z-scan measurements, point towards a possible use of pyrene adsorbed on silver in the rational design of NLO devices. Presence of a red shift in both simulated and experimental UV-Vis spectra confirms the adsorption on silver, which is due to the electrostatic interaction between silver and pyrene, inducing variations in the structural parameters of pyrene. Fukui calculations along with MEP plot predict the electrophilic nature of the silver cluster in the presence of pyrene, with NBO analysis revealing that the adsorption causes charge redistribution from the first three rings of pyrene towards the fourth ring, from where the 2p orbitals of carbon interact with the valence 5s orbitals of the cluster. This is further confirmed by the downshifting of ring breathing modes in both the experimental and theoretical Raman spectra.
Anzulewicz, Anna; Sobota, Krzysztof; Delafield-Butt, Jonathan T.
2016-01-01
Autism is a developmental disorder evident from infancy. Yet, its clinical identification requires expert diagnostic training. New evidence indicates disruption to motor timing and integration may underpin the disorder, providing a potential new computational marker for its early identification. In this study, we employed smart tablet computers with touch-sensitive screens and embedded inertial movement sensors to record the movement kinematics and gesture forces made by 37 children 3–6 years old with autism and 45 age- and gender-matched children developing typically. Machine learning analysis of the children’s motor patterns identified autism with up to 93% accuracy. Analysis revealed these patterns consisted of greater forces at contact and with a different distribution of forces within a gesture, and gesture kinematics were faster and larger, with more distal use of space. These data support the notion disruption to movement is core feature of autism, and demonstrate autism can be computationally assessed by fun, smart device gameplay. PMID:27553971
Bornik, Alexander; Urschler, Martin; Schmalstieg, Dieter; Bischof, Horst; Krauskopf, Astrid; Schwark, Thorsten; Scheurer, Eva; Yen, Kathrin
2018-06-01
Three-dimensional (3D) crime scene documentation using 3D scanners and medical imaging modalities like computed tomography (CT) and magnetic resonance imaging (MRI) are increasingly applied in forensic casework. Together with digital photography, these modalities enable comprehensive and non-invasive recording of forensically relevant information regarding injuries/pathologies inside the body and on its surface. Furthermore, it is possible to capture traces and items at crime scenes. Such digitally secured evidence has the potential to similarly increase case understanding by forensic experts and non-experts in court. Unlike photographs and 3D surface models, images from CT and MRI are not self-explanatory. Their interpretation and understanding requires radiological knowledge. Findings in tomography data must not only be revealed, but should also be jointly studied with all the 2D and 3D data available in order to clarify spatial interrelations and to optimally exploit the data at hand. This is technically challenging due to the heterogeneous data representations including volumetric data, polygonal 3D models, and images. This paper presents a novel computer-aided forensic toolbox providing tools to support the analysis, documentation, annotation, and illustration of forensic cases using heterogeneous digital data. Conjoint visualization of data from different modalities in their native form and efficient tools to visually extract and emphasize findings help experts to reveal unrecognized correlations and thereby enhance their case understanding. Moreover, the 3D case illustrations created for case analysis represent an efficient means to convey the insights gained from case analysis to forensic non-experts involved in court proceedings like jurists and laymen. The capability of the presented approach in the context of case analysis, its potential to speed up legal procedures and to ultimately enhance legal certainty is demonstrated by introducing a number of representative forensic cases. Copyright © 2018 The Author(s). Published by Elsevier B.V. All rights reserved.
Internet and computer based interventions for cannabis use: a meta-analysis.
Tait, Robert J; Spijkerman, Renske; Riper, Heleen
2013-12-01
Worldwide, cannabis is the most prevalently used illegal drug and creates demand for prevention and treatment services that cannot be fulfilled using conventional approaches. Computer and Internet-based interventions may have the potential to meet this need. Therefore, we systematically reviewed the literature and conducted a meta-analysis on the effectiveness of this approach in reducing the frequency of cannabis use. We systematically searched online databases (Medline, PubMed, PsychINFO, Embase) for eligible studies and conducted a meta-analysis. Studies had to use a randomized design, be delivered either via the Internet or computer and report separate outcomes for cannabis use. The principal outcome measure was the frequency of cannabis use. Data were extracted from 10 studies and the meta-analysis involved 10 comparisons with 4,125 participants. The overall effect size was small but significant, g=0.16 (95% confidence interval (CI) 0.09-0.22, P<0.001) at post-treatment. Subgroup analyses did not reveal significant subgroup differences for key factors including type of analysis (intention-to-treat, completers only), type of control (active, waitlist), age group (11-16, 17+ years), gender composition (female only, mixed), type of intervention (prevention, 'treatment'), guided versus unguided programs, mode of delivery (Internet, computer), individual versus family dyad and venue (home, research setting). Also, no significant moderation effects were found for number of sessions and time to follow-up. Finally, there was no evidence of publication bias. Internet and computer interventions appear to be effective in reducing cannabis use in the short-term albeit based on data from few studies and across diverse samples. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
On the importance of mathematical methods for analysis of MALDI-imaging mass spectrometry data.
Trede, Dennis; Kobarg, Jan Hendrik; Oetjen, Janina; Thiele, Herbert; Maass, Peter; Alexandrov, Theodore
2012-03-21
In the last decade, matrix-assisted laser desorption/ionization (MALDI) imaging mass spectrometry (IMS), also called as MALDI-imaging, has proven its potential in proteomics and was successfully applied to various types of biomedical problems, in particular to histopathological label-free analysis of tissue sections. In histopathology, MALDI-imaging is used as a general analytic tool revealing the functional proteomic structure of tissue sections, and as a discovery tool for detecting new biomarkers discriminating a region annotated by an experienced histologist, in particular, for cancer studies. A typical MALDI-imaging data set contains 10⁸ to 10⁹ intensity values occupying more than 1 GB. Analysis and interpretation of such huge amount of data is a mathematically, statistically and computationally challenging problem. In this paper we overview some computational methods for analysis of MALDI-imaging data sets. We discuss the importance of data preprocessing, which typically includes normalization, baseline removal and peak picking, and hightlight the importance of image denoising when visualizing IMS data.
On the Importance of Mathematical Methods for Analysis of MALDI-Imaging Mass Spectrometry Data.
Trede, Dennis; Kobarg, Jan Hendrik; Oetjen, Janina; Thiele, Herbert; Maass, Peter; Alexandrov, Theodore
2012-03-01
In the last decade, matrix-assisted laser desorption/ionization (MALDI) imaging mass spectrometry (IMS), also called as MALDI-imaging, has proven its potential in proteomics and was successfully applied to various types of biomedical problems, in particular to histopathological label-free analysis of tissue sections. In histopathology, MALDI-imaging is used as a general analytic tool revealing the functional proteomic structure of tissue sections, and as a discovery tool for detecting new biomarkers discriminating a region annotated by an experienced histologist, in particular, for cancer studies. A typical MALDI-imaging data set contains 108 to 109 intensity values occupying more than 1 GB. Analysis and interpretation of such huge amount of data is a mathematically, statistically and computationally challenging problem. In this paper we overview some computational methods for analysis of MALDI-imaging data sets. We discuss the importance of data preprocessing, which typically includes normalization, baseline removal and peak picking, and hightlight the importance of image denoising when visualizing IMS data.
Annotation analysis for testing drug safety signals using unstructured clinical notes
2012-01-01
Background The electronic surveillance for adverse drug events is largely based upon the analysis of coded data from reporting systems. Yet, the vast majority of electronic health data lies embedded within the free text of clinical notes and is not gathered into centralized repositories. With the increasing access to large volumes of electronic medical data—in particular the clinical notes—it may be possible to computationally encode and to test drug safety signals in an active manner. Results We describe the application of simple annotation tools on clinical text and the mining of the resulting annotations to compute the risk of getting a myocardial infarction for patients with rheumatoid arthritis that take Vioxx. Our analysis clearly reveals elevated risks for myocardial infarction in rheumatoid arthritis patients taking Vioxx (odds ratio 2.06) before 2005. Conclusions Our results show that it is possible to apply annotation analysis methods for testing hypotheses about drug safety using electronic medical records. PMID:22541596
NASA Astrophysics Data System (ADS)
Falkner, Katrina; Vivian, Rebecca
2015-10-01
To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.
MinePath: Mining for Phenotype Differential Sub-paths in Molecular Pathways
Koumakis, Lefteris; Kartsaki, Evgenia; Chatzimina, Maria; Zervakis, Michalis; Vassou, Despoina; Marias, Kostas; Moustakis, Vassilis; Potamias, George
2016-01-01
Pathway analysis methodologies couple traditional gene expression analysis with knowledge encoded in established molecular pathway networks, offering a promising approach towards the biological interpretation of phenotype differentiating genes. Early pathway analysis methodologies, named as gene set analysis (GSA), view pathways just as plain lists of genes without taking into account either the underlying pathway network topology or the involved gene regulatory relations. These approaches, even if they achieve computational efficiency and simplicity, consider pathways that involve the same genes as equivalent in terms of their gene enrichment characteristics. Most recent pathway analysis approaches take into account the underlying gene regulatory relations by examining their consistency with gene expression profiles and computing a score for each profile. Even with this approach, assessing and scoring single-relations limits the ability to reveal key gene regulation mechanisms hidden in longer pathway sub-paths. We introduce MinePath, a pathway analysis methodology that addresses and overcomes the aforementioned problems. MinePath facilitates the decomposition of pathways into their constituent sub-paths. Decomposition leads to the transformation of single-relations to complex regulation sub-paths. Regulation sub-paths are then matched with gene expression sample profiles in order to evaluate their functional status and to assess phenotype differential power. Assessment of differential power supports the identification of the most discriminant profiles. In addition, MinePath assess the significance of the pathways as a whole, ranking them by their p-values. Comparison results with state-of-the-art pathway analysis systems are indicative for the soundness and reliability of the MinePath approach. In contrast with many pathway analysis tools, MinePath is a web-based system (www.minepath.org) offering dynamic and rich pathway visualization functionality, with the unique characteristic to color regulatory relations between genes and reveal their phenotype inclination. This unique characteristic makes MinePath a valuable tool for in silico molecular biology experimentation as it serves the biomedical researchers’ exploratory needs to reveal and interpret the regulatory mechanisms that underlie and putatively govern the expression of target phenotypes. PMID:27832067
MinePath: Mining for Phenotype Differential Sub-paths in Molecular Pathways.
Koumakis, Lefteris; Kanterakis, Alexandros; Kartsaki, Evgenia; Chatzimina, Maria; Zervakis, Michalis; Tsiknakis, Manolis; Vassou, Despoina; Kafetzopoulos, Dimitris; Marias, Kostas; Moustakis, Vassilis; Potamias, George
2016-11-01
Pathway analysis methodologies couple traditional gene expression analysis with knowledge encoded in established molecular pathway networks, offering a promising approach towards the biological interpretation of phenotype differentiating genes. Early pathway analysis methodologies, named as gene set analysis (GSA), view pathways just as plain lists of genes without taking into account either the underlying pathway network topology or the involved gene regulatory relations. These approaches, even if they achieve computational efficiency and simplicity, consider pathways that involve the same genes as equivalent in terms of their gene enrichment characteristics. Most recent pathway analysis approaches take into account the underlying gene regulatory relations by examining their consistency with gene expression profiles and computing a score for each profile. Even with this approach, assessing and scoring single-relations limits the ability to reveal key gene regulation mechanisms hidden in longer pathway sub-paths. We introduce MinePath, a pathway analysis methodology that addresses and overcomes the aforementioned problems. MinePath facilitates the decomposition of pathways into their constituent sub-paths. Decomposition leads to the transformation of single-relations to complex regulation sub-paths. Regulation sub-paths are then matched with gene expression sample profiles in order to evaluate their functional status and to assess phenotype differential power. Assessment of differential power supports the identification of the most discriminant profiles. In addition, MinePath assess the significance of the pathways as a whole, ranking them by their p-values. Comparison results with state-of-the-art pathway analysis systems are indicative for the soundness and reliability of the MinePath approach. In contrast with many pathway analysis tools, MinePath is a web-based system (www.minepath.org) offering dynamic and rich pathway visualization functionality, with the unique characteristic to color regulatory relations between genes and reveal their phenotype inclination. This unique characteristic makes MinePath a valuable tool for in silico molecular biology experimentation as it serves the biomedical researchers' exploratory needs to reveal and interpret the regulatory mechanisms that underlie and putatively govern the expression of target phenotypes.
Reflectance measurements for the detection and mapping of soil limitations
NASA Technical Reports Server (NTRS)
Benson, L. A.; Frazee, C. J.
1973-01-01
During 1971 and 1972 research was conducted on two fallow fields in the proposed Oahe Irrigation Project to investigate the relationship between the tonal variations observed on aerial photographs and the principal soil limitations of the area. A grid sampling procedure was used to collected detailed field data during the 1972 growing season. The field data was compared to imagery collected on May 14, 1971 at 3050 meters altitude. The imagery and field data were initially evaluated by a visual analysis. Correlation and regression analysis revealed a highly significant correlation and regression analysis revealed a highly significant correlation between the digitized color infrared film data and soil properties such as organic matter content, color, depth to carbonates, bulk density and reflectivity. Computer classification of the multiemulsion film data resulted in maps delineating the areas containing claypan and erosion limitations. Reflectance data from the red spectral band provided the best results.
NASA Astrophysics Data System (ADS)
Pathak, Rohit; Joshi, Satyadhar
Within a span of over a decade, India has become one of the most favored destinations across the world for Business Process Outsourcing (BPO) operations. India has rapidly achieved the status of being the most preferred destination for BPO for companies located in the US and Europe. Security and privacy are the two major issues needed to be addressed by the Indian software industry to have an increased and long-term outsourcing contract from the US. Another important issue is about sharing employee’s information to ensure that data and vital information of an outsourcing company is secured and protected. To ensure that the confidentiality of a client’s information is maintained, BPOs need to implement some data security measures. In this paper, we propose a new protocol for specifically for BPO Secure Multi-Party Computation (SMC). As there are many computations and surveys which involve confidential data from many parties or organizations and the concerned data is property of the organization, preservation and security of this data is of prime importance for such type of computations. Although the computation requires data from all the parties, but none of the associated parties would want to reveal their data to the other parties. We have proposed a new efficient and scalable protocol to perform computation on encrypted information. The information is encrypted in a manner that it does not affect the result of the computation. It uses modifier tokens which are distributed among virtual parties, and finally used in the computation. The computation function uses the acquired data and modifier tokens to compute right result from the encrypted data. Thus without revealing the data, right result can be computed and privacy of the parties is maintained. We have given a probabilistic security analysis of hacking the protocol and shown how zero hacking security can be achieved. Also we have analyzed the specific case of Indian BPO.
NASA Astrophysics Data System (ADS)
Sathya, K.; Dhamodharan, P.; Dhandapani, M.
2018-03-01
A new hydrgen bonded proton transfer complex, 2-methyl imidazolium 3, 5-dinitrobenzoate 3,5-dinitro benzoic acid (MIDB) was synthesized by the reaction between 2-methyl imidazole with 3,5-dinitro benzoic acid (1:2) in methanol solvent at room temperature. The crystals were subjected to FT-IR spectral analysis to confirm the functional groups of the new compound. Single crystal XRD analysis reveals that MIDB belongs to monoclinic system with P21/c space group. The asymmetric unit consists of one 2-methyl imidazolium cation, one 3, 5-dinitrobenzoate anion and one uncharged 3,5-dinitro benzoic acid moiety. Experimental NMR spectroscopic data and theoretically calculated NMR data correlated very well to estabilish the exact carbon skeleton and hydrogen environment in the molecular structure of MIDB. The thermal stability of the compound was investigated by thermogravimetry and differential thermal analysis (TG-DTA). Computational studies such as optimization of molecular geometry, natural bond analysis (NBO), Mulliken population analysis and HOMO-LUMO analysis were performed using Gaussian 09 software by B3LYP method at 6-31 g basis set level. The calculated first-order polarizability (β) of MIDB from computational studies is 4.1752 × 10-30 esu, which is 32 times greater than that of urea. UV-vis-NIR spectral studies revealed that the MIDB has a large optical transparency window. The optical nonlinearities of MIDB have been investigated by Z-scan technique with Hesbnd Ne laser radiation of wavelength 632.8 nm. Hirshfeld analysis indicate O⋯H/H⋯O interactions are the superior interactions confirming excessive hydrogen bond net work in the molecular structure.
Lin, Chun-Li; Chang, Yen-Hsiang; Hsieh, Shih-Kai; Chang, Wen-Jen
2013-03-01
This study evaluated the risk of failure for an endodontically treated premolar with different crack depths, which was shearing toward the pulp chamber and was restored by using 3 different computer-aided design/computer-aided manufacturing ceramic restoration configurations. Three 3-dimensional finite element models designed with computer-aided design/computer-aided manufacturing ceramic onlay, endocrown, and conventional crown restorations were constructed to perform simulations. The Weibull function was incorporated with finite element analysis to calculate the long-term failure probability relative to different load conditions. The results indicated that the stress values on the enamel, dentin, and luting cement for endocrown restorations exhibited the lowest values relative to the other 2 restoration methods. Weibull analysis revealed that the overall failure probabilities in a shallow cracked premolar were 27%, 2%, and 1% for the onlay, endocrown, and conventional crown restorations, respectively, in the normal occlusal condition. The corresponding values were 70%, 10%, and 2% for the depth cracked premolar. This numeric investigation suggests that the endocrown provides sufficient fracture resistance only in a shallow cracked premolar with endodontic treatment. The conventional crown treatment can immobilize the premolar for different cracked depths with lower failure risk. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Singh, Vinod Kumar; Krishnamachari, Annangarachari
2016-09-01
Genome-wide experimental studies in Saccharomyces cerevisiae reveal that autonomous replicating sequence (ARS) requires an essential consensus sequence (ACS) for replication activity. Computational studies identified thousands of ACS like patterns in the genome. However, only a few hundreds of these sites act as replicating sites and the rest are considered as dormant or evolving sites. In a bid to understand the sequence makeup of replication sites, a content and context-based analysis was performed on a set of replicating ACS sequences that binds to origin-recognition complex (ORC) denoted as ORC-ACS and non-replicating ACS sequences (nrACS), that are not bound by ORC. In this study, DNA properties such as base composition, correlation, sequence dependent thermodynamic and DNA structural profiles, and their positions have been considered for characterizing ORC-ACS and nrACS. Analysis reveals that ORC-ACS depict marked differences in nucleotide composition and context features in its vicinity compared to nrACS. Interestingly, an A-rich motif was also discovered in ORC-ACS sequences within its nucleosome-free region. Profound changes in the conformational features, such as DNA helical twist, inclination angle and stacking energy between ORC-ACS and nrACS were observed. Distribution of ACS motifs in the non-coding segments points to the locations of ORC-ACS which are found far away from the adjacent gene start position compared to nrACS thereby enabling an accessible environment for ORC-proteins. Our attempt is novel in considering the contextual view of ACS and its flanking region along with nucleosome positioning in the S. cerevisiae genome and may be useful for any computational prediction scheme.
Loughman, James; Davison, Peter; Flitcroft, Ian
2007-11-01
Preattentive visual search (PAVS) describes rapid and efficient retinal and neural processing capable of immediate target detection in the visual field. Damage to the nerve fibre layer or visual pathway might reduce the efficiency with which the visual system performs such analysis. The purpose of this study was to test the hypothesis that patients with glaucoma are impaired on parallel search tasks, and that this would serve to distinguish glaucoma in early cases. Three groups of observers (glaucoma patients, suspect and normal individuals) were examined, using computer-generated flicker, orientation, and vertical motion displacement targets to assess PAVS efficiency. The task required rapid and accurate localisation of a singularity embedded in a field of 119 homogeneous distractors on either the left or right-hand side of a computer monitor. All subjects also completed a choice reaction time (CRT) task. Independent sample T tests revealed PAVS efficiency to be significantly impaired in the glaucoma group compared with both normal and suspect individuals. Performance was impaired in all types of glaucoma tested. Analysis between normal and suspect individuals revealed a significant difference only for motion displacement response times. Similar analysis using a PAVS/CRT index confirmed the glaucoma findings but also showed statistically significant differences between suspect and normal individuals across all target types. A test of PAVS efficiency appears capable of differentiating early glaucoma from both normal and suspect cases. Analysis incorporating a PAVS/CRT index enhances the diagnostic capacity to differentiate normal from suspect cases.
A novel quantum scheme for secure two-party distance computation
NASA Astrophysics Data System (ADS)
Peng, Zhen-wan; Shi, Run-hua; Zhong, Hong; Cui, Jie; Zhang, Shun
2017-12-01
Secure multiparty computational geometry is an essential field of secure multiparty computation, which computes a computation geometric problem without revealing any private information of each party. Secure two-party distance computation is a primitive of secure multiparty computational geometry, which computes the distance between two points without revealing each point's location information (i.e., coordinate). Secure two-party distance computation has potential applications with high secure requirements in military, business, engineering and so on. In this paper, we present a quantum solution to secure two-party distance computation by subtly using quantum private query. Compared to the classical related protocols, our quantum protocol can ensure higher security and better privacy protection because of the physical principle of quantum mechanics.
Velázquez, Claudia; Correa-Basurto, José; Garcia-Hernandez, Normand; Barbosa, Elizabeth; Tesoro-Cruz, Emiliano; Calzada, Samuel; Calzada, Fernando
2012-09-28
Chiranthodendron pentadactylon Larreat is frequently used in Mexican traditional medicine as well as in Guatemalan for several medicinal purposes, including their use in the control of diarrhea. This work was undertaken to obtain additional information that support the traditional use of Chiranthodendron pentadactylon Larreat, on pharmacological basis using the major antisecretory isolated compound from computational, in vitro and in vivo experiments. (-)-Epicatechin was isolated from ethyl acetate fraction of the plant crude extract. In vivo toxin (Vibrio cholera or Escherichia coli)-induced intestinal secretion in rat jejunal loops models and sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE) analysis on Vibrio cholera toxin were used in experimental studies while the molecular docking technique was used to conduct computational study. The antisecretory activity of epicatechin was tested against Vibrio cholera and Escherichia coli toxins at oral dose 10 mg/kg in the rat model. It exhibited the most potent activity on Vibrio cholera toxin (56.9% of inhibition). In the case of Escherichia coli toxin its effect was moderate (24.1% of inhibition). SDS-PAGE analysis revealed that both (-)-epicatechin and Chiranthodendron pentadactylon extract interacted with the Vibrio cholera toxin at concentration from 80 μg/mL and 300 μg/mL, respectively. Computational molecular docking showed that epicatechin interacted with four amino acid residues (Asn 103, Phe 31, Phe 223 and The 78) in the catalytic site of Vibrio cholera toxin, revealing its potential binding mode at molecular level. The results derived from computational, in vitro and in vivo experiments on Vibrio cholera and Escherichia coli toxins confirm the potential of epicatechin as a new antisecretory compound and give additional scientific support to anecdotal use of Chiranthodendron pentadactylon Larreat in Mexican traditional medicine to treat gastrointestinal disorders such as diarrhea. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Crysalis: an integrated server for computational analysis and design of protein crystallization.
Wang, Huilin; Feng, Liubin; Zhang, Ziding; Webb, Geoffrey I; Lin, Donghai; Song, Jiangning
2016-02-24
The failure of multi-step experimental procedures to yield diffraction-quality crystals is a major bottleneck in protein structure determination. Accordingly, several bioinformatics methods have been successfully developed and employed to select crystallizable proteins. Unfortunately, the majority of existing in silico methods only allow the prediction of crystallization propensity, seldom enabling computational design of protein mutants that can be targeted for enhancing protein crystallizability. Here, we present Crysalis, an integrated crystallization analysis tool that builds on support-vector regression (SVR) models to facilitate computational protein crystallization prediction, analysis, and design. More specifically, the functionality of this new tool includes: (1) rapid selection of target crystallizable proteins at the proteome level, (2) identification of site non-optimality for protein crystallization and systematic analysis of all potential single-point mutations that might enhance protein crystallization propensity, and (3) annotation of target protein based on predicted structural properties. We applied the design mode of Crysalis to identify site non-optimality for protein crystallization on a proteome-scale, focusing on proteins currently classified as non-crystallizable. Our results revealed that site non-optimality is based on biases related to residues, predicted structures, physicochemical properties, and sequence loci, which provides in-depth understanding of the features influencing protein crystallization. Crysalis is freely available at http://nmrcen.xmu.edu.cn/crysalis/.
Crysalis: an integrated server for computational analysis and design of protein crystallization
Wang, Huilin; Feng, Liubin; Zhang, Ziding; Webb, Geoffrey I.; Lin, Donghai; Song, Jiangning
2016-01-01
The failure of multi-step experimental procedures to yield diffraction-quality crystals is a major bottleneck in protein structure determination. Accordingly, several bioinformatics methods have been successfully developed and employed to select crystallizable proteins. Unfortunately, the majority of existing in silico methods only allow the prediction of crystallization propensity, seldom enabling computational design of protein mutants that can be targeted for enhancing protein crystallizability. Here, we present Crysalis, an integrated crystallization analysis tool that builds on support-vector regression (SVR) models to facilitate computational protein crystallization prediction, analysis, and design. More specifically, the functionality of this new tool includes: (1) rapid selection of target crystallizable proteins at the proteome level, (2) identification of site non-optimality for protein crystallization and systematic analysis of all potential single-point mutations that might enhance protein crystallization propensity, and (3) annotation of target protein based on predicted structural properties. We applied the design mode of Crysalis to identify site non-optimality for protein crystallization on a proteome-scale, focusing on proteins currently classified as non-crystallizable. Our results revealed that site non-optimality is based on biases related to residues, predicted structures, physicochemical properties, and sequence loci, which provides in-depth understanding of the features influencing protein crystallization. Crysalis is freely available at http://nmrcen.xmu.edu.cn/crysalis/. PMID:26906024
The computational worm: spatial orientation and its neuronal basis in C. elegans.
Lockery, Shawn R
2011-10-01
Spatial orientation behaviors in animals are fundamental for survival but poorly understood at the neuronal level. The nematode Caenorhabditis elegans orients to a wide range of stimuli and has a numerically small and well-described nervous system making it advantageous for investigating the mechanisms of spatial orientation. Recent work by the C. elegans research community has identified essential computational elements of the neural circuits underlying two orientation strategies that operate in five different sensory modalities. Analysis of these circuits reveals novel motifs including simple circuits for computing temporal derivatives of sensory input and for integrating sensory input with behavioral state to generate adaptive behavior. These motifs constitute hypotheses concerning the identity and functionality of circuits controlling spatial orientation in higher organisms. Copyright © 2011 Elsevier Ltd. All rights reserved.
Al-Anzi, Bader; Arpp, Patrick; Gerges, Sherif; Ormerod, Christopher; Olsman, Noah; Zinn, Kai
2015-05-01
An approach combining genetic, proteomic, computational, and physiological analysis was used to define a protein network that regulates fat storage in budding yeast (Saccharomyces cerevisiae). A computational analysis of this network shows that it is not scale-free, and is best approximated by the Watts-Strogatz model, which generates "small-world" networks with high clustering and short path lengths. The network is also modular, containing energy level sensing proteins that connect to four output processes: autophagy, fatty acid synthesis, mRNA processing, and MAP kinase signaling. The importance of each protein to network function is dependent on its Katz centrality score, which is related both to the protein's position within a module and to the module's relationship to the network as a whole. The network is also divisible into subnetworks that span modular boundaries and regulate different aspects of fat metabolism. We used a combination of genetics and pharmacology to simultaneously block output from multiple network nodes. The phenotypic results of this blockage define patterns of communication among distant network nodes, and these patterns are consistent with the Watts-Strogatz model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
ElNaggar, Mariam S; Barbier, Charlotte N; Van Berkel, Gary J
A coaxial geometry liquid microjunction surface sampling probe (LMJ-SSP) enables direct extraction of analytes from surfaces for subsequent analysis by techniques like mass spectrometry. Solution dynamics at the probe-to-sample surface interface in the LMJ-SSP has been suspected to influence sampling efficiency and dispersion but has not been rigorously investigated. The effect on flow dynamics and analyte transport to the mass spectrometer caused by coaxial retraction of the inner and outer capillaries from each other and the surface during sampling with a LMJ-SSP was investigated using computational fluid dynamics and experimentation. A transparent LMJ-SSP was constructed to provide the means formore » visual observation of the dynamics of the surface sampling process. Visual observation, computational fluid dynamics (CFD) analysis, and experimental results revealed that inner capillary axial retraction from the flush position relative to the outer capillary transitioned the probe from a continuous sampling and injection mode through an intermediate regime to sample plug formationmode caused by eddy currents at the sampling end of the probe. The potential for analytical implementation of these newly discovered probe operational modes is discussed.« less
Moris, Izabela C M; Monteiro, Silas Borges; Martins, Raíssa; Ribeiro, Ricardo Faria; Gomes, Erica A
2018-01-01
To evaluate the influence of different manufacturing methods of single implant-supported metallic crowns on the internal and external marginal fit through computed microtomography. Forty external hexagon implants were divided into 4 groups ( n = 8), according to the manufacturing method: GC, conventional casting; GI, induction casting; GP, plasma casting; and GCAD, CAD/CAM machining. The crowns were attached to the implants with insertion torque of 30 N·cm. The external (vertical and horizontal) marginal fit and internal fit were assessed through computed microtomography. Internal and external marginal fit data ( μ m) were submitted to a one-way ANOVA and Tukey's test ( α = .05). Qualitative evaluation of the images was conducted by using micro-CT. The statistical analysis revealed no significant difference between the groups for vertical misfit ( P = 0.721). There was no significant difference ( P > 0.05) for the internal and horizontal marginal misfit in the groups GC, GI, and GP, but it was found for the group GCAD ( P ≤ 0.05). Qualitative analysis revealed that most of the samples of cast groups exhibited crowns underextension while the group GCAD showed overextension. The manufacturing method of the crowns influenced the accuracy of marginal fit between the prosthesis and implant. The best results were found for the crowns fabricated through CAD/CAM machining.
Ben Ayed, Rayda; Ben Hassen, Hanen; Ennouri, Karim; Rebai, Ahmed
2016-12-01
The genetic diversity of 22 olive tree cultivars (Olea europaea L.) sampled from different Mediterranean countries was assessed using 5 SNP markers (FAD2.1; FAD2.3; CALC; SOD and ANTHO3) located in four different genes. The genotyping analysis of the 22 cultivars with 5 SNP loci revealed 11 alleles (average 2.2 per allele). The dendrogram based on cultivar genotypes revealed three clusters consistent with the cultivars classification. Besides, the results obtained with the five SNPs were compared to those obtained with the SSR markers using bioinformatic analyses and by computing a cophenetic correlation coefficient, indicating the usefulness of the UPGMA method for clustering plant genotypes. Based on principal coordinate analysis using a similarity matrix, the first two coordinates, revealed 54.94 % of the total variance. This work provides a more comprehensive explanation of the diversity available in Tunisia olive cultivars, and an important contribution for olive breeding and olive oil authenticity.
Computational analysis of the receptor binding specificity of novel influenza A/H7N9 viruses.
Zhou, Xinrui; Zheng, Jie; Ivan, Fransiskus Xaverius; Yin, Rui; Ranganathan, Shoba; Chow, Vincent T K; Kwoh, Chee-Keong
2018-05-09
Influenza viruses are undergoing continuous and rapid evolution. The fatal influenza A/H7N9 has drawn attention since the first wave of infections in March 2013, and raised more grave concerns with its increased potential to spread among humans. Experimental studies have revealed several host and virulence markers, indicating differential host binding preferences which can help estimate the potential of causing a pandemic. Here we systematically investigate the sequence pattern and structural characteristics of novel influenza A/H7N9 using computational approaches. The sequence analysis highlighted mutations in protein functional domains of influenza viruses. Molecular docking and molecular dynamics simulation revealed that the hemagglutinin (HA) of A/Taiwan/1/2017(H7N9) strain enhanced the binding with both avian and human receptor analogs, compared with the previous A/Shanghai/02/2013(H7N9) strain. The Molecular Mechanics - Poisson Boltzmann Surface Area (MM-PBSA) calculation revealed the change of residue-ligand interaction energy and detected the residues with conspicuous binding preference. The results are novel and specific to the emerging influenza A/Taiwan/1/2017(H7N9) strain compared with A/Shanghai/02/2013(H7N9). Its enhanced ability to bind human receptor analogs, which are abundant in the human upper respiratory tract, may be responsible for the recent outbreak. Residues showing binding preference were detected, which could facilitate monitoring the circulating influenza viruses.
Spectroscopic investigation of some building blocks of organic conductors: A comparative study
NASA Astrophysics Data System (ADS)
Mukherjee, V.; Yadav, T.
2017-04-01
Theoretical molecular structures and IR and Raman spectra of di and tetra methyl substituted tetrathiafulvalene and tetraselenafulvalene molecules have been studied. These molecules belong to the organic conductor family and are immensely used as building blocks of several organic conducting devices. The Hartree-Fock and density functional theory with exchange functional B3LYP have been employed for computational purpose. We have also performed normal coordinate analysis to scale the theoretical frequencies and to calculate potential energy distributions for the conspicuous assignments. The exciting frequency and temperature dependent Raman spectra have also presented. Optimization results reveal that the sulphur derivatives possess boat shape while selenium derivatives possess planner structures. Natural bond orbitals analysis has also been performed to study second order interaction between donors and acceptors and to compute molecular orbital occupancy and energy.
Interval sampling methods and measurement error: a computer simulation.
Wirth, Oliver; Slaven, James; Taylor, Matthew A
2014-01-01
A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.
NASA Astrophysics Data System (ADS)
Zhao, Xuemei; Li, Rui; Chen, Yu; Sia, Sheau Fung; Li, Donghai; Zhang, Yu; Liu, Aihua
2017-04-01
Additional hemodynamic parameters are highly desirable in the clinical management of intracranial aneurysm rupture as static medical images cannot demonstrate the blood flow within aneurysms. There are two ways of obtaining the hemodynamic information—by phase-contrast magnetic resonance imaging (PCMRI) and computational fluid dynamics (CFD). In this paper, we compared PCMRI and CFD in the analysis of a stable patient's specific aneurysm. The results showed that PCMRI and CFD are in good agreement with each other. An additional CFD study of two stable and two ruptured aneurysms revealed that ruptured aneurysms have a higher statistical average blood velocity, wall shear stress, and oscillatory shear index (OSI) within the aneurysm sac compared to those of stable aneurysms. Furthermore, for ruptured aneurysms, the OSI divides the positive and negative wall shear stress divergence at the aneurysm sac.
NASA Astrophysics Data System (ADS)
Sathya, K.; Dhamodharan, P.; Dhandapani, M.
2018-05-01
A molecular complex, 1H-benzo[d][1,2,3]triazol-3-ium-3,5-dinitrobenzoate, (BTDB), was synthesized, crystallized and characterized by CHN analysis and 1H, 13C NMR spectral studies. The crystal is transparent in entire visible region as evidenced by UV-Vis-NIR spectrum. TG/DTA analysis shows that BTDB is stable up to 150 °C. Single crystal XRD analysis was carried out to ascertain the molecular structure and BTDB crystallizes in the monoclinic system with space group P21/n. Computational studies that include optimization of molecular geometry, natural bond analysis (NBO), Mulliken population analysis and HOMO-LUMO analysis were performed using Gaussian 09 software by B3LYP method at 6-311G(d,p) level. Hirshfeld surfaces and 2D fingerprint plots revealed that O⋯H, H⋯H and O⋯C interactions are the most prevalent. The first order hyperpolarizability (β) of BITB is 44 times greater than urea. The results show that the BTDB may be used for various opto-electronic applications.
Computation-Guided Backbone Grafting of a Discontinuous Motif onto a Protein Scaffold
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azoitei, Mihai L.; Correia, Bruno E.; Ban, Yih-En Andrew
2012-02-07
The manipulation of protein backbone structure to control interaction and function is a challenge for protein engineering. We integrated computational design with experimental selection for grafting the backbone and side chains of a two-segment HIV gp120 epitope, targeted by the cross-neutralizing antibody b12, onto an unrelated scaffold protein. The final scaffolds bound b12 with high specificity and with affinity similar to that of gp120, and crystallographic analysis of a scaffold bound to b12 revealed high structural mimicry of the gp120-b12 complex structure. The method can be generalized to design other functional proteins through backbone grafting.
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1989-01-01
In November 1988 a worm program invaded several thousand UNIX-operated Sun workstations and VAX computers attached to the Research Internet, seriously disrupting service for several days but damaging no files. An analysis of the work's decompiled code revealed a battery of attacks by a knowledgeable insider, and demonstrated a number of security weaknesses. The attack occurred in an open network, and little can be inferred about the vulnerabilities of closed networks used for critical operations. The attack showed that passwork protection procedures need review and strengthening. It showed that sets of mutually trusting computers need to be carefully controlled. Sharp public reaction crystalized into a demand for user awareness and accountability in a networked world.
NASA Astrophysics Data System (ADS)
Obbard, R. W.
2015-07-01
This comment addresses a statement made in "A review of air-ice chemical and physical interactions (AICI): liquids, quasi-liquids, and solids in snow" by Bartels-Rausch et al. (Atmos. Chem. Phys., 14, 1587-1633, doi:10.5194/acp-14-1587-2014, 2014). Here we rebut the assertion that X-ray computed microtomography of sea ice fails to reveal liquid brine inclusions by discussing the phases present at the analysis temperature.
Congenital lobar emphysema in a kitten.
Blonk, M; Van de Maele, I; Combes, A; Stablay, B; De Cock, H; Polis, I; Rybachuk, G; de Rooster, H
2017-11-01
A five-month-old ragdoll cat presented with severe respiratory signs, unresponsive to medical therapy. Hyperinflation of the right middle lung lobe was diagnosed with radiography and computed tomography. Lung lobectomy following a median sternotomy led to full recovery. Histopathological analysis revealed lobar emphysema and, based on the animal's age, congenital lobar emphysema was considered the most likely diagnosis. © 2017 British Small Animal Veterinary Association.
ERIC Educational Resources Information Center
Ramirez, Dulce M.
2017-01-01
The increasing use of online pedagogy in higher education has revealed a need to analyze factors contributing to student engagement in online courses. Throughout the past decade, social media has been a growing influence in higher education. This quantitative cross-sectional study examined the attitudes of students and faculty towards computer…
Wee, Leonard; Hackett, Sara Lyons; Jones, Andrew; Lim, Tee Sin; Harper, Christopher Stirling
2013-01-01
This study evaluated the agreement of fiducial marker localization between two modalities — an electronic portal imaging device (EPID) and cone‐beam computed tomography (CBCT) — using a low‐dose, half‐rotation scanning protocol. Twenty‐five prostate cancer patients with implanted fiducial markers were enrolled. Before each daily treatment, EPID and half‐rotation CBCT images were acquired. Translational shifts were computed for each modality and two marker‐matching algorithms, seed‐chamfer and grey‐value, were performed for each set of CBCT images. The localization offsets, and systematic and random errors from both modalities were computed. Localization performances for both modalities were compared using Bland‐Altman limits of agreement (LoA) analysis, Deming regression analysis, and Cohen's kappa inter‐rater analysis. The differences in the systematic and random errors between the modalities were within 0.2 mm in all directions. The LoA analysis revealed a 95% agreement limit of the modalities of 2 to 3.5 mm in any given translational direction. Deming regression analysis demonstrated that constant biases existed in the shifts computed by the modalities in the superior–inferior (SI) direction, but no significant proportional biases were identified in any direction. Cohen's kappa analysis showed good agreement between the modalities in prescribing translational corrections of the couch at 3 and 5 mm action levels. Images obtained from EPID and half‐rotation CBCT showed acceptable agreement for registration of fiducial markers. The seed‐chamfer algorithm for tracking of fiducial markers in CBCT datasets yielded better agreement than the grey‐value matching algorithm with EPID‐based registration. PACS numbers: 87.55.km, 87.55.Qr PMID:23835391
BioSig3D: High Content Screening of Three-Dimensional Cell Culture Models
Bilgin, Cemal Cagatay; Fontenay, Gerald; Cheng, Qingsu; Chang, Hang; Han, Ju; Parvin, Bahram
2016-01-01
BioSig3D is a computational platform for high-content screening of three-dimensional (3D) cell culture models that are imaged in full 3D volume. It provides an end-to-end solution for designing high content screening assays, based on colony organization that is derived from segmentation of nuclei in each colony. BioSig3D also enables visualization of raw and processed 3D volumetric data for quality control, and integrates advanced bioinformatics analysis. The system consists of multiple computational and annotation modules that are coupled together with a strong use of controlled vocabularies to reduce ambiguities between different users. It is a web-based system that allows users to: design an experiment by defining experimental variables, upload a large set of volumetric images into the system, analyze and visualize the dataset, and either display computed indices as a heatmap, or phenotypic subtypes for heterogeneity analysis, or download computed indices for statistical analysis or integrative biology. BioSig3D has been used to profile baseline colony formations with two experiments: (i) morphogenesis of a panel of human mammary epithelial cell lines (HMEC), and (ii) heterogeneity in colony formation using an immortalized non-transformed cell line. These experiments reveal intrinsic growth properties of well-characterized cell lines that are routinely used for biological studies. BioSig3D is being released with seed datasets and video-based documentation. PMID:26978075
A computational model for epidural electrical stimulation of spinal sensorimotor circuits.
Capogrosso, Marco; Wenger, Nikolaus; Raspopovic, Stanisa; Musienko, Pavel; Beauparlant, Janine; Bassi Luciani, Lorenzo; Courtine, Grégoire; Micera, Silvestro
2013-12-04
Epidural electrical stimulation (EES) of lumbosacral segments can restore a range of movements after spinal cord injury. However, the mechanisms and neural structures through which EES facilitates movement execution remain unclear. Here, we designed a computational model and performed in vivo experiments to investigate the type of fibers, neurons, and circuits recruited in response to EES. We first developed a realistic finite element computer model of rat lumbosacral segments to identify the currents generated by EES. To evaluate the impact of these currents on sensorimotor circuits, we coupled this model with an anatomically realistic axon-cable model of motoneurons, interneurons, and myelinated afferent fibers for antagonistic ankle muscles. Comparisons between computer simulations and experiments revealed the ability of the model to predict EES-evoked motor responses over multiple intensities and locations. Analysis of the recruited neural structures revealed the lack of direct influence of EES on motoneurons and interneurons. Simulations and pharmacological experiments demonstrated that EES engages spinal circuits trans-synaptically through the recruitment of myelinated afferent fibers. The model also predicted the capacity of spatially distinct EES to modulate side-specific limb movements and, to a lesser extent, extension versus flexion. These predictions were confirmed during standing and walking enabled by EES in spinal rats. These combined results provide a mechanistic framework for the design of spinal neuroprosthetic systems to improve standing and walking after neurological disorders.
Dynamical analysis of the global business-cycle synchronization
2018-01-01
This paper reports the dynamical analysis of the business cycles of 12 (developed and developing) countries over the last 56 years by applying computational techniques used for tackling complex systems. They reveal long-term convergence and country-level interconnections because of close contagion effects caused by bilateral networking exposure. Interconnectivity determines the magnitude of cross-border impacts. Local features and shock propagation complexity also may be true engines for local configuration of cycles. The algorithmic modeling proves to represent a solid approach to study the complex dynamics involved in the world economies. PMID:29408909
Dynamical analysis of the global business-cycle synchronization.
Lopes, António M; Tenreiro Machado, J A; Huffstot, John S; Mata, Maria Eugénia
2018-01-01
This paper reports the dynamical analysis of the business cycles of 12 (developed and developing) countries over the last 56 years by applying computational techniques used for tackling complex systems. They reveal long-term convergence and country-level interconnections because of close contagion effects caused by bilateral networking exposure. Interconnectivity determines the magnitude of cross-border impacts. Local features and shock propagation complexity also may be true engines for local configuration of cycles. The algorithmic modeling proves to represent a solid approach to study the complex dynamics involved in the world economies.
Computational Fluid Dynamics Analysis Success Stories of X-Plane Design to Flight Test
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2008-01-01
Examples of the design and flight test of three true X-planes are described, particularly X-plane design techniques that relied heavily on computational fluid dynamics(CFD) analysis. Three examples are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and the X-48B Blended Wing Body Demonstrator Aircraft. An overview is presented of the uses of CFD analysis, comparison and contrast with wind tunnel testing, and information derived from CFD analysis that directly related to successful flight test. Lessons learned on the proper and improper application of CFD analysis are presented. Highlights of the flight-test results of the three example X-planes are presented. This report discusses developing an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the areas in which CFD analysis does and does not perform well during this process is presented. How wind tunnel testing complements, calibrates, and verifies CFD analysis is discussed. Lessons learned revealing circumstances under which CFD analysis results can be misleading are given. Strengths and weaknesses of the various flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed.
MULTIVARIATE ANALYSIS OF DRINKING BEHAVIOUR IN A RURAL POPULATION
Mathrubootham, N.; Bashyam, V.S.P.; Shahjahan
1997-01-01
This study was carried out to find out the drinking pattern in a rural population, using multivariate techniques. 386 current users identified in a community were assessed with regard to their drinking behaviours using a structured interview. For purposes of the study the questions were condensed into 46 meaningful variables. In bivariate analysis, 14 variables including dependent variables such as dependence, MAST & CAGE (measuring alcoholic status), Q.F. Index and troubled drinking were found to be significant. Taking these variables and other multivariate techniques too such as ANOVA, correlation, regression analysis and factor analysis were done using both SPSS PC + and HCL magnum mainframe computer with FOCUS package and UNIX systems. Results revealed that number of factors such as drinking style, duration of drinking, pattern of abuse, Q.F. Index and various problems influenced drinking and some of them set up a vicious circle. Factor analysis revealed mainly 3 factors, abuse, dependence and social drinking factors. Dependence could be divided into low/moderate dependence. The implications and practical applications of these tests are also discussed. PMID:21584077
Selective updating of working memory content modulates meso-cortico-striatal activity.
Murty, Vishnu P; Sambataro, Fabio; Radulescu, Eugenia; Altamura, Mario; Iudicello, Jennifer; Zoltick, Bradley; Weinberger, Daniel R; Goldberg, Terry E; Mattay, Venkata S
2011-08-01
Accumulating evidence from non-human primates and computational modeling suggests that dopaminergic signals arising from the midbrain (substantia nigra/ventral tegmental area) mediate striatal gating of the prefrontal cortex during the selective updating of working memory. Using event-related functional magnetic resonance imaging, we explored the neural mechanisms underlying the selective updating of information stored in working memory. Participants were scanned during a novel working memory task that parses the neurophysiology underlying working memory maintenance, overwriting, and selective updating. Analyses revealed a functionally coupled network consisting of a midbrain region encompassing the substantia nigra/ventral tegmental area, caudate, and dorsolateral prefrontal cortex that was selectively engaged during working memory updating compared to the overwriting and maintenance of working memory content. Further analysis revealed differential midbrain-dorsolateral prefrontal interactions during selective updating between low-performing and high-performing individuals. These findings highlight the role of this meso-cortico-striatal circuitry during the selective updating of working memory in humans, which complements previous research in behavioral neuroscience and computational modeling. Published by Elsevier Inc.
You and me and the computer makes three: variations in exam room use of the electronic health record
Saleem, Jason J; Flanagan, Mindy E; Russ, Alissa L; McMullen, Carmit K; Elli, Leora; Russell, Scott A; Bennett, Katelyn J; Matthias, Marianne S; Rehman, Shakaib U; Schwartz, Mark D; Frankel, Richard M
2014-01-01
Challenges persist on how to effectively integrate the electronic health record (EHR) into patient visits and clinical workflow, while maintaining patient-centered care. Our goal was to identify variations in, barriers to, and facilitators of the use of the US Department of Veterans Affairs (VA) EHR in ambulatory care workflow in order better to understand how to integrate the EHR into clinical work. We observed and interviewed 20 ambulatory care providers across three geographically distinct VA medical centers. Analysis revealed several variations in, associated barriers to, and facilitators of EHR use corresponding to different units of analysis: computer interface, team coordination/workflow, and organizational. We discuss our findings in the context of different units of analysis and connect variations in EHR use to various barriers and facilitators. Findings from this study may help inform the design of the next generation of EHRs for the VA and other healthcare systems. PMID:24001517
Determining the accuracy of maximum likelihood parameter estimates with colored residuals
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Klein, Vladislav
1994-01-01
An important part of building high fidelity mathematical models based on measured data is calculating the accuracy associated with statistical estimates of the model parameters. Indeed, without some idea of the accuracy of parameter estimates, the estimates themselves have limited value. In this work, an expression based on theoretical analysis was developed to properly compute parameter accuracy measures for maximum likelihood estimates with colored residuals. This result is important because experience from the analysis of measured data reveals that the residuals from maximum likelihood estimation are almost always colored. The calculations involved can be appended to conventional maximum likelihood estimation algorithms. Simulated data runs were used to show that the parameter accuracy measures computed with this technique accurately reflect the quality of the parameter estimates from maximum likelihood estimation without the need for analysis of the output residuals in the frequency domain or heuristically determined multiplication factors. The result is general, although the application studied here is maximum likelihood estimation of aerodynamic model parameters from flight test data.
Winter, Mark R.; Liu, Mo; Monteleone, David; Melunis, Justin; Hershberg, Uri; Goderie, Susan K.; Temple, Sally; Cohen, Andrew R.
2015-01-01
Summary Time-lapse microscopy can capture patterns of development through multiple divisions for an entire clone of proliferating cells. Images are taken every few minutes over many days, generating data too vast to process completely by hand. Computational analysis of this data can benefit from occasional human guidance. Here we combine improved automated algorithms with minimized human validation to produce fully corrected segmentation, tracking, and lineaging results with dramatic reduction in effort. A web-based viewer provides access to data and results. The improved approach allows efficient analysis of large numbers of clones. Using this method, we studied populations of progenitor cells derived from the anterior and posterior embryonic mouse cerebral cortex, each growing in a standardized culture environment. Progenitors from the anterior cortex were smaller, less motile, and produced smaller clones compared to those from the posterior cortex, demonstrating cell-intrinsic differences that may contribute to the areal organization of the cerebral cortex. PMID:26344906
Najbauer, Eszter E.; Bazsó, Gábor; Apóstolo, Rui; Fausto, Rui; Biczysko, Malgorzata; Barone, Vincenzo; Tarczay, György
2018-01-01
The conformers of α-serine were investigated by matrix-isolation IR spectroscopy combined with NIR laser irradiation. This method, aided by 2D correlation analysis, enabled unambiguously grouping the spectral lines to individual conformers. On the basis of comparison of at least nine experimentally observed vibrational transitions of each conformer with empirically scaled (SQM) and anharmonic (GVPT2) computed IR spectra, 6 conformers were identified. In addition, the presence of at least one more conformer in Ar matrix was proved, and a short-lived conformer with a half-live of (3.7±0.5)·103 s in N2 matrix was generated by NIR irradiation. The analysis of the NIR laser induced conversions revealed that the excitation of the stretching overtone of both the side-chain and the carboxylic OH groups can effectively promote conformational changes, but remarkably different paths were observed for the two kinds of excitations. PMID:26201050
Ginossar, Tamar
2008-01-01
The Internet provides a new modality for health communication by facilitating the creation of virtual communities. These communities have the potential to influence health behavior beyond traditional FTF support groups. This study utilized content analysis of 1,424 e-mail messages posted to 2 online cancer communities to examine uses of these groups. Findings revealed (a) similarities in the content of communication in the 2 virtual communities, (b) gender differences in participation, and (c) differences in utilization of these online groups between patients and family members. These results are discussed in light of the diverse uses of online cancer communities that they reveal, the role of family members in support seeking and provision, and gender communication styles in health computer-mediated communication.
Legal issues of computer imaging in plastic surgery: a primer.
Chávez, A E; Dagum, P; Koch, R J; Newman, J P
1997-11-01
Although plastic surgeons are increasingly incorporating computer imaging techniques into their practices, many fear the possibility of legally binding themselves to achieve surgical results identical to those reflected in computer images. Computer imaging allows surgeons to manipulate digital photographs of patients to project possible surgical outcomes. Some of the many benefits imaging techniques pose include improving doctor-patient communication, facilitating the education and training of residents, and reducing administrative and storage costs. Despite the many advantages computer imaging systems offer, however, surgeons understandably worry that imaging systems expose them to immense legal liability. The possible exploitation of computer imaging by novice surgeons as a marketing tool, coupled with the lack of consensus regarding the treatment of computer images, adds to the concern of surgeons. A careful analysis of the law, however, reveals that surgeons who use computer imaging carefully and conservatively, and adopt a few simple precautions, substantially reduce their vulnerability to legal claims. In particular, surgeons face possible claims of implied contract, failure to instruct, and malpractice from their use or failure to use computer imaging. Nevertheless, legal and practical obstacles frustrate each of those causes of actions. Moreover, surgeons who incorporate a few simple safeguards into their practice may further reduce their legal susceptibility.
Phlegmonous gastritis associated with group A streptococcal toxic shock syndrome.
Morimoto, Masaya; Tamura, Shinobu; Hayakawa, Takahiro; Yamanishi, Hirofumi; Nakamoto, Chiaki; Nakamoto, Hiromichi; Ikebe, Tadayoshi; Nakano, Yoshio; Fujimoto, Tokuzo
2014-01-01
Phlegmonous gastritis (PG) is a rare, acute, severe infectious disease of the gastric wall that is often fatal due to Streptococcus spp. A 77-year-old man with diabetes and a gastric ulcer was urgently admitted due to prolonged nausea and vomiting. Computed tomography revealed widespread diffuse thickening of the gastric wall, and PG was suspected. The patient expired less than 9 hours after admission despite intensive treatments. Later, an analysis of the blood and gastric juice revealed group A streptococcus (GAS) and virulence factors associated with toxic shock syndrome (TSS). We herein diagnosed a patient with an extremely aggressive course of PG caused by GAS TSS.
Sato, Mitsuo; Okachi, Shotaro; Fukihara, Jun; Shimoyama, Yoshie; Wakahara, Keiko; Sakakibara, Toshihiro; Hase, Tetsunari; Onishi, Yasuharu; Ogura, Yasuhiro; Maeda, Osamu; Hasegawa, Yoshinori
2018-05-15
We herein report a case of lung metastases with unusual radiological appearances that mimicked those of chronic airway infection, causing diagnostic difficulty. A 60-year-old woman who underwent liver transplantation from a living donor was incidentally diagnosed with bile duct adenocarcinoma after a histopathological analysis of her explanted liver. Six months later, chest computed tomography (CT) revealed bilateral bronchogenic dissemination that had gradually worsened, suggesting chronic airway infection. A biopsy with bronchoscopy from a mass lesion beyond a segmental bronchus revealed adenocarcinoma identical to that of her bile duct adenocarcinoma, leading to the diagnosis of multiple lung metastases from bile duct adenocarcinoma.
Sato, Mitsuo; Okachi, Shotaro; Fukihara, Jun; Shimoyama, Yoshie; Wakahara, Keiko; Sakakibara, Toshihiro; Hase, Tetsunari; Onishi, Yasuharu; Ogura, Yasuhiro; Maeda, Osamu; Hasegawa, Yoshinori
2017-01-01
We herein report a case of lung metastases with unusual radiological appearances that mimicked those of chronic airway infection, causing diagnostic difficulty. A 60-year-old woman who underwent liver transplantation from a living donor was incidentally diagnosed with bile duct adenocarcinoma after a histopathological analysis of her explanted liver. Six months later, chest computed tomography (CT) revealed bilateral bronchogenic dissemination that had gradually worsened, suggesting chronic airway infection. A biopsy with bronchoscopy from a mass lesion beyond a segmental bronchus revealed adenocarcinoma identical to that of her bile duct adenocarcinoma, leading to the diagnosis of multiple lung metastases from bile duct adenocarcinoma. PMID:29279503
Adversarial risk analysis with incomplete information: a level-k approach.
Rothschild, Casey; McLay, Laura; Guikema, Seth
2012-07-01
This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.
Fundamental analysis of the failure of polymer-based fiber reinforced composites
NASA Technical Reports Server (NTRS)
Kanninen, M. F.; Rybicki, E. F.; Griffith, W. I.; Broek, D.
1976-01-01
A mathematical model is described which will permit predictions of the strength of fiber reinforced composites containing known flaws to be made from the basic properties of their constituents. The approach was to embed a local heterogeneous region (LHR) surrounding the crack tip into an anisotropic elastic continuum. The model should (1) permit an explicit analysis of the micromechanical processes involved in the fracture process, and (2) remain simple enough to be useful in practical computations. Computations for arbitrary flaw size and orientation under arbitrary applied load combinations were performed from unidirectional composites with linear elastic-brittle constituent behavior. The mechanical properties were nominally those of graphite epoxy. With the rupture properties arbitrarily varied to test the capability of the model to reflect real fracture modes in fiber composites, it was shown that fiber breakage, matrix crazing, crack bridging, matrix-fiber debonding, and axial splitting can all occur during a period of (gradually) increasing load prior to catastrophic fracture. The computations reveal qualitatively the sequential nature of the stable crack process that precedes fracture.
On the elastic–plastic decomposition of crystal deformation at the atomic scale
Stukowski, Alexander; Arsenlis, A.
2012-03-02
Given two snapshots of an atomistic system, taken at different stages of the deformation process, one can compute the incremental deformation gradient field, F, as defined by continuum mechanics theory, from the displacements of atoms. However, such a kinematic analysis of the total deformation does not reveal the respective contributions of elastic and plastic deformation. We develop a practical technique to perform the multiplicative decomposition of the deformation field, F = F eF p, into elastic and plastic parts for the case of crystalline materials. The described computational analysis method can be used to quantify plastic deformation in a materialmore » due to crystal slip-based mechanisms in molecular dynamics and molecular statics simulations. The knowledge of the plastic deformation field, F p, and its variation with time can provide insight into the number, motion and localization of relevant crystal defects such as dislocations. As a result, the computed elastic field, F e, provides information about inhomogeneous lattice strains and lattice rotations induced by the presence of defects.« less
Eyben, Florian; Weninger, Felix; Lehment, Nicolas; Schuller, Björn; Rigoll, Gerhard
2013-01-01
Without doubt general video and sound, as found in large multimedia archives, carry emotional information. Thus, audio and video retrieval by certain emotional categories or dimensions could play a central role for tomorrow's intelligent systems, enabling search for movies with a particular mood, computer aided scene and sound design in order to elicit certain emotions in the audience, etc. Yet, the lion's share of research in affective computing is exclusively focusing on signals conveyed by humans, such as affective speech. Uniting the fields of multimedia retrieval and affective computing is believed to lend to a multiplicity of interesting retrieval applications, and at the same time to benefit affective computing research, by moving its methodology "out of the lab" to real-world, diverse data. In this contribution, we address the problem of finding "disturbing" scenes in movies, a scenario that is highly relevant for computer-aided parental guidance. We apply large-scale segmental feature extraction combined with audio-visual classification to the particular task of detecting violence. Our system performs fully data-driven analysis including automatic segmentation. We evaluate the system in terms of mean average precision (MAP) on the official data set of the MediaEval 2012 evaluation campaign's Affect Task, which consists of 18 original Hollywood movies, achieving up to .398 MAP on unseen test data in full realism. An in-depth analysis of the worth of individual features with respect to the target class and the system errors is carried out and reveals the importance of peak-related audio feature extraction and low-level histogram-based video analysis.
Eyben, Florian; Weninger, Felix; Lehment, Nicolas; Schuller, Björn; Rigoll, Gerhard
2013-01-01
Without doubt general video and sound, as found in large multimedia archives, carry emotional information. Thus, audio and video retrieval by certain emotional categories or dimensions could play a central role for tomorrow's intelligent systems, enabling search for movies with a particular mood, computer aided scene and sound design in order to elicit certain emotions in the audience, etc. Yet, the lion's share of research in affective computing is exclusively focusing on signals conveyed by humans, such as affective speech. Uniting the fields of multimedia retrieval and affective computing is believed to lend to a multiplicity of interesting retrieval applications, and at the same time to benefit affective computing research, by moving its methodology “out of the lab” to real-world, diverse data. In this contribution, we address the problem of finding “disturbing” scenes in movies, a scenario that is highly relevant for computer-aided parental guidance. We apply large-scale segmental feature extraction combined with audio-visual classification to the particular task of detecting violence. Our system performs fully data-driven analysis including automatic segmentation. We evaluate the system in terms of mean average precision (MAP) on the official data set of the MediaEval 2012 evaluation campaign's Affect Task, which consists of 18 original Hollywood movies, achieving up to .398 MAP on unseen test data in full realism. An in-depth analysis of the worth of individual features with respect to the target class and the system errors is carried out and reveals the importance of peak-related audio feature extraction and low-level histogram-based video analysis. PMID:24391704
Denisova, Galina F; Denisov, Dimitri A; Yeung, Jeffrey; Loeb, Mark B; Diamond, Michael S; Bramson, Jonathan L
2008-11-01
Understanding antibody function is often enhanced by knowledge of the specific binding epitope. Here, we describe a computer algorithm that permits epitope prediction based on a collection of random peptide epitopes (mimotopes) isolated by antibody affinity purification. We applied this methodology to the prediction of epitopes for five monoclonal antibodies against the West Nile virus (WNV) E protein, two of which exhibit therapeutic activity in vivo. This strategy was validated by comparison of our results with existing F(ab)-E protein crystal structures and mutational analysis by yeast surface display. We demonstrate that by combining the results of the mimotope method with our data from mutational analysis, epitopes could be predicted with greater certainty. The two methods displayed great complementarity as the mutational analysis facilitated epitope prediction when the results with the mimotope method were equivocal and the mimotope method revealed a broader number of residues within the epitope than the mutational analysis. Our results demonstrate that the combination of these two prediction strategies provides a robust platform for epitope characterization.
Acute hind limb paralysis secondary to an extradural spinal cord Cryptococcus gattii lesion in a dog
Kurach, Lindsey; Wojnarowicz, Chris; Wilkinson, Tom; Sereda, Colin
2013-01-01
A 2-year-old, spayed female, German short-haired pointer was presented with a 1-day history of non-ambulatory paraplegia with absent deep pain perception. A computed tomography scan revealed an irregular eighth thoracic vertebral body and an extradural compressive lesion. Decompression was performed and abnormal tissues were submitted for analysis. Findings were consistent with a Cryptococcus gattii infection. PMID:24155428
A Study of the Utilization Patterns of an Elementary School-Based Health Clinic over a 5-Year Period
ERIC Educational Resources Information Center
Johnson, Veda; Hutcherson, Valerie
2006-01-01
The purpose of this study was to determine the utilization pattern of an elementary school-based clinic over a 5-year period. It involved a retrospective analysis of computer-based data for all patient visits during this study period. Results revealed high clinic utilization with an average of over 5 encounters for all users each year. The most…
Infrasound Signals from Ground-Motion Sources
2008-09-01
signals as a basis for discriminants between underground nuclear tests ( UGT ) and earthquakes (EQ). In an earlier program, infrasound signals from... UGTs and EQs were collected at ranges of a few hundred kilometers, in the far-field. Analysis of these data revealed two parameters that had potential...well. To study the near-field signals, we are using computational techniques based on modeled ground motions from UGTs and EQs. One is the closed
Abraham, Jose P; Sajan, D; Joe, I Hubert; Jayakumar, V S
2008-11-15
The infrared absorption, Raman spectra and SERS spectra of p-amino acetanilide have been analyzed with the aid of density functional theory calculations at B3LYP/6-311G(d,p) level. The electric dipole moment (mu) and the first hyperpolarizability (beta) values of the investigated molecule have been computed using ab initio quantum mechanical calculations. The calculation results also show that the synthesized molecule might have microscopic nonlinear optical (NLO) behavior with non-zero values. Computed geometries reveal that the PAA molecule is planar, while secondary amide group is twisted with respect to the phenyl ring is found, upon hydrogen bonding. The hyperconjugation of the C=O group with adjacent C-C bond and donor-acceptor interaction associated with the secondary amide have been investigated using computed geometry. The carbonyl stretching band position is found to be influenced by the tendency of phenyl ring to withdraw nitrogen lone pair, intermolecular hydrogen bonding, conjugation and hyperconjugation. The existence of intramolecular C=O...H hydrogen bonded have been investigated by means of the natural bonding orbital (NBO) analysis. The influence of the decrease of N-H and C=O bond orders and increase of C-N bond orders due to donor-acceptor interaction has been identified in the vibrational spectra. The SERS spectral analysis reveals that the large enhancement of in-plane bending, out of plane bending and ring breathing modes in the surface-enhanced Raman scattering spectrum indicates that the molecule is adsorbed on the silver surface in a 'atleast vertical' configuration, with the ring perpendicular to the silver surface.
NASA Astrophysics Data System (ADS)
Abraham, Jose P.; Sajan, D.; Joe, I. Hubert; Jayakumar, V. S.
2008-11-01
The infrared absorption, Raman spectra and SERS spectra of p-amino acetanilide have been analyzed with the aid of density functional theory calculations at B3LYP/6-311G(d,p) level. The electric dipole moment ( μ) and the first hyperpolarizability ( β) values of the investigated molecule have been computed using ab initio quantum mechanical calculations. The calculation results also show that the synthesized molecule might have microscopic nonlinear optical (NLO) behavior with non-zero values. Computed geometries reveal that the PAA molecule is planar, while secondary amide group is twisted with respect to the phenyl ring is found, upon hydrogen bonding. The hyperconjugation of the C dbnd O group with adjacent C-C bond and donor-acceptor interaction associated with the secondary amide have been investigated using computed geometry. The carbonyl stretching band position is found to be influenced by the tendency of phenyl ring to withdraw nitrogen lone pair, intermolecular hydrogen bonding, conjugation and hyperconjugation. The existence of intramolecular C dbnd O⋯H hydrogen bonded have been investigated by means of the natural bonding orbital (NBO) analysis. The influence of the decrease of N-H and C dbnd O bond orders and increase of C-N bond orders due to donor-acceptor interaction has been identified in the vibrational spectra. The SERS spectral analysis reveals that the large enhancement of in-plane bending, out of plane bending and ring breathing modes in the surface-enhanced Raman scattering spectrum indicates that the molecule is adsorbed on the silver surface in a 'atleast vertical' configuration, with the ring perpendicular to the silver surface.
Bryan, Rebecca; Nair, Prasanth B; Taylor, Mark
2009-09-18
Interpatient variability is often overlooked in orthopaedic computational studies due to the substantial challenges involved in sourcing and generating large numbers of bone models. A statistical model of the whole femur incorporating both geometric and material property variation was developed as a potential solution to this problem. The statistical model was constructed using principal component analysis, applied to 21 individual computer tomography scans. To test the ability of the statistical model to generate realistic, unique, finite element (FE) femur models it was used as a source of 1000 femurs to drive a study on femoral neck fracture risk. The study simulated the impact of an oblique fall to the side, a scenario known to account for a large proportion of hip fractures in the elderly and have a lower fracture load than alternative loading approaches. FE model generation, application of subject specific loading and boundary conditions, FE processing and post processing of the solutions were completed automatically. The generated models were within the bounds of the training data used to create the statistical model with a high mesh quality, able to be used directly by the FE solver without remeshing. The results indicated that 28 of the 1000 femurs were at highest risk of fracture. Closer analysis revealed the percentage of cortical bone in the proximal femur to be a crucial differentiator between the failed and non-failed groups. The likely fracture location was indicated to be intertrochantic. Comparison to previous computational, clinical and experimental work revealed support for these findings.
REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang
2013-04-30
Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less
NASA Astrophysics Data System (ADS)
Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai
2017-08-01
Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.
Anastasio, Thomas J
2013-01-01
Fear conditioning, in which a cue is conditioned to elicit a fear response, and extinction, in which a previously conditioned cue no longer elicits a fear response, depend on neural plasticity occurring within the amygdala. Projection neurons in the basolateral amygdala (BLA) learn to respond to the cue during fear conditioning, and they mediate fear responding by transferring cue signals to the output stage of the amygdala. Some BLA projection neurons retain their cue responses after extinction. Recent work shows that activation of the endocannabinoid system is necessary for extinction, and it leads to long-term depression (LTD) of the GABAergic synapses that inhibitory interneurons make onto BLA projection neurons. Such GABAergic LTD would enhance the responses of the BLA projection neurons that mediate fear responding, so it would seem to oppose, rather than promote, extinction. To address this paradox, a computational analysis of two well-known conceptual models of amygdaloid plasticity was undertaken. The analysis employed exhaustive state-space search conducted within a declarative programming environment. The analysis reveals that GABAergic LTD actually increases the number of synaptic strength configurations that achieve extinction while preserving the cue responses of some BLA projection neurons in both models. The results suggest that GABAergic LTD helps the amygdala retain cue memory during extinction even as the amygdala learns to suppress the previously conditioned response. The analysis also reveals which features of both models are essential for their ability to achieve extinction with some cue memory preservation, and suggests experimental tests of those features.
Anastasio, Thomas J.
2013-01-01
Fear conditioning, in which a cue is conditioned to elicit a fear response, and extinction, in which a previously conditioned cue no longer elicits a fear response, depend on neural plasticity occurring within the amygdala. Projection neurons in the basolateral amygdala (BLA) learn to respond to the cue during fear conditioning, and they mediate fear responding by transferring cue signals to the output stage of the amygdala. Some BLA projection neurons retain their cue responses after extinction. Recent work shows that activation of the endocannabinoid system is necessary for extinction, and it leads to long-term depression (LTD) of the GABAergic synapses that inhibitory interneurons make onto BLA projection neurons. Such GABAergic LTD would enhance the responses of the BLA projection neurons that mediate fear responding, so it would seem to oppose, rather than promote, extinction. To address this paradox, a computational analysis of two well-known conceptual models of amygdaloid plasticity was undertaken. The analysis employed exhaustive state-space search conducted within a declarative programming environment. The analysis reveals that GABAergic LTD actually increases the number of synaptic strength configurations that achieve extinction while preserving the cue responses of some BLA projection neurons in both models. The results suggest that GABAergic LTD helps the amygdala retain cue memory during extinction even as the amygdala learns to suppress the previously conditioned response. The analysis also reveals which features of both models are essential for their ability to achieve extinction with some cue memory preservation, and suggests experimental tests of those features. PMID:23761759
Ungprasert, Patompong; Wilton, Katelynn M; Ernste, Floranne C; Kalra, Sanjay; Crowson, Cynthia S; Rajagopalan, Srinivasan; Bartholmai, Brian J
2017-10-01
To evaluate the correlation between measurements from quantitative thoracic high-resolution CT (HRCT) analysis with "Computer-Aided Lung Informatics for Pathology Evaluation and Rating" (CALIPER) software and measurements from pulmonary function tests (PFTs) in patients with idiopathic inflammatory myopathies (IIM)-associated interstitial lung disease (ILD). A cohort of patients with IIM-associated ILD seen at Mayo Clinic was identified from medical record review. Retrospective analysis of HRCT data and PFTs at baseline and 1 year was performed. The abnormalities in HRCT were quantified using CALIPER software. A total of 110 patients were identified. At baseline, total interstitial abnormalities as measured by CALIPER, both by absolute volume and by percentage of total lung volume, had a significant negative correlation with diffusing capacity for carbon monoxide (DLCO), total lung capacity (TLC), and oxygen saturation. Analysis by subtype of interstitial abnormality revealed significant negative correlations between ground glass opacities (GGO) and reticular density (RD) with DLCO and TLC. At one year, changes of total interstitial abnormalities compared with baseline had a significant negative correlation with changes of TLC and oxygen saturation. A negative correlation between changes of total interstitial abnormalities and DLCO was also observed, but it was not statistically significant. Analysis by subtype of interstitial abnormality revealed negative correlations between changes of GGO and RD and changes of DLCO, TLC, and oxygen saturation, but most of the correlations did not achieve statistical significance. CALIPER measurements correlate well with functional measurements in patients with IIM-associated ILD.
Fundamental analysis of the failure of polymer-based fiber reinforced composites
NASA Technical Reports Server (NTRS)
Kanninen, M. F.; Rybicki, E. F.; Griffith, W. I.; Broek, D.
1975-01-01
A mathematical model predicting the strength of unidirectional fiber reinforced composites containing known flaws and with linear elastic-brittle material behavior was developed. The approach was to imbed a local heterogeneous region surrounding the crack tip into an anisotropic elastic continuum. This (1) permits an explicit analysis of the micromechanical processes involved in the fracture, and (2) remains simple enough to be useful in practical computations. Computations for arbitrary flaw size and orientation under arbitrary applied loads were performed. The mechanical properties were those of graphite epoxy. With the rupture properties arbitrarily varied to test the capabilities of the model to reflect real fracture modes, it was shown that fiber breakage, matrix crazing, crack bridging, matrix-fiber debonding, and axial splitting can all occur during a period of (gradually) increasing load prior to catastrophic failure. The calculations also reveal the sequential nature of the stable crack growth process proceding fracture.
Extraction and Analysis of Display Data
NASA Technical Reports Server (NTRS)
Land, Chris; Moye, Kathryn
2008-01-01
The Display Audit Suite is an integrated package of software tools that partly automates the detection of Portable Computer System (PCS) Display errors. [PCS is a lap top computer used onboard the International Space Station (ISS).] The need for automation stems from the large quantity of PCS displays (6,000+, with 1,000,000+ lines of command and telemetry data). The Display Audit Suite includes data-extraction tools, automatic error detection tools, and database tools for generating analysis spread sheets. These spread sheets allow engineers to more easily identify many different kinds of possible errors. The Suite supports over 40 independent analyses, 16 NASA Tech Briefs, November 2008 and complements formal testing by being comprehensive (all displays can be checked) and by revealing errors that are difficult to detect via test. In addition, the Suite can be run early in the development cycle to find and correct errors in advance of testing.
Computer analysis of lighting style in fine art: steps towards inter-artist studies
NASA Astrophysics Data System (ADS)
Stork, David G.
2011-03-01
Stylometry in visual art-the mathematical description of artists' styles - has been based on a number of properties of works, such as color, brush stroke shape, visual texture, and measures of contours' curvatures. We introduce the concept of quantitative measures of lighting, such as statistical descriptions of spatial coherence, diuseness, and so forth, as properties of artistic style. Some artists of the high Renaissance, such as Leonardo, worked from nature and strove to render illumination "faithfully" photorealists, such as Richard Estes, worked from photographs and duplicated the "physics based" lighting accurately. As such, each had dierent motivations, methodologies, stagings, and "accuracies" in rendering lighting clues. Perceptual studies show that observers are poor judges of properties of lighting in photographs such as consistency (and thus by extension in paintings as well); computer methods such as rigorous cast-shadow analysis, occluding-contour analysis and spherical harmonic based estimation of light fields can be quite accurate. For this reasons, computer lighting analysis can provide a new tools for art historical studies. We review lighting analysis in paintings such as Vermeer's Girl with a pearl earring, de la Tour's Christ in the carpenter's studio, Caravaggio's Magdalen with the smoking flame and Calling of St. Matthew) and extend our corpus to works where lighting coherence is of interest to art historians, such as Caravaggio's Adoration of the Shepherds or Nativity (1609) in the Capuchin church of Santa Maria degli Angeli. Our measure of lighting coherence may help reveal the working methods of some artists and in diachronic studies of individual artists. We speculate on artists and art historical questions that may ultimately profit from future renements to these new computational tools.
Testing alternative ground water models using cross-validation and other methods
Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.
2007-01-01
Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.
How Do Students Misunderstand Number Representations?
ERIC Educational Resources Information Center
Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.
2011-01-01
We used both student interviews and diagnostic testing to reveal students' misconceptions about number representations in computing systems. This article reveals that students who have passed an undergraduate level computer organization course still possess surprising misconceptions about positional notations, two's complement representation, and…
Sabti, Ahmed Abdulateef; Chaichan, Rasha Sami
2014-01-01
This study examines the attitudes of Saudi Arabian high school students toward the use of computer technologies in learning English. The study also discusses the possible barriers that affect and limit the actual usage of computers. Quantitative approach is applied in this research, which involved 30 Saudi Arabia students of a high school in Kuala Lumpur, Malaysia. The respondents comprised 15 males and 15 females with ages between 16 years and 18 years. Two instruments, namely, Scale of Attitude toward Computer Technologies (SACT) and Barriers affecting Students' Attitudes and Use (BSAU) were used to collect data. The Technology Acceptance Model (TAM) of Davis (1989) was utilized. The analysis of the study revealed gender differences in attitudes toward the use of computer technologies in learning English. Female students showed high and positive attitudes towards the use of computer technologies in learning English than males. Both male and female participants demonstrated high and positive perception of Usefulness and perceived Ease of Use of computer technologies in learning English. Three barriers that affected and limited the use of computer technologies in learning English were identified by the participants. These barriers are skill, equipment, and motivation. Among these barriers, skill had the highest effect, whereas motivation showed the least effect.
A review of evaluative studies of computer-based learning in nursing education.
Lewis, M J; Davies, R; Jenkins, D; Tait, M I
2001-01-01
Although there have been numerous attempts to evaluate the learning benefits of computer-based learning (CBL) packages in nursing education, the results obtained have been equivocal. A literature search conducted for this review found 25 reports of the evaluation of nursing CBL packages since 1966. Detailed analysis of the evaluation methods used in these reports revealed that most had significant design flaws, including the use of too small a sample group, the lack of a control group, etc. Because of this, the conclusions reached were not always valid. More effort is required in the design of future evaluation studies of nursing CBL packages. Copyright 2001 Harcourt Publishers Ltd.
Miljković, Filip; Kunimoto, Ryo; Bajorath, Jürgen
2017-08-01
Computational exploration of small-molecule-based relationships between target proteins from different families. Target annotations of drugs and other bioactive compounds were systematically analyzed on the basis of high-confidence activity data. A total of 286 novel chemical links were established between distantly related or unrelated target proteins. These relationships involved a total of 1859 bioactive compounds including 147 drugs and 141 targets. Computational analysis of large amounts of compounds and activity data has revealed unexpected relationships between diverse target proteins on the basis of compounds they share. These relationships are relevant for drug discovery efforts. Target pairs that we have identified and associated compound information are made freely available.
Revision by means of computer-mediated peer discussions
NASA Astrophysics Data System (ADS)
Soong, Benson; Mercer, Neil; Er, Siew Shin
2010-05-01
In this article, we provide a discussion on our revision method (termed prescriptive tutoring) aimed at revealing students' misconceptions and misunderstandings by getting them to solve physics problems with an anonymous partner via the computer. It is currently being implemented and evaluated in a public secondary school in Singapore, and statistical analysis of our initial small-scale study shows that students in the experimental group significantly outperformed students in both the control and alternative intervention groups. In addition, students in the experimental group perceived that they had gained improved understanding of the physics concepts covered during the intervention, and reported that they would like to continue revising physics concepts using the intervention methods.
Conjugate Heat Transfer Analyses on the Manifold for Ramjet Fuel Injectors
NASA Technical Reports Server (NTRS)
Wang, Xiao-Yen J.
2006-01-01
Three-dimensional conjugate heat transfer analyses on the manifold located upstream of the ramjet fuel injector are performed using CFdesign, a finite-element computational fluid dynamics (CFD) software. The flow field of the hot fuel (JP-7) flowing through the manifold is simulated and the wall temperature of the manifold is computed. The three-dimensional numerical results of the fuel temperature are compared with those obtained using a one-dimensional analysis based on empirical equations, and they showed a good agreement. The numerical results revealed that it takes around 30 to 40 sec to reach the equilibrium where the fuel temperature has dropped about 3 F from the inlet to the exit of the manifold.
Using Technology to Facilitate Collaboration in Community-Based Participatory Research (CBPR)
Jessell, Lauren; Smith, Vivian; Jemal, Alexis; Windsor, Liliane
2017-01-01
This study explores the use of Computer-Supported Collaborative Work (CSCW) technologies, by way of a computer-based system called iCohere. This system was used to facilitate collaboration conducting Community-Based Participatory Research (CBPR). Data was gathered from 13 members of a Community Collaborative Board (CCB). Analysis revealed that iCohere served the following functions: facilitating communication, providing a depository for information and resource sharing, and allowing for remote meeting attendance. Results indicated that while iCohere was useful in performing these functions, less expensive technologies had the potential to achieve similar goals if properly implemented. Implications for future research on CSCW systems and CBPR are discussed. PMID:29056871
A rare case of dedifferentiated liposarcoma of the sinonasal cavity: A case report.
Miyazaki, Masaru; Aoki, Mikiko; Oba, Satoru; Sakata, Toshifumi; Nakagawa, Takashi; Nabeshima, Kazuki
2017-10-01
Sarcoma is an uncommon histopathological presentation of sinonasal tumors, comprising ~15% of all cases; liposarcoma is particularly uncommon. An analysis of the available medical literature revealed no prior reports of dedifferentiated liposarcoma (DDLPS) of the sinonasal cavity. This case report presents a rare case of DDLPS of the sinonasal cavity. A 40-year old six-week pregnant female was admitted with a left nasal obstruction. Endoscopic evaluation of the left nasal cavity revealed a polypoid lesion. A computed tomography scan indicated a mass invading the left nasal cavity, maxillary sinus and anterior ethmoid sinus with focal destruction of the surrounding bone. A biopsy of the tumor was performed and hematoxylin and eosin staining of the tissue sections revealed proliferation of atypical and pleomorphic spindle cells with enlarged or elongated hyperchromatic nuclei and occasional vacuolated cytoplasm arranged in short interlacing fascicles or storiform structures, accompanied by tumor necrosis. These findings were consistent with undifferentiated pleomorphic sarcoma. Immunohistochemically, the tumor cells were positive for cyclin dependent kinase 4, mouse double minute 2 homolog (MDM2) and adipophilin. Fluorescence in situ hybridization (FISH) analysis revealed amplification of the MDM2 gene. Recently, undifferentiated pleomorphic sarcoma without areas of well-differentiated liposarcoma but with MDM2 amplification is regarded as conventional DDLPS. In the present case, the tumor was diagnosed as a DDLPS due to the results of histopathological, immunohistochemical and FISH analysis.
NASA Technical Reports Server (NTRS)
Bartels, Robert E.
1998-01-01
Flow and turbulence models applied to the problem of shock buffet onset are studied. The accuracy of the interactive boundary layer and the thin-layer Navier-Stokes equations solved with recent upwind techniques using similar transport field equation turbulence models is assessed for standard steady test cases, including conditions having significant shock separation. The two methods are found to compare well in the shock buffet onset region of a supercritical airfoil that involves strong trailing-edge separation. A computational analysis using the interactive-boundary layer has revealed a Reynolds scaling effect in the shock buffet onset of the supercritical airfoil, which compares well with experiment. The methods are next applied to a conventional airfoil. Steady shock-separated computations of the conventional airfoil with the two methods compare well with experiment. Although the interactive boundary layer computations in the shock buffet region compare well with experiment for the conventional airfoil, the thin-layer Navier-Stokes computations do not. These findings are discussed in connection with possible mechanisms important in the onset of shock buffet and the constraints imposed by current numerical modeling techniques.
Jaspard, Emmanuel
2006-01-01
Background There are three isoforms of glutamate dehydrogenase. The isoform EC 1.4.1.4 (GDH4) catalyses glutamate synthesis from 2-oxoglutarate and ammonium, using NAD(P)H. Ammonium assimilation is critical for plant growth. Although GDH4 from animals and prokaryotes are well characterized, there are few data concerning plant GDH4, even from those whose genomes are well annotated. Results A large set of the three GDH isoforms was built resulting in 116 non-redundant full polypeptide sequences. A computational analysis was made to gain more information concerning the structure – function relationship of GDH4 from plants (Eukaryota, Viridiplantae). The tested plant GDH4 sequences were the two ones known to date, those of Chlorella sorokiniana. This analysis revealed several structural features specific of plant GDH4: (i) the lack of a structure called "antenna"; (ii) the NAD(P)-binding motif GAGNVA; and (iii) a second putative coenzyme-binding motif GVLTGKG together with four residues involved in the binding of the reduced form of NADP. Conclusion A number of structural features specific of plant GDH4 have been found. The results reinforce the probable key role of GDH4 in ammonium assimilation by plants. Reviewers This article was reviewed by Tina Bakolitsa (nominated by Eugene Koonin), Martin Jambon (nominated by Laura Landweber), Sandor Pangor and Franck Eisenhaber. PMID:17173671
Cuijpers, Vincent M J I; Jaroszewicz, Jacub; Anil, Sukumaran; Al Farraj Aldosari, Abdullah; Walboomers, X Frank; Jansen, John A
2014-03-01
The aims of this study were (i) to determine the spatial resolution and sensitivity of micro- versus nano-computed tomography (CT) techniques and (ii) to validate micro- versus nano-CT in a dog dental implant model, comparative to histological analysis. To determine spatial resolution and sensitivity, standardized reference samples containing standardized nano- and microspheres were prepared in polymer and ceramic matrices. Thereafter, 10 titanium-coated polymer dental implants (3.2 mm in Ø by 4 mm in length) were placed in the mandible of Beagle dogs. Both micro- and nano-CT, as well as histological analyses, were performed. The reference samples confirmed the high resolution of the nano-CT system, which was capable of revealing sub-micron structures embedded in radiodense matrices. The dog implantation study and subsequent statistical analysis showed equal values for bone area and bone-implant contact measurements between micro-CT and histology. However, because of the limited sample size and field of view, nano-CT was not rendering reliable data representative of the entire bone-implant specimen. Micro-CT analysis is an efficient tool to quantitate bone healing parameters at the bone-implant interface, especially when using titanium-coated PMMA implants. Nano-CT is not suitable for such quantification, but reveals complementary morphological information rivaling histology, yet with the advantage of a 3D visualization. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.
Roddy, Karen A.; Prendergast, Patrick J.; Murphy, Paula
2011-01-01
Very little is known about the regulation of morphogenesis in synovial joints. Mechanical forces generated from muscle contractions are required for normal development of several aspects of normal skeletogenesis. Here we show that biophysical stimuli generated by muscle contractions impact multiple events during chick knee joint morphogenesis influencing differential growth of the skeletal rudiment epiphyses and patterning of the emerging tissues in the joint interzone. Immobilisation of chick embryos was achieved through treatment with the neuromuscular blocking agent Decamethonium Bromide. The effects on development of the knee joint were examined using a combination of computational modelling to predict alterations in biophysical stimuli, detailed morphometric analysis of 3D digital representations, cell proliferation assays and in situ hybridisation to examine the expression of a selected panel of genes known to regulate joint development. This work revealed the precise changes to shape, particularly in the distal femur, that occur in an altered mechanical environment, corresponding to predicted changes in the spatial and dynamic patterns of mechanical stimuli and region specific changes in cell proliferation rates. In addition, we show altered patterning of the emerging tissues of the joint interzone with the loss of clearly defined and organised cell territories revealed by loss of characteristic interzone gene expression and abnormal expression of cartilage markers. This work shows that local dynamic patterns of biophysical stimuli generated from muscle contractions in the embryo act as a source of positional information guiding patterning and morphogenesis of the developing knee joint. PMID:21386908
Hsiao, Tzu-Hung; Chiu, Yu-Chiao; Hsu, Pei-Yin; Lu, Tzu-Pin; Lai, Liang-Chuan; Tsai, Mong-Hsun; Huang, Tim H.-M.; Chuang, Eric Y.; Chen, Yidong
2016-01-01
Several mutual information (MI)-based algorithms have been developed to identify dynamic gene-gene and function-function interactions governed by key modulators (genes, proteins, etc.). Due to intensive computation, however, these methods rely heavily on prior knowledge and are limited in genome-wide analysis. We present the modulated gene/gene set interaction (MAGIC) analysis to systematically identify genome-wide modulation of interaction networks. Based on a novel statistical test employing conjugate Fisher transformations of correlation coefficients, MAGIC features fast computation and adaption to variations of clinical cohorts. In simulated datasets MAGIC achieved greatly improved computation efficiency and overall superior performance than the MI-based method. We applied MAGIC to construct the estrogen receptor (ER) modulated gene and gene set (representing biological function) interaction networks in breast cancer. Several novel interaction hubs and functional interactions were discovered. ER+ dependent interaction between TGFβ and NFκB was further shown to be associated with patient survival. The findings were verified in independent datasets. Using MAGIC, we also assessed the essential roles of ER modulation in another hormonal cancer, ovarian cancer. Overall, MAGIC is a systematic framework for comprehensively identifying and constructing the modulated interaction networks in a whole-genome landscape. MATLAB implementation of MAGIC is available for academic uses at https://github.com/chiuyc/MAGIC. PMID:26972162
An oculomotor and computational study of a patient with diagonistic dyspraxia.
Pouget, Pierre; Pradat-Diehl, Pascale; Rivaud-Péchoux, Sophie; Wattiez, Nicolas; Gaymard, Bertrand
2011-04-01
Diagonistic dyspraxia (DD) is a behavioural disorder encountered in split-brain subjects in which the left arm acts against the subject's will, deliberately counteracting what the right arm does. We report here an oculomotor and computational study of a patient with a long lasting form of DD. A first series of oculomotor paradigms revealed marked and unprecedented saccade impairments. We used a computational model in order to provide information about the impaired decision-making process: the analysis of saccade latencies revealed that variations of decision times were explained by adjustments of response criterion. This result and paradoxical impairments observed in additional oculomotor paradigms allowed to propose that this adjustment of the criterion level resulted from the co-existence of counteracting oculomotor programs, consistent with the existence of antagonist programs in homotopic cortical areas. In the intact brain, trans-hemispheric inhibition would allow suppression of these counter programs. Depending on the topography of the disconnected areas, various motor and/or behavioural impairments would arise in split-brain subjects. In motor systems, such conflict would result in increased criteria for desired movement execution (oculomotor system) or in simultaneous execution of counteracting movements (skeletal motor system). At higher cognitive levels, it may result in conflict of intentions. Copyright © 2010 Elsevier Srl. All rights reserved.
Pilia, P A; Swain, R P; Williams, A V; Loadholt, C B; Ainsworth, S K
1985-12-01
The cationic ultrastructural tracer polyethyleneimine (PEI: pI approximately equal to 11.0), binds electrophysically to uniformly spaced discrete electron-dense anionic sites present in the laminae rarae of the rat glomerular basement membrane (GBM), mesangial reflections of the GBM, Bowman's capsule, and tubular basement membranes when administered intravenously. Computer-assisted morphometric analysis of glomerular anionic sites reveals that the maximum concentration of stainable lamina rara externa (lre) sites (21/10,000 A GBM) occurs 60 minutes after PEI injection with a site-site interspacing of 460 A. Lamina rara interna (lri) sites similarly demonstrate a maximum concentration (20/10,000 A GBM) at 60 minutes with a periodicity of 497 A. The concentration and distribution of anionic sites within the lri was irregular in pattern and markedly decreased in number, while the lre possesses an electrical field that is highly regular at all time intervals analyzed (15, 30, 60, 120, 180, 240, and 300 minutes). Immersion and perfusion of renal tissue with PEI reveals additional heavy staining of the epithelial and endothelial cell sialoprotein coatings. PEI appears to bind to glomerular anionic sites reversibly: ie, between 60 and 180 minutes the concentration of stained sites decreases. At 300 minutes, the interspacing once again approaches the 60-minute concentration. This suggests a dynamic turnover or dissociation followed by a reassociation of glomerular negatively charged PEI binding sites. In contrast, morphometric analysis of anionic sites stained with lysozyme and protamine sulfate reveals interspacings of 642 A and 585 A, respectively; in addition, these tracers produce major glomerular ultrastructural alterations and induce transient proteinuria. PEI does not induce proteinuria in rats, nor does it produce glomerular morphologic alterations when ten times the tracer dosage is administered intravenously. These findings indicate that the choice of ultrastructural charge tracer, the method of administering the tracer, and the time selected for analysis of tissue after administration of tracer significantly influences results. Morphometric analysis of the distribution of glomerular anionic sites in nonproteinuric rats provides a method of evaluating quantitative alterations of the glomerular charge barrier in renal disease models.
Annotation: a computational solution for streamlining metabolomics analysis
Domingo-Almenara, Xavier; Montenegro-Burke, J. Rafael; Benton, H. Paul; Siuzdak, Gary
2017-01-01
Metabolite identification is still considered an imposing bottleneck in liquid chromatography mass spectrometry (LC/MS) untargeted metabolomics. The identification workflow usually begins with detecting relevant LC/MS peaks via peak-picking algorithms and retrieving putative identities based on accurate mass searching. However, accurate mass search alone provides poor evidence for metabolite identification. For this reason, computational annotation is used to reveal the underlying metabolites monoisotopic masses, improving putative identification in addition to confirmation with tandem mass spectrometry. This review examines LC/MS data from a computational and analytical perspective, focusing on the occurrence of neutral losses and in-source fragments, to understand the challenges in computational annotation methodologies. Herein, we examine the state-of-the-art strategies for computational annotation including: (i) peak grouping or full scan (MS1) pseudo-spectra extraction, i.e., clustering all mass spectral signals stemming from each metabolite; (ii) annotation using ion adduction and mass distance among ion peaks; (iii) incorporation of biological knowledge such as biotransformations or pathways; (iv) tandem MS data; and (v) metabolite retention time calibration, usually achieved by prediction from molecular descriptors. Advantages and pitfalls of each of these strategies are discussed, as well as expected future trends in computational annotation. PMID:29039932
Study of basic computer competence among public health nurses in Taiwan.
Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling
2004-03-01
Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.
Optical properties of boron-group (V) hexagonal nanowires: DFT investigation
NASA Astrophysics Data System (ADS)
Santhibhushan, B.; Soni, Mahesh; Srivastava, Anurag
2017-07-01
The paper presents structural, electronic and optical properties of boron-group V hexagonal nanowires (h-NW) within the framework of density functional theory. The h-NW of boron-group V compounds with an analogous diameter of 12 Å have been designed in (1 1 1) plane. Stability analysis performed through formation energies reveal that, the stability of these structures decreases with increasing atomic number of the group V element. The band nature predicts that these nanowires are good electrical conductors. Optical behaviour of the nanowires has been analysed through absorption coefficient, reflectivity, refractive index, optical conductivity and electron energy loss spectrum (EELS), that are computed from the frequency-dependent complex dielectric function. The analysis reveals high reactivity of BP and BAs h-NWs to the incident light especially in the IR and visible ranges, and the optical transparency of BN h-NW in the visible and UV ranges.
Computational gene expression profiling under salt stress reveals patterns of co-expression
Sanchita; Sharma, Ashok
2016-01-01
Plants respond differently to environmental conditions. Among various abiotic stresses, salt stress is a condition where excess salt in soil causes inhibition of plant growth. To understand the response of plants to the stress conditions, identification of the responsible genes is required. Clustering is a data mining technique used to group the genes with similar expression. The genes of a cluster show similar expression and function. We applied clustering algorithms on gene expression data of Solanum tuberosum showing differential expression in Capsicum annuum under salt stress. The clusters, which were common in multiple algorithms were taken further for analysis. Principal component analysis (PCA) further validated the findings of other cluster algorithms by visualizing their clusters in three-dimensional space. Functional annotation results revealed that most of the genes were involved in stress related responses. Our findings suggest that these algorithms may be helpful in the prediction of the function of co-expressed genes. PMID:26981411
Ab initio genotype–phenotype association reveals intrinsic modularity in genetic networks
Slonim, Noam; Elemento, Olivier; Tavazoie, Saeed
2006-01-01
Microbial species express an astonishing diversity of phenotypic traits, behaviors, and metabolic capacities. However, our molecular understanding of these phenotypes is based almost entirely on studies in a handful of model organisms that together represent only a small fraction of this phenotypic diversity. Furthermore, many microbial species are not amenable to traditional laboratory analysis because of their exotic lifestyles and/or lack of suitable molecular genetic techniques. As an adjunct to experimental analysis, we have developed a computational information-theoretic framework that produces high-confidence gene–phenotype predictions using cross-species distributions of genes and phenotypes across 202 fully sequenced archaea and eubacteria. In addition to identifying the genetic basis of complex traits, our approach reveals the organization of these genes into generic preferentially co-inherited modules, many of which correspond directly to known enzymatic pathways, molecular complexes, signaling pathways, and molecular machines. PMID:16732191
NASA Astrophysics Data System (ADS)
Binoy, J.; Prathima, N. B.; Murali Krishna, C.; Santhosh, C.; Hubert Joe, I.; Jayakumar, V. S.
2006-08-01
Acetanilide, a compound of pharmaceutical importance possessing pain-relieving properties due to its blocking the pulse dissipating along the nerve fiber, is subjected to vibrational spectral investigation using NIR FT Raman, FT-IR, and SERS. The geometry, Mulliken charges, and vibrational spectrum of acetanilide have been computed using the Hartree-Fock theory and density functional theory employing the 6-31G (d) basis set. To investigate the influence of intermolecular amide hydrogen bonding, the geometry, charge distribution, and vibrational spectrum of the acetanilide dimer have been computed at the HF/6-31G (d) level. The computed geometries reveal that the acetanilide molecule is planar, while twisting of the secondary amide group with respect to the phenyl ring is found upon hydrogen bonding. The trans isomerism and “amido” form of the secondary amide, hyperconjugation of the C=O group with the adjacent C-C bond, and donor-acceptor interaction have been investigated using computed geometry. The carbonyl stretching band position is found to be influenced by the tendency of the phenyl ring to withdraw nitrogen lone pair, intermolecular hydrogen bonding, conjugation, and hyperconjugation. A decrease in the NH and C=O bond orders and increase in the C-N bond orders due to donor-acceptor interaction can be observed in the vibrational spectra. The SERS spectral analysis reveals that the flat orientation of the molecule on the adsorption plane is preferred.
Marquié, J C; Thon, B; Baracat, B
1994-06-01
The study of Bue and Gollac (1988) provided evidence that a significantly lower proportion of workers aged 45 years and over make use of computer technology compared with younger ones. The aim of the present survey was to explain this fact by a more intensive analysis of the older workers' attitude with respect to the computerization of work situations in relation to other individual and organizational factors. Six hundred and twenty office workers from 18 to 70 years old, either users or non-users of computerized devices, were asked to complete a questionnaire. The questions allowed the assessment of various aspects of the workers' current situation, such as the computer training they had received, the degree of consultation they were subjected to during the computerization process, their representation of the effects of these new technologies on working conditions and employment, the rate of use of new technologies outside the work context, and the perceived usefulness of computers for their own work. The analysis of the questionnaire revealed that as long as the step towards using computer tools, even minimally, has not been taken, then attitudes with respect to computerization are on the whole not very positive and are a source of anxiety for many workers. Age, and even more, seniority in the department, increase such negative representations. The effects of age and seniority were also found among users, as well as the effects of other factors such as qualification, education level, type and rate of computer use, and size of the firm. For the older workers, the expectation of less positive consequences for their career, or even the fear that computerization might be accompanied by threats to their own employment and the less clear knowledge of how computers operate, appeared to account for a significant part of the observed age and seniority differences in attitudes. Although the difference in the amount of computer training between age groups was smaller than expected, the study revealed that one third of the users never received any specific training, and that many of those who benefited from it were trained for only a few days. Consultation of the staff during the computerization process also appeared to be poor, to apply mostly to the best trained and qualified workers, and to be more highly developed in small companies. The results are discussed in the light of more qualitative data recorded during the survey. They suggest the need to increase information, training and involvement of all personnel from the very first stages of computerization (or other technical changes) in order to lessen fears and the feeling of disruption, which are particularly obvious among the oldest workers.
Validation of a computerized algorithm to quantify fetal heart rate deceleration area.
Gyllencreutz, Erika; Lu, Ke; Lindecrantz, Kaj; Lindqvist, Pelle G; Nordstrom, Lennart; Holzmann, Malin; Abtahi, Farhad
2018-05-16
Reliability in visual cardiotocography interpretation is unsatisfying, which has led to development of computerized cardiotocography. Computerized analysis is well established for antenatal fetal surveillance, but has yet not performed sufficiently during labor. We aimed to investigate the capacity of a new computerized algorithm compared to visual assessment in identifying intrapartum fetal heart rate baseline and decelerations. Three-hundred-and-twelve intrapartum cardiotocography tracings with variable decelerations were analysed by the computerized algorithm and visually examined by two observers, blinded to each other and the computer analysis. The width, depth and area of each deceleration was measured. Four cases (>100 variable decelerations) were subject to in-depth detailed analysis. The outcome measures were bias in seconds (width), beats per minute (depth), and beats (area) between computer and observers by using Bland-Altman analysis. Interobserver reliability was determined by calculating intraclass correlation and Spearman rank analysis. The analysis (312 cases) showed excellent intraclass correlation (0.89-0.95) and very strong Spearman correlation (0.82-0.91). The detailed analysis of > 100 decelerations in 4 cases revealed low bias between the computer and the two observers; width 1.4 and 1.4 seconds, depth 5.1 and 0.7 beats per minute, and area 0.1 and -1.7 beats. This was comparable to the bias between the two observers; 0.3 seconds (width), 4.4 beats per minute (depth), and 1.7 beats (area). The intraclass correlation was excellent (0.90-0.98). A novel computerized algorithm for intrapartum cardiotocography analysis is as accurate as gold standard visual assessment with high correlation and low bias. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Hedayat, Assem; Nagy, Nicole; Packota, Garnet; Monteith, Judy; Allen, Darcy; Wysokinski, Tomasz; Zhu, Ning
2016-05-01
Dental burs are used extensively in dentistry to mechanically prepare tooth structures for restorations (fillings), yet little has been reported on the bur debris left behind in the teeth, and whether it poses potential health risks to patients. Here it is aimed to image dental bur debris under dental fillings, and allude to the potential health hazards that can be caused by this debris when left in direct contact with the biological surroundings, specifically when the debris is made of a non-biocompatible material. Non-destructive micro-computed tomography using the BioMedical Imaging & Therapy facility 05ID-2 beamline at the Canadian Light Source was pursued at 50 keV and at a pixel size of 4 µm to image dental bur fragments under a composite resin dental filling. The bur's cutting edges that produced the fragment were also chemically analyzed. The technique revealed dental bur fragments of different sizes in different locations on the floor of the prepared surface of the teeth and under the filling, which places them in direct contact with the dentinal tubules and the dentinal fluid circulating within them. Dispersive X-ray spectroscopy elemental analysis of the dental bur edges revealed that the fragments are made of tungsten carbide-cobalt, which is bio-incompatible.
Tools, techniques, organisation and culture of the CADD group at Sygnature Discovery.
St-Gallay, Steve A; Sambrook-Smith, Colin P
2017-03-01
Computer-aided drug design encompasses a wide variety of tools and techniques, and can be implemented with a range of organisational structures and focus in different organisations. Here we outline the computational chemistry skills within Sygnature Discovery, along with the software and hardware at our disposal, and briefly discuss the methods that are not employed and why. The goal of the group is to provide support for design and analysis in order to improve the quality of compounds synthesised and reduce the timelines of drug discovery projects, and we reveal how this is achieved at Sygnature. Impact on medicinal chemistry is vital to demonstrating the value of computational chemistry, and we discuss the approaches taken to influence the list of compounds for synthesis, and how we recognise success. Finally we touch on some of the areas being developed within the team in order to provide further value to the projects and clients.
How should a speech recognizer work?
Scharenborg, Odette; Norris, Dennis; Bosch, Louis; McQueen, James M
2005-11-12
Although researchers studying human speech recognition (HSR) and automatic speech recognition (ASR) share a common interest in how information processing systems (human or machine) recognize spoken language, there is little communication between the two disciplines. We suggest that this lack of communication follows largely from the fact that research in these related fields has focused on the mechanics of how speech can be recognized. In Marr's (1982) terms, emphasis has been on the algorithmic and implementational levels rather than on the computational level. In this article, we provide a computational-level analysis of the task of speech recognition, which reveals the close parallels between research concerned with HSR and ASR. We illustrate this relation by presenting a new computational model of human spoken-word recognition, built using techniques from the field of ASR that, in contrast to current existing models of HSR, recognizes words from real speech input. 2005 Lawrence Erlbaum Associates, Inc.
Analysis of Material Sample Heated by Impinging Hot Hydrogen Jet in a Non-Nuclear Tester
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Foote, John; Litchford, Ron
2006-01-01
A computational conjugate heat transfer methodology was developed and anchored with data obtained from a hot-hydrogen jet heated, non-nuclear materials tester, as a first step towards developing an efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective and thermal radiative, and conjugate heat transfers. Predicted hot hydrogen jet and material surface temperatures were compared with those of measurement. Predicted solid temperatures were compared with those obtained with a standard heat transfer code. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.
Tools, techniques, organisation and culture of the CADD group at Sygnature Discovery
NASA Astrophysics Data System (ADS)
St-Gallay, Steve A.; Sambrook-Smith, Colin P.
2017-03-01
Computer-aided drug design encompasses a wide variety of tools and techniques, and can be implemented with a range of organisational structures and focus in different organisations. Here we outline the computational chemistry skills within Sygnature Discovery, along with the software and hardware at our disposal, and briefly discuss the methods that are not employed and why. The goal of the group is to provide support for design and analysis in order to improve the quality of compounds synthesised and reduce the timelines of drug discovery projects, and we reveal how this is achieved at Sygnature. Impact on medicinal chemistry is vital to demonstrating the value of computational chemistry, and we discuss the approaches taken to influence the list of compounds for synthesis, and how we recognise success. Finally we touch on some of the areas being developed within the team in order to provide further value to the projects and clients.
Children's strategies to solving additive inverse problems: a preliminary analysis
NASA Astrophysics Data System (ADS)
Ding, Meixia; Auxter, Abbey E.
2017-03-01
Prior studies show that elementary school children generally "lack" formal understanding of inverse relations. This study goes beyond lack to explore what children might "have" in their existing conception. A total of 281 students, kindergarten to third grade, were recruited to respond to a questionnaire that involved both contextual and non-contextual tasks on inverse relations, requiring both computational and explanatory skills. Results showed that children demonstrated better performance in computation than explanation. However, many students' explanations indicated that they did not necessarily utilize inverse relations for computation. Rather, they appeared to possess partial understanding, as evidenced by their use of part-whole structure, which is a key to understanding inverse relations. A close inspection of children's solution strategies further revealed that the sophistication of children's conception of part-whole structure varied in representation use and unknown quantity recognition, which suggests rich opportunities to develop students' understanding of inverse relations in lower elementary classrooms.
NASA Astrophysics Data System (ADS)
Perna, Andrea; Jost, Christian; Couturier, Etienne; Valverde, Sergi; Douady, Stéphane; Theraulaz, Guy
2008-09-01
Recent studies have introduced computer tomography (CT) as a tool for the visualisation and characterisation of insect architectures. Here, we use CT to map the three-dimensional networks of galleries inside Cubitermes nests in order to analyse them with tools from graph theory. The structure of these networks indicates that connections inside the nest are rearranged during the whole nest life. The functional analysis reveals that the final network topology represents an excellent compromise between efficient connectivity inside the nest and defence against attacking predators. We further discuss and illustrate the usefulness of CT to disentangle environmental and specific influences on nest architecture.
The kids got game: Computer/video games, gender and learning outcomes in science classrooms
NASA Astrophysics Data System (ADS)
Anderson, Janice Lyn
In recent years educators have begun to explore how to purposively design computer/video games to support student learning. This interest in video games has arisen in part because educational video games appear to have the potential to improve student motivation and interest in technology, and engage students in learning through the use of a familiar medium (Squire, 2005; Shaffer, 2006; Gee, 2005). The purpose of this dissertation research is to specifically address the issue of student learning through the use of educational computer/video games. Using the Quest Atlantis computer game, this study involved a mixed model research strategy that allowed for both broad understandings of classroom practices and specific analysis of outcomes through the themes that emerged from the case studies of the gendered groups using the game. Specifically, this study examined how fifth-grade students learning about science concepts, such as water quality and ecosystems, unfolds over time as they participate in the Quest Atlantis computer game. Data sources included classroom observations and video, pre- and post-written assessments, pre- and post- student content interviews, student field notebooks, field reports and the field notes of the researcher. To make sense of how students learning unfolded, video was analyzed using a framework of interaction analysis and small group interactions (Jordan & Henderson, 1995; Webb, 1995). These coded units were then examined with respect to student artifacts and assessments and patterns of learning trajectories analyzed. The analysis revealed that overall, student learning outcomes improved from pre- to post-assessments for all students. While there were no observable gendered differences with respect to the test scores and content interviews, there were gendered differences with respect to game play. Implications for game design, use of external scaffolds, games as tools for learning and gendered findings are discussed.
Computational study of noise in a large signal transduction network.
Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena
2011-06-21
Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.
Using text analysis to quantify the similarity and evolution of scientific disciplines
Dias, Laércio; Scharloth, Joachim
2018-01-01
We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance. PMID:29410857
Using text analysis to quantify the similarity and evolution of scientific disciplines.
Dias, Laércio; Gerlach, Martin; Scharloth, Joachim; Altmann, Eduardo G
2018-01-01
We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance.
Computer Proficiency Questionnaire: Assessing Low and High Computer Proficient Seniors
Boot, Walter R.; Charness, Neil; Czaja, Sara J.; Sharit, Joseph; Rogers, Wendy A.; Fisk, Arthur D.; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran
2015-01-01
Purpose of the Study: Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. Design and Methods: To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. Results: The CPQ demonstrated excellent reliability (Cronbach’s α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. Implications: The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. PMID:24107443
Computational Modeling and Real-Time Control of Patient-Specific Laser Treatment of Cancer
Fuentes, D.; Oden, J. T.; Diller, K. R.; Hazle, J. D.; Elliott, A.; Shetty, A.; Stafford, R. J.
2014-01-01
An adaptive feedback control system is presented which employs a computational model of bioheat transfer in living tissue to guide, in real-time, laser treatments of prostate cancer monitored by magnetic resonance thermal imaging (MRTI). The system is built on what can be referred to as cyberinfrastructure - a complex structure of high-speed network, large-scale parallel computing devices, laser optics, imaging, visualizations, inverse-analysis algorithms, mesh generation, and control systems that guide laser therapy to optimally control the ablation of cancerous tissue. The computational system has been successfully tested on in-vivo, canine prostate. Over the course of an 18 minute laser induced thermal therapy (LITT) performed at M.D. Anderson Cancer Center (MDACC) in Houston, Texas, the computational models were calibrated to intra-operative real time thermal imaging treatment data and the calibrated models controlled the bioheat transfer to within 5°C of the predetermined treatment plan. The computational arena is in Austin, Texas and managed at the Institute for Computational Engineering and Sciences (ICES). The system is designed to control the bioheat transfer remotely while simultaneously providing real-time remote visualization of the on-going treatment. Post operative histology of the canine prostate reveal that the damage region was within the targeted 1.2cm diameter treatment objective. PMID:19148754
Computational modeling and real-time control of patient-specific laser treatment of cancer.
Fuentes, D; Oden, J T; Diller, K R; Hazle, J D; Elliott, A; Shetty, A; Stafford, R J
2009-04-01
An adaptive feedback control system is presented which employs a computational model of bioheat transfer in living tissue to guide, in real-time, laser treatments of prostate cancer monitored by magnetic resonance thermal imaging. The system is built on what can be referred to as cyberinfrastructure-a complex structure of high-speed network, large-scale parallel computing devices, laser optics, imaging, visualizations, inverse-analysis algorithms, mesh generation, and control systems that guide laser therapy to optimally control the ablation of cancerous tissue. The computational system has been successfully tested on in vivo, canine prostate. Over the course of an 18 min laser-induced thermal therapy performed at M.D. Anderson Cancer Center (MDACC) in Houston, Texas, the computational models were calibrated to intra-operative real-time thermal imaging treatment data and the calibrated models controlled the bioheat transfer to within 5 degrees C of the predetermined treatment plan. The computational arena is in Austin, Texas and managed at the Institute for Computational Engineering and Sciences (ICES). The system is designed to control the bioheat transfer remotely while simultaneously providing real-time remote visualization of the on-going treatment. Post-operative histology of the canine prostate reveal that the damage region was within the targeted 1.2 cm diameter treatment objective.
Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji
2012-07-01
With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.
Ishihara, D; Horie, T; Niho, T
2014-11-07
The relative importance of the wing's inertial and aerodynamic forces is the key to revealing how the kinematical characteristics of the passive pitching motion of insect flapping wings are generated, which is still unclear irrespective of its importance in the design of insect-like micro air vehicles. Therefore, we investigate three species of flies in order to reveal this, using a novel fluid-structure interaction analysis that consists of a dynamically scaled experiment and a three-dimensional finite element analysis. In the experiment, the dynamic similarity between the lumped torsional flexibility model as a first approximation of the dipteran wing and the actual insect is measured by the Reynolds number Re, the Strouhal number St, the mass ratio M, and the Cauchy number Ch. In the computation, the three-dimension is important in order to simulate the stable leading edge vortex and lift force in the present Re regime over 254. The drawback of the present experiment is the difficulty in satisfying the condition of M due to the limitation of available solid materials. The novelty of the present analysis is to complement this drawback using the computation. We analyze the following two cases: (a) The equilibrium between the wing's elastic and fluid forces is dynamically similar to that of the actual insect, while the wing's inertial force can be ignored. (b) All forces are dynamically similar to those of the actual insect. From the comparison between the results of cases (a) and (b), we evaluate the contributions of the equilibrium between the aerodynamic and the wing's elastic forces and the wing's inertial force to the passive pitching motion as 80-90% and 10-20%, respectively. It follows from these results that the dipteran passive pitching motion will be based on the equilibrium between the wing's elastic and aerodynamic forces, while it will be enhanced by the wing's inertial force.
Thon, Anika; Teichgräber, Ulf; Tennstedt-Schenk, Cornelia; Hadjidemetriou, Stathis; Winzler, Sven; Malich, Ansgar; Papageorgiou, Ismini
2017-01-01
Prostate cancer (PCa) diagnosis by means of multiparametric magnetic resonance imaging (mpMRI) is a current challenge for the development of computer-aided detection (CAD) tools. An innovative CAD-software (Watson Elementary™) was proposed to achieve high sensitivity and specificity, as well as to allege a correlate to Gleason grade. To assess the performance of Watson Elementary™ in automated PCa diagnosis in our hospital´s database of MRI-guided prostate biopsies. The evaluation was retrospective for 104 lesions (47 PCa, 57 benign) from 79, 64.61±6.64 year old patients using 3T T2-weighted imaging, Apparent Diffusion Coefficient (ADC) maps and dynamic contrast enhancement series. Watson Elementary™ utilizes signal intensity, diffusion properties and kinetic profile to compute a proportional Gleason grade predictor, termed Malignancy Attention Index (MAI). The analysis focused on (i) the CAD sensitivity and specificity to classify suspect lesions and (ii) the MAI correlation with the histopathological ground truth. The software revealed a sensitivity of 46.80% for PCa classification. The specificity for PCa was found to be 75.43% with a positive predictive value of 61.11%, a negative predictive value of 63.23% and a false discovery rate of 38.89%. CAD classified PCa and benign lesions with equal probability (P 0.06, χ2 test). Accordingly, receiver operating characteristic analysis suggests a poor predictive value for MAI with an area under curve of 0.65 (P 0.02), which is not superior to the performance of board certified observers. Moreover, MAI revealed no significant correlation with Gleason grade (P 0.60, Pearson´s correlation). The tested CAD software for mpMRI analysis was a weak PCa biomarker in this dataset. Targeted prostate biopsy and histology remains the gold standard for prostate cancer diagnosis.
Thon, Anika; Teichgräber, Ulf; Tennstedt-Schenk, Cornelia; Hadjidemetriou, Stathis; Winzler, Sven; Malich, Ansgar
2017-01-01
Background Prostate cancer (PCa) diagnosis by means of multiparametric magnetic resonance imaging (mpMRI) is a current challenge for the development of computer-aided detection (CAD) tools. An innovative CAD-software (Watson Elementary™) was proposed to achieve high sensitivity and specificity, as well as to allege a correlate to Gleason grade. Aim/Objective To assess the performance of Watson Elementary™ in automated PCa diagnosis in our hospital´s database of MRI-guided prostate biopsies. Methods The evaluation was retrospective for 104 lesions (47 PCa, 57 benign) from 79, 64.61±6.64 year old patients using 3T T2-weighted imaging, Apparent Diffusion Coefficient (ADC) maps and dynamic contrast enhancement series. Watson Elementary™ utilizes signal intensity, diffusion properties and kinetic profile to compute a proportional Gleason grade predictor, termed Malignancy Attention Index (MAI). The analysis focused on (i) the CAD sensitivity and specificity to classify suspect lesions and (ii) the MAI correlation with the histopathological ground truth. Results The software revealed a sensitivity of 46.80% for PCa classification. The specificity for PCa was found to be 75.43% with a positive predictive value of 61.11%, a negative predictive value of 63.23% and a false discovery rate of 38.89%. CAD classified PCa and benign lesions with equal probability (P 0.06, χ2 test). Accordingly, receiver operating characteristic analysis suggests a poor predictive value for MAI with an area under curve of 0.65 (P 0.02), which is not superior to the performance of board certified observers. Moreover, MAI revealed no significant correlation with Gleason grade (P 0.60, Pearson´s correlation). Conclusion The tested CAD software for mpMRI analysis was a weak PCa biomarker in this dataset. Targeted prostate biopsy and histology remains the gold standard for prostate cancer diagnosis. PMID:29023572
Computational analysis of nonlinearities within dynamics of cable-based driving systems
NASA Astrophysics Data System (ADS)
Anghelache, G. D.; Nastac, S.
2017-08-01
This paper deals with computational nonlinear dynamics of mechanical systems containing some flexural parts within the actuating scheme, and, especially, the situations of the cable-based driving systems were treated. It was supposed both functional nonlinearities and the real characteristic of the power supply, in order to obtain a realistically computer simulation model being able to provide very feasible results regarding the system dynamics. It was taken into account the transitory and stable regimes during a regular exploitation cycle. The authors present a particular case of a lift system, supposed to be representatively for the objective of this study. The simulations were made based on the values of the essential parameters acquired from the experimental tests and/or the regular practice in the field. The results analysis and the final discussions reveal the correlated dynamic aspects within the mechanical parts, the driving system, and the power supply, whole of these supplying potential sources of particular resonances, within some transitory phases of the working cycle, and which can affect structural and functional dynamics. In addition, it was underlines the influences of computational hypotheses on the both quantitative and qualitative behaviour of the system. Obviously, the most significant consequence of this theoretical and computational research consist by developing an unitary and feasible model, useful to dignify the nonlinear dynamic effects into the systems with cable-based driving scheme, and hereby to help an optimization of the exploitation regime including a dynamics control measures.
2012-01-01
Background Despite computational challenges, elucidating conformations that a protein system assumes under physiologic conditions for the purpose of biological activity is a central problem in computational structural biology. While these conformations are associated with low energies in the energy surface that underlies the protein conformational space, few existing conformational search algorithms focus on explicitly sampling low-energy local minima in the protein energy surface. Methods This work proposes a novel probabilistic search framework, PLOW, that explicitly samples low-energy local minima in the protein energy surface. The framework combines algorithmic ingredients from evolutionary computation and computational structural biology to effectively explore the subspace of local minima. A greedy local search maps a conformation sampled in conformational space to a nearby local minimum. A perturbation move jumps out of a local minimum to obtain a new starting conformation for the greedy local search. The process repeats in an iterative fashion, resulting in a trajectory-based exploration of the subspace of local minima. Results and conclusions The analysis of PLOW's performance shows that, by navigating only the subspace of local minima, PLOW is able to sample conformations near a protein's native structure, either more effectively or as well as state-of-the-art methods that focus on reproducing the native structure for a protein system. Analysis of the actual subspace of local minima shows that PLOW samples this subspace more effectively that a naive sampling approach. Additional theoretical analysis reveals that the perturbation function employed by PLOW is key to its ability to sample a diverse set of low-energy conformations. This analysis also suggests directions for further research and novel applications for the proposed framework. PMID:22759582
Puniya, Bhanwar Lal; Allen, Laura; Hochfelder, Colleen; Majumder, Mahbubul; Helikar, Tomáš
2016-01-01
Dysregulation in signal transduction pathways can lead to a variety of complex disorders, including cancer. Computational approaches such as network analysis are important tools to understand system dynamics as well as to identify critical components that could be further explored as therapeutic targets. Here, we performed perturbation analysis of a large-scale signal transduction model in extracellular environments that stimulate cell death, growth, motility, and quiescence. Each of the model’s components was perturbed under both loss-of-function and gain-of-function mutations. Using 1,300 simulations under both types of perturbations across various extracellular conditions, we identified the most and least influential components based on the magnitude of their influence on the rest of the system. Based on the premise that the most influential components might serve as better drug targets, we characterized them for biological functions, housekeeping genes, essential genes, and druggable proteins. The most influential components under all environmental conditions were enriched with several biological processes. The inositol pathway was found as most influential under inactivating perturbations, whereas the kinase and small lung cancer pathways were identified as the most influential under activating perturbations. The most influential components were enriched with essential genes and druggable proteins. Moreover, known cancer drug targets were also classified in influential components based on the affected components in the network. Additionally, the systemic perturbation analysis of the model revealed a network motif of most influential components which affect each other. Furthermore, our analysis predicted novel combinations of cancer drug targets with various effects on other most influential components. We found that the combinatorial perturbation consisting of PI3K inactivation and overactivation of IP3R1 can lead to increased activity levels of apoptosis-related components and tumor-suppressor genes, suggesting that this combinatorial perturbation may lead to a better target for decreasing cell proliferation and inducing apoptosis. Finally, our approach shows a potential to identify and prioritize therapeutic targets through systemic perturbation analysis of large-scale computational models of signal transduction. Although some components of the presented computational results have been validated against independent gene expression data sets, more laboratory experiments are warranted to more comprehensively validate the presented results. PMID:26904540
fMRI Analysis-by-Synthesis Reveals a Dorsal Hierarchy That Extracts Surface Slant.
Ban, Hiroshi; Welchman, Andrew E
2015-07-08
The brain's skill in estimating the 3-D orientation of viewed surfaces supports a range of behaviors, from placing an object on a nearby table, to planning the best route when hill walking. This ability relies on integrating depth signals across extensive regions of space that exceed the receptive fields of early sensory neurons. Although hierarchical selection and pooling is central to understanding of the ventral visual pathway, the successive operations in the dorsal stream are poorly understood. Here we use computational modeling of human fMRI signals to probe the computations that extract 3-D surface orientation from binocular disparity. To understand how representations evolve across the hierarchy, we developed an inference approach using a series of generative models to explain the empirical fMRI data in different cortical areas. Specifically, we simulated the responses of candidate visual processing algorithms and tested how well they explained fMRI responses. Thereby we demonstrate a hierarchical refinement of visual representations moving from the representation of edges and figure-ground segmentation (V1, V2) to spatially extensive disparity gradients in V3A. We show that responses in V3A are little affected by low-level image covariates, and have a partial tolerance to the overall depth position. Finally, we show that responses in V3A parallel perceptual judgments of slant. This reveals a relatively short computational hierarchy that captures key information about the 3-D structure of nearby surfaces, and more generally demonstrates an analysis approach that may be of merit in a diverse range of brain imaging domains. Copyright © 2015 Ban and Welchman.
Time and learning efficiency in Internet-based learning: a systematic review and meta-analysis.
Cook, David A; Levinson, Anthony J; Garside, Sarah
2010-12-01
Authors have claimed that Internet-based instruction promotes greater learning efficiency than non-computer methods. determine, through a systematic synthesis of evidence in health professions education, how Internet-based instruction compares with non-computer instruction in time spent learning, and what features of Internet-based instruction are associated with improved learning efficiency. we searched databases including MEDLINE, CINAHL, EMBASE, and ERIC from 1990 through November 2008. STUDY SELECTION AND DATA ABSTRACTION we included all studies quantifying learning time for Internet-based instruction for health professionals, compared with other instruction. Reviewers worked independently, in duplicate, to abstract information on interventions, outcomes, and study design. we identified 20 eligible studies. Random effects meta-analysis of 8 studies comparing Internet-based with non-Internet instruction (positive numbers indicating Internet longer) revealed pooled effect size (ES) for time -0.10 (p = 0.63). Among comparisons of two Internet-based interventions, providing feedback adds time (ES 0.67, p =0.003, two studies), and greater interactivity generally takes longer (ES 0.25, p = 0.089, five studies). One study demonstrated that adapting to learner prior knowledge saves time without significantly affecting knowledge scores. Other studies revealed that audio narration, video clips, interactive models, and animations increase learning time but also facilitate higher knowledge and/or satisfaction. Across all studies, time correlated positively with knowledge outcomes (r = 0.53, p = 0.021). on average, Internet-based instruction and non-computer instruction require similar time. Instructional strategies to enhance feedback and interactivity typically prolong learning time, but in many cases also enhance learning outcomes. Isolated examples suggest potential for improving efficiency in Internet-based instruction.
A CS1 pedagogical approach to parallel thinking
NASA Astrophysics Data System (ADS)
Rague, Brian William
Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within a discrete computational context are presented. Logical thinking is highlighted, guided primarily by a sequential approach to algorithm development and made manifest by typically using the latest, commercially successful programming language. In response to the most recent developments in accessible multicore computers, instructors of these introductory classes may wish to include training on how to design workable parallel code. Novel issues arise when programming concurrent applications which can make teaching these concepts to beginning programmers a seemingly formidable task. Student comprehension of design strategies related to parallel systems should be monitored to ensure an effective classroom experience. This research investigated the feasibility of integrating parallel computing concepts into the first-year CS classroom. To quantitatively assess student comprehension of parallel computing, an experimental educational study using a two-factor mixed group design was conducted to evaluate two instructional interventions in addition to a control group: (1) topic lecture only, and (2) topic lecture with laboratory work using a software visualization Parallel Analysis Tool (PAT) specifically designed for this project. A new evaluation instrument developed for this study, the Perceptions of Parallelism Survey (PoPS), was used to measure student learning regarding parallel systems. The results from this educational study show a statistically significant main effect among the repeated measures, implying that student comprehension levels of parallel concepts as measured by the PoPS improve immediately after the delivery of any initial three-week CS1 level module when compared with student comprehension levels just prior to starting the course. Survey results measured during the ninth week of the course reveal that performance levels remained high compared to pre-course performance scores. A second result produced by this study reveals no statistically significant interaction effect between the intervention method and student performance as measured by the evaluation instrument over three separate testing periods. However, visual inspection of survey score trends and the low p-value generated by the interaction analysis (0.062) indicate that further studies may verify improved concept retention levels for the lecture w/PAT group.
Computer based imaging and analysis of root gravitropism
NASA Technical Reports Server (NTRS)
Evans, M. L.; Ishikawa, H.
1997-01-01
Two key issues in studies of the nature of the gravitropic response in roots have been the determination of the precise pattern of differential elongation responsible for downward bending and the identification of the cells that show the initial motor response. The main approach for examining patterns of differential growth during root gravitropic curvature has been to apply markers to the root surface and photograph the root at regular intervals during gravitropic curvature. Although these studies have provided valuable information on the characteristics of the gravitropic motor response in roots, their labor intensive nature limits sample size and discourages both high frequency of sampling and depth of analysis of surface expansion data. In this brief review we describe the development of computer-based video analysis systems for automated measurement of root growth and shape change and discuss some key features of the root gravitropic response that have been revealed using this methodology. We summarize the capabilities of several new pieces of software designed to measure growth and shape changes in graviresponding roots and describe recent progress in developing analysis systems for studying the small, but experimentally popular, primary roots of Arabidopsis. A key finding revealed by such studies is that the initial gravitropic response of roots of maize and Arabidopsis occurs in the distal elongation zone (DEZ) near the root apical meristem, not in the main elongation zone. Another finding is that the initiation of rapid elongation in the DEZ following gravistimulation appears to be related to rapid membrane potential changes in this region of the root. These observations have provided the incentive for ongoing studies examining possible links between potential growth modifying factors (auxin, calcium, protons) and gravistimulated changes in membrane potential and growth patterns in the DEZ.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Troy; Bhat, Sham; Marcy, Peter
Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive computational fluid dynamics (CFD) simulations are valuable tools in evaluating and deploying oxyfuel and other carbon capture technologies, either as retrofit technologies or for new construction. However, accurate predictive combustor simulations require physically realistic submodels with low computational requirements. A recent sensitivity analysis of a detailed char conversion model (Char Conversion Kinetics (CCK)) found thermal annealing to be an extremely sensitive submodel. In the present work, further analysis of the previous annealing model revealed significant disagreement with numerous datasets from experiments performed after that annealing model was developed. Themore » annealing model was accordingly extended to reflect experimentally observed reactivity loss, because of the thermal annealing of a variety of coals under diverse char preparation conditions. The model extension was informed by a Bayesian calibration analysis. In addition, since oxyfuel conditions include extraordinarily high levels of CO 2, the development of a first-ever CO 2 reactivity loss model due to annealing is presented.« less
Holland, Troy; Bhat, Sham; Marcy, Peter; ...
2017-08-25
Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive computational fluid dynamics (CFD) simulations are valuable tools in evaluating and deploying oxyfuel and other carbon capture technologies, either as retrofit technologies or for new construction. However, accurate predictive combustor simulations require physically realistic submodels with low computational requirements. A recent sensitivity analysis of a detailed char conversion model (Char Conversion Kinetics (CCK)) found thermal annealing to be an extremely sensitive submodel. In the present work, further analysis of the previous annealing model revealed significant disagreement with numerous datasets from experiments performed after that annealing model was developed. Themore » annealing model was accordingly extended to reflect experimentally observed reactivity loss, because of the thermal annealing of a variety of coals under diverse char preparation conditions. The model extension was informed by a Bayesian calibration analysis. In addition, since oxyfuel conditions include extraordinarily high levels of CO 2, the development of a first-ever CO 2 reactivity loss model due to annealing is presented.« less
Quantifying the development of user-generated art during 2001–2010
Yazdani, Mehrdad; Chow, Jay; Manovich, Lev
2017-01-01
One of the main questions in the humanities is how cultures and artistic expressions change over time. While a number of researchers have used quantitative computational methods to study historical changes in literature, music, and cinema, our paper offers the first quantitative analysis of historical changes in visual art created by users of a social online network. We propose a number of computational methods for the analysis of temporal development of art images. We then apply these methods to a sample of 270,000 artworks created between 2001 and 2010 by users of the largest social network for art—DeviantArt (www.deviantart.com). We investigate changes in subjects, techniques, sizes, proportions and also selected visual characteristics of images. Because these artworks are classified by their creators into two general categories—Traditional Art and Digital Art—we are also able to investigate if the use of digital tools has had a significant effect on the content and form of artworks. Our analysis reveals a number of gradual and systematic changes over a ten-year period in artworks belonging to both categories. PMID:28792494
Quantifying the development of user-generated art during 2001-2010.
Yazdani, Mehrdad; Chow, Jay; Manovich, Lev
2017-01-01
One of the main questions in the humanities is how cultures and artistic expressions change over time. While a number of researchers have used quantitative computational methods to study historical changes in literature, music, and cinema, our paper offers the first quantitative analysis of historical changes in visual art created by users of a social online network. We propose a number of computational methods for the analysis of temporal development of art images. We then apply these methods to a sample of 270,000 artworks created between 2001 and 2010 by users of the largest social network for art-DeviantArt (www.deviantart.com). We investigate changes in subjects, techniques, sizes, proportions and also selected visual characteristics of images. Because these artworks are classified by their creators into two general categories-Traditional Art and Digital Art-we are also able to investigate if the use of digital tools has had a significant effect on the content and form of artworks. Our analysis reveals a number of gradual and systematic changes over a ten-year period in artworks belonging to both categories.
Statistical Analysis of the First Passage Path Ensemble of Jump Processes
NASA Astrophysics Data System (ADS)
von Kleist, Max; Schütte, Christof; Zhang, Wei
2018-02-01
The transition mechanism of jump processes between two different subsets in state space reveals important dynamical information of the processes and therefore has attracted considerable attention in the past years. In this paper, we study the first passage path ensemble of both discrete-time and continuous-time jump processes on a finite state space. The main approach is to divide each first passage path into nonreactive and reactive segments and to study them separately. The analysis can be applied to jump processes which are non-ergodic, as well as continuous-time jump processes where the waiting time distributions are non-exponential. In the particular case that the jump processes are both Markovian and ergodic, our analysis elucidates the relations between the study of the first passage paths and the study of the transition paths in transition path theory. We provide algorithms to numerically compute statistics of the first passage path ensemble. The computational complexity of these algorithms scales with the complexity of solving a linear system, for which efficient methods are available. Several examples demonstrate the wide applicability of the derived results across research areas.
Scalable and cost-effective NGS genotyping in the cloud.
Souilmi, Yassine; Lancaster, Alex K; Jung, Jae-Yoon; Rizzo, Ettore; Hawkins, Jared B; Powles, Ryan; Amzazi, Saaïd; Ghazal, Hassan; Tonellato, Peter J; Wall, Dennis P
2015-10-15
While next-generation sequencing (NGS) costs have plummeted in recent years, cost and complexity of computation remain substantial barriers to the use of NGS in routine clinical care. The clinical potential of NGS will not be realized until robust and routine whole genome sequencing data can be accurately rendered to medically actionable reports within a time window of hours and at scales of economy in the 10's of dollars. We take a step towards addressing this challenge, by using COSMOS, a cloud-enabled workflow management system, to develop GenomeKey, an NGS whole genome analysis workflow. COSMOS implements complex workflows making optimal use of high-performance compute clusters. Here we show that the Amazon Web Service (AWS) implementation of GenomeKey via COSMOS provides a fast, scalable, and cost-effective analysis of both public benchmarking and large-scale heterogeneous clinical NGS datasets. Our systematic benchmarking reveals important new insights and considerations to produce clinical turn-around of whole genome analysis optimization and workflow management including strategic batching of individual genomes and efficient cluster resource configuration.
Al-Ruqaie, I.; Al-Khalifah, N.S.; Shanavaskhan, A.E.
2015-01-01
Varietal identification of olives is an intrinsic and empirical exercise owing to the large number of synonyms and homonyms, intensive exchange of genotypes, presence of varietal clones and lack of proper certification in nurseries. A comparative study of morphological characters of eight olive cultivars grown in Saudi Arabia was carried out and analyzed using NTSYSpc (Numerical Taxonomy System for personal computer) system segregated smaller fruits in one clade and the rest in two clades. Koroneiki, a Greek cultivar with a small sized fruit shared arm with Spanish variety Arbosana. Morphologic analysis using NTSYSpc revealed that biometrics of leaves, fruits and seeds are reliable morphologic characters to distinguish between varieties, except for a few morphologically very similar olive cultivars. The proximate analysis showed significant variations in the protein, fiber, crude fat, ash and moisture content of different cultivars. The study also showed that neither the size of fruit nor the fruit pulp thickness is a limiting factor determining crude fat content of olives. PMID:26858547
Optimization of a solid-state electron spin qubit using Gate Set Tomography
Dehollain, Juan P.; Muhonen, Juha T.; Blume-Kohout, Robin J.; ...
2016-10-13
Here, state of the art qubit systems are reaching the gate fidelities required for scalable quantum computation architectures. Further improvements in the fidelity of quantum gates demands characterization and benchmarking protocols that are efficient, reliable and extremely accurate. Ideally, a benchmarking protocol should also provide information on how to rectify residual errors. Gate Set Tomography (GST) is one such protocol designed to give detailed characterization of as-built qubits. We implemented GST on a high-fidelity electron-spin qubit confined by a single 31P atom in 28Si. The results reveal systematic errors that a randomized benchmarking analysis could measure but not identify, whereasmore » GST indicated the need for improved calibration of the length of the control pulses. After introducing this modification, we measured a new benchmark average gate fidelity of 99.942(8)%, an improvement on the previous value of 99.90(2)%. Furthermore, GST revealed high levels of non-Markovian noise in the system, which will need to be understood and addressed when the qubit is used within a fault-tolerant quantum computation scheme.« less
Analysis of film cooling in rocket nozzles
NASA Technical Reports Server (NTRS)
Woodbury, Keith A.; Karr, Gerald R.
1992-01-01
Progress during the reporting period is summarized. Analysis of film cooling in rocket nozzles by computational fluid dynamics (CFD) computer codes is desirable for two reasons. First, it allows prediction of resulting flow fields within the rocket nozzle, in particular the interaction of the coolant boundary layer with the main flow. This facilitates evaluation of potential cooling configurations with regard to total thrust, etc., before construction and testing of any prototype. Secondly, CFD simulation of film cooling allows for assessment of the effectiveness of the proposed cooling in limiting nozzle wall temperature rises. This latter objective is the focus of the current work. The desired objective is to use the Finite Difference Navier Stokes (FDNS) code to predict wall heat fluxes or wall temperatures in rocket nozzles. As prior work has revealed that the FDNS code is deficient in the thermal modeling of boundary conditions, the first step is to correct these deficiencies in the FDNS code. Next, these changes must be tested against available data. Finally, the code will be used to model film cooling of a particular rocket nozzle. The third task of this research, using the modified code to compute the flow of hot gases through a nozzle, is described.
Negative correlates of computer game play in adolescents.
Colwell, J; Payne, J
2000-08-01
There is some concern that playing computer games may be associated with social isolation, lowered self-esteem, and aggression among adolescents. Measures of these variables were included in a questionnaire completed by 204 year eight students at a North London comprehensive school. Principal components analysis of a scale to assess needs fulfilled by game play provided some support for the notion of 'electronic friendship' among boys, but there was no evidence that game play leads to social isolation. Play was not linked to self-esteem in girls, but a negative relationship was obtained between self-esteem and frequency of play in boys. However, self-esteem was not associated with total exposure to game play. Aggression scores were not related to the number of games with aggressive content named among three favourite games, but they were positively correlated with total exposure to game play. A multiple regression analysis revealed that sex and total game play exposure each accounted for a significant but small amount of the variance in aggression scores. The positive correlation between playing computer games and aggression provides some justification for further investigation of the causal hypothesis, and possible methodologies are discussed.
Development of a New Methodology for Computing Surface Sensible Heat Fluxes using Thermal Imagery
NASA Astrophysics Data System (ADS)
Morrison, T. J.; Calaf, M.; Fernando, H. J.; Price, T. A.; Pardyjak, E.
2017-12-01
Current numerical weather predication models utilize similarity to characterize momentum, moisture, and heat fluxes. Such formulations are only valid under the ideal assumptions of spatial homogeneity, statistical stationary, and zero subsidence. However, recent surface temperature measurements from the Mountain Terrain Atmospheric Modeling and Observations (MATERHORN) Program on the Salt Flats of Utah's West desert, show that even under the most a priori ideal conditions, heterogeneity of the aforementioned variables exists. We present a new method to extract spatially-distributed measurements of surface sensible heat flux from thermal imagery. The approach consists of using a surface energy budget, where the ground heat flux is easily computed from limited measurements using a force-restore-type methodology, the latent heat fluxes are neglected, and the energy storage is computed using a lumped capacitance model. Preliminary validation of the method is presented using experimental data acquired from a nearby sonic anemometer during the MATERHORN campaign. Additional evaluation is required to confirm the method's validity. Further decomposition analysis of on-site instrumentation (thermal camera, cold-hotwire probes, and sonic anemometers) using Proper Orthogonal Decomposition (POD), and wavelet analysis, reveals time scale similarity between the flow and surface fluctuations.
Determining protein function and interaction from genome analysis
Eisenberg, David; Marcotte, Edward M.; Thompson, Michael J.; Pellegrini, Matteo; Yeates, Todd O.
2004-08-03
A computational method system, and computer program are provided for inferring functional links from genome sequences. One method is based on the observation that some pairs of proteins A' and B' have homologs in another organism fused into a single protein chain AB. A trans-genome comparison of sequences can reveal these AB sequences, which are Rosetta Stone sequences because they decipher an interaction between A' and B. Another method compares the genomic sequence of two or more organisms to create a phylogenetic profile for each protein indicating its presence or absence across all the genomes. The profile provides information regarding functional links between different families of proteins. In yet another method a combination of the above two methods is used to predict functional links.
Assigning protein functions by comparative genome analysis protein phylogenetic profiles
Pellegrini, Matteo; Marcotte, Edward M.; Thompson, Michael J.; Eisenberg, David; Grothe, Robert; Yeates, Todd O.
2003-05-13
A computational method system, and computer program are provided for inferring functional links from genome sequences. One method is based on the observation that some pairs of proteins A' and B' have homologs in another organism fused into a single protein chain AB. A trans-genome comparison of sequences can reveal these AB sequences, which are Rosetta Stone sequences because they decipher an interaction between A' and B. Another method compares the genomic sequence of two or more organisms to create a phylogenetic profile for each protein indicating its presence or absence across all the genomes. The profile provides information regarding functional links between different families of proteins. In yet another method a combination of the above two methods is used to predict functional links.
FACE computer simulation. [Flexible Arm Controls Experiment
NASA Technical Reports Server (NTRS)
Sadeh, Willy Z.; Szmyd, Jeffrey A.
1990-01-01
A computer simulation of the FACE (Flexible Arm Controls Experiment) was conducted to assess its design for use in the Space Shuttle. The FACE is supposed to be a 14-ft long articulate structure with 4 degrees of freedom, consisting of shoulder pitch and yaw, elbow pitch, and wrist pitch. Kinematics of the FACE was simulated to obtain data on arm operation, function, workspace and interaction. Payload capture ability was modeled. The simulation indicates the capability for detailed kinematic simulation and payload capture ability analysis, and the feasibility of real-time simulation was determined. In addition, the potential for interactive real-time training through integration of the simulation with various interface controllers was revealed. At this stage, the flexibility of the arm was not yet considered.
Functional relevance of neurotransmitter receptor heteromers in the central nervous system.
Ferré, Sergi; Ciruela, Francisco; Woods, Amina S; Lluis, Carme; Franco, Rafael
2007-09-01
The existence of neurotransmitter receptor heteromers is becoming broadly accepted and their functional significance is being revealed. Heteromerization of neurotransmitter receptors produces functional entities that possess different biochemical characteristics with respect to the individual components of the heteromer. Neurotransmitter receptor heteromers can function as processors of computations that modulate cell signaling. Thus, the quantitative or qualitative aspects of the signaling generated by stimulation of any of the individual receptor units in the heteromer are different from those obtained during coactivation. Furthermore, recent studies demonstrate that some neurotransmitter receptor heteromers can exert an effect as processors of computations that directly modulate both pre- and postsynaptic neurotransmission. This is illustrated by the analysis of striatal receptor heteromers that control striatal glutamatergic neurotransmission.
Computational pathology: Exploring the spatial dimension of tumor ecology.
Nawaz, Sidra; Yuan, Yinyin
2016-09-28
Tumors are evolving ecosystems where cancer subclones and the microenvironment interact. This is analogous to interaction dynamics between species in their natural habitats, which is a prime area of study in ecology. Spatial statistics are frequently used in ecological studies to infer complex relations including predator-prey, resource dependency and co-evolution. Recently, the emerging field of computational pathology has enabled high-throughput spatial analysis by using image processing to identify different cell types and their locations within histological tumor samples. We discuss how these data may be analyzed with spatial statistics used in ecology to reveal patterns and advance our understanding of ecological interactions occurring among cancer cells and their microenvironment. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Privacy-preserving microbiome analysis using secure computation.
Wagner, Justin; Paulson, Joseph N; Wang, Xiao; Bhattacharjee, Bobby; Corrada Bravo, Héctor
2016-06-15
Developing targeted therapeutics and identifying biomarkers relies on large amounts of research participant data. Beyond human DNA, scientists now investigate the DNA of micro-organisms inhabiting the human body. Recent work shows that an individual's collection of microbial DNA consistently identifies that person and could be used to link a real-world identity to a sensitive attribute in a research dataset. Unfortunately, the current suite of DNA-specific privacy-preserving analysis tools does not meet the requirements for microbiome sequencing studies. To address privacy concerns around microbiome sequencing, we implement metagenomic analyses using secure computation. Our implementation allows comparative analysis over combined data without revealing the feature counts for any individual sample. We focus on three analyses and perform an evaluation on datasets currently used by the microbiome research community. We use our implementation to simulate sharing data between four policy-domains. Additionally, we describe an application of our implementation for patients to combine data that allows drug developers to query against and compensate patients for the analysis. The software is freely available for download at: http://cbcb.umd.edu/∼hcorrada/projects/secureseq.html Supplementary data are available at Bioinformatics online. hcorrada@umiacs.umd.edu. © The Author 2016. Published by Oxford University Press.
Mast, Fred D.; Ratushny, Alexander V.
2014-01-01
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336
Thruput Analysis of AFLC CYBER 73 Computers.
1981-12-01
Ref 2:14). This decision permitted a fast conversion effort with minimum programmer/analyst experience (Ref 34). Recently, as the conversion effort...converted (Ref 1:2). 2 . i i i II I i4 Moreover, many of the large data-file and machine-time- consuming systems were not included in the earlier...by LMT personnel revealed that during certain periods i.e., 0000-0800, the machine is normally reserved for the large 3 4 resource- consuming programs
Cardoso, Flávia G R; Ferreira, Nádia S; Martinho, Frederico C; Nascimento, Gustavo G; Manhães, Luiz R C; Rocco, Marco A; Carvalho, Cláudio A T; Valera, Marcia C
2015-07-01
This clinical study was conducted to correlate the levels of endotoxins and bacterial counts found in primary endodontic infection with the volume of periapical bone destruction determined by cone-beam computed tomography (CBCT) analysis. Moreover, the levels of bacteria and endotoxins were correlated with the development of clinical features. Twenty-four root canals with primary endodontic disease and apical periodontitis were selected. Clinical features such as pain on palpation, pain on percussion, and previous episode of pain were recorded. The volume (cubic millimeters) of periapical bone destruction was determined by CBCT analysis. Endotoxins and bacterial samplings were collected by using sterile/apyrogenic paper points. Endotoxins were quantified by using limulus amebocyte lysate assay (KQCL test), and bacterial count (colony-forming units [CFU]/mL) was determined by using anaerobic culture techniques. Data were analyzed by Pearson correlation and multiple logistic regression (P < .05). Endotoxins and bacteria were detected in 100% of the root canal samples (24 of 24), with median values of 10.92 endotoxin units (EU)/mL (1.75-128 EU/mL) and 7.5 × 10(5) CFU/mL (3.20 × 10(5)-8.16 × 10(6) CFU/mL), respectively. The median volume of bone destruction determined by CBCT analysis was 100 mm(3) (10-450 mm(3)). The multiple regression analysis revealed a positive correlation between higher levels of endotoxins present in root canal infection and larger volume of bone destruction (P < .05). Moreover, higher levels of endotoxins were also correlated with the presence of previous pain (P < .05). Our findings revealed that the levels of endotoxins found in root canal infection are related to the volume of periapical bone destruction determined by CBCT analysis. Moreover, the levels of endotoxin are related to the presence of previous pain. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Efficient Characterization of Protein Cavities within Molecular Simulation Trajectories: trj_cavity.
Paramo, Teresa; East, Alexandra; Garzón, Diana; Ulmschneider, Martin B; Bond, Peter J
2014-05-13
Protein cavities and tunnels are critical in determining phenomena such as ligand binding, molecular transport, and enzyme catalysis. Molecular dynamics (MD) simulations enable the exploration of the flexibility and conformational plasticity of protein cavities, extending the information available from static experimental structures relevant to, for example, drug design. Here, we present a new tool (trj_cavity) implemented within the GROMACS ( www.gromacs.org ) framework for the rapid identification and characterization of cavities detected within MD trajectories. trj_cavity is optimized for usability and computational efficiency and is applicable to the time-dependent analysis of any cavity topology, and optional specialized descriptors can be used to characterize, for example, protein channels. Its novel grid-based algorithm performs an efficient neighbor search whose calculation time is linear with system size, and a comparison of performance with other widely used cavity analysis programs reveals an orders-of-magnitude improvement in the computational cost. To demonstrate its potential for revealing novel mechanistic insights, trj_cavity has been used to analyze long-time scale simulation trajectories for three diverse protein cavity systems. This has helped to reveal, respectively, the lipid binding mechanism in the deep hydrophobic cavity of a soluble mite-allergen protein, Der p 2; a means for shuttling carbohydrates between the surface-exposed substrate-binding and catalytic pockets of a multidomain, membrane-proximal pullulanase, PulA; and the structural basis for selectivity in the transmembrane pore of a voltage-gated sodium channel (NavMs), embedded within a lipid bilayer environment. trj_cavity is available for download under an open-source license ( http://sourceforge.net/projects/trjcavity ). A simplified, GROMACS-independent version may also be compiled.
Food habits and nutrition education--computer aided analysis of data.
Wise, A; Liddell, J A; Lockie, G M
1987-04-01
Nutrition education messages should take into account the food habits of those who are to be educated. These can be revealed by computer analysis of weighed intake data, which has been collected for calculation of nutrient intakes. Seventy-six students and staff at Robert Gordon's Institute of Technology weighed their food for 1 week and the records were used to determine the frequency of consumption of foods and portion sizes, as well as nutrient intakes. There were only very minor relationships between the number of different foods chosen and nutritional variables. Nutrition students had successfully changed the frequency of consumption of certain foods relative to others and as a result consumed diets containing a lower proportion of energy from fat. Messages to non-nutrition students might profitably incorporate those beneficial changes that nutrition students had easily accomplished. This study revealed that certain (otherwise common and nutritionally unsound) food choices were not a major part of the subjects' habits, and could be given low priority in educational messages. It was suggested that foods exhibiting high variability of portion weight might be under greater individual control and hence more amenable to change. A study of the distribution of portion weights reveals information about number of slices, biscuits, etc, taken in each portion. This varied for different kinds of biscuit. It was concluded that messages should target specific foods rather than stress variation in the diet. It is suggested that educators should consider whether messages would be more effective in terms of frequency of consumption or size of portion for particular groups. The meal distribution pattern also shows which foods are most commonly consumed at home or in the canteen, hence whether education might be best directed to the individual or the caterer, respectively.
Jinadatha, Chetan; Villamaria, Frank C; Coppin, John D; Dale, Charles R; Williams, Marjory D; Whitworth, Ryan; Stibich, Mark
2017-12-28
While research has demonstrated the importance of a clean health care environment, there is a lack of research on the role portable medical equipment (PME) play in the transmission cycle of healthcare-acquired infections (HAIs). This study investigated the patterns and sequence of contact events among health care workers, patients, surfaces, and medical equipment in a hospital environment. Research staff observed patient care events over six different 24 h periods on six different hospital units. Each encounter was recorded as a sequence of events and analyzed using sequence analysis and visually represented by network plots. In addition, a point prevalence microbial sample was taken from the computer on wheels (COW). The most touched items during patient care was the individual patient (850), bedrail (375), bed-surface (302), and bed side Table (223). Three of the top ten most common subsequences included touching PME and the patient: computer on wheels ➔ patient (62 of 274 total sequences, 22.6%, contained this sequence), patient ➔ COW (20.4%), and patient ➔ IV pump (16.1%). The network plots revealed large interconnectedness among objects in the room, the patient, PME, and the healthcare worker. Our results demonstrated that PME such as COW and IV pump were two of the most highly-touched items during patient care. Even with proper hand sanitization and personal protective equipment, this sequence analysis reveals the potential for contamination from the patient and environment, to a vector such as portable medical equipment, and ultimately to another patient in the hospital.
Atay, Christina; Conway, Erin R.; Angus, Daniel; Wiles, Janet; Baker, Rosemary; Chenery, Helen J.
2015-01-01
The progressive neuropathology involved in dementia frequently causes a gradual decline in communication skills. Communication partners who are unaware of the specific communication problems faced by people with dementia (PWD) can inadvertently challenge their conversation partner, leading to distress and a reduced flow of information between speakers. Previous research has produced an extensive literature base recommending strategies to facilitate conversational engagement in dementia. However, empirical evidence for the beneficial effects of these strategies on conversational dynamics is sparse. This study uses a time-efficient computational discourse analysis tool called Discursis to examine the link between specific communication behaviours and content-based conversational engagement in 20 conversations between PWD living in residential aged-care facilities and care staff members. Conversations analysed here were baseline conversations recorded before staff members underwent communication training. Care staff members spontaneously exhibited a wide range of facilitative and non-facilitative communication behaviours, which were coded for analysis of conversation dynamics within these baseline conversations. A hybrid approach combining manual coding and automated Discursis metric analysis provides two sets of novel insights. Firstly, this study revealed nine communication behaviours that, if used by the care staff member in a given turn, significantly increased the appearance of subsequent content-based engagement in the conversation by PWD. Secondly, the current findings reveal alignment between human- and computer-generated labelling of communication behaviour for 8 out of the total 22 behaviours under investigation. The approach demonstrated in this study provides an empirical procedure for the detailed evaluation of content-based conversational engagement associated with specific communication behaviours. PMID:26658135
Collins, Anne G E; Frank, Michael J
2018-03-06
Learning from rewards and punishments is essential to survival and facilitates flexible human behavior. It is widely appreciated that multiple cognitive and reinforcement learning systems contribute to decision-making, but the nature of their interactions is elusive. Here, we leverage methods for extracting trial-by-trial indices of reinforcement learning (RL) and working memory (WM) in human electro-encephalography to reveal single-trial computations beyond that afforded by behavior alone. Neural dynamics confirmed that increases in neural expectation were predictive of reduced neural surprise in the following feedback period, supporting central tenets of RL models. Within- and cross-trial dynamics revealed a cooperative interplay between systems for learning, in which WM contributes expectations to guide RL, despite competition between systems during choice. Together, these results provide a deeper understanding of how multiple neural systems interact for learning and decision-making and facilitate analysis of their disruption in clinical populations.
Mukherjee, Anirban; Bal, Chandrasekhar; Tripathi, Madhavi; Das, Chandan Jyoti; Shamim, Shamim Ahmed
2017-01-01
A 44-year-old female with known primary myelofibrosis presented with shortness of breath. High Resolution Computed Tomography thorax revealed large heterogeneously enhancing extraparenchymal soft tissue density mass involving bilateral lung fields. F-18-fluorodeoxyglucose (FDG) positron emission tomography/computed tomography revealed mildly FDG avid soft tissue density mass with specks of calcification involving bilateral lung fields, liver, and spleen. Subsequent histopathologic evaluation from the right lung mass was suggestive of extramedullary hematopoesis. PMID:28533647
A study of the accuracy of neutrally buoyant bubbles used as flow tracers in air
NASA Technical Reports Server (NTRS)
Kerho, Michael F.
1993-01-01
Research has been performed to determine the accuracy of neutrally buoyant and near neutrally buoyant bubbles used as flow tracers in air. Theoretical, computational, and experimental results are presented to evaluate the dynamics of bubble trajectories and factors affecting their ability to trace flow-field streamlines. The equation of motion for a single bubble was obtained and evaluated using a computational scheme to determine the factors which affect a bubble's trajectory. A two-dimensional experiment was also conducted to experimentally determine bubble trajectories in the stagnation region of NACA 0012 airfoil at 0 deg angle of attack using a commercially available helium bubble generation system. Physical properties of the experimental bubble trajectories were estimated using the computational scheme. These properties included the density ratio and diameter of the individual bubbles. the helium bubble system was then used to visualize and document the flow field about a 30 deg swept semispan wing with simulated glaze ice. Results were compared to Navier-Stokes calculations and surface oil flow visualization. The theoretical and computational analysis have shown that neutrally buoyant bubbles will trace even the most complex flow patterns. Experimental analysis revealed that the use of bubbles to trace flow patterns should be limited to qualitative measurements unless care is taken to ensure neutral buoyancy. This is due to the difficulty in the production of neutrally buoyant bubbles.
Aerothermodynamic Analysis of Commercial Experiment Transporter (COMET) Reentry Capsule
NASA Technical Reports Server (NTRS)
Wood, William A.; Gnoffo, Peter A.; Rault, Didier F. G.
1996-01-01
An aerothermodynamic analysis of the Commercial Experiment Transporter (COMET) reentry capsule has been performed using the laminar thin-layer Navier-Stokes solver Langley Aerothermodynamic Upwind Relaxation Algorithm. Flowfield solutions were obtained at Mach numbers 1.5, 2, 5, 10, 15, 20, 25, and 27.5. Axisymmetric and 5, 10, and 20 degree angles of attack were considered across the Mach-number range, with the Mach 25 conditions taken to 90 degrees angle of attack and the Mach 27.5 cases taken to 60 degrees angle of attack. Detailed surface heat-transfer rates were computed at Mach 20 and 25, revealing that heating rates on the heat-shield shoulder ,can exceed the stagnation-point heating by 230 percent. Finite-rate chemistry solutions were performed above Mach 10, otherwise perfect gas computations were made. Drag, lift, and pitching moment coefficients are computed and details of a wake flow are presented. The effect of including the wake in the solution domain was investigated and base pressure corrections to forebody drag coefficients were numerically determined for the lower Mach numbers. Pitching moment comparisons are made with direct simulation Monte Carlo results in the more rarefied flow at the highest Mach numbers, showing agreement within two-percent. Thin-layer Navier-Stokes computations of the axial force are found to be 15 percent higher across the speed range than the empirical/Newtonian based results used during the initial trajectory analyses.
NASA Astrophysics Data System (ADS)
Cheng, Tian-Le; Ma, Fengde D.; Zhou, Jie E.; Jennings, Guy; Ren, Yang; Jin, Yongmei M.; Wang, Yu U.
2012-01-01
Diffuse scattering contains rich information on various structural disorders, thus providing a useful means to study the nanoscale structural deviations from the average crystal structures determined by Bragg peak analysis. Extraction of maximal information from diffuse scattering requires concerted efforts in high-quality three-dimensional (3D) data measurement, quantitative data analysis and visualization, theoretical interpretation, and computer simulations. Such an endeavor is undertaken to study the correlated dynamic atomic position fluctuations caused by thermal vibrations (phonons) in precursor state of shape-memory alloys. High-quality 3D diffuse scattering intensity data around representative Bragg peaks are collected by using in situ high-energy synchrotron x-ray diffraction and two-dimensional digital x-ray detector (image plate). Computational algorithms and codes are developed to construct the 3D reciprocal-space map of diffuse scattering intensity distribution from the measured data, which are further visualized and quantitatively analyzed to reveal in situ physical behaviors. Diffuse scattering intensity distribution is explicitly formulated in terms of atomic position fluctuations to interpret the experimental observations and identify the most relevant physical mechanisms, which help set up reduced structural models with minimal parameters to be efficiently determined by computer simulations. Such combined procedures are demonstrated by a study of phonon softening phenomenon in precursor state and premartensitic transformation of Ni-Mn-Ga shape-memory alloy.
Can natural proteins designed with 'inverted' peptide sequences adopt native-like protein folds?
Sridhar, Settu; Guruprasad, Kunchur
2014-01-01
We have carried out a systematic computational analysis on a representative dataset of proteins of known three-dimensional structure, in order to evaluate whether it would possible to 'swap' certain short peptide sequences in naturally occurring proteins with their corresponding 'inverted' peptides and generate 'artificial' proteins that are predicted to retain native-like protein fold. The analysis of 3,967 representative proteins from the Protein Data Bank revealed 102,677 unique identical inverted peptide sequence pairs that vary in sequence length between 5-12 and 18 amino acid residues. Our analysis illustrates with examples that such 'artificial' proteins may be generated by identifying peptides with 'similar structural environment' and by using comparative protein modeling and validation studies. Our analysis suggests that natural proteins may be tolerant to accommodating such peptides.
NASA Astrophysics Data System (ADS)
Jerosch, K.; Lüdtke, A.; Schlüter, M.; Ioannidis, G. T.
2007-02-01
The combination of new underwater technology as remotely operating vehicles (ROVs), high-resolution video imagery, and software to compute georeferenced mosaics of the seafloor provides new opportunities for marine geological or biological studies and applications in offshore industry. Even during single surveys by ROVs or towed systems large amounts of images are compiled. While these underwater techniques are now well-engineered, there is still a lack of methods for the automatic analysis of the acquired image data. During ROV dives more than 4200 georeferenced video mosaics were compiled for the HÅkon Mosby Mud Volcano (HMMV). Mud volcanoes as HMMV are considered as significant source locations for methane characterised by unique chemoautotrophic communities as Beggiatoa mats. For the detection and quantification of the spatial distribution of Beggiatoa mats an automated image analysis technique was developed, which applies watershed transformation and relaxation-based labelling of pre-segmented regions. Comparison of the data derived by visual inspection of 2840 video images with the automated image analysis revealed similarities with a precision better than 90%. We consider this as a step towards a time-efficient and accurate analysis of seafloor images for computation of geochemical budgets and identification of habitats at the seafloor.
Conformational analysis and circular dichroism of bilirubin, the yellow pigment of jaundice
NASA Astrophysics Data System (ADS)
Lightner, David A.; Person, Richard; Peterson, Blake; Puzicha, Gisbert; Pu, Yu-Ming; Bojadziev, Stefan
1991-06-01
Conformational analysis of (4Z, 15Z)-bilirubin-IX(alpha) by molecular mechanics computations reveals a global energy minimum folded conformation. Powerful added stabilization is achieved through intramolecular hydrogen bonding. Theoretical treatment of bilirubin as a molecular exciton predicts an intense bisignate circular dichroism spectrum for the folded conformation: (Delta) (epsilon) is congruent to 270 L (DOT) mole-1 (DOT) cm-1 for the $OM450 nm electronic transition(s). Synthesis of bilirubin analogs with propionic acid groups methylated at the (alpha) or (beta) position introduces an allosteric effect that allows for an optical resolution of the pigments, with enantiomers exhibiting the theoretically predicted circular dichroism.
NASA Astrophysics Data System (ADS)
Lespinats, S.; Meyer-Bäse, Anke; He, Huan; Marshall, Alan G.; Conrad, Charles A.; Emmett, Mark R.
2009-05-01
Partial Least Square Regression (PLSR) and Data-Driven High Dimensional Scaling (DD-HDS) are employed for the prediction and the visualization of changes in polar lipid expression induced by different combinations of wild-type (wt) p53 gene therapy and SN38 chemotherapy of U87 MG glioblastoma cells. A very detailed analysis of the gangliosides reveals that certain gangliosides of GM3 or GD1-type have unique properties not shared by the others. In summary, this preliminary work shows that data mining techniques are able to determine the modulation of gangliosides by different treatment combinations.
Geo-Distinctive Comorbidity Networks of Pediatric Asthma.
Shin, Eun Kyong; Shaban-Nejad, Arash
2018-01-01
Most pediatric asthma cases occur in complex interdependencies, exhibiting complex manifestation of multiple symptoms. Studying asthma comorbidities can help to better understand the etiology pathway of the disease. Albeit such relations of co-expressed symptoms and their interactions have been highlighted recently, empirical investigation has not been rigorously applied to pediatric asthma cases. In this study, we use computational network modeling and analysis to reveal the links and associations between commonly co-observed diseases/conditions with asthma among children in Memphis, Tennessee. We present a novel method for geo-parsed comorbidity network analysis to show the distinctive patterns of comorbidity networks in urban and suburban areas in Memphis.
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator)
1980-01-01
Several possibilities were considered for defining the data set in which the same test areas could be used for each of the four different spatial resolutions being evaluated. The LARSYS CLUSTER was used to sort the vectors into spectral classes to reduce the within-spectral class variability in an effort to develop training statistics. A data quality test was written to determine the basic signal to noise characteristics within the data set being used. Because preliminary analysis of the LANDSAT MSS data revealed the presence of high cirrus clouds, other data sets are being sought.
NASA Astrophysics Data System (ADS)
Saja, D.; Joe, I. Hubert; Jayakumar, V. S.
2006-01-01
The NIR-FT Raman, FT-IR spectral analysis of potential NLO material P-Amino Acetanilide is carried out by density functional computations. The optimized geometry shows that NH2 and NHCOCH3 groups substituted in para position of phenyl ring are non-planar which predicts maximum conjugation of molecule with donor and acceptor groups. Vibrational analysis reveals that simultaneous IR and Raman activation of the phenyl ring modes also provide evidence for the charge transfer interaction between the donors and the acceptor can make the molecule highly polarized and the intra molecular charge transfer interaction must be responsible for the NLO properties of PAA.
A genomic regulatory network for development
NASA Technical Reports Server (NTRS)
Davidson, Eric H.; Rast, Jonathan P.; Oliveri, Paola; Ransick, Andrew; Calestani, Cristina; Yuh, Chiou-Hwa; Minokawa, Takuya; Amore, Gabriele; Hinman, Veronica; Arenas-Mena, Cesar;
2002-01-01
Development of the body plan is controlled by large networks of regulatory genes. A gene regulatory network that controls the specification of endoderm and mesoderm in the sea urchin embryo is summarized here. The network was derived from large-scale perturbation analyses, in combination with computational methodologies, genomic data, cis-regulatory analysis, and molecular embryology. The network contains over 40 genes at present, and each node can be directly verified at the DNA sequence level by cis-regulatory analysis. Its architecture reveals specific and general aspects of development, such as how given cells generate their ordained fates in the embryo and why the process moves inexorably forward in developmental time.
NASA Technical Reports Server (NTRS)
Housner, J. M.; Anderson, M.; Belvin, W.; Horner, G.
1985-01-01
Dynamic analysis of large space antenna systems must treat the deployment as well as vibration and control of the deployed antenna. Candidate computer programs for deployment dynamics, and issues and needs for future program developments are reviewed. Some results for mast and hoop deployment are also presented. Modeling of complex antenna geometry with conventional finite element methods and with repetitive exact elements is considered. Analytical comparisons with experimental results for a 15 meter hoop/column antenna revealed the importance of accurate structural properties including nonlinear joints. Slackening of cables in this antenna is also a consideration. The technology of designing actively damped structures through analytical optimization is discussed and results are presented.
Claussnitzer, Melina; Dankel, Simon N; Klocke, Bernward; Grallert, Harald; Glunk, Viktoria; Berulava, Tea; Lee, Heekyoung; Oskolkov, Nikolay; Fadista, Joao; Ehlers, Kerstin; Wahl, Simone; Hoffmann, Christoph; Qian, Kun; Rönn, Tina; Riess, Helene; Müller-Nurasyid, Martina; Bretschneider, Nancy; Schroeder, Timm; Skurk, Thomas; Horsthemke, Bernhard; Spieler, Derek; Klingenspor, Martin; Seifert, Martin; Kern, Michael J; Mejhert, Niklas; Dahlman, Ingrid; Hansson, Ola; Hauck, Stefanie M; Blüher, Matthias; Arner, Peter; Groop, Leif; Illig, Thomas; Suhre, Karsten; Hsu, Yi-Hsiang; Mellgren, Gunnar; Hauner, Hans; Laumen, Helmut
2014-01-16
Genome-wide association studies have revealed numerous risk loci associated with diverse diseases. However, identification of disease-causing variants within association loci remains a major challenge. Divergence in gene expression due to cis-regulatory variants in noncoding regions is central to disease susceptibility. We show that integrative computational analysis of phylogenetic conservation with a complexity assessment of co-occurring transcription factor binding sites (TFBS) can identify cis-regulatory variants and elucidate their mechanistic role in disease. Analysis of established type 2 diabetes risk loci revealed a striking clustering of distinct homeobox TFBS. We identified the PRRX1 homeobox factor as a repressor of PPARG2 expression in adipose cells and demonstrate its adverse effect on lipid metabolism and systemic insulin sensitivity, dependent on the rs4684847 risk allele that triggers PRRX1 binding. Thus, cross-species conservation analysis at the level of co-occurring TFBS provides a valuable contribution to the translation of genetic association signals to disease-related molecular mechanisms. Copyright © 2014 Elsevier Inc. All rights reserved.
An analysis of texture, timbre, and rhythm in relation to form in Magnus Lindberg's "Gran Duo"
NASA Astrophysics Data System (ADS)
Wolfe, Brian Thomas
Gran Duo (1999-2000) by Magnus Lindberg (b. 1958) is the result of a commission by Sir Simon Rattle, former conductor of the City of Birmingham (England) Symphony Orchestra, and the Royal Festival Hall to commemorate the third millennium. Composed for twenty-four woodwinds and brass, Lindberg divides the woodwind and brass families into eight characters that serve as participants in an attentive twenty-minute conversation. The document includes biographical information about the composition to further understand Lindberg's writing style. The composer's use of computer-assisted composition techniques inspires an alternative structural analysis of Gran Duo. Spectral graphs provide a supplementary tool for score study assisting with the verification of formal structural elements. A tempo chart allows the conductor to easily identify form and tempo relationships between each of the nineteen sections throughout the five-movement composition. In order to reveal character areas and their relation to the structure of the work, the analysis of texture, timbre, and rhythm reveal the formal structure of the composition, which reflects a conversation between the brass and woodwinds in this setting for wind instruments.
Comparison and correlation of Simple Sequence Repeats distribution in genomes of Brucella species
Kiran, Jangampalli Adi Pradeep; Chakravarthi, Veeraraghavulu Praveen; Kumar, Yellapu Nanda; Rekha, Somesula Swapna; Kruti, Srinivasan Shanthi; Bhaskar, Matcha
2011-01-01
Computational genomics is one of the important tools to understand the distribution of closely related genomes including simple sequence repeats (SSRs) in an organism, which gives valuable information regarding genetic variations. The central objective of the present study was to screen the SSRs distributed in coding and non-coding regions among different human Brucella species which are involved in a range of pathological disorders. Computational analysis of the SSRs in the Brucella indicates few deviations from expected random models. Statistical analysis also reveals that tri-nucleotide SSRs are overrepresented and tetranucleotide SSRs underrepresented in Brucella genomes. From the data, it can be suggested that over expressed tri-nucleotide SSRs in genomic and coding regions might be responsible in the generation of functional variation of proteins expressed which in turn may lead to different pathogenicity, virulence determinants, stress response genes, transcription regulators and host adaptation proteins of Brucella genomes. Abbreviations SSRs - Simple Sequence Repeats, ORFs - Open Reading Frames. PMID:21738309
Baltzer, Pascal Andreas Thomas; Freiberg, Christian; Beger, Sebastian; Vag, Tibor; Dietzel, Matthias; Herzog, Aimee B; Gajda, Mieczyslaw; Camara, Oumar; Kaiser, Werner A
2009-09-01
Enhancement characteristics after administration of a contrast agent are regarded as a major criterion for differential diagnosis in magnetic resonance mammography (MRM). However, no consensus exists about the best measurement method to assess contrast enhancement kinetics. This systematic investigation was performed to compare visual estimation with manual region of interest (ROI) and computer-aided diagnosis (CAD) analysis for time curve measurements in MRM. A total of 329 patients undergoing surgery after MRM (1.5 T) were analyzed prospectively. Dynamic data were measured using visual estimation, including ROI as well as CAD methods, and classified depending on initial signal increase and delayed enhancement. Pathology revealed 469 lesions (279 malignant, 190 benign). Kappa agreement between the methods ranged from 0.78 to 0.81. Diagnostic accuracies of 74.4% (visual), 75.7% (ROI), and 76.6% (CAD) were found without statistical significant differences. According to our results, curve type measurements are useful as a diagnostic criterion in breast lesions irrespective of the method used.
NASA Astrophysics Data System (ADS)
Wesley, Beth Eddinger; Krockover, Gerald H.; Devito, Alfred
The purpose of this study was to determine the effects of computer-assisted instruction (CAI) versus a text mode of programmed instruction (PI), and the cognitive style of locus of control, on preservice elementary teachers' achievement of the integrated science process skills. Eighty-one preservice elementary teachers in six sections of a science methods class were classified as internally or externally controlled. The sections were randomly assigned to receive instruction in the integrated science process skills via a microcomputer or printed text. The study used a pretest-posttest control group design. Before assessing main and interaction effects, analysis of covariance was used to adjust posttest scores using the pretest scores. Statistical analysis revealed that main effects were not significant. Additionally, no interaction effects between treatments and loci of control were demonstrated. The results suggest that printed PI and tutorial CAI are equally effective modes of instruction for teaching internally and externally oriented preservice elementary teachers the integrated science process skills.
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Foote, John; Litchford, Ron
2006-01-01
The objective of this effort is to perform design analyses for a non-nuclear hot-hydrogen materials tester, as a first step towards developing efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber design and analysis. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective, and thermal radiative heat transfers. The goals of the design analyses are to maintain maximum hot-hydrogen jet impingement energy and to minimize chamber wall heating. The results of analyses on three test fixture configurations and the rationale for final selection are presented. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.
NASA Astrophysics Data System (ADS)
Joung, Tae-Hwan; Sammut, Karl; He, Fangpo; Lee, Seung-Keon
2012-03-01
Autonomous Underwater Vehicles (AUVs) provide a useful means of collecting detailed oceano-graphic information. The hull resistance of an AUV is an important factor in determining the power requirements and range of the vehicle. This paper describes a procedure using Computational Fluid Dynamics (CFD) for determining the hull resistance of an AUV under development, for a given propeller rotation speed and within a given range of AUV velocities. The CFD analysis results reveal the distribution of the hydrodynamic values (velocity, pressure, etc.) around the AUV hull and its ducted propeller. The paper then proceeds to present a methodology for optimizing the AUV profile in order to reduce the total resistance. This paper demonstrates that shape optimization of conceptual designs is possible using the commercial CFD package contained in Ansys™. The optimum design to minimize the drag force of the AUV was identified for a given object function and a set of constrained design parameters
NASA Astrophysics Data System (ADS)
Moaienla, T.; Singh, Th. David; Singh, N. Rajmuhon; Devi, M. Indira
2009-10-01
Studying the absorption difference and comparative absorption spectra of the interaction of Pr(III) and Nd(III) with L-phenylalanine, L-glycine, L-alanine and L-aspartic acid in the presence and absence of Ca 2+ in organic solvents, various energy interaction parameters like Slater-Condon ( FK), Racah ( Ek), Lande factor ( ξ4f), nephelauxetic ratio ( β), bonding ( b1/2), percentage-covalency ( δ) have been evaluated applying partial and multiple regression analysis. The values of oscillator strength ( P) and Judd-Ofelt electric dipole intensity parameter Tλ ( λ = 2, 4, 6) for different 4f-4f transitions have been computed. On analysis of the variation of the various energy interaction parameters as well as the changes in the oscillator strength ( P) and Tλ values reveal the mode of binding with different ligands.
NASA Technical Reports Server (NTRS)
Easton, John W.; Struk, Peter M.; Rotella, Anthony
2008-01-01
As a part of efforts to develop an electronics repair capability for long duration space missions, techniques and materials for soldering components on a circuit board in reduced gravity must be developed. This paper presents results from testing solder joint formation in low gravity on a NASA Reduced Gravity Research Aircraft. The results presented include joints formed using eutectic tin-lead solder and one of the following fluxes: (1) a no-clean flux core, (2) a rosin flux core, and (3) a solid solder wire with external liquid no-clean flux. The solder joints are analyzed with a computed tomography (CT) technique which imaged the interior of the entire solder joint. This replaced an earlier technique that required the solder joint to be destructively ground down revealing a single plane which was subsequently analyzed. The CT analysis technique is described and results presented with implications for future testing as well as implications for the overall electronics repair effort discussed.
Decoding the Regulatory Network for Blood Development from Single-Cell Gene Expression Measurements
Haghverdi, Laleh; Lilly, Andrew J.; Tanaka, Yosuke; Wilkinson, Adam C.; Buettner, Florian; Macaulay, Iain C.; Jawaid, Wajid; Diamanti, Evangelia; Nishikawa, Shin-Ichi; Piterman, Nir; Kouskoff, Valerie; Theis, Fabian J.; Fisher, Jasmin; Göttgens, Berthold
2015-01-01
Here we report the use of diffusion maps and network synthesis from state transition graphs to better understand developmental pathways from single cell gene expression profiling. We map the progression of mesoderm towards blood in the mouse by single-cell expression analysis of 3,934 cells, capturing cells with blood-forming potential at four sequential developmental stages. By adapting the diffusion plot methodology for dimensionality reduction to single-cell data, we reconstruct the developmental journey to blood at single-cell resolution. Using transitions between individual cellular states as input, we develop a single-cell network synthesis toolkit to generate a computationally executable transcriptional regulatory network model that recapitulates blood development. Model predictions were validated by showing that Sox7 inhibits primitive erythropoiesis, and that Sox and Hox factors control early expression of Erg. We therefore demonstrate that single-cell analysis of a developing organ coupled with computational approaches can reveal the transcriptional programs that control organogenesis. PMID:25664528
Computational study of hydroxyapatite structures, properties and defects
NASA Astrophysics Data System (ADS)
Bystrov, V. S.; Coutinho, J.; Bystrova, A. V.; Dekhtyar, Yu D.; Pullar, R. C.; Poronin, A.; Palcevskis, E.; Dindune, A.; Alkan, B.; Durucan, C.; Paramonova, E. V.
2015-03-01
Hydroxyapatite (HAp) was studied from a first principle approach using the local density approximation (LDA) method in AIMPRO code, in combination with various quantum mechanical (QM) and molecular mechanical (MM) methods from HypemChem 7.5/8.0. The data obtained were used for studies of HAp structures, the physical properties of HAp (density of electronic states—DOS, bulk modulus etc) and defects in HAp. Computed data confirmed that HAp can co-exist in different phases—hexagonal and monoclinic. Ordered monoclinic structures, which could reveal piezoelectric properties, are of special interest. The data obtained allow us to characterize the properties of the following defects in HAp: O, H and OH vacancies; H and OH interstitials; substitutions of Ca by Mg, Sr, Mn or Se, and P by Si. These properties reveal the appearance of additional energy levels inside the forbidden zone, shifts of the top of the valence band or the bottom of the conduction band, and subsequent changes in the width of the forbidden zone. The data computed are compared with other known data, both calculated and experimental, such as alteration of the electron work functions under different influences of various defects and treatments, obtained by photoelectron emission. The obtained data are very useful, and there is an urgent need for such analysis of modified HAp interactions with living cells and tissues, improvement of implant techniques and development of new nanomedical applications.
NASA Astrophysics Data System (ADS)
Khamees, Hussien Ahmed; Jyothi, Mahima; Khanum, Shaukath Ara; Madegowda, Mahendra
2018-06-01
The compound 1-(3,4-dimethoxyphenyl)-3-(4-flurophenyl)-propan-1-one (DFPO) was synthesized by Claisen-Schmidt condensation reaction and the single crystals were obtained by slow evaporation method. Three-dimensional structure was confirmed by single crystal X-ray diffraction method and exhibiting the triclinic crystal system with space group P-1. The crystal structure is stabilized by Csbnd H⋯O intermolecular and weak interactions. Computed molecular geometry has been obtained by density functional theory (DFT) and compared with experimental results. The spectra of both FT-IR in the range (4000-400 cm-1) and FT- Raman (3500-50 cm-1) of DFPO were recorded experimentally and computed by (DFT) using B3LYP/6-311G (d,p) as basis sets. Intramolecular charge transfer has been scanned using natural bond orbital (NBO) analysis and revealed the various contribution of bonding and lone pair to the stabilization of molecule. Nonlinear optical activity (NLO) of the title compound has been determined by second harmonic generation (SHG) and computed using DFT method. Hyperpolarizability, HOMO-LUMO energy gap, hardness, softness electronegativity and others Global reactivity descriptors of DFPO has been calculated and revealed complete picture of chemical reactivity of DFPO. Hirshfeld surface analyses were applied to investigate the intermolecular interactions and revealed that more than two-thirds of the inter contacts are associated with O⋯H, C⋯H and H⋯H interactions. Docking studies of DFPO showed inhibition of Vascular endothelial growth Factor human receptor (VEGFR-2) signalling pathway, which indicates DFPO as anti-angiogenesis, that play pivotal role in cancer, so we suggest it for clinical studies to evaluate its potential to treat human cancers.
Parallel solution of the symmetric tridiagonal eigenproblem. Research report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jessup, E.R.
1989-10-01
This thesis discusses methods for computing all eigenvalues and eigenvectors of a symmetric tridiagonal matrix on a distributed-memory Multiple Instruction, Multiple Data multiprocessor. Only those techniques having the potential for both high numerical accuracy and significant large-grained parallelism are investigated. These include the QL method or Cuppen's divide and conquer method based on rank-one updating to compute both eigenvalues and eigenvectors, bisection to determine eigenvalues and inverse iteration to compute eigenvectors. To begin, the methods are compared with respect to computation time, communication time, parallel speed up, and accuracy. Experiments on an IPSC hypercube multiprocessor reveal that Cuppen's method ismore » the most accurate approach, but bisection with inverse iteration is the fastest and most parallel. Because the accuracy of the latter combination is determined by the quality of the computed eigenvectors, the factors influencing the accuracy of inverse iteration are examined. This includes, in part, statistical analysis of the effect of a starting vector with random components. These results are used to develop an implementation of inverse iteration producing eigenvectors with lower residual error and better orthogonality than those generated by the EISPACK routine TINVIT. This thesis concludes with adaptions of methods for the symmetric tridiagonal eigenproblem to the related problem of computing the singular value decomposition (SVD) of a bidiagonal matrix.« less
Parallel solution of the symmetric tridiagonal eigenproblem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jessup, E.R.
1989-01-01
This thesis discusses methods for computing all eigenvalues and eigenvectors of a symmetric tridiagonal matrix on a distributed memory MIMD multiprocessor. Only those techniques having the potential for both high numerical accuracy and significant large-grained parallelism are investigated. These include the QL method or Cuppen's divide and conquer method based on rank-one updating to compute both eigenvalues and eigenvectors, bisection to determine eigenvalues, and inverse iteration to compute eigenvectors. To begin, the methods are compared with respect to computation time, communication time, parallel speedup, and accuracy. Experiments on an iPSC hyper-cube multiprocessor reveal that Cuppen's method is the most accuratemore » approach, but bisection with inverse iteration is the fastest and most parallel. Because the accuracy of the latter combination is determined by the quality of the computed eigenvectors, the factors influencing the accuracy of inverse iteration are examined. This includes, in part, statistical analysis of the effects of a starting vector with random components. These results are used to develop an implementation of inverse iteration producing eigenvectors with lower residual error and better orthogonality than those generated by the EISPACK routine TINVIT. This thesis concludes with adaptations of methods for the symmetric tridiagonal eigenproblem to the related problem of computing the singular value decomposition (SVD) of a bidiagonal matrix.« less
Chesser, Amy K; Keene Woods, Nikki; Wipperman, Jennifer; Wilson, Rachel; Dong, Frank
2014-02-01
Low health literacy is associated with poor health outcomes. Research is needed to understand the mechanisms and pathways of its effects. Computer-based assessment tools may improve efficiency and cost-effectiveness of health literacy research. The objective of this preliminary study was to assess if administration of the Short Test of Functional Health Literacy in Adults (STOFHLA) through a computer-based medium was comparable to the paper-based test in terms of accuracy and time to completion. A randomized, crossover design was used to compare computer versus paper format of the STOFHLA at a Midwestern family medicine residency program. Eighty participants were initially randomized to either computer (n = 42) or paper (n = 38) format of the STOFHLA. After a 30-day washout period, participants returned to complete the other version of the STOFHLA. Data analysis revealed no significant difference between paper- and computer-based surveys (p = .9401; N = 57). The majority of participants showed "adequate" health literacy via paper- and computer-based surveys (100% and 97% of participants, respectively). Electronic administration of STOFHLA results were equivalent to the paper administration results for evaluation of adult health literacy. Future investigations should focus on expanded populations in multiple health care settings and validation of other health literacy screening tools in a clinical setting.
Theoretical calculation of polarizability isotope effects.
Moncada, Félix; Flores-Moreno, Roberto; Reyes, Andrés
2017-03-01
We propose a scheme to estimate hydrogen isotope effects on molecular polarizabilities. This approach combines the any-particle molecular orbital method, in which both electrons and H/D nuclei are described as quantum waves, with the auxiliary density perturbation theory, to calculate analytically the polarizability tensor. We assess the performance of method by calculating the polarizability isotope effect for 20 molecules. A good correlation between theoretical and experimental data is found. Further analysis of the results reveals that the change in the polarizability of a X-H bond upon deuteration decreases as the electronegativity of X increases. Our investigation also reveals that the molecular polarizability isotope effect presents an additive character. Therefore, it can be computed by counting the number of deuterated bonds in the molecule.
Analysis of worldwide research in the field of cybernetics during 1997-2011.
Singh, Virender; Perdigones, Alicia; García, José Luis; Cañas-Guerrero, Ignacio; Mazarrón, Fernando R
2014-12-01
The study provides an overview of the research activity carried out in the field of cybernetics. To do so, all research papers from 1997 to 2011 (16,445 research papers) under the category of "Computer Science, Cybernetics" of Web of Science have been processed using our in-house software which is developed specifically for this purpose. Among its multiple capabilities, this software analyses individual and compound keywords, quantifies productivity taking into account the work distribution, estimates the impact of each article and determines the collaborations established at different scales. Keywords analysis identifies the evolution of the most important research topics in the field of cybernetics and their specificity in biological aspects, as well as the research topics with lesser interest. The analysis of productivity, impact and collaborations provides a framework to assess research activity in a specific and realistic context. The geographical and institutional distribution of publications reveals the leading countries and research centres, analysing their relation to main research journals. Moreover, collaborations analysis reveals great differences in terms of internationalization and complexity of research networks. The results of this study may be very useful for the characterization and the decisions made by research in the field of cybernetics.
Computer proficiency questionnaire: assessing low and high computer proficient seniors.
Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran
2015-06-01
Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
[Upper extremities, neck and back symptoms in office employees working at computer stations].
Zejda, Jan E; Bugajska, Joanna; Kowalska, Małgorzata; Krzych, Lukasz; Mieszkowska, Marzena; Brozek, Grzegorz; Braczkowska, Bogumiła
2009-01-01
To obtain current data on the occurrence ofwork-related symptoms of office computer users in Poland we implemented a questionnaire survey. Its goal was to assess the prevalence and intensity of symptoms of upper extremities, neck and back in office workers who use computers on a regular basis, and to find out if the occurrence of symptoms depends on the duration of computer use and other work-related factors. Office workers in two towns (Warszawa and Katowice), employed in large social services companies, were invited to fill in the Polish version of Nordic Questionnaire. The questions included work history and history of last-week symptoms of pain of hand/wrist, elbow, arm, neck and upper and lower back (occurrence and intensity measured by visual scale). Altogether 477 men and women returned the completed questionnaires. Between-group symptom differences (chi-square test) were verified by multivariate analysis (GLM). The prevalence of symptoms in individual body parts was as follows: neck, 55.6%; arm, 26.9%; elbow, 13.3%; wrist/hand, 29.9%; upper back, 49.6%; and lower back, 50.1%. Multivariate analysis confirmed the effect of gender, age and years of computer use on the occurrence of symptoms. Among other determinants, forearm support explained pain of wrist/hand, wrist support of elbow pain, and chair adjustment of arm pain. Association was also found between low back pain and chair adjustment and keyboard position. The findings revealed frequent occurrence of symptoms of pain in upper extremities and neck in office workers who use computers on a regular basis. Seating position could also contribute to the frequent occurrence of back pain in the examined population.
Goldklang, Monica P.; Tekabe, Yared; Zelonina, Tina; Trischler, Jordis; Xiao, Rui; Stearns, Kyle; Romanov, Alexander; Muzio, Valeria; Shiomi, Takayuki; Johnson, Lynne L.
2016-01-01
Evaluation of lung disease is limited by the inability to visualize ongoing pathological processes. Molecular imaging that targets cellular processes related to disease pathogenesis has the potential to assess disease activity over time to allow intervention before lung destruction. Because apoptosis is a critical component of lung damage in emphysema, a functional imaging approach was taken to determine if targeting apoptosis in a smoke exposure model would allow the quantification of early lung damage in vivo. Rabbits were exposed to cigarette smoke for 4 or 16 weeks and underwent single-photon emission computed tomography/computed tomography scanning using technetium-99m–rhAnnexin V-128. Imaging results were correlated with ex vivo tissue analysis to validate the presence of lung destruction and apoptosis. Lung computed tomography scans of long-term smoke–exposed rabbits exhibit anatomical similarities to human emphysema, with increased lung volumes compared with controls. Morphometry on lung tissue confirmed increased mean linear intercept and destructive index at 16 weeks of smoke exposure and compliance measurements documented physiological changes of emphysema. Tissue and lavage analysis displayed the hallmarks of smoke exposure, including increased tissue cellularity and protease activity. Technetium-99m–rhAnnexin V-128 single-photon emission computed tomography signal was increased after smoke exposure at 4 and 16 weeks, with confirmation of increased apoptosis through terminal deoxynucleotidyl transferase dUTP nick end labeling staining and increased tissue neutral sphingomyelinase activity in the tissue. These studies not only describe a novel emphysema model for use with future therapeutic applications, but, most importantly, also characterize a promising imaging modality that identifies ongoing destructive cellular processes within the lung. PMID:27483341
A Privacy-Protecting Authentication Scheme for Roaming Services with Smart Cards
NASA Astrophysics Data System (ADS)
Son, Kyungho; Han, Dong-Guk; Won, Dongho
In this work we propose a novel smart card based privacy-protecting authentication scheme for roaming services. Our proposal achieves so-called Class 2 privacy protection, i.e., no information identifying a roaming user and also linking the user's behaviors is not revealed in a visited network. It can be used to overcome the inherent structural flaws of smart card based anonymous authentication schemes issued recently. As shown in our analysis, our scheme is computationally efficient for a mobile user.
Mast, Fred D; Ratushny, Alexander V; Aitchison, John D
2014-09-15
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.
Analytical and experimental vibration studies of a 1/8-scale shuttle orbiter
NASA Technical Reports Server (NTRS)
Pinson, L. D.
1975-01-01
Natural frequencies and mode shapes for four symmetric vibration modes and four antisymmetric modes are compared with predictions based on NASTRAN finite-element analyses. Initial predictions gave poor agreement with test data; an extensive investigation revealed that the major factors influencing agreement were out-of-plane imperfections in fuselage panels and a soft fin-fuselage connection. Computations with a more refined analysis indicated satisfactory frequency predictions for all modes studied, within 11 percent of experimental values.
Reliability assessment of multiple quantum well avalanche photodiodes
NASA Technical Reports Server (NTRS)
Yun, Ilgu; Menkara, Hicham M.; Wang, Yang; Oguzman, Isamil H.; Kolnik, Jan; Brennan, Kevin F.; May, Gray S.; Wagner, Brent K.; Summers, Christopher J.
1995-01-01
The reliability of doped-barrier AlGaAs/GsAs multi-quantum well avalanche photodiodes fabricated by molecular beam epitaxy is investigated via accelerated life tests. Dark current and breakdown voltage were the parameters monitored. The activation energy of the degradation mechanism and median device lifetime were determined. Device failure probability as a function of time was computed using the lognormal model. Analysis using the electron beam induced current method revealed the degradation to be caused by ionic impurities or contamination in the passivation layer.
Hagen, R. W.; Ambos, H. D.; Browder, M. W.; Roloff, W. R.; Thomas, L. J.
1979-01-01
The Clinical Physiologic Research System (CPRS) developed from our experience in applying computers to medical instrumentation problems. This experience revealed a set of applications with a commonality in data acquisition, analysis, input/output, and control needs that could be met by a portable system. The CPRS demonstrates a practical methodology for integrating commercial instruments with distributed modular elements of local design in order to make facile responses to changing instrumentation needs in clinical environments. ImagesFigure 3
NASA Astrophysics Data System (ADS)
Passarino, Giampiero
2014-05-01
Higgs Computed Axial Tomography, an excerpt. The Higgs boson lineshape ( and the devil hath power to assume a pleasing shape, Hamlet, Act II, scene 2) is analyzed for the process, with special emphasis on the off-shell tail which shows up for large values of the Higgs virtuality. The effect of including background and interference is also discussed. The main focus of this work is on residual theoretical uncertainties, discussing how much-improved constraint on the Higgs intrinsic width can be revealed by an improved approach to analysis.
Computational analysis of aircraft pressure relief doors
NASA Astrophysics Data System (ADS)
Schott, Tyler
Modern trends in commercial aircraft design have sought to improve fuel efficiency while reducing emissions by operating at higher pressures and temperatures than ever before. Consequently, greater demands are placed on the auxiliary bleed air systems used for a multitude of aircraft operations. The increased role of bleed air systems poses significant challenges for the pressure relief system to ensure the safe and reliable operation of the aircraft. The core compartment pressure relief door (PRD) is an essential component of the pressure relief system which functions to relieve internal pressure in the core casing of a high-bypass turbofan engine during a burst duct over-pressurization event. The successful modeling and analysis of a burst duct event are imperative to the design and development of PRD's to ensure that they will meet the increased demands placed on the pressure relief system. Leveraging high-performance computing coupled with advances in computational analysis, this thesis focuses on a comprehensive computational fluid dynamics (CFD) study to characterize turbulent flow dynamics and quantify the performance of a core compartment PRD across a range of operating conditions and geometric configurations. The CFD analysis was based on a compressible, steady-state, three-dimensional, Reynolds-averaged Navier-Stokes approach. Simulations were analyzed, and results show that variations in freestream conditions, plenum environment, and geometric configurations have a non-linear impact on the discharge, moment, thrust, and surface temperature characteristics. The CFD study revealed that the underlying physics for this behavior is explained by the interaction of vortices, jets, and shockwaves. This thesis research is innovative and provides a comprehensive and detailed analysis of existing and novel PRD geometries over a range of realistic operating conditions representative of a burst duct over-pressurization event. Further, the study provides aircraft manufacturers with valuable insight into the impact that operating conditions and geometric configurations have on PRD performance and how the information can be used to assist future research and development of PRD design.
Computer-aided personal interviewing. A new technique for data collection in epidemiologic surveys.
Birkett, N J
1988-03-01
Most epidemiologic studies involve the collection of data directly from selected respondents. Traditionally, interviewers are provided with the interview in booklet form on paper and answers are recorded therein. On receipt at the study office, the interview results are coded, transcribed, and keypunched for analysis. The author's team has developed a method of personal interviewing which uses a structured interview stored on a lap-sized computer. Responses are entered into the computer and are subject to immediate error-checking and correction. All skip-patterns are automatic. Data entry to the final data-base involves no manual data transcription. A pilot evaluation with a preliminary version of the system using tape-recorded interviews in a test/re-test methodology revealed a slightly higher error rate, probably related to weaknesses in the pilot system and the training process. Computer interviews tended to be longer but other features of the interview process were not affected by computer. The author's team has now completed 2,505 interviews using this system in a community-based blood pressure survey. It has been well accepted by both interviewers and respondents. Failure to complete an interview on the computer was uncommon (5 per cent) and well-handled by paper back-up questionnaires. The results show that computer-aided personal interviewing in the home is feasible but that further evaluation is needed to establish the impact of this methodology on overall data quality.
Lünse, Christina E.; Corbino, Keith A.; Ames, Tyler D.; Nelson, James W.; Roth, Adam; Perkins, Kevin R.; Sherlock, Madeline E.
2017-01-01
Abstract The discovery of structured non-coding RNAs (ncRNAs) in bacteria can reveal new facets of biology and biochemistry. Comparative genomics analyses executed by powerful computer algorithms have successfully been used to uncover many novel bacterial ncRNA classes in recent years. However, this general search strategy favors the discovery of more common ncRNA classes, whereas progressively rarer classes are correspondingly more difficult to identify. In the current study, we confront this problem by devising several methods to select subsets of intergenic regions that can concentrate these rare RNA classes, thereby increasing the probability that comparative sequence analysis approaches will reveal their existence. By implementing these methods, we discovered 224 novel ncRNA classes, which include ROOL RNA, an RNA class averaging 581 nt and present in multiple phyla, several highly conserved and widespread ncRNA classes with properties that suggest sophisticated biochemical functions and a multitude of putative cis-regulatory RNA classes involved in a variety of biological processes. We expect that further research on these newly found RNA classes will reveal additional aspects of novel biology, and allow for greater insights into the biochemistry performed by ncRNAs. PMID:28977401
NASA Astrophysics Data System (ADS)
Rauf, Abdur; Shah, Afzal; Khan, Abdul Aziz; Shah, Aamir Hassan; Abbasi, Rashda; Qureshi, Irfan Zia; Ali, Saqib
2017-04-01
A novel Schiff base, 1-((2, 4-dimethylphenylimino)methyl)naphthalen-2-ol abbreviated as (HL) and its four metallic complexes were synthesized and confirmed by 1H and 13C NMR, FTIR, TGA and UV-Visible spectroscopy. Schiff base was also characterized by X-ray analysis. The photometric and electrochemical responses of all the synthesized compounds were investigated in a wide pH range. Structures of the compounds were optimized computationally for the evaluation of different physico-chemical parameters. On the basis of electrochemical results the redox mechanistic pathways of the compounds were proposed. The cytotoxicity analysis on Hela cells revealed that HL and its complexes inhibit cell growth as revealed from their IC50 values (HL):106.7 μM, (L2VO): 40.66 μM, (L2Sn): 5.92 μM, (L2Zn): 42.82 and (L2Co): 107.68 μM. The compounds were tested for anti-diabetic, triglyceride, cholesterol, anti-microbial, anti-fungal and enzyme inhibition activities. The results revealed that HL and its complexes are promising new therapeutic options as these compounds exhibit strong activity against cancer cells, diabetics, fungal and microbial inhibition.
Gong, Anmin; Liu, Jianping; Chen, Si; Fu, Yunfa
2018-01-01
To study the physiologic mechanism of the brain during different motor imagery (MI) tasks, the authors employed a method of brain-network modeling based on time-frequency cross mutual information obtained from 4-class (left hand, right hand, feet, and tongue) MI tasks recorded as brain-computer interface (BCI) electroencephalography data. The authors explored the brain network revealed by these MI tasks using statistical analysis and the analysis of topologic characteristics, and observed significant differences in the reaction level, reaction time, and activated target during 4-class MI tasks. There was a great difference in the reaction level between the execution and resting states during different tasks: the reaction level of the left-hand MI task was the greatest, followed by that of the right-hand, feet, and tongue MI tasks. The reaction time required to perform the tasks also differed: during the left-hand and right-hand MI tasks, the brain networks of subjects reacted promptly and strongly, but there was a delay during the feet and tongue MI task. Statistical analysis and the analysis of network topology revealed the target regions of the brain network during different MI processes. In conclusion, our findings suggest a new way to explain the neural mechanism behind MI.
Noguchi, Shuji; Kajihara, Ryusuke; Iwao, Yasunori; Fujinami, Yukari; Suzuki, Yoshio; Terada, Yasuko; Uesugi, Kentaro; Miura, Keiko; Itai, Shigeru
2013-03-10
Computed tomography (CT) using synchrotron X-ray radiation was evaluated as a non-destructive structural analysis method for fine granules. Two kinds of granules have been investigated: a bromhexine hydrochloride (BHX)-layered Celphere CP-102 granule coated with pH-sensitive polymer Kollicoat Smartseal 30-D, and a wax-matrix granule constructed from acetaminophen (APAP), dibasic calcium phosphate dehydrate, and aminoalkyl methacrylate copolymer E (AMCE) manufactured by melt granulation. The diameters of both granules were 200-300 μm. CT analysis of CP-102 granule could visualize the laminar structures of BHX and Kollicoat layers, and also visualize the high talc-content regions in the Kollicoat layer that could not be detected by scanning electron microscopy. Moreover, CT analysis using X-ray energies above the absorption edge of Br specifically enhanced the contrast in the BHX layer. As for granules manufactured by melt granulation, CT analysis revealed that they had a small inner void space due to a uniform distribution of APAP and other excipients. The distribution of AMCE revealed by CT analysis was also found to involve in the differences of drug dissolution from the granules as described previously. These observations demonstrate that CT analysis using synchrotron X-ray radiation is a powerful method for the detailed internal structure analysis of fine granules. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sathya, K.; Dhamodharan, P.; Dhandapani, M.
2018-05-01
A new proton transfer complex was synthesized by the reaction between 2-amino-3-methyl pyridine with 3,5-dinitro benzoic acid in methanol solvent at room temperature. Chemical composition and stoichiometry of the synthesized complex 2-amino-3-methylpyridinium 3,5-dinitrobenzoate (AMPDB) were verified by CHN analysis. The AMPDB crystals were subjected to FT-IR spectral analysis to confirm the functional groups in the compound. UV-Vis-NIR spectral studies revealed that the AMPDB has a large optical transparency window. Single crystal XRD analysis reveals that AMPDB belongs to a monoclinic system with P21/c space group. NMR spectroscopic data indicate the exact carbon skeleton and hydrogen environment in the molecular structure of AMPDB. The thermal stability of the compound was investigated by thermogravimetry (TG). Computational studies such as optimisation of molecular geometry, natural bond analysis (NBO), Mulliken population analysis and HOMO-LUMO analysis were performed using Gaussian 09 software by B3LYP method at 6-311 G(d p) basis set. The first order hyperpolarizability (β) value is 37 times greater than that of urea. The optical nonlinearities of AMPDB have been investigated by Z-scan technique with He-Ne laser radiation of wavelength 632.8 nm. Hirshfeld analysis indicate O⋯H/H⋯O interactions are the superior interactions confirming intensive hydrogen bond net work.
Egri-Nagy, Attila; Nehaniv, Chrystopher L
2008-01-01
Beyond complexity measures, sometimes it is worthwhile in addition to investigate how complexity changes structurally, especially in artificial systems where we have complete knowledge about the evolutionary process. Hierarchical decomposition is a useful way of assessing structural complexity changes of organisms modeled as automata, and we show how recently developed computational tools can be used for this purpose, by computing holonomy decompositions and holonomy complexity. To gain insight into the evolution of complexity, we investigate the smoothness of the landscape structure of complexity under minimal transitions. As a proof of concept, we illustrate how the hierarchical complexity analysis reveals symmetries and irreversible structure in biological networks by applying the methods to the lac operon mechanism in the genetic regulatory network of Escherichia coli.
Analysis of vibrational-translational energy transfer using the direct simulation Monte Carlo method
NASA Technical Reports Server (NTRS)
Boyd, Iain D.
1991-01-01
A new model is proposed for energy transfer between the vibrational and translational modes for use in the direct simulation Monte Carlo method (DSMC). The model modifies the Landau-Teller theory for a harmonic oscillator and the rate transition is related to an experimental correlation for the vibrational relaxation time. Assessment of the model is made with respect to three different computations: relaxation in a heat bath, a one-dimensional shock wave, and hypersonic flow over a two-dimensional wedge. These studies verify that the model achieves detailed balance, and excellent agreement with experimental data is obtained in the shock wave calculation. The wedge flow computation reveals that the usual phenomenological method for simulating vibrational nonequilibrium in the DSMC technique predicts much higher vibrational temperatures in the wake region.
Wormlike Chain Theory and Bending of Short DNA
NASA Astrophysics Data System (ADS)
Mazur, Alexey K.
2007-05-01
The probability distributions for bending angles in double helical DNA obtained in all-atom molecular dynamics simulations are compared with theoretical predictions. The computed distributions remarkably agree with the wormlike chain theory and qualitatively differ from predictions of the subelastic chain model. The computed data exhibit only small anomalies in the apparent flexibility of short DNA and cannot account for the recently reported AFM data. It is possible that the current atomistic DNA models miss some essential mechanisms of DNA bending on intermediate length scales. Analysis of bent DNA structures reveal, however, that the bending motion is structurally heterogeneous and directionally anisotropic on the length scales where the experimental anomalies were detected. These effects are essential for interpretation of the experimental data and they also can be responsible for the apparent discrepancy.
An Analysis of Navigation Algorithms for Smartphones Using J2ME
NASA Astrophysics Data System (ADS)
Santos, André C.; Tarrataca, Luís; Cardoso, João M. P.
Embedded systems are considered one of the most potential areas for future innovations. Two embedded fields that will most certainly take a primary role in future innovations are mobile robotics and mobile computing. Mobile robots and smartphones are growing in number and functionalities, becoming a presence in our daily life. In this paper, we study the current feasibility of a smartphone to execute navigation algorithms. As a test case, we use a smartphone to control an autonomous mobile robot. We tested three navigation problems: Mapping, Localization and Path Planning. For each of these problems, an algorithm has been chosen, developed in J2ME, and tested on the field. Results show the current mobile Java capacity for executing computationally demanding algorithms and reveal the real possibility of using smartphones for autonomous navigation.
NASA Astrophysics Data System (ADS)
Fitrasari, Dian; Purqon, Acep
2017-07-01
Proteins play important roles in body metabolism. However, to reveal hydration effects, it is cost computing especially for all-atom calculation. Coarse-grained method is one of potential solution to reduce the calculation and computable in longer timescale. Furthermore, the protein of Azurin is interesting protein and potentially applicable to cancer medicine for the stability property reason. We investigate the effects of hydration on Azurin, the conformation and the stabilities. Furthermore, we analyze the free-energy of the conformation system to find the favorable structure using free energy perturbation (FEP) calculation. Our calculation results show that free energy value of azurin is -136.9 kJ/mol. It shows a good agreement with experimental results with relative error index remained at 0.07%.
An extraordinary transmission analogue for enhancing microwave antenna performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pushpakaran, Sarin V., E-mail: sarincrema@gmail.com; Purushothaman, Jayakrishnan M.; Chandroth, Aanandan
2015-10-15
The theory of diffraction limit proposed by H.A Bethe limits the total power transfer through a subwavelength hole. Researchers all over the world have gone through different techniques for boosting the transmission through subwavelength holes resulting in the Extraordinary Transmission (EOT) behavior. We examine computationally and experimentally the concept of EOT nature in the microwave range for enhancing radiation performance of a stacked dipole antenna working in the S band. It is shown that the front to back ratio of the antenna is considerably enhanced without affecting the impedance matching performance of the design. The computational analysis based on Finitemore » Difference Time Domain (FDTD) method reveals that the excitation of Fabry-Perot resonant modes on the slots is responsible for performance enhancement.« less
Phase-contrast x-ray computed tomography for biological imaging
NASA Astrophysics Data System (ADS)
Momose, Atsushi; Takeda, Tohoru; Itai, Yuji
1997-10-01
We have shown so far that 3D structures in biological sot tissues such as cancer can be revealed by phase-contrast x- ray computed tomography using an x-ray interferometer. As a next step, we aim at applications of this technique to in vivo observation, including radiographic applications. For this purpose, the size of view field is desired to be more than a few centimeters. Therefore, a larger x-ray interferometer should be used with x-rays of higher energy. We have evaluated the optimal x-ray energy from an aspect of does as a function of sample size. Moreover, desired spatial resolution to an image sensor is discussed as functions of x-ray energy and sample size, basing on a requirement in the analysis of interference fringes.
Trapnell, Cole; Roberts, Adam; Goff, Loyal; Pertea, Geo; Kim, Daehwan; Kelley, David R; Pimentel, Harold; Salzberg, Steven L; Rinn, John L; Pachter, Lior
2012-01-01
Recent advances in high-throughput cDNA sequencing (RNA-seq) can reveal new genes and splice variants and quantify expression genome-wide in a single assay. The volume and complexity of data from RNA-seq experiments necessitate scalable, fast and mathematically principled analysis software. TopHat and Cufflinks are free, open-source software tools for gene discovery and comprehensive expression analysis of high-throughput mRNA sequencing (RNA-seq) data. Together, they allow biologists to identify new genes and new splice variants of known ones, as well as compare gene and transcript expression under two or more conditions. This protocol describes in detail how to use TopHat and Cufflinks to perform such analyses. It also covers several accessory tools and utilities that aid in managing data, including CummeRbund, a tool for visualizing RNA-seq analysis results. Although the procedure assumes basic informatics skills, these tools assume little to no background with RNA-seq analysis and are meant for novices and experts alike. The protocol begins with raw sequencing reads and produces a transcriptome assembly, lists of differentially expressed and regulated genes and transcripts, and publication-quality visualizations of analysis results. The protocol's execution time depends on the volume of transcriptome sequencing data and available computing resources but takes less than 1 d of computer time for typical experiments and ~1 h of hands-on time. PMID:22383036
Chai, Chunyue; Lin, Yanling; Shen, Danyu; Wu, Yuren; Li, Hongjuan; Dou, Daolong
2013-01-01
Identification of pathogen-inducible promoters largely lags behind cloning of the genes for disease resistance. Here, we cloned the soybean GmaPPO12 gene and found that it was rapidly and strongly induced by Phytophthorasojae infection. Computational analysis revealed that its promoter contained many known cis-elements, including several defense related transcriptional factor-binding boxes. We showed that the promoter could mediate induction of GUS expression upon infection in both transient expression assays in Nicotianabenthamiana and stable transgenic soybean hairy roots. Importantly, we demonstrated that pathogen-induced expression of the GmaPPO12 promoter was higher than that of the soybean GmaPR1a promoter. A progressive 5' and 3' deletion analysis revealed two fragments that were essential for promoter activity. Thus, the cloned promoter could be used in transgenic plants to enhance resistance to phytophthora pathogens, and the identified fragment could serve as a candidate to produce synthetic pathogen-induced promoters.
NASA Astrophysics Data System (ADS)
Chakraborty, Debdutta; Chattaraj, Pratim Kumar
2017-10-01
The possibility of functionalizing boron nitride flakes (BNFs) with some selected main group metal clusters, viz. OLi4, NLi5, CLi6, BLI7 and Al12Be, has been analyzed with the aid of density functional theory (DFT) based computations. Thermochemical as well as energetic considerations suggest that all the metal clusters interact with the BNF moiety in a favorable fashion. As a result of functionalization, the static (first) hyperpolarizability (β ) values of the metal cluster supported BNF moieties increase quite significantly as compared to that in the case of pristine BNF. Time dependent DFT analysis reveals that the metal clusters can lower the transition energies associated with the dominant electronic transitions quite significantly thereby enabling the metal cluster supported BNF moieties to exhibit significant non-linear optical activity. Moreover, the studied systems demonstrate broad band absorption capability spanning the UV-visible as well as infra-red domains. Energy decomposition analysis reveals that the electrostatic interactions principally stabilize the metal cluster supported BNF moieties.
Chakraborty, Debdutta; Chattaraj, Pratim Kumar
2017-10-25
The possibility of functionalizing boron nitride flakes (BNFs) with some selected main group metal clusters, viz. OLi 4 , NLi 5 , CLi 6 , BLI 7 and Al 12 Be, has been analyzed with the aid of density functional theory (DFT) based computations. Thermochemical as well as energetic considerations suggest that all the metal clusters interact with the BNF moiety in a favorable fashion. As a result of functionalization, the static (first) hyperpolarizability ([Formula: see text]) values of the metal cluster supported BNF moieties increase quite significantly as compared to that in the case of pristine BNF. Time dependent DFT analysis reveals that the metal clusters can lower the transition energies associated with the dominant electronic transitions quite significantly thereby enabling the metal cluster supported BNF moieties to exhibit significant non-linear optical activity. Moreover, the studied systems demonstrate broad band absorption capability spanning the UV-visible as well as infra-red domains. Energy decomposition analysis reveals that the electrostatic interactions principally stabilize the metal cluster supported BNF moieties.
Computers for the Faculty: How on a Limited Budget.
ERIC Educational Resources Information Center
Arman, Hal; Kostoff, John
An informal investigation of the use of computers at Delta College (DC) in Michigan revealed reasonable use of computers by faculty in disciplines such as mathematics, business, and technology, but very limited use in the humanities and social sciences. In an effort to increase faculty computer usage, DC decided to make computers available to any…
Computer Anxiety: Relationship to Math Anxiety and Holland Types.
ERIC Educational Resources Information Center
Bellando, Jayne; Winer, Jane L.
Although the number of computers in the school system is increasing, many schools are not using computers to their capacity. One reason for this may be computer anxiety on the part of the teacher. A review of the computer anxiety literature reveals little information on the subject, and findings from previous studies suggest that basic controlled…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heydari, M.H., E-mail: heydari@stu.yazd.ac.ir; The Laboratory of Quantum Information Processing, Yazd University, Yazd; Hooshmandasl, M.R., E-mail: hooshmandasl@yazd.ac.ir
Because of the nonlinearity, closed-form solutions of many important stochastic functional equations are virtually impossible to obtain. Thus, numerical solutions are a viable alternative. In this paper, a new computational method based on the generalized hat basis functions together with their stochastic operational matrix of Itô-integration is proposed for solving nonlinear stochastic Itô integral equations in large intervals. In the proposed method, a new technique for computing nonlinear terms in such problems is presented. The main advantage of the proposed method is that it transforms problems under consideration into nonlinear systems of algebraic equations which can be simply solved. Errormore » analysis of the proposed method is investigated and also the efficiency of this method is shown on some concrete examples. The obtained results reveal that the proposed method is very accurate and efficient. As two useful applications, the proposed method is applied to obtain approximate solutions of the stochastic population growth models and stochastic pendulum problem.« less
Gravitational Waves from Black Hole Mergers
NASA Technical Reports Server (NTRS)
Centrella, Joan
2007-01-01
The final merger of two black holes is expected to be the strongest gravitational wave source for ground-based interferometers such as LIGO, VIRGO, and GEO600, as well as the space-based interferometer LISA. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of extreme gravity, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute black hole mergers using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Within the past few years, however, this situation has changed dramatically, with a series of remarkable breakthroughs. This talk will focus on new simulations that are revealing the dynamics and waveforms of binary black hole mergers, and their applications in gravitational wave detection, data analysis, and astrophysics.
NASA Astrophysics Data System (ADS)
Hai, Pham Minh; Bonello, Philip
2008-12-01
The direct study of the vibration of real engine structures with nonlinear bearings, particularly aero-engines, has been severely limited by the fact that current nonlinear computational techniques are not well-suited for complex large-order systems. This paper introduces a novel implicit "impulsive receptance method" (IRM) for the time domain analysis of such structures. The IRM's computational efficiency is largely immune to the number of modes used and dependent only on the number of nonlinear elements. This means that, apart from retaining numerical accuracy, a much more physically accurate solution is achievable within a short timeframe. Simulation tests on a realistically sized representative twin-spool aero-engine showed that the new method was around 40 times faster than a conventional implicit integration scheme. Preliminary results for a given rotor unbalance distribution revealed the varying degree of journal lift, orbit size and shape at the example engine's squeeze-film damper bearings, and the effect of end-sealing at these bearings.
Neural computations underlying inverse reinforcement learning in the human brain
Pauli, Wolfgang M; Bossaerts, Peter; O'Doherty, John
2017-01-01
In inverse reinforcement learning an observer infers the reward distribution available for actions in the environment solely through observing the actions implemented by another agent. To address whether this computational process is implemented in the human brain, participants underwent fMRI while learning about slot machines yielding hidden preferred and non-preferred food outcomes with varying probabilities, through observing the repeated slot choices of agents with similar and dissimilar food preferences. Using formal model comparison, we found that participants implemented inverse RL as opposed to a simple imitation strategy, in which the actions of the other agent are copied instead of inferring the underlying reward structure of the decision problem. Our computational fMRI analysis revealed that anterior dorsomedial prefrontal cortex encoded inferences about action-values within the value space of the agent as opposed to that of the observer, demonstrating that inverse RL is an abstract cognitive process divorceable from the values and concerns of the observer him/herself. PMID:29083301
A comparative DFT study on the antioxidant activity of apigenin and scutellarein flavonoid compounds
NASA Astrophysics Data System (ADS)
Sadasivam, K.; Kumaresan, R.
2011-03-01
The potent antioxidant activity of flavonoids relevant to their ability to scavenge reactive oxygen species is the most important function of flavonoids. Density functional theory calculations were explored to investigate the antioxidant activity of flavonoid compounds such as apigenin and scutellarein. The biological characteristics are dependent on electronic parameters, describing the charge distribution on the rings of the flavonoid molecules. The computation of structural and various molecular descriptors such as polarizability, dipole moment, energy gap, homolytic O-H bond dissociation enthalpies (BDEs), ionization potential (IP), electron affinity, hardness, softness, electronegativity, electrophilic index and density plot of molecular orbital for neutral as well as radical species were carried out and studied. The B3LYP/6-311G(d,p) basis set was adopted for all the computations. This computation reveals that scutellarein exhibits higher degree of antioxidant activity than apigenin. Their dipole moment and polarizability analysis show that both the compounds are polar in nature and have the capacity to polarize other atoms.
Fernandez, Maria E.; LaRue, Denise M.; Bartholomew, L. Kay
2012-01-01
Computer-based multimedia technologies can be used to tailor health messages, but promotoras (Spanish-speaking community health workers) rarely use these tools. Promotoras delivered health messages about colorectal cancer screening to medically underserved Latinos in South Texas using two small media formats: a “low-tech” format (flipchart and video); and a “high-tech” format consisting of a tailored, interactive computer program delivered on a tablet computer. Using qualitative methods, we observed promotora training and intervention delivery, and conducted interviews with five promotoras to compare and contrast program implementation of both formats. We discuss the ways each format aided or challenged promotoras’ intervention delivery. Findings reveal that some aspects of both formats enhanced intervention delivery by tapping into Latino health communication preferences and facilitating interpersonal communication, while other aspects hindered intervention delivery. This study contributes to our understanding of how community health workers use low- and high-tech small media formats when delivering health messages to Latinos. PMID:21986243
Binary Black Holes, Gravitational Waves, and Numerical Relativity
NASA Technical Reports Server (NTRS)
Centrella, John
2007-01-01
The final merger of two black holes is expected to be the strongest gravitational wave source for ground-based interferometers such as LIGO, VIRGO, and GE0600, as well as the space-based interferometer LISA. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of extreme gravity, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute black hole mergers using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Within the past few years, however, this situation has changed dramatically, with a series of remarkable breakthroughs. This talk will focus on new simulations that are revealing the dynamics and waveforms of binary black hole mergers, and their applications in gravitational wave detection, data analysis, and astrophysics.
Binary Black Holes: Mergers, Dynamics, and Waveforms
NASA Astrophysics Data System (ADS)
Centrella, Joan
2007-04-01
The final merger of two black holes is expected to be the strongest gravitational wave source for ground-based interferometers such as LIGO, VIRGO, and GEO600, as well as the space-based interferometer LISA. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of extreme gravity, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute black hole mergers using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Within the past few years, however, this situation has changed dramatically, with a series of remarkable breakthroughs. This talk will focus on new simulations that are revealing the dynamics and waveforms of binary black hole mergers, and their applications in gravitational wave detection, data analysis, and astrophysics.
Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.
Said, Nadia; Engelhart, Michael; Kirches, Christian; Körkel, Stefan; Holt, Daniel V
2016-01-01
Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.
Ong, Eng Teo; Lee, Heow Pueh; Lim, Kian Meng
2004-09-01
This article presents a fast algorithm for the efficient solution of the Helmholtz equation. The method is based on the translation theory of the multipole expansions. Here, the speedup comes from the convolution nature of the translation operators, which can be evaluated rapidly using fast Fourier transform algorithms. Also, the computations of the translation operators are accelerated by using the recursive formulas developed recently by Gumerov and Duraiswami [SIAM J. Sci. Comput. 25, 1344-1381(2003)]. It is demonstrated that the algorithm can produce good accuracy with a relatively low order of expansion. Efficiency analyses of the algorithm reveal that it has computational complexities of O(Na), where a ranges from 1.05 to 1.24. However, this method requires substantially more memory to store the translation operators as compared to the fast multipole method. Hence, despite its simplicity in implementation, this memory requirement issue may limit the application of this algorithm to solving very large-scale problems.
Teachers' Organization of Participation Structures for Teaching Science with Computer Technology
NASA Astrophysics Data System (ADS)
Subramaniam, Karthigeyan
2016-08-01
This paper describes a qualitative study that investigated the nature of the participation structures and how the participation structures were organized by four science teachers when they constructed and communicated science content in their classrooms with computer technology. Participation structures focus on the activity structures and processes in social settings like classrooms thereby providing glimpses into the complex dynamics of teacher-students interactions, configurations, and conventions during collective meaning making and knowledge creation. Data included observations, interviews, and focus group interviews. Analysis revealed that the dominant participation structure evident within participants' instruction with computer technology was ( Teacher) initiation-( Student and Teacher) response sequences-( Teacher) evaluate participation structure. Three key events characterized the how participants organized this participation structure in their classrooms: setting the stage for interactive instruction, the joint activity, and maintaining accountability. Implications include the following: (1) teacher educators need to tap into the knowledge base that underscores science teachers' learning to teach philosophies when computer technology is used in instruction. (2) Teacher educators need to emphasize the essential idea that learning and cognition is not situated within the computer technology but within the pedagogical practices, specifically the participation structures. (3) The pedagogical practices developed with the integration or with the use of computer technology underscored by the teachers' own knowledge of classroom contexts and curriculum needs to be the focus for how students learn science content with computer technology instead of just focusing on how computer technology solely supports students learning of science content.
Kurokawa, Yoshika; Sone, Hideko; Win-Shwe, Tin-Tin; Zeng, Yang; Kimura, Hiroyuki; Koyama, Yosuke; Yagi, Yusuke; Matsui, Yasuto; Yamazaki, Masashi; Hirano, Seishiro
2017-01-01
Dendrimers have been expected as excellent nanodevices for brain medication. An amine-terminated polyamidoamine dendrimer (PD), an unmodified plain type of PD, has the obvious disadvantage of cytotoxicity, but still serves as an attractive molecule because it easily adheres to the cell surface, facilitating easy cellular uptake. Single-photon emission computed tomographic imaging of a mouse following intravenous injection of a radiolabeled PD failed to reveal any signal in the intracranial region. Furthermore, examination of the permeability of PD particles across the blood–brain barrier (BBB) in vitro using a commercially available kit revealed poor permeability of the nanoparticles, which was suppressed by an inhibitor of caveolae-mediated endocytosis, but not by an inhibitor of macropinocytosis. Physicochemical analysis of the PD revealed that cationic PDs are likely to aggregate promptly upon mixing with body fluids and that this prompt aggregation is probably driven by non-Derjaguin–Landau– Verwey–Overbeek attractive forces originating from the surrounding divalent ions. Atomic force microscopy observation of a freshly cleaved mica plate soaked in dendrimer suspension (culture media) confirmed prompt aggregation. Our study revealed poor transfer of intravenously administered cationic PDs into the intracranial nervous tissue, and the results of our analysis suggested that this was largely attributable to the reduced BBB permeability arising from the propensity of the particles to promptly aggregate upon mixing with body fluids. PMID:28579780
Nonstimulated rabbit phonation model: Cricothyroid approximation.
Novaleski, Carolyn K; Kojima, Tsuyoshi; Chang, Siyuan; Luo, Haoxiang; Valenzuela, Carla V; Rousseau, Bernard
2016-07-01
To describe a nonstimulated in vivo rabbit phonation model using an Isshiki type IV thyroplasty and uninterrupted humidified glottal airflow to produce sustained audible phonation. Prospective animal study. Six New Zealand white breeder rabbits underwent a surgical procedure involving an Isshiki type IV thyroplasty and continuous airflow delivered to the glottis. Phonatory parameters were examined using high-speed laryngeal imaging and acoustic and aerodynamic analysis. Following the procedure, airflow was discontinued, and sutures remained in place to maintain the phonatory glottal configuration for microimaging using a 9.4 Tesla imaging system. High-speed laryngeal imaging revealed sustained vocal fold oscillation throughout the experimental procedure. Analysis of acoustic signals revealed a mean vocal intensity of 61 dB and fundamental frequency of 590 Hz. Aerodynamic analysis revealed a mean airflow rate of 85.91 mL/s and subglottal pressure of 9 cm H2 O. Following the procedure, microimaging revealed that the in vivo phonatory glottal configuration was maintained, providing consistency between the experimental and postexperimental laryngeal geometry. The latter provides a significant milestone that is necessary for geometric reconstruction and to allow for validation of computational simulations against the in vivo rabbit preparation. We demonstrate a nonstimulated in vivo phonation preparation using an Isshiki type IV thyroplasty and continuous humidified glottal airflow in a rabbit animal model. This preparation elicits sustained vocal fold vibration and phonatory measures that are consistent with our laboratory's prior work using direct neuromuscular stimulation for evoked phonation. N/A. Laryngoscope, 126:1589-1594, 2016. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
NASA Astrophysics Data System (ADS)
Ilayaraja, Renganathan; Rajkumar, Ramalingam; Rajesh, Durairaj; Muralidharan, Arumugam Ramachandran; Padmanabhan, Parasuraman; Archunan, Govindaraju
2014-06-01
Chemosignals play a crucial role in social and sexual communication among inter- and intra-species. Chemical cues are bound with protein that is present in the pheromones irrespective of sex are commonly called as pheromone binding protein (PBP). In rats, the pheromone compounds are bound with low molecular lipocalin protein α2u-globulin (α2u). We reported farnesol is a natural endogenous ligand (compound) present in rat preputial gland as a bound volatile compound. In the present study, an attempt has been made through computational method to evaluating the binding efficiency of α2u with the natural ligand (farnesol) and standard fluorescent molecule (2-naphthol). The docking analysis revealed that the binding energy of farnesol and 2-naphthol was almost equal and likely to share some binding pocket of protein. Further, to extrapolate the results generated through computational approach, the α2u protein was purified and subjected to fluorescence titration and binding assay. The results showed that the farnesol is replaced by 2-naphthol with high hydrophobicity of TYR120 in binding sites of α2u providing an acceptable dissociation constant indicating the binding efficiency of α2u. The obtained results are in corroboration with the data made through computational approach.
Woo, Kevin L; Rieucau, Guillaume
2008-07-01
The increasing use of the video playback technique in behavioural ecology reveals a growing need to ensure better control of the visual stimuli that focal animals experience. Technological advances now allow researchers to develop computer-generated animations instead of using video sequences of live-acting demonstrators. However, care must be taken to match the motion characteristics (speed and velocity) of the animation to the original video source. Here, we presented a tool based on the use of an optic flow analysis program to measure the resemblance of motion characteristics of computer-generated animations compared to videos of live-acting animals. We examined three distinct displays (tail-flick (TF), push-up body rock (PUBR), and slow arm wave (SAW)) exhibited by animations of Jacky dragons (Amphibolurus muricatus) that were compared to the original video sequences of live lizards. We found no significant differences between the motion characteristics of videos and animations across all three displays. Our results showed that our animations are similar the speed and velocity features of each display. Researchers need to ensure that similar motion characteristics in animation and video stimuli are represented, and this feature is a critical component in the future success of the video playback technique.
Neutron observables from inclusive lepton scattering on nuclei
NASA Astrophysics Data System (ADS)
Rinat, A. S.; Taragin, M. F.
2010-07-01
We analyze new data from Thomas Jefferson National Accelerator Facility (JLab) for inclusive electron scattering on various targets. Computed and measured total inclusive cross sections in the range 0.3≲x≲0.95 show reasonable agreement on a logarithmic scale for all targets. However, closer inspection of the quasielastic components reveals serious discrepancies. European Muon Collaboration (EMC) ratios with conceivably smaller systematic errors fare the same. As a consequence, the new data do not enable the extraction of the magnetic form factor GMn and the structure function F2n of the neutron, although the application of exactly the same analysis to older data had been successful. We incorporate in the above analysis older CLAS Collaboration data on F22H. Removal of some scattered points from those makes it appear possible to obtain the desired neutron information. We compare our results with others from alternative sources. Special attention is paid to the A=3 isodoublet cross sections and EMC ratios. Present data exist only for He3, but the available input in combination with charge symmetry enables computations for H3. Their average is the computed isoscalar part and is compared with the empirical modification of He3 EMC ratios toward a fictitious A=3 isosinglet.
Computational-based structural, functional and phylogenetic analysis of Enterobacter phytases.
Pramanik, Krishnendu; Kundu, Shreyasi; Banerjee, Sandipan; Ghosh, Pallab Kumar; Maiti, Tushar Kanti
2018-06-01
Myo-inositol hexakisphosphate phosphohydrolases (i.e., phytases) are known to be a very important enzyme responsible for solubilization of insoluble phosphates. In the present study, Enterobacter phytases have characterized by different phylogenetic, structural and functional parameters using some standard bio-computational tools. Results showed that majority of the Enterobacter phytases are acidic in nature as most of the isoelectric points were under 7.0. The aliphatic indices predicted for the selected proteins were below 40 indicating their thermostable nature. The average molecular weight of the proteins was 48 kDa. The lower values of GRAVY of the said proteins implied that they have better interactions with water. Secondary structure prediction revealed that alpha-helical content was highest among the other forms such as sheets, coils, etc. Moreover, the predicted 3D structure of Enterobacter phytases divulged that the proteins consisted of four monomeric polypeptide chains i.e., it was a tetrameric protein. The predicted tertiary model of E. aerogenes (A0A0M3HCJ2) was deposited in Protein Model Database (Acc. No.: PM0080561) for further utilization after a thorough quality check from QMEAN and SAVES server. Functional analysis supported their classification as histidine acid phosphatases. Besides, multiple sequence alignment revealed that "DG-DP-LG" was the most highly conserved residues within the Enterobacter phytases. Thus, the present study will be useful in selecting suitable phytase-producing microbe exclusively for using in the animal food industry as a food additive.
Identification of miRNA-mRNA regulatory modules by exploring collective group relationships.
Masud Karim, S M; Liu, Lin; Le, Thuc Duy; Li, Jiuyong
2016-01-11
microRNAs (miRNAs) play an essential role in the post-transcriptional gene regulation in plants and animals. They regulate a wide range of biological processes by targeting messenger RNAs (mRNAs). Evidence suggests that miRNAs and mRNAs interact collectively in gene regulatory networks. The collective relationships between groups of miRNAs and groups of mRNAs may be more readily interpreted than those between individual miRNAs and mRNAs, and thus are useful for gaining insight into gene regulation and cell functions. Several computational approaches have been developed to discover miRNA-mRNA regulatory modules (MMRMs) with a common aim to elucidate miRNA-mRNA regulatory relationships. However, most existing methods do not consider the collective relationships between a group of miRNAs and the group of targeted mRNAs in the process of discovering MMRMs. Our aim is to develop a framework to discover MMRMs and reveal miRNA-mRNA regulatory relationships from the heterogeneous expression data based on the collective relationships. We propose DIscovering COllective group RElationships (DICORE), an effective computational framework for revealing miRNA-mRNA regulatory relationships. We utilize the notation of collective group relationships to build the computational framework. The method computes the collaboration scores of the miRNAs and mRNAs on the basis of their interactions with mRNAs and miRNAs, respectively. Then it determines the groups of miRNAs and groups of mRNAs separately based on their respective collaboration scores. Next, it calculates the strength of the collective relationship between each pair of miRNA group and mRNA group using canonical correlation analysis, and the group pairs with significant canonical correlations are considered as the MMRMs. We applied this method to three gene expression datasets, and validated the computational discoveries. Analysis of the results demonstrates that a large portion of the regulatory relationships discovered by DICORE is consistent with the experimentally confirmed databases. Furthermore, it is observed that the top mRNAs that are regulated by the miRNAs in the identified MMRMs are highly relevant to the biological conditions of the given datasets. It is also shown that the MMRMs identified by DICORE are more biologically significant and functionally enriched.
Metal–Metal Bonding in Uranium–Group 10 Complexes
2016-01-01
Heterobimetallic complexes containing short uranium–group 10 metal bonds have been prepared from monometallic IUIV(OArP-κ2O,P)3 (2) {[ArPO]− = 2-tert-butyl-4-methyl-6-(diphenylphosphino)phenolate}. The U–M bond in IUIV(μ-OArP-1κ1O,2κ1P)3M0, M = Ni (3–Ni), Pd (3–Pd), and Pt (3–Pt), has been investigated by experimental and DFT computational methods. Comparisons of 3–Ni with two further U–Ni complexes XUIV(μ-OArP-1κ1O,2κ1P)3Ni0, X = Me3SiO (4) and F (5), was also possible via iodide substitution. All complexes were characterized by variable-temperature NMR spectroscopy, electrochemistry, and single crystal X-ray diffraction. The U–M bonds are significantly shorter than any other crystallographically characterized d–f-block bimetallic, even though the ligand flexes to allow a variable U–M separation. Excellent agreement is found between the experimental and computed structures for 3–Ni and 3–Pd. Natural population analysis and natural localized molecular orbital (NLMO) compositions indicate that U employs both 5f and 6d orbitals in covalent bonding to a significant extent. Quantum theory of atoms-in-molecules analysis reveals U–M bond critical point properties typical of metallic bonding and a larger delocalization index (bond order) for the less polar U–Ni bond than U–Pd. Electrochemical studies agree with the computational analyses and the X-ray structural data for the U–X adducts 3–Ni, 4, and 5. The data show a trend in uranium–metal bond strength that decreases from 3–Ni down to 3–Pt and suggest that exchanging the iodide for a fluoride strengthens the metal–metal bond. Despite short U–TM (transition metal) distances, four other computational approaches also suggest low U–TM bond orders, reflecting highly transition metal localized valence NLMOs. These are more so for 3–Pd than 3–Ni, consistent with slightly larger U–TM bond orders in the latter. Computational studies of the model systems (PH3)3MU(OH)3I (M = Ni, Pd) reveal longer and weaker unsupported U–TM bonds vs 3. PMID:26942560
Estimation of surface curvature from full-field shape data using principal component analysis
NASA Astrophysics Data System (ADS)
Sharma, Sameer; Vinuchakravarthy, S.; Subramanian, S. J.
2017-01-01
Three-dimensional digital image correlation (3D-DIC) is a popular image-based experimental technique for estimating surface shape, displacements and strains of deforming objects. In this technique, a calibrated stereo rig is used to obtain and stereo-match pairs of images of the object of interest from which the shapes of the imaged surface are then computed using the calibration parameters of the rig. Displacements are obtained by performing an additional temporal correlation of the shapes obtained at various stages of deformation and strains by smoothing and numerically differentiating the displacement data. Since strains are of primary importance in solid mechanics, significant efforts have been put into computation of strains from the measured displacement fields; however, much less attention has been paid to date to computation of curvature from the measured 3D surfaces. In this work, we address this gap by proposing a new method of computing curvature from full-field shape measurements using principal component analysis (PCA) along the lines of a similar work recently proposed to measure strains (Grama and Subramanian 2014 Exp. Mech. 54 913-33). PCA is a multivariate analysis tool that is widely used to reveal relationships between a large number of variables, reduce dimensionality and achieve significant denoising. This technique is applied here to identify dominant principal components in the shape fields measured by 3D-DIC and these principal components are then differentiated systematically to obtain the first and second fundamental forms used in the curvature calculation. The proposed method is first verified using synthetically generated noisy surfaces and then validated experimentally on some real world objects with known ground-truth curvatures.
Findings from an Organizational Network Analysis to Support Local Public Health Management
Caldwell, Michael; Rockoff, Maxine L.; Gebbie, Kristine; Carley, Kathleen M.; Bakken, Suzanne
2008-01-01
We assessed the feasibility of using organizational network analysis in a local public health organization. The research setting was an urban/suburban county health department with 156 employees. The goal of the research was to study communication and information flow in the department and to assess the technique for public health management. Network data were derived from survey questionnaires. Computational analysis was performed with the Organizational Risk Analyzer. Analysis revealed centralized communication, limited interdependencies, potential knowledge loss through retirement, and possible informational silos. The findings suggested opportunities for more cross program coordination but also suggested the presences of potentially efficient communication paths and potentially beneficial social connectedness. Managers found the findings useful to support decision making. Public health organizations must be effective in an increasingly complex environment. Network analysis can help build public health capacity for complex system management. PMID:18481183
Recent advances in ChIP-seq analysis: from quality management to whole-genome annotation.
Nakato, Ryuichiro; Shirahige, Katsuhiko
2017-03-01
Chromatin immunoprecipitation followed by sequencing (ChIP-seq) analysis can detect protein/DNA-binding and histone-modification sites across an entire genome. Recent advances in sequencing technologies and analyses enable us to compare hundreds of samples simultaneously; such large-scale analysis has potential to reveal the high-dimensional interrelationship level for regulatory elements and annotate novel functional genomic regions de novo. Because many experimental considerations are relevant to the choice of a method in a ChIP-seq analysis, the overall design and quality management of the experiment are of critical importance. This review offers guiding principles of computation and sample preparation for ChIP-seq analyses, highlighting the validity and limitations of the state-of-the-art procedures at each step. We also discuss the latest challenges of single-cell analysis that will encourage a new era in this field. © The Author 2016. Published by Oxford University Press.
Inferring Group Processes from Computer-Mediated Affective Text Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, Jack C; Begoli, Edmon; Jose, Ajith
2011-02-01
Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Severalmore » useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.« less
A review of intelligent systems for heart sound signal analysis.
Nabih-Ali, Mohammed; El-Dahshan, El-Sayed A; Yahia, Ashraf S
2017-10-01
Intelligent computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of physicians and reduce the time required for accurate diagnosis. CAD systems could provide physicians with a suggestion about the diagnostic of heart diseases. The objective of this paper is to review the recent published preprocessing, feature extraction and classification techniques and their state of the art of phonocardiogram (PCG) signal analysis. Published literature reviewed in this paper shows the potential of machine learning techniques as a design tool in PCG CAD systems and reveals that the CAD systems for PCG signal analysis are still an open problem. Related studies are compared to their datasets, feature extraction techniques and the classifiers they used. Current achievements and limitations in developing CAD systems for PCG signal analysis using machine learning techniques are presented and discussed. In the light of this review, a number of future research directions for PCG signal analysis are provided.
Analysis of Android Device-Based Solutions for Fall Detection
Casilari, Eduardo; Luque, Rafael; Morón, María-José
2015-01-01
Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928
Time Analysis of Building Dynamic Response Under Seismic Action. Part 2: Example of Calculation
NASA Astrophysics Data System (ADS)
Ufimtcev, E. M.
2017-11-01
The second part of the article illustrates the use of the time analysis method (TAM) by the example of the calculation of a 3-storey building, the design dynamic model (DDM) of which is adopted in the form of a flat vertical cantilever rod with 3 horizontal degrees of freedom associated with floor and coverage levels. The parameters of natural oscillations (frequencies and modes) and the results of the calculation of the elastic forced oscillations of the building’s DDM - oscillograms of the reaction parameters on the time interval t ∈ [0; 131,25] sec. The obtained results are analyzed on the basis of the computed values of the discrepancy of the DDS motion equation and the comparison of the results calculated on the basis of the numerical approach (FEM) and the normative method set out in SP 14.13330.2014 “Construction in Seismic Regions”. The data of the analysis testify to the accuracy of the construction of the computational model as well as the high accuracy of the results obtained. In conclusion, it is revealed that the use of the TAM will improve the strength of buildings and structures subject to seismic influences when designing them.
Analysis of Android Device-Based Solutions for Fall Detection.
Casilari, Eduardo; Luque, Rafael; Morón, María-José
2015-07-23
Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.
Richardson, Ruth
2016-03-01
The first edition of Anatomy Descriptive and Surgical (1858) was greeted with accolades, but also provoked serious controversy concerning Henry Gray's failure to acknowledge the work of earlier anatomists. A review in the Medical Times (1859) accused Gray of intellectual theft. The journal took the unusual step of substantiating its indictment by publishing twenty parallel texts from Gray and from a pre-existing textbook, Quain's Anatomy. At the recent "Vesalius Continuum" conference in Zakynthos, Greece (2014) Professor Brion Benninger disputed the theft by announcing from the floor the results of a computer analysis of both texts, which he reported exonerated Gray by revealing no evidence of plagiarism. The analysis has not been forthcoming, however, despite requests. Here the historian of Gray's Anatomy supplements the argument set out in the Medical Times 150 years ago with data suggesting unwelcome personality traits in Henry Gray, and demonstrating the utility of others' work to his professional advancement. Fair dealing in the world of anatomy and indeed the genuineness of the lustre of medical fame are important matters, but whether quantitative evidence has anything to add to the discussion concerning Gray's probity can be assessed only if Benninger makes public his computer analysis. © 2015 Wiley Periodicals, Inc.
Matrix Isolation Spectroscopy and Photochemistry of Triplet 1,3-DIMETHYLPROPYNYLIDENE (MeC3Me)
NASA Astrophysics Data System (ADS)
Knezz, Stephanie N.; Waltz, Terese A.; Haenni, Benjamin C.; Burrmann, Nicola J.; McMahon, Robert J.
2015-06-01
Acetylenic carbenes and conjugated carbon chain molecules of the HCnH family are relevant to the study of combustion and chemistry in the interstellar medium (ISM). Propynylidene (HC3H) has been thoroughly studied and its structure and photochemistry determined. Here, we produce triplet diradical 1,3-dimethylpropynylidene (MeC3Me) photochemically from a precursor diazo compound in a cryogenic matrix (N2 or Ar) at 10 K, and spectroscopic analysis is carried out. The infrared, electronic absorption, and electron paramagnetic resonance spectra were examined in light of the parent (HC3H) system to ascertain the effect of alkyl substituents on delocalized carbon chains of this type. Computational analysis, EPR, and infrared analysis indicate a triplet ground state with a quasilinear structure. Infrared experiments reveal photochemical reaction to penten-3-yne upon UV irradiation. Further experimental and computational results pertaining to the structure and photochemistry will be presented. Seburg, R. A.; Patterson, E. V.; McMahon, R. J., Structure of Triplet Propynylidene (HCCCH) as Probed by IR, UV/vis, and EPR Spectroscopy of Isotopomers. Journal of the American Chemical Society 2009, 131 (26), 9442-9455.
Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.
Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei
2016-01-01
Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity.
Pai, Pei-Jing; Hu, Yingwei; Lam, Henry
2016-08-31
Intact glycopeptide MS analysis to reveal site-specific protein glycosylation is an important frontier of proteomics. However, computational tools for analyzing MS/MS spectra of intact glycopeptides are still limited and not well-integrated into existing workflows. In this work, a new computational tool which combines the spectral library building/searching tool, SpectraST (Lam et al. Nat. Methods2008, 5, 873-875), and the glycopeptide fragmentation prediction tool, MassAnalyzer (Zhang et al. Anal. Chem.2010, 82, 10194-10202) for intact glycopeptide analysis has been developed. Specifically, this tool enables the determination of the glycan structure directly from low-energy collision-induced dissociation (CID) spectra of intact glycopeptides. Given a list of possible glycopeptide sequences as input, a sample-specific spectral library of MassAnalyzer-predicted spectra is built using SpectraST. Glycan identification from CID spectra is achieved by spectral library searching against this library, in which both m/z and intensity information of the possible fragmentation ions are taken into consideration for improved accuracy. We validated our method using a standard glycoprotein, human transferrin, and evaluated its potential to be used in site-specific glycosylation profiling of glycoprotein datasets from LC-MS/MS. In addition, we further applied our method to reveal, for the first time, the site-specific N-glycosylation profile of recombinant human acetylcholinesterase expressed in HEK293 cells. For maximum usability, SpectraST is developed as part of the Trans-Proteomic Pipeline (TPP), a freely available and open-source software suite for MS data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Experimental and DFT studies on the antioxidant activity of a C-glycoside from Rhynchosia capitata
NASA Astrophysics Data System (ADS)
Praveena, R.; Sadasivam, K.; Kumaresan, R.; Deepha, V.; Sivakumar, Raman
2013-02-01
Rhynchosia capitata (=Glycine capitata) Heyne ex roth, was found to possess polyphenolics including flavonoids, which acts as potential antioxidant. The study of ethanolic extract of roots and leaves reveals that the leaves possess high polyphenolics including flavonoids than roots. This was also confirmed by DPPH radical scavenging activity. Leaf powder of the plant was extracted with different solvents by soxhlet apparatus in the order of increasing polarity. The DPPH scavenging activity of methanol fraction was found to be high compared to the crude extract and other fractions. Nitric oxide scavenging activity was dominant in chloroform fraction compared to methanol fraction. Presence of flavonoids especially vitexin, a C-glycoside in methanol and chloroform fractions were confirmed by high pressure thin layer chromatography (HPTLC) analysis. The structural and molecular characteristics of naturally occurring flavonoid, vitexin was investigated in gas phase using density functional theory (DFT) approach with B3LYP/6-311G(d,p) level of theory. Analysis of bond dissociation enthalpy (BDE) reveals that the OH site that requires minimum energy for dissociation is 4'-OH from B-ring. To explore the radical scavenging activity of vitexin, the adiabatic ionization potential, electron affinity, hardness, softness, electronegativity and electrophilic index properties were computed and interpreted. The nonvalidity of Koopman's theorem has been verified by the computation of Eo and Ev energy magnitudes. Interestingly, from BDE calculations it was observed that BDE for 4'-OH, 5-OH and 7-OH are comparatively low for vitexin than its aglycone apigenin and this may be due to the presence of C-8 glucoside in vitexin. To substantiate this, plot of frontier molecular orbital and spin density distribution analysis for neutral and the corresponding radical species for the compound vitexin have been presented.
Measuring track densities in lunar grains by image analysis
NASA Technical Reports Server (NTRS)
Blanford, George E.
1993-01-01
We have developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. We used a sample that had already been etched in 6 N NaOH at 118 C for 15 h to reveal tracks. We determined that back-scattered electron images taken at 50 percent contrast and approximately 49.8 percent brightness produced suitable high contrast images for analysis. We ascertained gray-scale thresholds of interest: 0-230 for tracks, 231 for masked regions, and 232-255 for background. We found no need to set an upper size limit for distinguishing tracks. We did use lower limits to exclude noise: 16 pixels at 15000x, 4 pixels at 10000x, 2 pixels at 6800x, and 0 pixels at 4600x. We used computer counting and measurement of area to obtain track densities. We found an excellent correlation with manual measurements for track densities below 1x10(exp 8) sq cm. For track densities between 1x10(exp 8) sq cm to 1x10(exp 9) sq cm, we found that a regression formula using the percentage area covered by tracks gave good agreement with manual measurements. Finally we used these new techniques to obtain a track density distribution that gave more detail and was more rapidly obtained than using manual techniques 15 years ago.
Dacosta-Aguayo, Rosalia; Graña, Manuel; Fernández-Andújar, Marina; López-Cancio, Elena; Cáceres, Cynthia; Bargalló, Núria; Barrios, Maite; Clemente, Immaculada; Monserrat, Pere Toran; Sas, Maite Alzamora; Dávalos, Antoni; Auer, Tibor; Mataró, Maria
2014-01-01
After stroke, white matter integrity can be affected both locally and distally to the primary lesion location. It has been shown that tract disruption in mirror's regions of the contralateral hemisphere is associated with degree of functional impairment. Fourteen patients suffering right hemispheric focal stroke (S) and eighteen healthy controls (HC) underwent Diffusion Weighted Imaging (DWI) and neuropsychological assessment. The stroke patient group was divided into poor (SP; n = 8) and good (SG; n = 6) cognitive recovery groups according to their cognitive improvement from the acute phase (72 hours after stroke) to the subacute phase (3 months post-stroke). Whole-brain DWI data analysis was performed by computing Diffusion Tensor Imaging (DTI) followed by Tract Based Spatial Statistics (TBSS). Assessment of effects was obtained computing the correlation of the projections on TBSS skeleton of Fractional Anisotropy (FA) and Radial Diffusivity (RD) with cognitive test results. Significant decrease of FA was found only in right brain anatomical areas for the S group when compared to the HC group. Analyzed separately, stroke patients with poor cognitive recovery showed additional significant FA decrease in several left hemisphere regions; whereas SG patients showed significant decrease only in the left genu of corpus callosum when compared to the HC. For the SG group, whole brain analysis revealed significant correlation between the performance in the Semantic Fluency test and the FA in the right hemisphere as well as between the performance in the Grooved Pegboard Test (GPT) and the Trail Making Test-part A and the FA in the left hemisphere. For the SP group, correlation analysis revealed significant correlation between the performance in the GPT and the FA in the right hemisphere.
Dacosta-Aguayo, Rosalia; Graña, Manuel; Fernández-Andújar, Marina; López-Cancio, Elena; Cáceres, Cynthia; Bargalló, Núria; Barrios, Maite; Clemente, Immaculada; Monserrat, Pere Toran; Sas, Maite Alzamora; Dávalos, Antoni
2014-01-01
After stroke, white matter integrity can be affected both locally and distally to the primary lesion location. It has been shown that tract disruption in mirror’s regions of the contralateral hemisphere is associated with degree of functional impairment. Fourteen patients suffering right hemispheric focal stroke (S) and eighteen healthy controls (HC) underwent Diffusion Weighted Imaging (DWI) and neuropsychological assessment. The stroke patient group was divided into poor (SP; n = 8) and good (SG; n = 6) cognitive recovery groups according to their cognitive improvement from the acute phase (72 hours after stroke) to the subacute phase (3 months post-stroke). Whole-brain DWI data analysis was performed by computing Diffusion Tensor Imaging (DTI) followed by Tract Based Spatial Statistics (TBSS). Assessment of effects was obtained computing the correlation of the projections on TBSS skeleton of Fractional Anisotropy (FA) and Radial Diffusivity (RD) with cognitive test results. Significant decrease of FA was found only in right brain anatomical areas for the S group when compared to the HC group. Analyzed separately, stroke patients with poor cognitive recovery showed additional significant FA decrease in several left hemisphere regions; whereas SG patients showed significant decrease only in the left genu of corpus callosum when compared to the HC. For the SG group, whole brain analysis revealed significant correlation between the performance in the Semantic Fluency test and the FA in the right hemisphere as well as between the performance in the Grooved Pegboard Test (GPT) and theTrail Making Test-part A and the FA in the left hemisphere. For the SP group, correlation analysis revealed significant correlation between the performance in the GPT and the FA in the right hemisphere. PMID:24475078
Information technology infusion model for health sector in a developing country: Nigeria as a case.
Idowu, Bayo; Adagunodo, Rotimi; Adedoyin, Rufus
2006-01-01
To date, information technology (IT) has not been widely adopted in the health sector in the developing countries. Information Technology may bring an improvement on health care delivery systems. It is one of the prime movers of globalization. Information technology infusion is the degree to which a different information technology tools are integrated into organizational activities. This study aimed to know the degree and the extent of incorporation of Information Technology in the Nigerian health sector and derive an IT infusion models for popular IT indicators that are in use in Nigeria (Personal computers, Mobile phones, and the Internet) and subsequently investigates their impacts on the health care delivery system in Nigerian teaching hospitals. In this study, data were collected through the use of questionnaires. Also, oral interviews were conducted and subsequently, the data gathered were analyzed. The results of the analysis revealed that out of the three IT indicators considered, mobile phones are spreading fastest. It also revealed that computers and mobile phones are in use in all the teaching hospitals. Finally in this research, IT infusion models were developed for health sector in Nigeria from the data gathered through the questionnaire and oral interview.
Maximal Neighbor Similarity Reveals Real Communities in Networks
Žalik, Krista Rizman
2015-01-01
An important problem in the analysis of network data is the detection of groups of densely interconnected nodes also called modules or communities. Community structure reveals functions and organizations of networks. Currently used algorithms for community detection in large-scale real-world networks are computationally expensive or require a priori information such as the number or sizes of communities or are not able to give the same resulting partition in multiple runs. In this paper we investigate a simple and fast algorithm that uses the network structure alone and requires neither optimization of pre-defined objective function nor information about number of communities. We propose a bottom up community detection algorithm in which starting from communities consisting of adjacent pairs of nodes and their maximal similar neighbors we find real communities. We show that the overall advantage of the proposed algorithm compared to the other community detection algorithms is its simple nature, low computational cost and its very high accuracy in detection communities of different sizes also in networks with blurred modularity structure consisting of poorly separated communities. All communities identified by the proposed method for facebook network and E-Coli transcriptional regulatory network have strong structural and functional coherence. PMID:26680448
Liu, Yushu; Ye, Hongqiang; Wang, Yong; Zhao, Yijao; Sun, Yuchun; Zhou, Yongsheng
2018-05-17
To evaluate the internal adaptations of cast crowns made from resin patterns produced using three different computer-aided design/computer-assisted manufacturing technologies. A full-crown abutment made of zirconia was digitized using an intraoral scanner, and the design of the crown was finished on the digital model. Resin patterns were fabricated using a fused deposition modeling (FDM) 3D printer (LT group), a digital light projection (DLP) 3D printer (EV group), or a five-axis milling machine (ZT group). All patterns were cast in cobalt-chromium alloy crowns. Crowns made from traditional handmade wax patterns (HM group) were used as controls. Each group contained 10 samples. The internal gaps of the patterns were analyzed using a 3D replica method and optical digitization. The results were compared using Kruskal-Wallis analysis of variance (ANOVA), a one-sample t test, and signed rank test (α = .05). For the LT group, the marginal and axial gaps were significantly larger than in the other three groups (P < .05), but the occlusal adaptation did not reveal a significant difference (P > .05). In the ZT group, the axial gap was slightly smaller than in the HM group (P < .0083). All the means of gaps in all areas in the four groups were less than 150 μm. Casting crowns using casting patterns made from all three CAD/CAM systems could not produce the prescribed parameters, but the crowns showed clinically acceptable internal adaptations.
Tachiyama, Keisuke; Shiga, Yuji; Shimoe, Yutaka; Mizuta, Ikuko; Mizuno, Toshiki; Kuriyama, Masaru
2018-04-25
A 55-year-old man with no history of stroke or migraine presented to the clinic with cognitive impairment and depression that had been experiencing for two years. Neurological examination showed bilateral pyramidal signs, and impairments in cognition and attention. Brain MRI revealed multiple lacunar lesions and microbleeds in the deep cerebral white matter, subcortical regions, and brainstem, as well as diffuse white matter hyperintensities without anterior temporal pole involvement. Cerebral single-photon emission computed tomography (SPECT) revealed bilateral hypoperfusion in the basal ganglia. Gene analysis revealed an arginine-to-proline missense mutation in the NOTCH3 gene at codon 75. The patient was administered lomerizine (10 mg/day), but the patient's cognitive impairment and cerebral atrophy continued to worsen. Follow-up testing with MRI three years after his initial diagnosis revealed similar lacunar infarctions, cerebral microbleeds, and diffuse white matter hyperintensities to those observed three years earlier. However, MRI scans revealed signs of increased cerebral blood flow. Together, these findings suggest that the patient's cognitive impairments may have been caused by pathogenesis in the cerebral cortex.
Matsumoto, Yoshiya; Kawaguchi, Tomoya; Yamamoto, Norio; Sawa, Kenji; Yoshimoto, Naoki; Suzumura, Tomohiro; Watanabe, Tetsuya; Mitsuoka, Shigeki; Asai, Kazuhisa; Kimura, Tatsuo; Yoshimura, Naruo; Kuwae, Yuko; Hirata, Kazuto
2017-09-01
A 75-year-old man with stage IV lung adenocarcinoma was treated with osimertinib due to disease progression despite having been administered erlotinib. Both an epidermal growth factor receptor (EGFR) L858R mutation on exon 21 and a T790M mutation on exon 20 were detected in a specimen from a recurrent primary tumor. Five weeks after osimertinib initiation, he developed general fatigue and dyspnea. Chest computed tomography scan revealed diffuse ground glass opacities and consolidation on both lungs. An analysis of the bronchoalveolar lavage fluid revealed marked lymphocytosis, and a transbronchial lung biopsy specimen showed a thickened interstitium with fibrosis and prominent lymphocytic infiltration. We diagnosed the patient to have interstitial lung disease induced by osimertinib.
Matsumoto, Yoshiya; Kawaguchi, Tomoya; Yamamoto, Norio; Sawa, Kenji; Yoshimoto, Naoki; Suzumura, Tomohiro; Watanabe, Tetsuya; Mitsuoka, Shigeki; Asai, Kazuhisa; Kimura, Tatsuo; Yoshimura, Naruo; Kuwae, Yuko; Hirata, Kazuto
2017-01-01
A 75-year-old man with stage IV lung adenocarcinoma was treated with osimertinib due to disease progression despite having been administered erlotinib. Both an epidermal growth factor receptor (EGFR) L858R mutation on exon 21 and a T790M mutation on exon 20 were detected in a specimen from a recurrent primary tumor. Five weeks after osimertinib initiation, he developed general fatigue and dyspnea. Chest computed tomography scan revealed diffuse ground glass opacities and consolidation on both lungs. An analysis of the bronchoalveolar lavage fluid revealed marked lymphocytosis, and a transbronchial lung biopsy specimen showed a thickened interstitium with fibrosis and prominent lymphocytic infiltration. We diagnosed the patient to have interstitial lung disease induced by osimertinib. PMID:28794368
Improved radial segregation via the destabilizing vertical Bridgman configuration
NASA Astrophysics Data System (ADS)
Sonda, Paul; Yeckel, Andrew; Daoutidis, Prodromos; Derby, Jeffrey J.
2004-01-01
We employ a computational model to revisit the classic crystal growth experiments conducted by Kim et al. (J. Electrochem. Soc. 119 (1972) 1218) and Müller et al. (J. Crystal Growth 70 (1984) 78), which were among the first to clearly document the effects of flow transitions on segregation. Analysis of the growth of tellerium-doped indium antimonide within a destabilizing vertical Bridgman configuration reveals the existence of multiple states, each of which can be reached by feasible paths of process operation. Transient growth simulations conducted on the different solution branches reveal striking differences in hydrodynamic and segregation behavior. We show that crystals grown in the destabilizing configuration exhibit considerably better radial segregation than those grown in the stabilizing configuration, a result which challenges conventional wisdom and practice.
Successful generation of structural information for fragment-based drug discovery.
Öster, Linda; Tapani, Sofia; Xue, Yafeng; Käck, Helena
2015-09-01
Fragment-based drug discovery relies upon structural information for efficient compound progression, yet it is often challenging to generate structures with bound fragments. A summary of recent literature reveals that a wide repertoire of experimental procedures is employed to generate ligand-bound crystal structures successfully. We share in-house experience from setting up and executing fragment crystallography in a project that resulted in 55 complex structures. The ligands span five orders of magnitude in affinity and the resulting structures are made available to be of use, for example, for development of computational methods. Analysis of the results revealed that ligand properties such as potency, ligand efficiency (LE) and, to some degree, clogP influence the success of complex structure generation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Statistical Significance of Optical Map Alignments
Sarkar, Deepayan; Goldstein, Steve; Schwartz, David C.
2012-01-01
Abstract The Optical Mapping System constructs ordered restriction maps spanning entire genomes through the assembly and analysis of large datasets comprising individually analyzed genomic DNA molecules. Such restriction maps uniquely reveal mammalian genome structure and variation, but also raise computational and statistical questions beyond those that have been solved in the analysis of smaller, microbial genomes. We address the problem of how to filter maps that align poorly to a reference genome. We obtain map-specific thresholds that control errors and improve iterative assembly. We also show how an optimal self-alignment score provides an accurate approximation to the probability of alignment, which is useful in applications seeking to identify structural genomic abnormalities. PMID:22506568
Seeding Cracks Using a Fatigue Tester for Accelerated Gear Tooth Breaking
NASA Technical Reports Server (NTRS)
Nenadic, Nenad G.; Wodenscheck, Joseph A.; Thurston, Michael G.; Lewicki, David G.
2011-01-01
This report describes fatigue-induced seeded cracks in spur gears and compares them to cracks created using a more traditional seeding method, notching. Finite element analysis (FEA) compares the effective compliance of a cracked tooth to the effective compliance of a notched tooth where the crack and the notch are of the same depth. In this analysis, cracks are propagated to the desired depth using FRANC2D and effective compliances are computed in ANSYS. A compliance-based feature for detecting cracks on the fatigue tester is described. The initiated cracks are examined using both nondestructive and destructive methods. The destructive examination reveals variability in the shape of crack surfaces.
Darwish, Ragaa T; Abdel-Aziz, Manal H; El Nekiedy, Abdel-Aziz M; Sobh, Zahraa K
2017-11-01
In forensic sciences to determine one's sex is quite important during the identity defining stage. The reliability of sex determination depends on the completeness of the remains and the degree of sexual dimorphism inherent in the population. Computed Tomography is the imaging modality of choice for two- and three-dimensional documentation and analysis of many autopsy findings. The aim of the present work was to assess the reliability of Three-dimensional Multislice Computed Tomography (3D MSCT) to determine sexual dimorphism from certain chest measurements; sternum and fourth rib using the 3D MSCT and to develop equations for sex determination from these bones among adult Egyptians sample. The present study was performed on 60 adult Egyptians. Their age ranged from 21 up to 74 years and they were equally divided between both sexes. Sixty virtual chests (reconstructed Multislice Computed Tomography 3D images) were examined for detection of Sternal measurements; Manubrium length (ML), Sternal body length (BL), Manubrium width (MW), Sternal body widths(BWa&BWb), Sternal area (SA) [(ML + BL) × (MW + BWa + BWb)/3]and Fourth rib width (FRW). All the studied measurements were significantly higher in males than in females. Multiple regression analysis was used to and three significant regression equations were developed for predicting sex using the different studied chest measurements; the sternal measurements, the sternal area and the widths of the right and left fourth ribs with their accuracies 96.67%.95.0%.72.68% respectively. Sterunm and fourth rib width revealed significant metric sex differences with the use of Multislice Computed Tomography 3D images thus provide a great advantage in the analysis of skeletal remains and badly decomposed bodies. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Bethge, Anja; Schumacher, Udo
2017-01-01
Background Tumor vasculature is critical for tumor growth, formation of distant metastases and efficiency of radio- and chemotherapy treatments. However, how the vasculature itself is affected during cancer treatment regarding to the metastatic behavior has not been thoroughly investigated. Therefore, the aim of this study was to analyze the influence of hypofractionated radiotherapy and cisplatin chemotherapy on vessel tree geometry and metastasis formation in a small cell lung cancer xenograft mouse tumor model to investigate the spread of malignant cells during different treatments modalities. Methods The biological data gained during these experiments were fed into our previously developed computer model “Cancer and Treatment Simulation Tool” (CaTSiT) to model the growth of the primary tumor, its metastatic deposit and also the influence on different therapies. Furthermore, we performed quantitative histology analyses to verify our predictions in xenograft mouse tumor model. Results According to the computer simulation the number of cells engrafting must vary considerably to explain the different weights of the primary tumor at the end of the experiment. Once a primary tumor is established, the fractal dimension of its vasculature correlates with the tumor size. Furthermore, the fractal dimension of the tumor vasculature changes during treatment, indicating that the therapy affects the blood vessels’ geometry. We corroborated these findings with a quantitative histological analysis showing that the blood vessel density is depleted during radiotherapy and cisplatin chemotherapy. The CaTSiT computer model reveals that chemotherapy influences the tumor’s therapeutic susceptibility and its metastatic spreading behavior. Conclusion Using a system biological approach in combination with xenograft models and computer simulations revealed that the usage of chemotherapy and radiation therapy determines the spreading behavior by changing the blood vessel geometry of the primary tumor. PMID:29107953
A research program in empirical computer science
NASA Technical Reports Server (NTRS)
Knight, J. C.
1991-01-01
During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.
NASA Astrophysics Data System (ADS)
Wu, Chia-Hua; Lee, Suiang-Shyan; Lin, Ja-Chen
2017-06-01
This all-in-one hiding method creates two transparencies that have several decoding options: visual decoding with or without translation flipping and computer decoding. In visual decoding, two less-important (or fake) binary secret images S1 and S2 can be revealed. S1 is viewed by the direct stacking of two transparencies. S2 is viewed by flipping one transparency and translating the other to a specified coordinate before stacking. Finally, important/true secret files can be decrypted by a computer using the information extracted from transparencies. The encoding process to hide this information includes the translated-flip visual cryptography, block types, the ways to use polynomial-style sharing, and linear congruential generator. If a thief obtained both transparencies, which are stored in distinct places, he still needs to find the values of keys used in computer decoding to break through after viewing S1 and/or S2 by stacking. However, the thief might just try every other kind of stacking and finally quit finding more secrets; for computer decoding is totally different from stacking decoding. Unlike traditional image hiding that uses images as host media, our method hides fine gray-level images in binary transparencies. Thus, our host media are transparencies. Comparisons and analysis are provided.
Segmentation of cortical bone using fast level sets
NASA Astrophysics Data System (ADS)
Chowdhury, Manish; Jörgens, Daniel; Wang, Chunliang; Smedby, Årjan; Moreno, Rodrigo
2017-02-01
Cortical bone plays a big role in the mechanical competence of bone. The analysis of cortical bone requires accurate segmentation methods. Level set methods are usually in the state-of-the-art for segmenting medical images. However, traditional implementations of this method are computationally expensive. This drawback was recently tackled through the so-called coherent propagation extension of the classical algorithm which has decreased computation times dramatically. In this study, we assess the potential of this technique for segmenting cortical bone in interactive time in 3D images acquired through High Resolution peripheral Quantitative Computed Tomography (HR-pQCT). The obtained segmentations are used to estimate cortical thickness and cortical porosity of the investigated images. Cortical thickness and Cortical porosity is computed using sphere fitting and mathematical morphological operations respectively. Qualitative comparison between the segmentations of our proposed algorithm and a previously published approach on six images volumes reveals superior smoothness properties of the level set approach. While the proposed method yields similar results to previous approaches in regions where the boundary between trabecular and cortical bone is well defined, it yields more stable segmentations in challenging regions. This results in more stable estimation of parameters of cortical bone. The proposed technique takes few seconds to compute, which makes it suitable for clinical settings.
Methane Adsorption in Zr-Based MOFs: Comparison and Critical Evaluation of Force Fields
2017-01-01
The search for nanoporous materials that are highly performing for gas storage and separation is one of the contemporary challenges in material design. The computational tools to aid these experimental efforts are widely available, and adsorption isotherms are routinely computed for huge sets of (hypothetical) frameworks. Clearly the computational results depend on the interactions between the adsorbed species and the adsorbent, which are commonly described using force fields. In this paper, an extensive comparison and in-depth investigation of several force fields from literature is reported for the case of methane adsorption in the Zr-based Metal–Organic Frameworks UiO-66, UiO-67, DUT-52, NU-1000, and MOF-808. Significant quantitative differences in the computed uptake are observed when comparing different force fields, but most qualitative features are common which suggests some predictive power of the simulations when it comes to these properties. More insight into the host–guest interactions is obtained by benchmarking the force fields with an extensive number of ab initio computed single molecule interaction energies. This analysis at the molecular level reveals that especially ab initio derived force fields perform well in reproducing the ab initio interaction energies. Finally, the high sensitivity of uptake predictions on the underlying potential energy surface is explored. PMID:29170687
Jin, Miaomiao; Cheng, Long; Li, Yi; Hu, Siyu; Lu, Ke; Chen, Jia; Duan, Nian; Wang, Zhuorui; Zhou, Yaxiong; Chang, Ting-Chang; Miao, Xiangshui
2018-06-27
Owing to the capability of integrating the information storage and computing in the same physical location, in-memory computing with memristors has become a research hotspot as a promising route for non von Neumann architecture. However, it is still a challenge to develop high performance devices as well as optimized logic methodologies to realize energy-efficient computing. Herein, filamentary Cu/GeTe/TiN memristor is reported to show satisfactory properties with nanosecond switching speed (< 60 ns), low voltage operation (< 2 V), high endurance (>104 cycles) and good retention (>104 s @85℃). It is revealed that the charge carrier conduction mechanisms in high resistance and low resistance states are Schottky emission and hopping transport between the adjacent Cu clusters, respectively, based on the analysis of current-voltage behaviors and resistance-temperature characteristics. An intuitive picture is given to describe the dynamic processes of resistive switching. Moreover, based on the basic material implication (IMP) logic circuit, we proposed a reconfigurable logic method and experimentally implemented IMP, NOT, OR, and COPY logic functions. Design of a one-bit full adder with reduction in computational sequences and its validation in simulation further demonstrate the potential practical application. The results provide important progress towards understanding of resistive switching mechanism and realization of energy-efficient in-memory computing architecture. © 2018 IOP Publishing Ltd.
Wright-Berryman, Jennifer L; Salyers, Michelle P; O'Halloran, James P; Kemp, Aaron S; Mueser, Kim T; Diazoni, Amanda J
2013-12-01
To explore mental health consumer and provider responses to a computerized version of the Illness Management and Recovery (IMR) program. Semistructured interviews were conducted to gather data from 6 providers and 12 consumers who participated in a computerized prototype of the IMR program. An inductive-consensus-based approach was used to analyze the interview responses. Qualitative analysis revealed consumers perceived various personal benefits and ease of use afforded by the new technology platform. Consumers also highly valued provider assistance and offered several suggestions to improve the program. The largest perceived barriers to future implementation were lack of computer skills and access to computers. Similarly, IMR providers commented on its ease and convenience, and the reduction of time intensive material preparation. Providers also expressed that the use of technology creates more options for the consumer to access treatment. The technology was acceptable, easy to use, and well-liked by consumers and providers. Clinician assistance with technology was viewed as helpful to get clients started with the program, as lack of computer skills and access to computers was a concern. Access to materials between sessions appears to be desired; however, given perceived barriers of computer skills and computer access, additional supports may be needed for consumers to achieve full benefits of a computerized version of IMR. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Dynamics of Numerics & Spurious Behaviors in CFD Computations. Revised
NASA Technical Reports Server (NTRS)
Yee, Helen C.; Sweby, Peter K.
1997-01-01
The global nonlinear behavior of finite discretizations for constant time steps and fixed or adaptive grid spacings is studied using tools from dynamical systems theory. Detailed analysis of commonly used temporal and spatial discretizations for simple model problems is presented. The role of dynamics in the understanding of long time behavior of numerical integration and the nonlinear stability, convergence, and reliability of using time-marching approaches for obtaining steady-state numerical solutions in computational fluid dynamics (CFD) is explored. The study is complemented with examples of spurious behavior observed in steady and unsteady CFD computations. The CFD examples were chosen to illustrate non-apparent spurious behavior that was difficult to detect without extensive grid and temporal refinement studies and some knowledge from dynamical systems theory. Studies revealed the various possible dangers of misinterpreting numerical simulation of realistic complex flows that are constrained by available computing power. In large scale computations where the physics of the problem under study is not well understood and numerical simulations are the only viable means of solution, extreme care must be taken in both computation and interpretation of the numerical data. The goal of this paper is to explore the important role that dynamical systems theory can play in the understanding of the global nonlinear behavior of numerical algorithms and to aid the identification of the sources of numerical uncertainties in CFD.
NASA Astrophysics Data System (ADS)
Pichiorri, F.; De Vico Fallani, F.; Cincotti, F.; Babiloni, F.; Molinari, M.; Kleih, S. C.; Neuper, C.; Kübler, A.; Mattia, D.
2011-04-01
The main purpose of electroencephalography (EEG)-based brain-computer interface (BCI) technology is to provide an alternative channel to support communication and control when motor pathways are interrupted. Despite the considerable amount of research focused on the improvement of EEG signal detection and translation into output commands, little is known about how learning to operate a BCI device may affect brain plasticity. This study investigated if and how sensorimotor rhythm-based BCI training would induce persistent functional changes in motor cortex, as assessed with transcranial magnetic stimulation (TMS) and high-density EEG. Motor imagery (MI)-based BCI training in naïve participants led to a significant increase in motor cortical excitability, as revealed by post-training TMS mapping of the hand muscle's cortical representation; peak amplitude and volume of the motor evoked potentials recorded from the opponens pollicis muscle were significantly higher only in those subjects who develop a MI strategy based on imagination of hand grasping to successfully control a computer cursor. Furthermore, analysis of the functional brain networks constructed using a connectivity matrix between scalp electrodes revealed a significant decrease in the global efficiency index for the higher-beta frequency range (22-29 Hz), indicating that the brain network changes its topology with practice of hand grasping MI. Our findings build the neurophysiological basis for the use of non-invasive BCI technology for monitoring and guidance of motor imagery-dependent brain plasticity and thus may render BCI a viable tool for post-stroke rehabilitation.
An insight to the molecular interactions of the FDA approved HIV PR drugs against L38L↑N↑L PR mutant
NASA Astrophysics Data System (ADS)
Sanusi, Zainab K.; Govender, Thavendran; Maguire, Glenn E. M.; Maseko, Sibusiso B.; Lin, Johnson; Kruger, Hendrik G.; Honarparvar, Bahareh
2018-03-01
The aspartate protease of the human immune deficiency type-1 virus (HIV-1) has become a crucial antiviral target in which many useful antiretroviral inhibitors have been developed. However, it seems the emergence of new HIV-1 PR mutations enhances drug resistance, hence, the available FDA approved drugs show less activity towards the protease. A mutation and insertion designated L38L↑N↑L PR was recently reported from subtype of C-SA HIV-1. An integrated two-layered ONIOM (QM:MM) method was employed in this study to examine the binding affinities of the nine HIV PR inhibitors against this mutant. The computed binding free energies as well as experimental data revealed a reduced inhibitory activity towards the L38L↑N↑L PR in comparison with subtype C-SA HIV-1 PR. This observation suggests that the insertion and mutations significantly affect the binding affinities or characteristics of the HIV PIs and/or parent PR. The same trend for the computational binding free energies was observed for eight of the nine inhibitors with respect to the experimental binding free energies. The outcome of this study shows that ONIOM method can be used as a reliable computational approach to rationalize lead compounds against specific targets. The nature of the intermolecular interactions in terms of the host-guest hydrogen bond interactions is discussed using the atoms in molecules (AIM) analysis. Natural bond orbital analysis was also used to determine the extent of charge transfer between the QM region of the L38L↑N↑L PR enzyme and FDA approved drugs. AIM analysis showed that the interaction between the QM region of the L38L↑N↑L PR and FDA approved drugs are electrostatic dominant, the bond stability computed from the NBO analysis supports the results from the AIM application. Future studies will focus on the improvement of the computational model by considering explicit water molecules in the active pocket. We believe that this approach has the potential to provide information that will aid in the design of much improved HIV-1 PR antiviral drugs.
Damaraju, E; Allen, E A; Belger, A; Ford, J M; McEwen, S; Mathalon, D H; Mueller, B A; Pearlson, G D; Potkin, S G; Preda, A; Turner, J A; Vaidya, J G; van Erp, T G; Calhoun, V D
2014-01-01
Schizophrenia is a psychotic disorder characterized by functional dysconnectivity or abnormal integration between distant brain regions. Recent functional imaging studies have implicated large-scale thalamo-cortical connectivity as being disrupted in patients. However, observed connectivity differences in schizophrenia have been inconsistent between studies, with reports of hyperconnectivity and hypoconnectivity between the same brain regions. Using resting state eyes-closed functional imaging and independent component analysis on a multi-site data that included 151 schizophrenia patients and 163 age- and gender matched healthy controls, we decomposed the functional brain data into 100 components and identified 47 as functionally relevant intrinsic connectivity networks. We subsequently evaluated group differences in functional network connectivity, both in a static sense, computed as the pairwise Pearson correlations between the full network time courses (5.4 minutes in length), and a dynamic sense, computed using sliding windows (44 s in length) and k-means clustering to characterize five discrete functional connectivity states. Static connectivity analysis revealed that compared to healthy controls, patients show significantly stronger connectivity, i.e., hyperconnectivity, between the thalamus and sensory networks (auditory, motor and visual), as well as reduced connectivity (hypoconnectivity) between sensory networks from all modalities. Dynamic analysis suggests that (1), on average, schizophrenia patients spend much less time than healthy controls in states typified by strong, large-scale connectivity, and (2), that abnormal connectivity patterns are more pronounced during these connectivity states. In particular, states exhibiting cortical-subcortical antagonism (anti-correlations) and strong positive connectivity between sensory networks are those that show the group differences of thalamic hyperconnectivity and sensory hypoconnectivity. Group differences are weak or absent during other connectivity states. Dynamic analysis also revealed hypoconnectivity between the putamen and sensory networks during the same states of thalamic hyperconnectivity; notably, this finding cannot be observed in the static connectivity analysis. Finally, in post-hoc analyses we observed that the relationships between sub-cortical low frequency power and connectivity with sensory networks is altered in patients, suggesting different functional interactions between sub-cortical nuclei and sensorimotor cortex during specific connectivity states. While important differences between patients with schizophrenia and healthy controls have been identified, one should interpret the results with caution given the history of medication in patients. Taken together, our results support and expand current knowledge regarding dysconnectivity in schizophrenia, and strongly advocate the use of dynamic analyses to better account for and understand functional connectivity differences.
Damaraju, E.; Allen, E.A.; Belger, A.; Ford, J.M.; McEwen, S.; Mathalon, D.H.; Mueller, B.A.; Pearlson, G.D.; Potkin, S.G.; Preda, A.; Turner, J.A.; Vaidya, J.G.; van Erp, T.G.; Calhoun, V.D.
2014-01-01
Schizophrenia is a psychotic disorder characterized by functional dysconnectivity or abnormal integration between distant brain regions. Recent functional imaging studies have implicated large-scale thalamo-cortical connectivity as being disrupted in patients. However, observed connectivity differences in schizophrenia have been inconsistent between studies, with reports of hyperconnectivity and hypoconnectivity between the same brain regions. Using resting state eyes-closed functional imaging and independent component analysis on a multi-site data that included 151 schizophrenia patients and 163 age- and gender matched healthy controls, we decomposed the functional brain data into 100 components and identified 47 as functionally relevant intrinsic connectivity networks. We subsequently evaluated group differences in functional network connectivity, both in a static sense, computed as the pairwise Pearson correlations between the full network time courses (5.4 minutes in length), and a dynamic sense, computed using sliding windows (44 s in length) and k-means clustering to characterize five discrete functional connectivity states. Static connectivity analysis revealed that compared to healthy controls, patients show significantly stronger connectivity, i.e., hyperconnectivity, between the thalamus and sensory networks (auditory, motor and visual), as well as reduced connectivity (hypoconnectivity) between sensory networks from all modalities. Dynamic analysis suggests that (1), on average, schizophrenia patients spend much less time than healthy controls in states typified by strong, large-scale connectivity, and (2), that abnormal connectivity patterns are more pronounced during these connectivity states. In particular, states exhibiting cortical–subcortical antagonism (anti-correlations) and strong positive connectivity between sensory networks are those that show the group differences of thalamic hyperconnectivity and sensory hypoconnectivity. Group differences are weak or absent during other connectivity states. Dynamic analysis also revealed hypoconnectivity between the putamen and sensory networks during the same states of thalamic hyperconnectivity; notably, this finding cannot be observed in the static connectivity analysis. Finally, in post-hoc analyses we observed that the relationships between sub-cortical low frequency power and connectivity with sensory networks is altered in patients, suggesting different functional interactions between sub-cortical nuclei and sensorimotor cortex during specific connectivity states. While important differences between patients with schizophrenia and healthy controls have been identified, one should interpret the results with caution given the history of medication in patients. Taken together, our results support and expand current knowledge regarding dysconnectivity in schizophrenia, and strongly advocate the use of dynamic analyses to better account for and understand functional connectivity differences. PMID:25161896
A comparative study of 2 computer-assisted methods of quantifying brightfield microscopy images.
Tse, George H; Marson, Lorna P
2013-10-01
Immunohistochemistry continues to be a powerful tool for the detection of antigens. There are several commercially available software packages that allow image analysis; however, these can be complex, require relatively high level of computer skills, and can be expensive. We compared 2 commonly available software packages, Adobe Photoshop CS6 and ImageJ, in their ability to quantify percentage positive area after picrosirius red (PSR) staining and 3,3'-diaminobenzidine (DAB) staining. On analysis of DAB-stained B cells in the mouse spleen, with a biotinylated primary rat anti-mouse-B220 antibody, there was no significant difference on converting images from brightfield microscopy to binary images to measure black and white pixels using ImageJ compared with measuring a range of brown pixels with Photoshop (Student t test, P=0.243, correlation r=0.985). When analyzing mouse kidney allografts stained with PSR, Photoshop achieved a greater interquartile range while maintaining a lower 10th percentile value compared with analysis with ImageJ. A lower 10% percentile reflects that Photoshop analysis is better at analyzing tissues with low levels of positive pixels; particularly relevant for control tissues or negative controls, whereas after ImageJ analysis the same images would result in spuriously high levels of positivity. Furthermore comparing the 2 methods by Bland-Altman plot revealed that these 2 methodologies did not agree when measuring images with a higher percentage of positive staining and correlation was poor (r=0.804). We conclude that for computer-assisted analysis of images of DAB-stained tissue there is no difference between using Photoshop or ImageJ. However, for analysis of color images where differentiation into a binary pattern is not easy, such as with PSR, Photoshop is superior at identifying higher levels of positivity while maintaining differentiation of low levels of positive staining.
The soft computing-based approach to investigate allergic diseases: a systematic review.
Tartarisco, Gennaro; Tonacci, Alessandro; Minciullo, Paola Lucia; Billeci, Lucia; Pioggia, Giovanni; Incorvaia, Cristoforo; Gangemi, Sebastiano
2017-01-01
Early recognition of inflammatory markers and their relation to asthma, adverse drug reactions, allergic rhinitis, atopic dermatitis and other allergic diseases is an important goal in allergy. The vast majority of studies in the literature are based on classic statistical methods; however, developments in computational techniques such as soft computing-based approaches hold new promise in this field. The aim of this manuscript is to systematically review the main soft computing-based techniques such as artificial neural networks, support vector machines, bayesian networks and fuzzy logic to investigate their performances in the field of allergic diseases. The review was conducted following PRISMA guidelines and the protocol was registered within PROSPERO database (CRD42016038894). The research was performed on PubMed and ScienceDirect, covering the period starting from September 1, 1990 through April 19, 2016. The review included 27 studies related to allergic diseases and soft computing performances. We observed promising results with an overall accuracy of 86.5%, mainly focused on asthmatic disease. The review reveals that soft computing-based approaches are suitable for big data analysis and can be very powerful, especially when dealing with uncertainty and poorly characterized parameters. Furthermore, they can provide valuable support in case of lack of data and entangled cause-effect relationships, which make it difficult to assess the evolution of disease. Although most works deal with asthma, we believe the soft computing approach could be a real breakthrough and foster new insights into other allergic diseases as well.
Huang, Qinlong; Yang, Yixian; Shi, Yuxiang
2018-02-24
With the growing number of vehicles and popularity of various services in vehicular cloud computing (VCC), message exchanging among vehicles under traffic conditions and in emergency situations is one of the most pressing demands, and has attracted significant attention. However, it is an important challenge to authenticate the legitimate sources of broadcast messages and achieve fine-grained message access control. In this work, we propose SmartVeh, a secure and efficient message access control and authentication scheme in VCC. A hierarchical, attribute-based encryption technique is utilized to achieve fine-grained and flexible message sharing, which ensures that vehicles whose persistent or dynamic attributes satisfy the access policies can access the broadcast message with equipped on-board units (OBUs). Message authentication is enforced by integrating an attribute-based signature, which achieves message authentication and maintains the anonymity of the vehicles. In order to reduce the computations of the OBUs in the vehicles, we outsource the heavy computations of encryption, decryption and signing to a cloud server and road-side units. The theoretical analysis and simulation results reveal that our secure and efficient scheme is suitable for VCC.
Computing Critical Properties with Yang-Yang Anomalies
NASA Astrophysics Data System (ADS)
Orkoulas, Gerassimos; Cerdeirina, Claudio; Fisher, Michael
2017-01-01
Computation of the thermodynamics of fluids in the critical region is a challenging task owing to divergence of the correlation length and lack of particle-hole symmetries found in Ising or lattice-gas models. In addition, analysis of experiments and simulations reveals a Yang-Yang (YY) anomaly which entails sharing of the specific heat singularity between the pressure and the chemical potential. The size of the YY anomaly is measured by the YY ratio Rμ =C μ /CV of the amplitudes of C μ = - T d2 μ /dT2 and of the total specific heat CV. A ``complete scaling'' theory, in which the pressure mixes into the scaling fields, accounts for the YY anomaly. In Phys. Rev. Lett. 116, 040601 (2016), compressible cell gas (CCG) models which exhibit YY and singular diameter anomalies, have been advanced for near-critical fluids. In such models, the individual cell volumes are allowed to fluctuate. The thermodynamics of CCGs can be computed through mapping onto the Ising model via the seldom-used great grand canonical ensemble. The computations indicate that local free volume fluctuations are the origins of the YY effects. Furthermore, local energy-volume coupling (to model water) is another crucial factor underlying the phenomena.
Yang, Yixian; Shi, Yuxiang
2018-01-01
With the growing number of vehicles and popularity of various services in vehicular cloud computing (VCC), message exchanging among vehicles under traffic conditions and in emergency situations is one of the most pressing demands, and has attracted significant attention. However, it is an important challenge to authenticate the legitimate sources of broadcast messages and achieve fine-grained message access control. In this work, we propose SmartVeh, a secure and efficient message access control and authentication scheme in VCC. A hierarchical, attribute-based encryption technique is utilized to achieve fine-grained and flexible message sharing, which ensures that vehicles whose persistent or dynamic attributes satisfy the access policies can access the broadcast message with equipped on-board units (OBUs). Message authentication is enforced by integrating an attribute-based signature, which achieves message authentication and maintains the anonymity of the vehicles. In order to reduce the computations of the OBUs in the vehicles, we outsource the heavy computations of encryption, decryption and signing to a cloud server and road-side units. The theoretical analysis and simulation results reveal that our secure and efficient scheme is suitable for VCC. PMID:29495269
Grammatical Analysis as a Distributed Neurobiological Function
Bozic, Mirjana; Fonteneau, Elisabeth; Su, Li; Marslen-Wilson, William D
2015-01-01
Language processing engages large-scale functional networks in both hemispheres. Although it is widely accepted that left perisylvian regions have a key role in supporting complex grammatical computations, patient data suggest that some aspects of grammatical processing could be supported bilaterally. We investigated the distribution and the nature of grammatical computations across language processing networks by comparing two types of combinatorial grammatical sequences—inflectionally complex words and minimal phrases—and contrasting them with grammatically simple words. Novel multivariate analyses revealed that they engage a coalition of separable subsystems: inflected forms triggered left-lateralized activation, dissociable into dorsal processes supporting morphophonological parsing and ventral, lexically driven morphosyntactic processes. In contrast, simple phrases activated a consistently bilateral pattern of temporal regions, overlapping with inflectional activations in L middle temporal gyrus. These data confirm the role of the left-lateralized frontotemporal network in supporting complex grammatical computations. Critically, they also point to the capacity of bilateral temporal regions to support simple, linear grammatical computations. This is consistent with a dual neurobiological framework where phylogenetically older bihemispheric systems form part of the network that supports language function in the modern human, and where significant capacities for language comprehension remain intact even following severe left hemisphere damage. PMID:25421880
Computer-Vision-Assisted Palm Rehabilitation With Supervised Learning.
Vamsikrishna, K M; Dogra, Debi Prosad; Desarkar, Maunendra Sankar
2016-05-01
Physical rehabilitation supported by the computer-assisted-interface is gaining popularity among health-care fraternity. In this paper, we have proposed a computer-vision-assisted contactless methodology to facilitate palm and finger rehabilitation. Leap motion controller has been interfaced with a computing device to record parameters describing 3-D movements of the palm of a user undergoing rehabilitation. We have proposed an interface using Unity3D development platform. Our interface is capable of analyzing intermediate steps of rehabilitation without the help of an expert, and it can provide online feedback to the user. Isolated gestures are classified using linear discriminant analysis (DA) and support vector machines (SVM). Finally, a set of discrete hidden Markov models (HMM) have been used to classify gesture sequence performed during rehabilitation. Experimental validation using a large number of samples collected from healthy volunteers reveals that DA and SVM perform similarly while applied on isolated gesture recognition. We have compared the results of HMM-based sequence classification with CRF-based techniques. Our results confirm that both HMM and CRF perform quite similarly when tested on gesture sequences. The proposed system can be used for home-based palm or finger rehabilitation in the absence of experts.
2017-01-01
Computational modeling has been applied to simulate the heterogeneity of cancer behavior. The development of Cervical Cancer (CC) is a process in which the cell acquires dynamic behavior from non-deleterious and deleterious mutations, exhibiting chromosomal alterations as a manifestation of this dynamic. To further determine the progression of chromosomal alterations in precursor lesions and CC, we introduce a computational model to study the dynamics of deleterious and non-deleterious mutations as an outcome of tumor progression. The analysis of chromosomal alterations mediated by our model reveals that multiple deleterious mutations are more frequent in precursor lesions than in CC. Cells with lethal deleterious mutations would be eliminated, which would mitigate cancer progression; on the other hand, cells with non-deleterious mutations would become dominant, which could predispose them to cancer progression. The study of somatic alterations through computer simulations of cancer progression provides a feasible pathway for insights into the transformation of cell mechanisms in humans. During cancer progression, tumors may acquire new phenotype traits, such as the ability to invade and metastasize or to become clinically important when they develop drug resistance. Non-deleterious chromosomal alterations contribute to this progression. PMID:28723940
Cellular intelligence: Microphenomenology and the realities of being.
Ford, Brian J
2017-12-01
Traditions of Eastern thought conceptualised life in a holistic sense, emphasising the processes of maintaining health and conquering sickness as manifestations of an essentially spiritual principle that was of overriding importance in the conduct of living. Western science, which drove the overriding and partial eclipse of Eastern traditions, became founded on a reductionist quest for ultimate realities which, in the modern scientific world, has embraced the notion that every living process can be successfully modelled by a digital computer system. It is argued here that the essential processes of cognition, response and decision-making inherent in living cells transcend conventional modelling, and microscopic studies of organisms like the shell-building amoebae and the rhodophyte alga Antithamnion reveal a level of cellular intelligence that is unrecognized by science and is not amenable to computer analysis. Copyright © 2017. Published by Elsevier Ltd.
Kaga, Akimune; Murotsuki, Jun; Kamimura, Miki; Kimura, Masato; Saito-Hakoda, Akiko; Kanno, Junko; Hoshi, Kazuhiko; Kure, Shigeo; Fujiwara, Ikuma
2015-05-01
Achondroplasia and Down syndrome are relatively common conditions individually. But co-occurrence of both conditions in the same patient is rare and there have been no reports of fetal analysis of this condition by prenatal sonographic and three-dimensional (3-D) helical computed tomography (CT). Prenatal sonographic findings seen in persons with Down syndrome, such as a thickened nuchal fold, cardiac defects, and echogenic bowel were not found in the patient. A prenatal 3-D helical CT revealed a large head with frontal bossing, metaphyseal flaring of the long bones, and small iliac wings, which suggested achondroplasia. In a case with combination of achondroplasia and Down syndrome, it may be difficult to diagnose the co-occurrence prenatally without typical markers of Down syndrome. © 2014 Japanese Teratology Society.
Coexistence of glandular papilloma and sclerosing pneumocytoma in the bronchiole.
Kitawaki, Yuko; Fujishima, Fumiyoshi; Taniuchi, Shinji; Saito, Ryoko; Nakamura, Yasuhiro; Sato, Ryoko; Aoyama, Yayoi; Onodera, Yoshiaki; Inoshita, Naoko; Matsuda, Yasushi; Watanabe, Mika; Sasano, Hironobu
2018-04-25
Both glandular papilloma (GP) and sclerosing pneumocytoma (SP) are rare tumors in the lung. We herein report an extremely rare case of coexistence of these two uncommon tumors. The patient was a 40-year-old Japanese woman with no chief complaint. A solitary nodule of the lung was detected using chest computed tomography. The transbronchial biopsy revealed that the tumor histologically corresponded to GP. The patient subsequently underwent partial resection of the right upper lobe. Histological examination of the resected specimens further revealed that the mass contained two different and independent elements and displayed typically histological features of GP and SP. Molecular analysis further revealed the presence of BRAF V600E and AKT1 E17K mutations in GP, whereas only AKT1 mutation was detected in SP. To our knowledge, this is the first case of coexistence of GP and SP in the bronchiole harboring common AKT1 mutation and different BRAF V600E mutational status. © 2018 Japanese Society of Pathology and John Wiley & Sons Australia, Ltd.
Ancient DNA sequence revealed by error-correcting codes.
Brandão, Marcelo M; Spoladore, Larissa; Faria, Luzinete C B; Rocha, Andréa S L; Silva-Filho, Marcio C; Palazzo, Reginaldo
2015-07-10
A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code.
Ancient DNA sequence revealed by error-correcting codes
Brandão, Marcelo M.; Spoladore, Larissa; Faria, Luzinete C. B.; Rocha, Andréa S. L.; Silva-Filho, Marcio C.; Palazzo, Reginaldo
2015-01-01
A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code. PMID:26159228
Intestinal gastrointestinal stromal tumor in a cat
SUWA, Akihisa; SHIMODA, Tetsuya
2017-01-01
A 12-year-old, 3.6-kg, spayed female domestic shorthaired cat had a 2-month history of anorexia and weight loss. Abdominal ultrasonography and computed tomography revealed an exophytic mass originating from the jejunum with very poor central and poor peripheral contrast enhancement. On day 14, surgical resection of the jejunum and mass with 5-cm margins and an end-to-end anastomosis were performed. Histopathological examination revealed the mass was a transmural, invasive cancer showing exophytic growth and originating from the small intestinal muscle layer. Immunohistochemical analysis of tumor cells revealed diffuse positivity for KIT protein and negativity for desmin and S-100. The mass was diagnosed as a gastrointestinal stromal tumor (GIST). Ultrasonographic findings indicated the tumor probably metastasized to the liver and omentum, as seen in humans and dogs. The owner rejected further treatment at the last visit on day 192. To our knowledge, this is the first report of intestinal tumor and metastasis in feline GIST and its imaging features. PMID:28163271
Automated image analysis reveals the dynamic 3-dimensional organization of multi-ciliary arrays
Galati, Domenico F.; Abuin, David S.; Tauber, Gabriel A.; Pham, Andrew T.; Pearson, Chad G.
2016-01-01
ABSTRACT Multi-ciliated cells (MCCs) use polarized fields of undulating cilia (ciliary array) to produce fluid flow that is essential for many biological processes. Cilia are positioned by microtubule scaffolds called basal bodies (BBs) that are arranged within a spatially complex 3-dimensional geometry (3D). Here, we develop a robust and automated computational image analysis routine to quantify 3D BB organization in the ciliate, Tetrahymena thermophila. Using this routine, we generate the first morphologically constrained 3D reconstructions of Tetrahymena cells and elucidate rules that govern the kinetics of MCC organization. We demonstrate the interplay between BB duplication and cell size expansion through the cell cycle. In mutant cells, we identify a potential BB surveillance mechanism that balances large gaps in BB spacing by increasing the frequency of closely spaced BBs in other regions of the cell. Finally, by taking advantage of a mutant predisposed to BB disorganization, we locate the spatial domains that are most prone to disorganization by environmental stimuli. Collectively, our analyses reveal the importance of quantitative image analysis to understand the principles that guide the 3D organization of MCCs. PMID:26700722
Yamashita, Ken-Ichiro; Taniwaki, Yoshihide; Utsunomiya, Hidetsuna; Taniwaki, Takayuki
2014-01-01
Impairment of orientation for time (OT) is a characteristic symptom of Alzheimer disease (AD). However, the brain regions underlying OT remain to be elucidated. Using single photon emission computed tomography (SPECT), we examined the brain regions exhibiting hypoperfusion that were associated with OT. We compared regional cerebral blood flow (rCBF) differences between AD and amnesic mild cognitive impairment (aMCI) or normal subjects using 3-dimensional stereotactic surface projection (3D-SSP) analysis. AD patients were divided into OT good and poor groups according to their mean OT scores, and rCBF then compared between the groups to elucidate OT-specific brain areas. 3D-SSP analysis showed reduced rCBF in the left superior parietal lobule (SPL) and bilateral inferior parietal lobule (IPL) in AD patients. In the poor OT group, 3D-SSP analysis revealed hypoperfusion in the bilateral SPL, IPL, posterior cingulated cortex (PCC), and precuneus. Among these areas, region of interest analysis revealed a significant higher number of hypoperfused pixels in the left PCC in the OT poor AD group. Our SPECT study suggested that hypoperfusion in the left SPL and bilateral IPL was AD specific, and reduced rCBF in the left PCC was specifically associated with OT. Copyright © 2014 by the American Society of Neuroimaging.
ERIC Educational Resources Information Center
Adalier, Ahmet
2012-01-01
The aim of this study is to reveal the relation between the Turkish and English language teacher candidates' social demographic characteristics and their perceived computer self-efficacy and attitudes toward computer. The population of the study consists of the teacher candidates in the Turkish and English language departments at the universities…
Reviewing and Critiquing Computer Learning and Usage among Older Adults
ERIC Educational Resources Information Center
Kim, Young Sek
2008-01-01
By searching the keywords of "older adult" and "computer" in ERIC, Academic Search Premier, and PsycINFO, this study reviewed 70 studies published after 1990 that address older adults' computer learning and usage. This study revealed 5 prominent themes among reviewed literature: (a) motivations and barriers of older adults' usage of computers, (b)…
A Method for Aircraft Concept Selection Using Multicriteria Interactive Genetic Algorithms
NASA Technical Reports Server (NTRS)
Buonanno, Michael; Mavris, Dimitri
2005-01-01
The problem of aircraft concept selection has become increasingly difficult in recent years as a result of a change from performance as the primary evaluation criteria of aircraft concepts to the current situation in which environmental effects, economics, and aesthetics must also be evaluated and considered in the earliest stages of the decision-making process. This has prompted a shift from design using historical data regression techniques for metric prediction to the use of physics-based analysis tools that are capable of analyzing designs outside of the historical database. The use of optimization methods with these physics-based tools, however, has proven difficult because of the tendency of optimizers to exploit assumptions present in the models and drive the design towards a solution which, while promising to the computer, may be infeasible due to factors not considered by the computer codes. In addition to this difficulty, the number of discrete options available at this stage may be unmanageable due to the combinatorial nature of the concept selection problem, leading the analyst to arbitrarily choose a sub-optimum baseline vehicle. These concept decisions such as the type of control surface scheme to use, though extremely important, are frequently made without sufficient understanding of their impact on the important system metrics because of a lack of computational resources or analysis tools. This paper describes a hybrid subjective/quantitative optimization method and its application to the concept selection of a Small Supersonic Transport. The method uses Genetic Algorithms to operate on a population of designs and promote improvement by varying more than sixty parameters governing the vehicle geometry, mission, and requirements. In addition to using computer codes for evaluation of quantitative criteria such as gross weight, expert input is also considered to account for criteria such as aeroelasticity or manufacturability which may be impossible or too computationally expensive to consider explicitly in the analysis. Results indicate that concepts resulting from the use of this method represent designs which are promising to both the computer and the analyst, and that a mapping between concepts and requirements that would not otherwise be apparent is revealed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okada, H.; Kato, M.; Ishimaru, T.
2014-02-20
Organometallic chemical vapor deposition of silicon nitride films enhanced by atomic nitrogen generated from surface-wave plasma is investigated. Feasibility of precursors of triethylsilane (TES) and bis(dimethylamino)dimethylsilane (BDMADMS) is discussed based on a calculation of bond energies by computer simulation. Refractive indices of 1.81 and 1.71 are obtained for deposited films with TES and BDMADMS, respectively. X-ray photoelectron spectroscopy (XPS) analysis of the deposited film revealed that TES-based film coincides with the stoichiometric thermal silicon nitride.
Computational neuroanatomy of speech production.
Hickok, Gregory
2012-01-05
Speech production has been studied predominantly from within two traditions, psycholinguistics and motor control. These traditions have rarely interacted, and the resulting chasm between these approaches seems to reflect a level of analysis difference: whereas motor control is concerned with lower-level articulatory control, psycholinguistics focuses on higher-level linguistic processing. However, closer examination of both approaches reveals a substantial convergence of ideas. The goal of this article is to integrate psycholinguistic and motor control approaches to speech production. The result of this synthesis is a neuroanatomically grounded, hierarchical state feedback control model of speech production.
Information systems analysis approach in hospitals: a national survey.
Wong, B K; Sellaro, C L; Monaco, J A
1995-03-01
A survey of 216 hospitals reveals that some hospitals do not conduct cost-benefit analyses or analyze possible adverse effects in feasibility studies. In determining and analyzing system requirements, external factors that initiate the transaction are not examined, and computer-aided software engineering (CASE) tools are seldom used. Some hospitals do not investigate the advantages and disadvantages of using in-house-developed software versus purchased software packages in the evaluation of alternatives. The survey finds that, overall, most hospitals follow the traditional systems development life cycle (SDLC) approach in analyzing information systems.
Validation of the Transient Structural Response of a Threaded Assembly: Phase I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott W.; Hemez, Francois M.; Robertson, Amy N.
2004-04-01
This report explores the application of model validation techniques in structural dynamics. The problem of interest is the propagation of an explosive-driven mechanical shock through a complex threaded joint. The study serves the purpose of assessing whether validating a large-size computational model is feasible, which unit experiments are required, and where the main sources of uncertainty reside. The results documented here are preliminary, and the analyses are exploratory in nature. The results obtained to date reveal several deficiencies of the analysis, to be rectified in future work.
Synthetic aperture radar operator tactical target acquisition research
NASA Technical Reports Server (NTRS)
Hershberger, M. L.; Craig, D. W.
1978-01-01
A radar target acquisition research study was conducted to access the effects of two levels of 13 radar sensor, display, and mission parameters on operator tactical target acquisition. A saturated fractional-factorial screening design was employed to examine these parameters. Data analysis computed ETA squared values for main and second-order effects for the variables tested. Ranking of the research parameters in terms of importance to system design revealed four variables (radar coverage, radar resolution/multiple looks, display resolution, and display size) accounted for 50 percent of the target acquisition probability variance.
NASA Astrophysics Data System (ADS)
Tavakoli, Mohammad Hossein; Renani, Elahe Kabiri; Honarmandnia, Mohtaram; Ezheiyan, Mahdi
2018-02-01
In this paper, a set of numerical simulations of fluid flow, temperature gradient, thermal stress and dislocation density for a Czochralski setup used to grow IR optical-grade Ge single crystal have been done for different stages of the growth process. A two-dimensional steady state finite element method has been applied for all calculations. The obtained numerical results reveal that the thermal field, thermal stress and dislocation structure are mainly dependent on the crystal height, heat radiation and gas flow in the growth system.
NASA Astrophysics Data System (ADS)
Goloshumova, V. N.; Kortenko, V. V.; Pokhoriler, V. L.; Kultyshev, A. Yu.; Ivanovskii, A. A.
2008-08-01
We describe the experience ZAO Ural Turbine Works specialists gained from mastering the series of CAD/CAE/CAM/PDM technologies, which are modern software tools of computer-aided engineering. We also present the results obtained from mathematical simulation of the process through which high-and intermediate-pressure rotors are heated for revealing the most thermally stressed zones, as well as the results from mathematical simulation of a new design of turbine cylinder shells for improving the maneuverability of these turbines.
Piloting the membranolytic activities of peptides with a self-organizing map.
Lin, Yen-Chu; Hiss, Jan A; Schneider, Petra; Thelesklaf, Peter; Lim, Yi Fan; Pillong, Max; Koehler, Fabian M; Dittrich, Petra S; Halin, Cornelia; Wessler, Silja; Schneider, Gisbert
2014-10-13
Antimicrobial peptides (AMPs) show remarkable selectivity toward lipid membranes and possess promising antibiotic potential. Their modes of action are diverse and not fully understood, and innovative peptide design strategies are needed to generate AMPs with improved properties. We present a de novo peptide design approach that resulted in new AMPs possessing low-nanomolar membranolytic activities. Thermal analysis revealed an entropy-driven mechanism of action. The study demonstrates sustained potential of advanced computational methods for designing peptides with the desired activity. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Concentrating on beauty: sexual selection and sociospatial memory.
Becker, D Vaughn; Kenrick, Douglas T; Guerin, Stephen; Maner, Jon K
2005-12-01
In three experiments, location memory for faces was examined using a computer version of the matching game Concentration. Findings suggested that physical attractiveness led to more efficient matching for female faces but not for male faces. Study 3 revealed this interaction despite allowing participants to initially see, attend to, and match the attractive male faces in the first few turns. Analysis of matching errors suggested that, compared to other targets, attractive women were less confusable with one another. Results are discussed in terms of the different functions that attractiveness serves for men and women.
Assessment of uncertainties of the models used in thermal-hydraulic computer codes
NASA Astrophysics Data System (ADS)
Gricay, A. S.; Migrov, Yu. A.
2015-09-01
The article deals with matters concerned with the problem of determining the statistical characteristics of variable parameters (the variation range and distribution law) in analyzing the uncertainty and sensitivity of calculation results to uncertainty in input data. A comparative analysis of modern approaches to uncertainty in input data is presented. The need to develop an alternative method for estimating the uncertainty of model parameters used in thermal-hydraulic computer codes, in particular, in the closing correlations of the loop thermal hydraulics block, is shown. Such a method shall feature the minimal degree of subjectivism and must be based on objective quantitative assessment criteria. The method includes three sequential stages: selecting experimental data satisfying the specified criteria, identifying the key closing correlation using a sensitivity analysis, and carrying out case calculations followed by statistical processing of the results. By using the method, one can estimate the uncertainty range of a variable parameter and establish its distribution law in the above-mentioned range provided that the experimental information is sufficiently representative. Practical application of the method is demonstrated taking as an example the problem of estimating the uncertainty of a parameter appearing in the model describing transition to post-burnout heat transfer that is used in the thermal-hydraulic computer code KORSAR. The performed study revealed the need to narrow the previously established uncertainty range of this parameter and to replace the uniform distribution law in the above-mentioned range by the Gaussian distribution law. The proposed method can be applied to different thermal-hydraulic computer codes. In some cases, application of the method can make it possible to achieve a smaller degree of conservatism in the expert estimates of uncertainties pertinent to the model parameters used in computer codes.
CMG-biotools, a free workbench for basic comparative microbial genomics.
Vesth, Tammi; Lagesen, Karin; Acar, Öncel; Ussery, David
2013-01-01
Today, there are more than a hundred times as many sequenced prokaryotic genomes than were present in the year 2000. The economical sequencing of genomic DNA has facilitated a whole new approach to microbial genomics. The real power of genomics is manifested through comparative genomics that can reveal strain specific characteristics, diversity within species and many other aspects. However, comparative genomics is a field not easily entered into by scientists with few computational skills. The CMG-biotools package is designed for microbiologists with limited knowledge of computational analysis and can be used to perform a number of analyses and comparisons of genomic data. The CMG-biotools system presents a stand-alone interface for comparative microbial genomics. The package is a customized operating system, based on Xubuntu 10.10, available through the open source Ubuntu project. The system can be installed on a virtual computer, allowing the user to run the system alongside any other operating system. Source codes for all programs are provided under GNU license, which makes it possible to transfer the programs to other systems if so desired. We here demonstrate the package by comparing and analyzing the diversity within the class Negativicutes, represented by 31 genomes including 10 genera. The analyses include 16S rRNA phylogeny, basic DNA and codon statistics, proteome comparisons using BLAST and graphical analyses of DNA structures. This paper shows the strength and diverse use of the CMG-biotools system. The system can be installed on a vide range of host operating systems and utilizes as much of the host computer as desired. It allows the user to compare multiple genomes, from various sources using standardized data formats and intuitive visualizations of results. The examples presented here clearly shows that users with limited computational experience can perform complicated analysis without much training.
Role of virtual bronchoscopy in children with a vegetable foreign body in the tracheobronchial tree.
Behera, G; Tripathy, N; Maru, Y K; Mundra, R K; Gupta, Y; Lodha, M
2014-12-01
Multidetector computed tomography virtual bronchoscopy is a non-invasive diagnostic tool which provides a three-dimensional view of the tracheobronchial airway. This study aimed to evaluate the usefulness of virtual bronchoscopy in cases of vegetable foreign body aspiration in children. The medical records of patients with a history of foreign body aspiration from August 2006 to August 2010 were reviewed. Data were collected regarding their clinical presentation and chest X-ray, virtual bronchoscopy and rigid bronchoscopy findings. Cases of metallic and other non-vegetable foreign bodies were excluded from the analysis. Patients with multidetector computed tomography virtual bronchoscopy showing features of vegetable foreign body were included in the analysis. For each patient, virtual bronchoscopy findings were reviewed and compared with those of rigid bronchoscopy. A total of 60 patients; all children ranging from 1 month to 8 years of age, were included. The mean age at presentation was 2.01 years. Rigid bronchoscopy confirmed the results of multidetector computed tomography virtual bronchoscopy (i.e. presence of foreign body, site of lodgement, and size and shape) in 59 patients. In the remaining case, a vegetable foreign body identified by virtual bronchoscopy was revealed by rigid bronchoscopy to be a thick mucus plug. Thus, the positive predictive value of virtual bronchoscopy was 98.3 per cent. Multidetector computed tomography virtual bronchoscopy is a sensitive and specific diagnostic tool for identifying radiolucent vegetable foreign bodies in the tracheobronchial tree. It can also provide a useful pre-operative road map for rigid bronchoscopy. Patients suspected of having an airway foreign body or chronic unexplained respiratory symptoms should undergo multidetector computed tomography virtual bronchoscopy to rule out a vegetable foreign body in the tracheobronchial tree and avoid general anaesthesia and invasive rigid bronchoscopy.
Dynamical analysis of Parkinsonian state emulated by hybrid Izhikevich neuron models
NASA Astrophysics Data System (ADS)
Liu, Chen; Wang, Jiang; Yu, Haitao; Deng, Bin; Wei, Xile; Li, Huiyan; Loparo, Kenneth A.; Fietkiewicz, Chris
2015-11-01
Computational models play a significant role in exploring novel theories to complement the findings of physiological experiments. Various computational models have been developed to reveal the mechanisms underlying brain functions. Particularly, in the development of therapies to modulate behavioral and pathological abnormalities, computational models provide the basic foundations to exhibit transitions between physiological and pathological conditions. Considering the significant roles of the intrinsic properties of the globus pallidus and the coupling connections between neurons in determining the firing patterns and the dynamical activities of the basal ganglia neuronal network, we propose a hypothesis that pathological behaviors under the Parkinsonian state may originate from combined effects of intrinsic properties of globus pallidus neurons and synaptic conductances in the whole neuronal network. In order to establish a computational efficient network model, hybrid Izhikevich neuron model is used due to its capacity of capturing the dynamical characteristics of the biological neuronal activities. Detailed analysis of the individual Izhikevich neuron model can assist in understanding the roles of model parameters, which then facilitates the establishment of the basal ganglia-thalamic network model, and contributes to a further exploration of the underlying mechanisms of the Parkinsonian state. Simulation results show that the hybrid Izhikevich neuron model is capable of capturing many of the dynamical properties of the basal ganglia-thalamic neuronal network, such as variations of the firing rates and emergence of synchronous oscillations under the Parkinsonian condition, despite the simplicity of the two-dimensional neuronal model. It may suggest that the computational efficient hybrid Izhikevich neuron model can be used to explore basal ganglia normal and abnormal functions. Especially it provides an efficient way of emulating the large-scale neuron network and potentially contributes to development of improved therapy for neurological disorders such as Parkinson's disease.
NASA Astrophysics Data System (ADS)
Griffard, Phyllis Baudoin
1999-11-01
The main research question of this study was: What gaps in biochemical understanding are revealed by a range of university introductory biology students as they work through a critically acclaimed multimedia program on photosynthesis, and what are the corresponding implications for elaboration of the Ausubel-Novak-Gowin Learning Theory (ANG, now Human Constructivism)? Twelve students, mixed for ability, gender and ethnicity, were recruited from two sections of "Bio 101." Before and after instruction in photosynthesis, in-depth clinical interviews were conducted during which participants completed a range of cognitive tasks such as sorting, concept mapping, explaining and predicting. Some tasks involved interacting with a computer simulation of photosynthesis. This study primarily employed qualitative case study and verbal analysis methods. Verbal analysis of the clinical interviews revealed numerous gaps that were categorized into typologies. The two major categories were propositional gaps and processing gaps. Propositional gaps were evident in development of participants' concepts, links and constructs. Significant among these were conceptual distance gaps and continuity of matter gaps. Gaps such as convention gaps and relative significance gaps seem to be due to naivete in the discipline. Processing gaps included gaps in graphic decoding skills and relevant cognitive habits such as self-monitoring and consulting prior knowledge. Although the gaps were easier to detect and isolate with the above-average participants, all participants showed evidence of at least some of these gaps. Since some gaps are not unexpected at all but the highest literacy levels, not all the gaps identified are to be considered deficiencies. The gaps identified support the attention given by ANG theorists to the role of prior knowledge and metacognition as well as the value of graphic organizers in knowledge construction. In addition, this study revealed numerous gaps in graphic decoding, indicating that both direct experience and explicit instruction are needed if students are to "learn how to learn with graphics," especially those graphics central to understanding a computer simulation's representations of structures, inputs, processes and outputs. It is hypothesized that gaps similar to those revealed in this study may be at the root of some alternative conceptions documented in the literature.
Reciprocal Questioning and Computer-based Instruction in Introductory Auditing: Student Perceptions.
ERIC Educational Resources Information Center
Watters, Mike
2000-01-01
An auditing course used reciprocal questioning (Socratic method) and computer-based instruction. Separate evaluations by 67 students revealed a strong aversion to the Socratic method; students expected professors to lecture. They showed a strong preference for the computer-based assignment. (SK)
Investigating College and Graduate Students' Multivariable Reasoning in Computational Modeling
ERIC Educational Resources Information Center
Wu, Hsin-Kai; Wu, Pai-Hsing; Zhang, Wen-Xin; Hsu, Ying-Shao
2013-01-01
Drawing upon the literature in computational modeling, multivariable reasoning, and causal attribution, this study aims at characterizing multivariable reasoning practices in computational modeling and revealing the nature of understanding about multivariable causality. We recruited two freshmen, two sophomores, two juniors, two seniors, four…
Fabijanić, Ivana; Matković-Čalogović, Dubravka; Pilepić, Viktor; Sanković, Krešimir
2017-12-01
The crystallization and characterization of a new polymorph of 2-thiouracil by single-crystal X-ray diffraction, Hirshfeld surface analysis and periodic density functional theory (DFT) calculations are described. The previously published polymorph (A) crystallizes in the triclinic space group P\\overline{1}, while that described herein (B) crystallizes in the monoclinic space group P2 1 /c. Periodic DFT calculations showed that the energies of polymorphs A and B, compared to the gas-phase geometry, were -108.8 and -29.4 kJ mol -1 , respectively. The two polymorphs have different intermolecular contacts that were analyzed and are discussed in detail. Significant differences in the molecular structure were found only in the bond lengths and angles involving heteroatoms that are involved in hydrogen bonds. Decomposition of the Hirshfeld fingerprint plots revealed that O...H and S...H contacts cover over 50% of the noncovalent contacts in both of the polymorphs; however, they are quite different in strength. Hydrogen bonds of the N-H...O and N-H...S types were found in polymorph A, whereas in polymorph B, only those of the N-H...O type are present, resulting in a different packing in the unit cell. QTAIM (quantum theory of atoms in molecules) computational analysis showed that the interaction energies for these weak-to-medium strength hydrogen bonds with a noncovalent or mixed interaction character were estimated to fall within the ranges 5.4-10.2 and 4.9-9.2 kJ mol -1 for polymorphs A and B, respectively. Also, the NCI (noncovalent interaction) plots revealed weak stacking interactions. The interaction energies for these interactions were in the ranges 3.5-4.1 and 3.1-5.5 kJ mol -1 for polymorphs A and B, respectively, as shown by QTAIM analysis.
Rajan, Vijisha K; Muraleedharan, K
2017-04-01
A computational DFT-B3LYP structural analysis of a poly phenol, Gallic acid (GA) has been performed by using 6-311++ G (df, p) basis set. The GA is a relatively stable molecule with considerable radical scavenging capacity. It is a well known antioxidant. The NBO analysis shows that the aromatic system is delocalized. The results reveal that the most stable radical is formed at O 3 -atom upon scavenging the free radicals. Global descriptive parameters show that GA acts as an acceptor center in charge transfer complex formation which is supported by ESP and contour diagrams and also by Q max value. The GA is a good antioxidant and it can be better understood by HAT and TMC mechanisms as it has low BDE, ΔH acidity and ΔG acidity values. The ΔBDE and ΔAIP values also confirm that the antioxidant capacity of GA can be explained through HAT rather than the SET-PT mechanism. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh
2017-08-01
One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000-2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adur, Rohan, E-mail: adur@physics.osu.edu; Du, Chunhui; Manuilov, Sergei A.
2015-05-07
The dipole field from a probe magnet can be used to localize a discrete spectrum of standing spin wave modes in a continuous ferromagnetic thin film without lithographic modification to the film. Obtaining the resonance field for a localized mode is not trivial due to the effect of the confined and inhomogeneous magnetization precession. We compare the results of micromagnetic and analytic methods to find the resonance field of localized modes in a ferromagnetic thin film, and investigate the accuracy of these methods by comparing with a numerical minimization technique that assumes Bessel function modes with pinned boundary conditions. Wemore » find that the micromagnetic technique, while computationally more intensive, reveals that the true magnetization profiles of localized modes are similar to Bessel functions with gradually decaying dynamic magnetization at the mode edges. We also find that an analytic solution, which is simple to implement and computationally much faster than other methods, accurately describes the resonance field of localized modes when exchange fields are negligible, and demonstrating the accessibility of localized mode analysis.« less
Enzymatic Transition States, Transition-State Analogs, Dynamics, Thermodynamics, and Lifetimes
Schramm, Vern L.
2017-01-01
Experimental analysis of enzymatic transition-state structures uses kinetic isotope effects (KIEs) to report on bonding and geometry differences between reactants and the transition state. Computational correlation of experimental values with chemical models permits three-dimensional geometric and electrostatic assignment of transition states formed at enzymatic catalytic sites. The combination of experimental and computational access to transition-state information permits (a) the design of transition-state analogs as powerful enzymatic inhibitors, (b) exploration of protein features linked to transition-state structure, (c) analysis of ensemble atomic motions involved in achieving the transition state, (d) transition-state lifetimes, and (e) separation of ground-state (Michaelis complexes) from transition-state effects. Transition-state analogs with picomolar dissociation constants have been achieved for several enzymatic targets. Transition states of closely related isozymes indicate that the protein’s dynamic architecture is linked to transition-state structure. Fast dynamic motions in catalytic sites are linked to transition-state generation. Enzymatic transition states have lifetimes of femtoseconds, the lifetime of bond vibrations. Binding isotope effects (BIEs) reveal relative reactant and transition-state analog binding distortion for comparison with actual transition states. PMID:21675920
Correlation of ERTS MSS data and earth coordinate systems
NASA Technical Reports Server (NTRS)
Malila, W. A. (Principal Investigator); Hieber, R. H.; Mccleer, A. P.
1973-01-01
The author has identified the following significant results. Experience has revealed a problem in the analysis and interpretation of ERTS-1 multispectral scanner (MSS) data. The problem is one of accurately correlating ERTS-1 MSS pixels with analysis areas specified on aerial photographs or topographic maps for training recognition computers and/or evaluating recognition results. It is difficult for an analyst to accurately identify which ERTS-1 pixels on a digital image display belong to specific areas and test plots, especially when they are small. A computer-aided procedure to correlate coordinates from topographic maps and/or aerial photographs with ERTS-1 data coordinates has been developed. In the procedure, a map transformation from earth coordinates to ERTS-1 scan line and point numbers is calculated using selected ground control points nad the method of least squares. The map transformation is then applied to the earth coordinates of selected areas to obtain the corresponding ERTS-1 point and line numbers. An optional provision allows moving the boundaries of the plots inward by variable distances so the selected pixels will not overlap adjacent features.
Xu, Xiejun; Xiao, Xingqing; Wang, Yiming; Xu, Shouhong; Liu, Honglai
2018-06-13
Targeted therapy for cancer requires thermosensitive components in drug carriers for controlled drug release against viral cells. The conformational transition characteristic of leucine zipper-structured lipopeptides is utilized in our lab to modulate the phase transition temperature of liposomes, thus achieving temperature-responsive control. In this study, we computationally examined the conformational transition behaviors of leucine zipper-structured lipopeptides that were modified at the N-terminus by distinct functional groups. The conformational transition temperatures of these lipopeptides were determined by structural analysis of the implicit-solvent replica exchange molecular dynamics simulation trajectories using the dihedral angle principal component analysis and the dictionary of protein secondary structure method. Our calculations revealed that the computed transition temperatures of the lipopeptides are in good agreement with the experimental measurements. The effect of hydrogen bonds on the conformational stability of the lipopeptide dimers was examined in conventional explicit-solvent molecular dynamics simulations. A quantitative correlation of the degree of structural dissociation of the dimers and their binding strength is well described by an exponential fit of the binding free energies to the conformation transition temperatures of the lipopeptides.
Approximation of traveling wave solutions in wall-bounded flows using resolvent modes
NASA Astrophysics Data System (ADS)
McKeon, Beverley; Graham, Michael; Moarref, Rashad; Park, Jae Sung; Sharma, Ati; Willis, Ashley
2014-11-01
Significant recent attention has been devoted to computing and understanding exact traveling wave solutions of the Navier-Stokes equations. These solutions can be interpreted as the state-space skeleton of turbulence and are attractive benchmarks for studying low-order models of wall turbulence. Here, we project such solutions onto the velocity response (or resolvent) modes supplied by the gain-based resolvent analysis outlined by McKeon & Sharma (JFM, 2010). We demonstrate that in both pipe (Pringle et al., Phil. Trans. R. Soc. A, 2009) and channel (Waleffe, JFM, 2001) flows, the solutions can be well-described by a small number of resolvent modes. Analysis of the nonlinear forcing modes sustaining these solutions reveals the importance of small amplitude forcing, consistent with the large amplifications admitted by the resolvent operator. We investigate the use of resolvent modes as computationally cheap ``seeds'' for the identification of further traveling wave solutions. The support of AFOSR under Grants FA9550-09-1-0701, FA9550-12-1-0469, FA9550-11-1-0094 and FA9550-14-1-0042 (program managers Rengasamy Ponnappan, Doug Smith and Gregg Abate) is gratefully acknowledged.
Correlating CFD Simulation with Wind Tunnel Test for the Full-Scale UH-60A Airloads Rotor
NASA Technical Reports Server (NTRS)
Romandr, Ethan; Norman, Thomas R.; Chang, I-Chung
2011-01-01
Data from the recent UH-60A Airloads Test in the National Full-Scale Aerodynamics Complex 40- by 80- Foot Wind Tunnel at NASA Ames Research Center are presented and compared to predictions computed by a loosely coupled Computational Fluid Dynamics (CFD)/Comprehensive analysis. Primary calculations model the rotor in free-air, but initial calculations are presented including a model of the tunnel test section. The conditions studied include a speed sweep at constant lift up to an advance ratio of 0.4 and a thrust sweep at constant speed into deep stall. Predictions show reasonable agreement with measurement for integrated performance indicators such as power and propulsive but occasionally deviate significantly. Detailed analysis of sectional airloads reveals good correlation in overall trends for normal force and pitching moment but pitching moment mean often differs. Chord force is frequently plagued by mean shifts and an overprediction of drag on the advancing side. Locations of significant aerodynamic phenomena are predicted accurately although the magnitude of individual events is often missed.
Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh
2017-01-01
One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000–2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL. PMID:28989193
Foo, Jong Yong Abdiel
2009-01-01
The simplest and widely used assessment of academic research and researchers is the journal impact factor (JIF). However, the JIF may exhibit patterns that are skewed towards journals that publish high number of non-research items and short turnover research. Moreover, there are concerns as the JIF is often used as a comparison for journals from different disciplines. In this study, the JIF computation of eight top ranked journals from four different subject categories was analyzed. The analysis reveals that most of the published items (>65%) in the science disciplines were nonresearch items while fewer such items (<22%) were observed in engineering-based journals. The single regression analysis confirmed that there is correlation (R(2) > or = .99) in the number of published items or citations received over the two-year period used in the JIF calculation amongst the eight selected journals. A weighted factor computation is introduced to compensate for the smaller journals and journals that publish longer turnover research. It is hoped that the approach can provide a comprehensive assessment of the quality of a journal regardless of the disciplinary field.
Resolvent analysis of shear flows using One-Way Navier-Stokes equations
NASA Astrophysics Data System (ADS)
Rigas, Georgios; Schmidt, Oliver; Towne, Aaron; Colonius, Tim
2017-11-01
For three-dimensional flows, questions of stability, receptivity, secondary flows, and coherent structures require the solution of large partial-derivative eigenvalue problems. Reduced-order approximations are thus required for engineering prediction since these problems are often computationally intractable or prohibitively expensive. For spatially slowly evolving flows, such as jets and boundary layers, the One-Way Navier-Stokes (OWNS) equations permit a fast spatial marching procedure that results in a huge reduction in computational cost. Here, an adjoint-based optimization framework is proposed and demonstrated for calculating optimal boundary conditions and optimal volumetric forcing. The corresponding optimal response modes are validated against modes obtained in terms of global resolvent analysis. For laminar base flows, the optimal modes reveal modal and non-modal transition mechanisms. For turbulent base flows, they predict the evolution of coherent structures in a statistical sense. Results from the application of the method to three-dimensional laminar wall-bounded flows and turbulent jets will be presented. This research was supported by the Office of Naval Research (N00014-16-1-2445) and Boeing Company (CT-BA-GTA-1).
A pertinent approach to solve nonlinear fuzzy integro-differential equations.
Narayanamoorthy, S; Sathiyapriya, S P
2016-01-01
Fuzzy integro-differential equations is one of the important parts of fuzzy analysis theory that holds theoretical as well as applicable values in analytical dynamics and so an appropriate computational algorithm to solve them is in essence. In this article, we use parametric forms of fuzzy numbers and suggest an applicable approach for solving nonlinear fuzzy integro-differential equations using homotopy perturbation method. A clear and detailed description of the proposed method is provided. Our main objective is to illustrate that the construction of appropriate convex homotopy in a proper way leads to highly accurate solutions with less computational work. The efficiency of the approximation technique is expressed via stability and convergence analysis so as to guarantee the efficiency and performance of the methodology. Numerical examples are demonstrated to verify the convergence and it reveals the validity of the presented numerical technique. Numerical results are tabulated and examined by comparing the obtained approximate solutions with the known exact solutions. Graphical representations of the exact and acquired approximate fuzzy solutions clarify the accuracy of the approach.
Thermotaxis is a Robust Mechanism for Thermoregulation in C. elegans Nematodes
Ramot, Daniel; MacInnis, Bronwyn L.; Lee, Hau-Chen; Goodman, Miriam B.
2013-01-01
Many biochemical networks are robust to variations in network or stimulus parameters. Although robustness is considered an important design principle of such networks, it is not known whether this principle also applies to higher-level biological processes such as animal behavior. In thermal gradients, C. elegans uses thermotaxis to bias its movement along the direction of the gradient. Here we develop a detailed, quantitative map of C. elegans thermotaxis and use these data to derive a computational model of thermotaxis in the soil, a natural environment of C. elegans. This computational analysis indicates that thermotaxis enables animals to avoid temperatures at which they cannot reproduce, to limit excursions from their adapted temperature, and to remain relatively close to the surface of the soil, where oxygen is abundant. Furthermore, our analysis reveals that this mechanism is robust to large variations in the parameters governing both worm locomotion and temperature fluctuations in the soil. We suggest that, similar to biochemical networks, animals evolve behavioral strategies that are robust, rather than strategies that rely on fine-tuning of specific behavioral parameters. PMID:19020047
Lin, Chun-Li; Chang, Yen-Hsiang; Pa, Che-An
2009-10-01
This study evaluated the risk of failure for an endodontically treated premolar with mesio occlusodistal palatal (MODP) preparation and 3 different computer-aided design/computer-aided manufacturing (CAD/CAM) ceramic restoration configurations. Three 3-dimensional finite element (FE) models designed with CAD/CAM ceramic onlay, endocrown, and conventional crown restorations were constructed to perform simulations. The Weibull function was incorporated with FE analysis to calculate the long-term failure probability relative to different load conditions. The results indicated that the stress values on the enamel, dentin, and luting cement for endocrown restoration were the lowest values relative to the other 2 restorations. Weibull analysis revealed that the individual failure probability in the endocrown enamel, dentin, and luting cement obviously diminished more than those for onlay and conventional crown restorations. The overall failure probabilities were 27.5%, 1%, and 1% for onlay, endocrown, and conventional crown restorations, respectively, in normal occlusal condition. This numeric investigation suggests that endocrown and conventional crown restorations for endodontically treated premolars with MODP preparation present similar longevity.
Yao, Shengnan; Zeng, Weiming; Wang, Nizhuan; Chen, Lei
2013-07-01
Independent component analysis (ICA) has been proven to be effective for functional magnetic resonance imaging (fMRI) data analysis. However, ICA decomposition requires to optimize the unmixing matrix iteratively whose initial values are generated randomly. Thus the randomness of the initialization leads to different ICA decomposition results. Therefore, just one-time decomposition for fMRI data analysis is not usually reliable. Under this circumstance, several methods about repeated decompositions with ICA (RDICA) were proposed to reveal the stability of ICA decomposition. Although utilizing RDICA has achieved satisfying results in validating the performance of ICA decomposition, RDICA cost much computing time. To mitigate the problem, in this paper, we propose a method, named ATGP-ICA, to do the fMRI data analysis. This method generates fixed initial values with automatic target generation process (ATGP) instead of being produced randomly. We performed experimental tests on both hybrid data and fMRI data to indicate the effectiveness of the new method and made a performance comparison of the traditional one-time decomposition with ICA (ODICA), RDICA and ATGP-ICA. The proposed method demonstrated that it not only could eliminate the randomness of ICA decomposition, but also could save much computing time compared to RDICA. Furthermore, the ROC (Receiver Operating Characteristic) power analysis also denoted the better signal reconstruction performance of ATGP-ICA than that of RDICA. Copyright © 2013 Elsevier Inc. All rights reserved.
Rhizobium etli asparaginase II
Huerta-Saquero, Alejandro; Evangelista-Martínez, Zahaed; Moreno-Enriquez, Angélica; Perez-Rueda, Ernesto
2013-01-01
Bacterial l-asparaginase has been a universal component of therapies for childhood acute lymphoblastic leukemia since the 1970s. Two principal enzymes derived from Escherichia coli and Erwinia chrysanthemi are the only options clinically approved to date. We recently reported a study of recombinant l-asparaginase (AnsA) from Rhizobium etli and described an increasing type of AnsA family members. Sequence analysis revealed four conserved motifs with notable differences with respect to the conserved regions of amino acid sequences of type I and type II l-asparaginases, particularly in comparison with therapeutic enzymes from E. coli and E. chrysanthemi. These differences suggested a distinct immunological specificity. Here, we report an in silico analysis that revealed immunogenic determinants of AnsA. Also, we used an extensive approach to compare the crystal structures of E. coli and E. chrysantemi asparaginases with a computational model of AnsA and identified immunogenic epitopes. A three-dimensional model of AsnA revealed, as expected based on sequence dissimilarities, completely different folding and different immunogenic epitopes. This approach could be very useful in transcending the problem of immunogenicity in two major ways: by chemical modifications of epitopes to reduce drug immunogenicity, and by site-directed mutagenesis of amino acid residues to diminish immunogenicity without reduction of enzymatic activity. PMID:22895060
NASA Astrophysics Data System (ADS)
Wang, Yongcui; Zhao, Weiling; Zhou, Xiaobo
2016-10-01
Accurate identification of coherent transcriptional modules (subnetworks) in adipose and muscle tissues is important for revealing the related mechanisms and co-regulated pathways involved in the development of aging-related diseases. Here, we proposed a systematically computational approach, called ICEGM, to Identify the Co-Expression Gene Modules through a novel mathematical framework of Higher-Order Generalized Singular Value Decomposition (HO-GSVD). ICEGM was applied on the adipose, and heart and skeletal muscle tissues in old and young female African green vervet monkeys. The genes associated with the development of inflammation, cardiovascular and skeletal disorder diseases, and cancer were revealed by the ICEGM. Meanwhile, genes in the ICEGM modules were also enriched in the adipocytes, smooth muscle cells, cardiac myocytes, and immune cells. Comprehensive disease annotation and canonical pathway analysis indicated that immune cells, adipocytes, cardiomyocytes, and smooth muscle cells played a synergistic role in cardiac and physical functions in the aged monkeys by regulation of the biological processes associated with metabolism, inflammation, and atherosclerosis. In conclusion, the ICEGM provides an efficiently systematic framework for decoding the co-expression gene modules in multiple tissues. Analysis of genes in the ICEGM module yielded important insights on the cooperative role of multiple tissues in the development of diseases.
Rhizobium etli asparaginase II: an alternative for acute lymphoblastic leukemia (ALL) treatment.
Huerta-Saquero, Alejandro; Evangelista-Martínez, Zahaed; Moreno-Enriquez, Angélica; Perez-Rueda, Ernesto
2013-01-01
Bacterial L-asparaginase has been a universal component of therapies for childhood acute lymphoblastic leukemia since the 1970s. Two principal enzymes derived from Escherichia coli and Erwinia chrysanthemi are the only options clinically approved to date. We recently reported a study of recombinant L-asparaginase (AnsA) from Rhizobium etli and described an increasing type of AnsA family members. Sequence analysis revealed four conserved motifs with notable differences with respect to the conserved regions of amino acid sequences of type I and type II L-asparaginases, particularly in comparison with therapeutic enzymes from E. coli and E. chrysanthemi. These differences suggested a distinct immunological specificity. Here, we report an in silico analysis that revealed immunogenic determinants of AnsA. Also, we used an extensive approach to compare the crystal structures of E. coli and E. chrysantemi asparaginases with a computational model of AnsA and identified immunogenic epitopes. A three-dimensional model of AsnA revealed, as expected based on sequence dissimilarities, completely different folding and different immunogenic epitopes. This approach could be very useful in transcending the problem of immunogenicity in two major ways: by chemical modifications of epitopes to reduce drug immunogenicity, and by site-directed mutagenesis of amino acid residues to diminish immunogenicity without reduction of enzymatic activity.
Vivek-Ananth, R P; Mohanraj, Karthikeyan; Vandanashree, Muralidharan; Jhingran, Anupam; Craig, James P; Samal, Areejit
2018-04-26
Aspergillus fumigatus and multiple other Aspergillus species cause a wide range of lung infections, collectively termed aspergillosis. Aspergilli are ubiquitous in environment with healthy immune systems routinely eliminating inhaled conidia, however, Aspergilli can become an opportunistic pathogen in immune-compromised patients. The aspergillosis mortality rate and emergence of drug-resistance reveals an urgent need to identify novel targets. Secreted and cell membrane proteins play a critical role in fungal-host interactions and pathogenesis. Using a computational pipeline integrating data from high-throughput experiments and bioinformatic predictions, we have identified secreted and cell membrane proteins in ten Aspergillus species known to cause aspergillosis. Small secreted and effector-like proteins similar to agents of fungal-plant pathogenesis were also identified within each secretome. A comparison with humans revealed that at least 70% of Aspergillus secretomes have no sequence similarity with the human proteome. An analysis of antigenic qualities of Aspergillus proteins revealed that the secretome is significantly more antigenic than cell membrane proteins or the complete proteome. Finally, overlaying an expression dataset, four A. fumigatus proteins upregulated during infection and with available structures, were found to be structurally similar to known drug target proteins in other organisms, and were able to dock in silico with the respective drug.
Ionic liquid thermal stabilities: decomposition mechanisms and analysis tools.
Maton, Cedric; De Vos, Nils; Stevens, Christian V
2013-07-07
The increasing amount of papers published on ionic liquids generates an extensive quantity of data. The thermal stability data of divergent ionic liquids are collected in this paper with attention to the experimental set-up. The influence and importance of the latter parameters are broadly addressed. Both ramped temperature and isothermal thermogravimetric analysis are discussed, along with state-of-the-art methods, such as TGA-MS and pyrolysis-GC. The strengths and weaknesses of the different methodologies known to date demonstrate that analysis methods should be in line with the application. The combination of data from advanced analysis methods allows us to obtain in-depth information on the degradation processes. Aided with computational methods, the kinetics and thermodynamics of thermal degradation are revealed piece by piece. The better understanding of the behaviour of ionic liquids at high temperature allows selective and application driven design, as well as mathematical prediction for engineering purposes.
Determination of Diethyl Phthalate and Polyhexamethylene Guanidine in Surrogate Alcohol from Russia
Monakhova, Yulia B.; Kuballa, Thomas; Leitz, Jenny; Lachenmeier, Dirk W.
2011-01-01
Analytical methods based on spectroscopic techniques were developed and validated for the determination of diethyl phthalate (DEP) and polyhexamethylene guanidine (PHMG), which may occur in unrecorded alcohol. Analysis for PHMG was based on UV-VIS spectrophotometry after derivatization with Eosin Y and 1H NMR spectroscopy of the DMSO extract. Analysis of DEP was performed with direct UV-VIS and 1H NMR methods. Multivariate curve resolution and spectra computation methods were used to confirm the presence of PHMG and DEP in the investigated beverages. Of 22 analysed alcohol samples, two contained DEP or PHMG. 1H NMR analysis also revealed the presence of signals of hawthorn extract in three medicinal alcohols used as surrogate alcohol. The simple and cheap UV-VIS methods can be used for rapid screening of surrogate alcohol samples for impurities, while 1H NMR is recommended for specific confirmatory analysis if required. PMID:21647285
Determination of diethyl phthalate and polyhexamethylene guanidine in surrogate alcohol from Russia.
Monakhova, Yulia B; Kuballa, Thomas; Leitz, Jenny; Lachenmeier, Dirk W
2011-01-01
Analytical methods based on spectroscopic techniques were developed and validated for the determination of diethyl phthalate (DEP) and polyhexamethylene guanidine (PHMG), which may occur in unrecorded alcohol. Analysis for PHMG was based on UV-VIS spectrophotometry after derivatization with Eosin Y and (1)H NMR spectroscopy of the DMSO extract. Analysis of DEP was performed with direct UV-VIS and (1)H NMR methods. Multivariate curve resolution and spectra computation methods were used to confirm the presence of PHMG and DEP in the investigated beverages. Of 22 analysed alcohol samples, two contained DEP or PHMG. (1)H NMR analysis also revealed the presence of signals of hawthorn extract in three medicinal alcohols used as surrogate alcohol. The simple and cheap UV-VIS methods can be used for rapid screening of surrogate alcohol samples for impurities, while (1)H NMR is recommended for specific confirmatory analysis if required.
Prevost, Luanna B.; Smith, Michelle K.; Knight, Jennifer K.
2016-01-01
Previous work has shown that students have persistent difficulties in understanding how central dogma processes can be affected by a stop codon mutation. To explore these difficulties, we modified two multiple-choice questions from the Genetics Concept Assessment into three open-ended questions that asked students to write about how a stop codon mutation potentially impacts replication, transcription, and translation. We then used computer-assisted lexical analysis combined with human scoring to categorize student responses. The lexical analysis models showed high agreement with human scoring, demonstrating that this approach can be successfully used to analyze large numbers of student written responses. The results of this analysis show that students’ ideas about one process in the central dogma can affect their thinking about subsequent and previous processes, leading to mixed models of conceptual understanding. PMID:27909016
RNA-Seq Analysis Reveals MAPKKK Family Members Related to Drought Tolerance in Maize
Ren, Wen; Yang, Fengling; He, Hang; Zhao, Jiuran
2015-01-01
The mitogen-activated protein kinase (MAPK) cascade is an evolutionarily conserved signal transduction pathway that is involved in plant development and stress responses. As the first component of this phosphorelay cascade, mitogen-activated protein kinase kinase kinases (MAPKKKs) act as adaptors linking upstream signaling steps to the core MAPK cascade to promote the appropriate cellular responses; however, the functions of MAPKKKs in maize are unclear. Here, we identified 71 MAPKKK genes, of which 14 were novel, based on a computational analysis of the maize (Zea mays L.) genome. Using an RNA-seq analysis in the leaf, stem and root of maize under well-watered and drought-stress conditions, we identified 5,866 differentially expressed genes (DEGs), including 8 MAPKKK genes responsive to drought stress. Many of the DEGs were enriched in processes such as drought stress, abiotic stimulus, oxidation-reduction, and metabolic processes. The other way round, DEGs involved in processes such as oxidation, photosynthesis, and starch, proline, ethylene, and salicylic acid metabolism were clearly co-expressed with the MAPKKK genes. Furthermore, a quantitative real-time PCR (qRT-PCR) analysis was performed to assess the relative expression levels of MAPKKKs. Correlation analysis revealed that there was a significant correlation between expression levels of two MAPKKKs and relative biomass responsive to drought in 8 inbred lines. Our results indicate that MAPKKKs may have important regulatory functions in drought tolerance in maize. PMID:26599013
Computer vs. Workbook Instruction in Second Language Acquisition.
ERIC Educational Resources Information Center
Nagata, Noriko
1996-01-01
Compares the effectiveness of Nihongo-CALI (Japanese Computer Assisted Language Instruction) with non-CALI workbook instruction. Findings reveal that given the same grammar notes and exercises, ongoing intelligent computer feedback is more effective than simple workbook answer sheets for developing learners' grammatical skill in producing Japanese…
VoxelStats: A MATLAB Package for Multi-Modal Voxel-Wise Brain Image Analysis.
Mathotaarachchi, Sulantha; Wang, Seqian; Shin, Monica; Pascoal, Tharick A; Benedet, Andrea L; Kang, Min Su; Beaudry, Thomas; Fonov, Vladimir S; Gauthier, Serge; Labbe, Aurélie; Rosa-Neto, Pedro
2016-01-01
In healthy individuals, behavioral outcomes are highly associated with the variability on brain regional structure or neurochemical phenotypes. Similarly, in the context of neurodegenerative conditions, neuroimaging reveals that cognitive decline is linked to the magnitude of atrophy, neurochemical declines, or concentrations of abnormal protein aggregates across brain regions. However, modeling the effects of multiple regional abnormalities as determinants of cognitive decline at the voxel level remains largely unexplored by multimodal imaging research, given the high computational cost of estimating regression models for every single voxel from various imaging modalities. VoxelStats is a voxel-wise computational framework to overcome these computational limitations and to perform statistical operations on multiple scalar variables and imaging modalities at the voxel level. VoxelStats package has been developed in Matlab(®) and supports imaging formats such as Nifti-1, ANALYZE, and MINC v2. Prebuilt functions in VoxelStats enable the user to perform voxel-wise general and generalized linear models and mixed effect models with multiple volumetric covariates. Importantly, VoxelStats can recognize scalar values or image volumes as response variables and can accommodate volumetric statistical covariates as well as their interaction effects with other variables. Furthermore, this package includes built-in functionality to perform voxel-wise receiver operating characteristic analysis and paired and unpaired group contrast analysis. Validation of VoxelStats was conducted by comparing the linear regression functionality with existing toolboxes such as glim_image and RMINC. The validation results were identical to existing methods and the additional functionality was demonstrated by generating feature case assessments (t-statistics, odds ratio, and true positive rate maps). In summary, VoxelStats expands the current methods for multimodal imaging analysis by allowing the estimation of advanced regional association metrics at the voxel level.
Personality Types and Affinity for Computers
1991-03-01
differences on personality dimensions between the respondents, and to explore the relationship between these differences and computer affinity. The results...between the respondents, and to explore the relationship between these differences and computer affinity. The results revealed no significant differences...type to this measure of computer affinity. 2 II. LITERATURZ REVIEW The interest of this study was the relationship between a person’s psychological
Teaching of Real Numbers by Using the Archimedes-Cantor Approach and Computer Algebra Systems
ERIC Educational Resources Information Center
Vorob'ev, Evgenii M.
2015-01-01
Computer technologies and especially computer algebra systems (CAS) allow students to overcome some of the difficulties they encounter in the study of real numbers. The teaching of calculus can be considerably more effective with the use of CAS provided the didactics of the discipline makes it possible to reveal the full computational potential of…
Information-Theoretical Complexity Analysis of Selected Elementary Chemical Reactions
NASA Astrophysics Data System (ADS)
Molina-Espíritu, M.; Esquivel, R. O.; Dehesa, J. S.
We investigate the complexity of selected elementary chemical reactions (namely, the hydrogenic-abstraction reaction and the identity SN2 exchange reaction) by means of the following single and composite information-theoretic measures: disequilibrium (D), exponential entropy(L), Fisher information (I), power entropy (J), I-D, D-L and I-J planes and Fisher-Shannon (FS) and Lopez-Mancini-Calbet (LMC) shape complexities. These quantities, which are functionals of the one-particle density, are computed in both position (r) and momentum (p) spaces. The analysis revealed that the chemically significant regions of these reactions can be identified through most of the single information-theoretic measures and the two-component planes, not only the ones which are commonly revealed by the energy, such as the reactant/product (R/P) and the transition state (TS), but also those that are not present in the energy profile such as the bond cleavage energy region (BCER), the bond breaking/forming regions (B-B/F) and the charge transfer process (CT). The analysis of the complexities shows that the energy profile of the abstraction reaction bears the same information-theoretical features of the LMC and FS measures, however for the identity SN2 exchange reaction does not hold a simple behavior with respect to the LMC and FS measures. Most of the chemical features of interest (BCER, B-B/F and CT) are only revealed when particular information-theoretic aspects of localizability (L or J), uniformity (D) and disorder (I) are considered.
A detailed spectroscopic study on the interaction of Rhodamine 6G with human hemoglobin.
Mandal, Paulami; Bardhan, Munmun; Ganguly, Tapan
2010-05-03
UV-vis, time-resolved fluorescence and circular dichroism spectroscopic investigations have been made to reveal the nature of the interactions between xanthene dye Rhodamine 6G and the well known protein hemoglobin. From the analysis of the steady-state and time-resolved fluorescence quenching of Rhodamine 6G in aqueous solutions in presence of hemoglobin, it is revealed that the quenching is static in nature. The primary binding pattern between Rhodamine and hemoglobin has been interpreted as combined effect of hydrophobic association and electrostatic interaction. The binding constants, number of binding sites and thermodynamic parameters at various pH of the environment have been computed. The binding average distance between the energy donor Rhodamine and acceptor hemoglobin has been determined from the Forster's theory. Copyright 2010 Elsevier B.V. All rights reserved.
Dimension-dependent stimulated radiative interaction of a single electron quantum wavepacket
NASA Astrophysics Data System (ADS)
Gover, Avraham; Pan, Yiming
2018-06-01
In the foundation of quantum mechanics, the spatial dimensions of electron wavepacket are understood only in terms of an expectation value - the probability distribution of the particle location. One can still inquire how the quantum electron wavepacket size affects a physical process. Here we address the fundamental physics problem of particle-wave duality and the measurability of a free electron quantum wavepacket. Our analysis of stimulated radiative interaction of an electron wavepacket, accompanied by numerical computations, reveals two limits. In the quantum regime of long wavepacket size relative to radiation wavelength, one obtains only quantum-recoil multiphoton sidebands in the electron energy spectrum. In the opposite regime, the wavepacket interaction approaches the limit of classical point-particle acceleration. The wavepacket features can be revealed in experiments carried out in the intermediate regime of wavepacket size commensurate with the radiation wavelength.
Birgin, Osman; Catlioğlu, Hakan; Gürbüz, Ramazan; Aydin, Serhat
2010-10-01
This study aimed to investigate the experiences of pre-service mathematics (PSM) teachers with computers and their attitudes toward them. The Computer Attitude Scale, Computer Competency Survey, and Computer Use Information Form were administered to 180 Turkish PSM teachers. Results revealed that most PSM teachers used computers at home and at Internet cafes, and that their competency was generally intermediate and upper level. The study concludes that PSM teachers' attitudes about computers differ according to their years of study, computer ownership, level of computer competency, frequency of computer use, computer experience, and whether they had attended a computer-aided instruction course. However, computer attitudes were not affected by gender.
Computer-Assisted Spanish-Composition Survey--1986.
ERIC Educational Resources Information Center
Harvey, T. Edward
1986-01-01
A survey of high school and higher education teachers' (N=208) attitudes regarding the use of computers for Spanish-composition instruction revealed that: the lack of foreign-character support remains the major frustration; most teachers used Apple or IBM computers; and there was mixed opinion regarding the real versus the expected benefits of…
On the Rhetorical Contract in Human-Computer Interaction.
ERIC Educational Resources Information Center
Wenger, Michael J.
1991-01-01
An exploration of the rhetorical contract--i.e., the expectations for appropriate interaction--as it develops in human-computer interaction revealed that direct manipulation interfaces were more likely to establish social expectations. Study results suggest that the social nature of human-computer interactions can be examined with reference to the…
ERIC Educational Resources Information Center
Dikli, Semire
2006-01-01
The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…
Gender Digital Divide and Challenges in Undergraduate Computer Science Programs
ERIC Educational Resources Information Center
Stoilescu, Dorian; McDougall, Douglas
2011-01-01
Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…
Using Flash Technology for Motivation and Assessment
ERIC Educational Resources Information Center
Deal, Walter F., III
2004-01-01
A visit to most any technology education laboratory or classroom will reveal that computers, software, and multimedia software are rapidly becoming a mainstay in learning about technology and technological literacy. Almost all technology labs have at least several computers dedicated to specialized software or hardware such as Computer-aided…
How Do Students Experience Testing on the University Computer?
ERIC Educational Resources Information Center
Whittington, Dale; And Others
1995-01-01
Reports a study of the administration mode, scores, and testing experiences of students taking the PreProfessional Skills Test (PPST) under differing conditions (computer based and paper and pencil). PPST scores and surveys of the students revealed varied test-taking strategies and computer-related alterations in test difficulty, construct,…
Anthony Johnson, A M; Borah, B K; Sai Gopal, D V R; Dasgupta, I
2012-12-01
Citrus yellow mosaic badna virus (CMBV), a member of the Family Caulimoviridae, Genus Badnavirus is the causative agent of mosaic disease among Citrus species in southern India. Despite its reported prevalence in several citrus species, complete information on clear functional genomics or functional information of full-length genomes from all the CMBV isolates infecting citrus species are not available in publicly accessible databases. CMBV isolates from Rough Lemon and Sweet Orange collected from a nursery were cloned and sequenced. The analysis revealed high sequence homology of the two CMBV isolates with previously reported CMBV sequences implying that they represent new variants. Based on computational analysis of the predicted secondary structures, the possible functions of some CMBV proteins have been analyzed.
Stadlmann, Johannes; Hoi, David M; Taubenschmid, Jasmin; Mechtler, Karl; Penninger, Josef M
2018-05-18
SugarQb (www.imba.oeaw.ac.at/sugarqb) is a freely available collection of computational tools for the automated identification of intact glycopeptides from high-resolution HCD MS/MS data-sets in the Proteome Discoverer environment. We report the migration of SugarQb to the latest and free version of Proteome Discoverer 2.1, and apply it to the analysis of PNGase F-resistant N-glycopeptides from mouse embryonic stem cells. The analysis of intact glycopeptides highlights unexpected technical limitations to PNGase F-dependent glycoproteomic workflows at the proteome level, and warrants a critical re-interpretation of seminal data-sets in the context of N-glycosylation-site prediction. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
A Bayesian Multinomial Probit MODEL FOR THE ANALYSIS OF PANEL CHOICE DATA.
Fong, Duncan K H; Kim, Sunghoon; Chen, Zhe; DeSarbo, Wayne S
2016-03-01
A new Bayesian multinomial probit model is proposed for the analysis of panel choice data. Using a parameter expansion technique, we are able to devise a Markov Chain Monte Carlo algorithm to compute our Bayesian estimates efficiently. We also show that the proposed procedure enables the estimation of individual level coefficients for the single-period multinomial probit model even when the available prior information is vague. We apply our new procedure to consumer purchase data and reanalyze a well-known scanner panel dataset that reveals new substantive insights. In addition, we delineate a number of advantageous features of our proposed procedure over several benchmark models. Finally, through a simulation analysis employing a fractional factorial design, we demonstrate that the results from our proposed model are quite robust with respect to differing factors across various conditions.
KIM, Jaehwan; EOM, Kidong; YOON, Hakyoung
2017-01-01
A 14-year-old dog weighing 4 kg presented with hypotension only in the right forelimb. Thoracic radiography revealed a round soft tissue opacity near the aortic arch and below the second thoracic vertebra on a lateral view. Three-dimensional computed tomography angiography clearly revealed stenosis and aneurysmal dilation of an aberrant right subclavian artery. Stenosis and aneurysm of an aberrant subclavian artery should be included as a differential diagnosis in dogs showing a round soft tissue opacity near the aortic arch and below the thoracic vertebra on the lateral thoracic radiograph. PMID:28496026
Kim, Jaehwan; Eom, Kidong; Yoon, Hakyoung
2017-06-16
A 14-year-old dog weighing 4 kg presented with hypotension only in the right forelimb. Thoracic radiography revealed a round soft tissue opacity near the aortic arch and below the second thoracic vertebra on a lateral view. Three-dimensional computed tomography angiography clearly revealed stenosis and aneurysmal dilation of an aberrant right subclavian artery. Stenosis and aneurysm of an aberrant subclavian artery should be included as a differential diagnosis in dogs showing a round soft tissue opacity near the aortic arch and below the thoracic vertebra on the lateral thoracic radiograph.
Evolution of structure and reactivity in a series of iconic carbenes.
Zhang, Min; Moss, Robert A; Thompson, Jack; Krogh-Jespersen, Karsten
2012-01-20
We present experimental activation parameters for the reactions of six carbenes (CCl(2), CClF, CF(2), ClCOMe, FCOMe, and (MeO)(2)C) with six alkenes (tetramethylethylene, cyclohexene, 1-hexene, methyl acrylate, acrylonitrile, and α-chloroacrylonitrile). Activation energies range from -1 kcal/mol for the addition of CCl(2) to tetramethylethylene to 11 kcal/mol for the addition of FCOMe to acrylonitrile. A generally satisfactory analysis of major trends in the evolution of carbenic structure and reactivity is afforded by qualitative applications of frontier molecular orbital theory, although the observed entropies of activation appear to fall in a counterintuitive pattern. An analysis of computed cyclopropanation transition state parameters reveals significant nucleophilic selectivity of (MeO)(2)C toward α-chloroacrylonitrile.
Dąbrowska, Monika; Starek, Małgorzata; Komsta, Łukasz; Szafrański, Przemysław; Stasiewicz-Urban, Anna; Opoka, Włodzimierz
2017-04-01
The retention behaviors were investigated for a series of eight cephalosporins in thin-layer chromatography (TLC) using stationary phases of RP-2, RP-8, RP-18, NH 2 , DIOL, and CN chemically bonded silica gel. Additionally, various binary mobile phases (water/methanol and water/acetone) were used in different volume proportions. The retention behavior of the analyzed molecules was defined by R M0 constant. In addition, reversed phase high performance liquid chromatography (RP-HPLC) was performed in lipophilicity studies by using immobilized artificial membrane (IAM) stationary phase. Obtained chromatographic data (R M0 and logk' IAM ) were correlated with the lipophilicity, expressed as values of the log calculated (logP calc ) and experimental (logP exp(shake-flask) ) partition coefficient. Principal component analysis (PCA) was applied in order to obtain an overview of similarity or dissimilarity among the analyzed compounds. Hierarchical cluster analysis (HCA) was performed to compare the separation characteristics of the applied stationary phases. This study was undertaken to identify the best chromatographic system and chromatographic data processing method to enable the prediction of logP values. A comprehensive chromatographic investigation into the retention of the analyzed cephalosporins revealed a similar behavior on RP-18, RP-8 and CN stationary phases. The weak correlations obtained between experimental and certain computed lipophilicity indices revealed that R M0 and PC1/RM are relevant lipophilicity parameters and the RP-8, CN and RP-18 plates are appropriate stationary phases for lipophilicity investigation, whereas computational approaches still cannot fully replace experimentation. Copyright © 2017 Elsevier B.V. All rights reserved.
A two-step sensitivity analysis for hydrological signatures in Jinhua River Basin, East China
NASA Astrophysics Data System (ADS)
Pan, S.; Fu, G.; Chiang, Y. M.; Xu, Y. P.
2016-12-01
Owing to model complexity and large number of parameters, calibration and sensitivity analysis are difficult processes for distributed hydrological models. In this study, a two-step sensitivity analysis approach is proposed for analyzing the hydrological signatures in Jinhua River Basin, East China, using the Distributed Hydrology-Soil-Vegetation Model (DHSVM). A rough sensitivity analysis is firstly conducted to obtain preliminary influential parameters via Analysis of Variance. The number of parameters was greatly reduced from eighteen-three to sixteen. Afterwards, the sixteen parameters are further analyzed based on a variance-based global sensitivity analysis, i.e., Sobol's sensitivity analysis method, to achieve robust sensitivity rankings and parameter contributions. Parallel-Computing is applied to reduce computational burden in variance-based sensitivity analysis. The results reveal that only a few number of model parameters are significantly sensitive, including rain LAI multiplier, lateral conductivity, porosity, field capacity, wilting point of clay loam, understory monthly LAI, understory minimum resistance and root zone depths of croplands. Finally several hydrological signatures are used for investigating the performance of DHSVM. Results show that high value of efficiency criteria didn't indicate excellent performance of hydrological signatures. For most samples from Sobol's sensitivity analysis, water yield was simulated very well. However, lowest and maximum annual daily runoffs were underestimated. Most of seven-day minimum runoffs were overestimated. Nevertheless, good performances of the three signatures above still exist in a number of samples. Analysis of peak flow shows that small and medium floods are simulated perfectly while slight underestimations happen to large floods. The work in this study helps to further multi-objective calibration of DHSVM model and indicates where to improve the reliability and credibility of model simulation.
Pan-cancer Alterations of the MYC Oncogene and Its Proximal Network across the Cancer Genome Atlas.
Schaub, Franz X; Dhankani, Varsha; Berger, Ashton C; Trivedi, Mihir; Richardson, Anne B; Shaw, Reid; Zhao, Wei; Zhang, Xiaoyang; Ventura, Andrea; Liu, Yuexin; Ayer, Donald E; Hurlin, Peter J; Cherniack, Andrew D; Eisenman, Robert N; Bernard, Brady; Grandori, Carla
2018-03-28
Although the MYC oncogene has been implicated in cancer, a systematic assessment of alterations of MYC, related transcription factors, and co-regulatory proteins, forming the proximal MYC network (PMN), across human cancers is lacking. Using computational approaches, we define genomic and proteomic features associated with MYC and the PMN across the 33 cancers of The Cancer Genome Atlas. Pan-cancer, 28% of all samples had at least one of the MYC paralogs amplified. In contrast, the MYC antagonists MGA and MNT were the most frequently mutated or deleted members, proposing a role as tumor suppressors. MYC alterations were mutually exclusive with PIK3CA, PTEN, APC, or BRAF alterations, suggesting that MYC is a distinct oncogenic driver. Expression analysis revealed MYC-associated pathways in tumor subtypes, such as immune response and growth factor signaling; chromatin, translation, and DNA replication/repair were conserved pan-cancer. This analysis reveals insights into MYC biology and is a reference for biomarkers and therapeutics for cancers with alterations of MYC or the PMN. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Spatial Skill Profile of Mathematics Pre-Service Teachers
NASA Astrophysics Data System (ADS)
Putri, R. O. E.
2018-01-01
This study is aimed to investigate the spatial intelligence of mathematics pre-service teachers and find the best instructional strategy that facilitates this aspect. Data were collected from 35 mathematics pre-service teachers. The Purdue Spatial Visualization Test (PSVT) was used to identify the spatial skill of mathematics pre-service teachers. Statistical analysis indicate that more than 50% of the participants possessed spatial skill in intermediate level, whereas the other were in high and low level of spatial skill. The result also shows that there is a positive correlation between spatial skill and mathematics ability, especially in geometrical problem solving. High spatial skill students tend to have better mathematical performance compare to those in two other levels. Furthermore, qualitative analysis reveals that most students have difficulty in manipulating geometrical objects mentally. This problem mostly appears in intermediate and low-level spatial skill students. The observation revealed that 3-D geometrical figures is the best method that can overcome the mentally manipulation problem and develop the spatial visualization. Computer application can also be used to improve students’ spatial skill.
Data integration aids understanding of butterfly-host plant networks
NASA Astrophysics Data System (ADS)
Muto-Fujita, Ai; Takemoto, Kazuhiro; Kanaya, Shigehiko; Nakazato, Takeru; Tokimatsu, Toshiaki; Matsumoto, Natsushi; Kono, Mayo; Chubachi, Yuko; Ozaki, Katsuhisa; Kotera, Masaaki
2017-03-01
Although host-plant selection is a central topic in ecology, its general underpinnings are poorly understood. Here, we performed a case study focusing on the publicly available data on Japanese butterflies. A combined statistical analysis of plant-herbivore relationships and taxonomy revealed that some butterfly subfamilies in different families feed on the same plant families, and the occurrence of this phenomenon more than just by chance, thus indicating the independent acquisition of adaptive phenotypes to the same hosts. We consequently integrated plant-herbivore and plant-compound relationship data and conducted a statistical analysis to identify compounds unique to host plants of specific butterfly families. Some of the identified plant compounds are known to attract certain butterfly groups while repelling others. The additional incorporation of insect-compound relationship data revealed potential metabolic processes that are related to host plant selection. Our results demonstrate that data integration enables the computational detection of compounds putatively involved in particular interspecies interactions and that further data enrichment and integration of genomic and transcriptomic data facilitates the unveiling of the molecular mechanisms involved in host plant selection.
Data integration aids understanding of butterfly–host plant networks
Muto-Fujita, Ai; Takemoto, Kazuhiro; Kanaya, Shigehiko; Nakazato, Takeru; Tokimatsu, Toshiaki; Matsumoto, Natsushi; Kono, Mayo; Chubachi, Yuko; Ozaki, Katsuhisa; Kotera, Masaaki
2017-01-01
Although host-plant selection is a central topic in ecology, its general underpinnings are poorly understood. Here, we performed a case study focusing on the publicly available data on Japanese butterflies. A combined statistical analysis of plant–herbivore relationships and taxonomy revealed that some butterfly subfamilies in different families feed on the same plant families, and the occurrence of this phenomenon more than just by chance, thus indicating the independent acquisition of adaptive phenotypes to the same hosts. We consequently integrated plant–herbivore and plant–compound relationship data and conducted a statistical analysis to identify compounds unique to host plants of specific butterfly families. Some of the identified plant compounds are known to attract certain butterfly groups while repelling others. The additional incorporation of insect–compound relationship data revealed potential metabolic processes that are related to host plant selection. Our results demonstrate that data integration enables the computational detection of compounds putatively involved in particular interspecies interactions and that further data enrichment and integration of genomic and transcriptomic data facilitates the unveiling of the molecular mechanisms involved in host plant selection. PMID:28262809
Efficient universal blind quantum computation.
Giovannetti, Vittorio; Maccone, Lorenzo; Morimae, Tomoyuki; Rudolph, Terry G
2013-12-06
We give a cheat sensitive protocol for blind universal quantum computation that is efficient in terms of computational and communication resources: it allows one party to perform an arbitrary computation on a second party's quantum computer without revealing either which computation is performed, or its input and output. The first party's computational capabilities can be extremely limited: she must only be able to create and measure single-qubit superposition states. The second party is not required to use measurement-based quantum computation. The protocol requires the (optimal) exchange of O(Jlog2(N)) single-qubit states, where J is the computational depth and N is the number of qubits needed for the computation.
Yun, Jian; Shang, Song-Chao; Wei, Xiao-Dan; Liu, Shuang; Li, Zhi-Jie
2016-01-01
Language is characterized by both ecological properties and social properties, and competition is the basic form of language evolution. The rise and decline of one language is a result of competition between languages. Moreover, this rise and decline directly influences the diversity of human culture. Mathematics and computer modeling for language competition has been a popular topic in the fields of linguistics, mathematics, computer science, ecology, and other disciplines. Currently, there are several problems in the research on language competition modeling. First, comprehensive mathematical analysis is absent in most studies of language competition models. Next, most language competition models are based on the assumption that one language in the model is stronger than the other. These studies tend to ignore cases where there is a balance of power in the competition. The competition between two well-matched languages is more practical, because it can facilitate the co-development of two languages. A third issue with current studies is that many studies have an evolution result where the weaker language inevitably goes extinct. From the integrated point of view of ecology and sociology, this paper improves the Lotka-Volterra model and basic reaction-diffusion model to propose an "ecology-society" computational model for describing language competition. Furthermore, a strict and comprehensive mathematical analysis was made for the stability of the equilibria. Two languages in competition may be either well-matched or greatly different in strength, which was reflected in the experimental design. The results revealed that language coexistence, and even co-development, are likely to occur during language competition.
Decaestecker, C; Salmon, I; Camby, I; Dewitte, O; Pasteels, J L; Brotchi, J; Van Ham, P; Kiss, R
1995-05-01
The present work investigates whether computer-assisted techniques can contribute any significant information to the characterization of astrocytic tumor aggressiveness. Two complementary computer-assisted methods were used. The first method made use of the digital image analysis of Feulgen-stained nuclei, making it possible to compute 15 morphonuclear and 8 nuclear DNA content-related (ploidy level) parameters. The second method enabled the most discriminatory parameters to be determined. This second method is the Decision Tree technique, which forms part of the Supervised Learning Algorithms. These two techniques were applied to a series of 250 supratentorial astrocytic tumors of the adult. This series included 39 low-grade (astrocytomas, AST) and 211 high-grade (47 anaplastic astrocytomas, ANA, and 164 glioblastomas, GBM) astrocytic tumors. The results show that some AST, ANA and GBM did not fit within simple logical rules. These "complex" cases were labeled NC-AST, NC-ANA and NC-GBM because they were "non-classical" (NC) with respect to their cytological features. An analysis of survival data revealed that the patients with NC-GBM had the same survival period as patients with GBM. In sharp contrast, patients with ANA survived significantly longer than patients with NC-ANA. In fact, the patients with ANA had the same survival period as patients who died from AST, while the patients with NC-ANA had a survival period similar to those with GBM. All these data show that the computer-assisted techniques used in this study can actually provide the pathologist with significant information on the characterization of astrocytic tumor aggressiveness.
Can macrocirculation changes predict nonhealing diabetic foot ulcers?
Lee, Ye-Na; Kim, Hyon-Surk; Kang, Jeong-A; Han, Seung-Kyu
2014-01-01
Transcutaneous partial oxygen tension (TcpO2) is considered the gold standard for assessment of tissue oxygenation, which is an essential factor for wound healing. The purpose of this study was to evaluate the association between macrocirculation and TcpO2 in persons with diabetes mellitus. Ninety-eight patients with diabetic foot ulcers participated in the study (61 men and 37 women). The subjects had a mean age of 66.6 years (range, 30-83 years) and were treated at the Diabetic Wound Center of Korea University Guro Hospital, Seoul, Republic of Korea. Macrocirculation was evaluated using 2 techniques: computed tomographic angiography and Doppler ultrasound. Macrocirculation scores were based on the patency of the two tibial arteries in 98 patients. Computed tomographic angiography and Doppler ultrasound scores (0-4 points) were given according to intraluminal filling defects and arterial pulse waveform of each vessel, respectively. Tissue oxygenation was measured by TcpO2. Macrocirculation scores were statistically analyzed as a function of the TcpO2. Statistical analysis revealed no significant linear trend between the macrocirculation status and TcpO2. Biavariate analysis using the Fisher exact test, Mantel-Haenszel tests, and McNemar-Bowker tests also found no significant relationship between macrocirculation and TcpO2. Computed tomographic angiography and Doppler ultrasound are not sufficiently reliable substitutes for TcpO2 measurements in regard to determining the optimal treatment for diabetic patients.
Dopamine Receptor-Specific Contributions to the Computation of Value.
Burke, Christopher J; Soutschek, Alexander; Weber, Susanna; Raja Beharelle, Anjali; Fehr, Ernst; Haker, Helene; Tobler, Philippe N
2018-05-01
Dopamine is thought to play a crucial role in value-based decision making. However, the specific contributions of different dopamine receptor subtypes to the computation of subjective value remain unknown. Here we demonstrate how the balance between D1 and D2 dopamine receptor subtypes shapes subjective value computation during risky decision making. We administered the D2 receptor antagonist amisulpride or placebo before participants made choices between risky options. Compared with placebo, D2 receptor blockade resulted in more frequent choice of higher risk and higher expected value options. Using a novel model fitting procedure, we concurrently estimated the three parameters that define individual risk attitude according to an influential theoretical account of risky decision making (prospect theory). This analysis revealed that the observed reduction in risk aversion under amisulpride was driven by increased sensitivity to reward magnitude and decreased distortion of outcome probability, resulting in more linear value coding. Our data suggest that different components that govern individual risk attitude are under dopaminergic control, such that D2 receptor blockade facilitates risk taking and expected value processing.
Big Data Processing for a Central Texas Groundwater Case Study
NASA Astrophysics Data System (ADS)
Cantu, A.; Rivera, O.; Martínez, A.; Lewis, D. H.; Gentle, J. N., Jr.; Fuentes, G.; Pierce, S. A.
2016-12-01
As computational methods improve, scientists are able to expand the level and scale of experimental simulation and testing that is completed for case studies. This study presents a comparative analysis of multiple models for the Barton Springs segment of the Edwards aquifer. Several numerical simulations using state-mandated MODFLOW models ran on Stampede, a High Performance Computing system housed at the Texas Advanced Computing Center, were performed for multiple scenario testing. One goal of this multidisciplinary project aims to visualize and compare the output data of the groundwater model using the statistical programming language R to find revealing data patterns produced by different pumping scenarios. Presenting data in a friendly post-processing format is covered in this paper. Visualization of the data and creating workflows applicable to the management of the data are tasks performed after data extraction. Resulting analyses provide an example of how supercomputing can be used to accelerate evaluation of scientific uncertainty and geological knowledge in relation to policy and management decisions. Understanding the aquifer behavior helps policy makers avoid negative impact on the endangered species, environmental services and aids in maximizing the aquifer yield.
Statistical benchmark for BosonSampling
NASA Astrophysics Data System (ADS)
Walschaers, Mattia; Kuipers, Jack; Urbina, Juan-Diego; Mayer, Klaus; Tichy, Malte Christopher; Richter, Klaus; Buchleitner, Andreas
2016-03-01
Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church-Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects.
Modeling for IFOG Vibration Error Based on the Strain Distribution of Quadrupolar Fiber Coil
Gao, Zhongxing; Zhang, Yonggang; Zhang, Yunhao
2016-01-01
Improving the performance of interferometric fiber optic gyroscope (IFOG) in harsh environment, especially in vibrational environment, is necessary for its practical applications. This paper presents a mathematical model for IFOG to theoretically compute the short-term rate errors caused by mechanical vibration. The computational procedures are mainly based on the strain distribution of quadrupolar fiber coil measured by stress analyzer. The definition of asymmetry of strain distribution (ASD) is given in the paper to evaluate the winding quality of the coil. The established model reveals that the high ASD and the variable fiber elastic modulus in large strain situation are two dominant reasons that give rise to nonreciprocity phase shift in IFOG under vibration. Furthermore, theoretical analysis and computational results indicate that vibration errors of both open-loop and closed-loop IFOG increase with the raise of vibrational amplitude, vibrational frequency and ASD. Finally, an estimation of vibration-induced IFOG errors in aircraft is done according to the proposed model. Our work is meaningful in designing IFOG coils to achieve a better anti-vibration performance. PMID:27455257
Girls and computer science: experiences, perceptions, and career aspirations
NASA Astrophysics Data System (ADS)
Hur, Jung Won; Andrzejewski, Carey E.; Marghitu, Daniela
2017-04-01
The purpose of this mixed methods study was to examine ways to promote computer science (CS) among girls by exploring young women's experiences and perceptions of CS as well as investigating factors affecting their career aspirations. American girls aged 10-16 participated in focus group interviews as well as pre-, post-, and follow-up surveys while attending a CS camp. The analysis of data revealed that although the participants were generally positive about the CS field, they had very limited knowledge of and experience with CS, leading to little aspiration to become computer scientists. The findings also indicated that girls' affinity for and confidence in CS were critical factors affecting their motivation for pursuing a CS-related career. The study demonstrated that participation in the CS camp motivated a small number of participants to be interested in majoring in CS, but the activity time was too short to make a significant impact. Based on the findings, we suggest that providing CS programming experiences in K-12 classrooms is important in order to boost girls' confidence and interest in CS.
Stride, E.; Cheema, U.
2017-01-01
The growth of bubbles within the body is widely believed to be the cause of decompression sickness (DCS). Dive computer algorithms that aim to prevent DCS by mathematically modelling bubble dynamics and tissue gas kinetics are challenging to validate. This is due to lack of understanding regarding the mechanism(s) leading from bubble formation to DCS. In this work, a biomimetic in vitro tissue phantom and a three-dimensional computational model, comprising a hyperelastic strain-energy density function to model tissue elasticity, were combined to investigate key areas of bubble dynamics. A sensitivity analysis indicated that the diffusion coefficient was the most influential material parameter. Comparison of computational and experimental data revealed the bubble surface's diffusion coefficient to be 30 times smaller than that in the bulk tissue and dependent on the bubble's surface area. The initial size, size distribution and proximity of bubbles within the tissue phantom were also shown to influence their subsequent dynamics highlighting the importance of modelling bubble nucleation and bubble–bubble interactions in order to develop more accurate dive algorithms. PMID:29263127
Extraordinarily Adaptive Properties of the Genetically Encoded Amino Acids
Ilardo, Melissa; Meringer, Markus; Freeland, Stephen; Rasulev, Bakhtiyor; Cleaves II, H. James
2015-01-01
Using novel advances in computational chemistry, we demonstrate that the set of 20 genetically encoded amino acids, used nearly universally to construct all coded terrestrial proteins, has been highly influenced by natural selection. We defined an adaptive set of amino acids as one whose members thoroughly cover relevant physico-chemical properties, or “chemistry space.” Using this metric, we compared the encoded amino acid alphabet to random sets of amino acids. These random sets were drawn from a computationally generated compound library containing 1913 alternative amino acids that lie within the molecular weight range of the encoded amino acids. Sets that cover chemistry space better than the genetically encoded alphabet are extremely rare and energetically costly. Further analysis of more adaptive sets reveals common features and anomalies, and we explore their implications for synthetic biology. We present these computations as evidence that the set of 20 amino acids found within the standard genetic code is the result of considerable natural selection. The amino acids used for constructing coded proteins may represent a largely global optimum, such that any aqueous biochemistry would use a very similar set. PMID:25802223
Jia, Limin
2017-01-01
Aimed at the complicated problems of attraction characteristics regarding passenger flow in urban rail transit network, the concept of the gravity field of passenger flow is proposed in this paper. We establish the computation methods of field strength and potential energy to reveal the potential attraction relationship among stations from the perspective of the collection and distribution of passenger flow and the topology of network. As for the computation methods of field strength, an optimum path concept is proposed to define betweenness centrality parameter. Regarding the computation of potential energy, Compound Simpson’s Rule Formula is applied to get a solution to the function. Taking No. 10 Beijing Subway as a practical example, an analysis of simulation and verification is conducted, and the results shows in the following ways. Firstly, the bigger field strength value between two stations is, the stronger passenger flow attraction is, and the greater probability of the formation of the largest passenger flow of section is. Secondly, there is the greatest passenger flow volume and circulation capacity between two zones of high potential energy. PMID:28863175
Computer-aided sperm analysis: a useful tool to evaluate patient's response to varicocelectomy.
Ariagno, Julia I; Mendeluk, Gabriela R; Furlan, María J; Sardi, M; Chenlo, P; Curi, Susana M; Pugliese, Mercedes N; Repetto, Herberto E; Cohen, Mariano
2017-01-01
Preoperative and postoperative sperm parameter values from infertile men with varicocele were analyzed by computer-aided sperm analysis (CASA) to assess if sperm characteristics improved after varicocelectomy. Semen samples of men with proven fertility (n = 38) and men with varicocele-related infertility (n = 61) were also analyzed. Conventional semen analysis was performed according to WHO (2010) criteria and a CASA system was employed to assess kinetic parameters and sperm concentration. Seminal parameters values in the fertile group were very far above from those of the patients, either before or after surgery. No significant improvement in the percentage normal sperm morphology (P = 0.10), sperm concentration (P = 0.52), total sperm count (P = 0.76), subjective motility (%) (P = 0.97) nor kinematics (P = 0.30) was observed after varicocelectomy when all groups were compared. Neither was significant improvement found in percentage normal sperm morphology (P = 0.91), sperm concentration (P = 0.10), total sperm count (P = 0.89) or percentage motility (P = 0.77) after varicocelectomy in paired comparisons of preoperative and postoperative data. Analysis of paired samples revealed that the total sperm count (P = 0.01) and most sperm kinetic parameters: curvilinear velocity (P = 0.002), straight-line velocity (P = 0.0004), average path velocity (P = 0.0005), linearity (P = 0.02), and wobble (P = 0.006) improved after surgery. CASA offers the potential for accurate quantitative assessment of each patient's response to varicocelectomy.
Functional Connectivity Parcellation of the Human Thalamus by Independent Component Analysis.
Zhang, Sheng; Li, Chiang-Shan R
2017-11-01
As a key structure to relay and integrate information, the thalamus supports multiple cognitive and affective functions through the connectivity between its subnuclei and cortical and subcortical regions. Although extant studies have largely described thalamic regional functions in anatomical terms, evidence accumulates to suggest a more complex picture of subareal activities and connectivities of the thalamus. In this study, we aimed to parcellate the thalamus and examine whole-brain connectivity of its functional clusters. With resting state functional magnetic resonance imaging data from 96 adults, we used independent component analysis (ICA) to parcellate the thalamus into 10 components. On the basis of the independence assumption, ICA helps to identify how subclusters overlap spatially. Whole brain functional connectivity of each subdivision was computed for independent component's time course (ICtc), which is a unique time series to represent an IC. For comparison, we computed seed-region-based functional connectivity using the averaged time course across all voxels within a thalamic subdivision. The results showed that, at p < 10 -6 , corrected, 49% of voxels on average overlapped among subdivisions. Compared with seed-region analysis, ICtc analysis revealed patterns of connectivity that were more distinguished between thalamic clusters. ICtc analysis demonstrated thalamic connectivity to the primary motor cortex, which has eluded the analysis as well as previous studies based on averaged time series, and clarified thalamic connectivity to the hippocampus, caudate nucleus, and precuneus. The new findings elucidate functional organization of the thalamus and suggest that ICA clustering in combination with ICtc rather than seed-region analysis better distinguishes whole-brain connectivities among functional clusters of a brain region.
An investigation of the critical components of a land ethic: An application of Q methodology
NASA Astrophysics Data System (ADS)
Spradling, Suzanne Shaw
Scope and method of study. The purpose of this study was to reveal the underlying structure of the beliefs of a sample of environmental educators regarding the critical components of a land or environmental ethic. Participants in the study were 30 environmental educators from seven states. All had been trained in one or more of the following national environmental education programs: Project WILD, Project WET, Project Learning Tree, Leopold Education Project, or Leave No Trace. Ages of the participants ranged from 18--63 years. Q methodology directed the study. Each participant completed a Q-sort of 54 statements related to environmental ethics. The data were analyzed using a computer program PQMethod 2.06. This program performed a correlation matrix as input data for factor analysis, and a VARIMAX rotation. Participant demographic data were collected in order to provide a more complete picture of the revealed structure of beliefs. Findings and conclusions. A three-factor solution was revealed from the analysis of the data. These factors represent the groupings of the participants with like beliefs in reference to the critical components of environmental ethics. Factor one was named Nature's Advocates. These individuals believe in equal rights for all parts of the environment. Factor two was named Nature's Stewards because of the revealed belief that humans were to have dominion over the earth given to them by the creator and that natural resources should be used responsibly. Factor three was named Nature's Romantics because of their belief that nature should be preserved for its aesthetic value and because of their naive approach to conservation. The demographic data added detail to the portrait created from the Q-sort data analysis. It is important then, to take into consideration what environmental educators believe about environmental ethics in designing meaningful curriculum that seeks to foster the development of those ethics. This study reveals the beliefs of a sample of environmental educators relating to environmental ethics critical components.
Rotenone and paraquat perturb dopamine metabolism: a computational analysis of pesticide toxicity
Qi, Zhen; Miller, Gary W.; Voit, Eberhard O.
2014-01-01
Pesticides, such as rotenone and paraquat, are suspected in the pathogenesis of Parkinson’s disease (PD), whose hallmark is the progressive loss of dopaminergic neurons in the substantia nigra pars compacta. Thus, compounds expected to play a role in the pathogenesis of PD will likely impact the function of dopaminergic neurons. To explore the relationship between pesticide exposure and dopaminergic toxicity, we developed a custom-tailored mathematical model of dopamine metabolism and utilized it to infer potential mechanisms underlying the toxicity of rotenone and paraquat, asking how these pesticides perturb specific processes. We performed two types of analyses, which are conceptually different and complement each other. The first analysis, a purely algebraic reverse engineering approach, analytically and deterministically computes the altered profile of enzyme activities that characterize the effects of a pesticide. The second method consists of large-scale Monte Carlo simulations that statistically reveal possible mechanisms of pesticides. The results from the reverse engineering approach show that rotenone and paraquat exposures lead to distinctly different flux perturbations. Rotenone seems to affect all fluxes associated with dopamine compartmentalization, whereas paraquat exposure perturbs fluxes associated with dopamine and its breakdown metabolites. The statistical results of the Monte-Carlo analysis suggest several specific mechanisms. The findings are interesting, because no a priori assumptions are made regarding specific pesticide actions, and all parameters characterizing the processes in the dopamine model are treated in an unbiased manner. Our results show how approaches from computational systems biology can help identify mechanisms underlying the toxicity of pesticide exposure. PMID:24269752
Cataldo, Rosella; Alfinito, Eleonora; Reggiani, Lino
2017-12-01
Aptamers are single stranded DNA, RNA, or peptide sequences having the ability to bind several specific targets (proteins, molecules as well as ions). Therefore, aptamer production and selection for therapeutic and diagnostic applications is very challenging. Usually, they are generated in vitro, although computational approaches have been recently developed for the in silico production. Despite these efforts, the mechanism of aptamer-ligand formation is not completely clear, and producing high-affinity aptamers is still quite difficult. This paper aims to develop a computational model able to describe aptamer-ligand affinity. Topological tools, such as the conventional degree distribution, the rank-degree distribution (hierarchy), and the node assortativity are employed. In doing so, the macromolecules tertiary-structures are mapped into appropriate graphs. These graphs reproduce the main topological features of the macromolecules, by preserving the distances between amino acids (nucleotides). Calculations are applied to the thrombin binding aptamer (TBA), and the TBA-thrombin complex produced in the presence of Na + or K + . The topological analysis is able to detect several differences between complexes obtained in the presence of the two cations, as expected by previous investigations. These results support graph analysis as a novel computational tool for testing affinity. Otherwise, starting from the graphs, an electrical network can be obtained by using the specific electrical properties of amino acids and nucleobases. Therefore, a further analysis concerns with the electrical response, revealing that the resistance is sensitively affected by the presence of sodium or potassium, thus suggesting resistance as a useful physical parameter for testing binding affinity.
Phytotoxicity, structural and computational analysis of 2-methyl-1,5-diarylpentadienones
NASA Astrophysics Data System (ADS)
Din, Zia Ud; Rodrigues-Filho, Edson; de Cassia Pereira, Viviane; Gualtieri, Sonia Cristina Juliano; Deflon, Victor Marcelo; da Silva Maia, Pedro Ivo; Kuznetsov, Aleksey E.
2017-08-01
In our studies aimed to produce new chemicals used in weed control, 2-methyl-1,5-diarylpentadienones were synthesized by the reaction of p-methoxybenzaldehyde, p-nitrobenzaldehyde and p-N,N-dimethylbenzaldehyde, respectively, with 2-butanone, resulting in four model compounds. The phytotoxicity of these compounds against wheat coleoptiles and Sesame seedling was observed at μM concentrations, indicating good potential for their usage in weed management in the field. Spectroscopic and computational studies were performed in order to gain understanding on their mechanisms of action and to clarify some structural complexities due existence of conformers and substituent effects. These compounds probably act as hydroxyphenylpyruvate dioxygenase inhibitors. The tested compounds were characterized by spectroscopic and single crystal X-ray diffraction analyses. Solid crystalline state of the compound A (2-Methyl-1-(p-methophyphenyl)-5-(phenyl)-diarylpentadienone) is observed in the monoclinic space group P21/c with unit cell dimensions a = 14.3366(4) Å, b = 11.3788(4) Å, c = 9.6319(3) Å, β = 96.596, V = 1560.88(9) Å3 and Z = 4. Compound C (2-Methyl-1-(p-methophyphenyl)-5-(p-nitrophenyl)-diarylpentadienone) crystallizes in the monoclinic space group P21/c with unit cell dimensions a = 17.8276(9) Å, b = 7.3627(4) Å, c = 12.9740(6) Å, β = 107.6230(10), V = 1623.04(14) Å3 and Z = 4. LC-UV-MS analysis furnished important data helpful for their characterization. The spectroscopic data and computational (DFT) analysis revealed the fact that each of the compounds A-D occurs in solution as four conformers.
Computational foundations of the visual number sense.
Stoianov, Ivilin Peev; Zorzi, Marco
2017-01-01
We provide an emergentist perspective on the computational mechanism underlying numerosity perception, its development, and the role of inhibition, based on our deep neural network model. We argue that the influence of continuous visual properties does not challenge the notion of number sense, but reveals limit conditions for the computation that yields invariance in numerosity perception. Alternative accounts should be formalized in a computational model.
Performance of the Wavelet Decomposition on Massively Parallel Architectures
NASA Technical Reports Server (NTRS)
El-Ghazawi, Tarek A.; LeMoigne, Jacqueline; Zukor, Dorothy (Technical Monitor)
2001-01-01
Traditionally, Fourier Transforms have been utilized for performing signal analysis and representation. But although it is straightforward to reconstruct a signal from its Fourier transform, no local description of the signal is included in its Fourier representation. To alleviate this problem, Windowed Fourier transforms and then wavelet transforms have been introduced, and it has been proven that wavelets give a better localization than traditional Fourier transforms, as well as a better division of the time- or space-frequency plane than Windowed Fourier transforms. Because of these properties and after the development of several fast algorithms for computing the wavelet representation of any signal, in particular the Multi-Resolution Analysis (MRA) developed by Mallat, wavelet transforms have increasingly been applied to signal analysis problems, especially real-life problems, in which speed is critical. In this paper we present and compare efficient wavelet decomposition algorithms on different parallel architectures. We report and analyze experimental measurements, using NASA remotely sensed images. Results show that our algorithms achieve significant performance gains on current high performance parallel systems, and meet scientific applications and multimedia requirements. The extensive performance measurements collected over a number of high-performance computer systems have revealed important architectural characteristics of these systems, in relation to the processing demands of the wavelet decomposition of digital images.
Computational Analysis of Gynura bicolor Bioactive Compounds as Dipeptidyl Peptidase-IV Inhibitor
Abdullah Zawawi, Muhammad Redha; Ahmad, Muhamad Aizuddin; Jaganath, Indu Bala
2017-01-01
The inhibition of dipeptidyl peptidase-IV (DPPIV) is a popular route for the treatment of type-2 diabetes. Commercially available gliptin-based drugs such as sitagliptin, anagliptin, linagliptin, saxagliptin, and alogliptin were specifically developed as DPPIV inhibitors for diabetic patients. The use of Gynura bicolor in treating diabetes had been reported in various in vitro experiments. However, an understanding of the inhibitory actions of G. bicolor bioactive compounds on DPPIV is still lacking and this may provide crucial information for the development of more potent and natural sources of DPPIV inhibitors. Evaluation of G. bicolor bioactive compounds for potent DPPIV inhibitors was computationally conducted using Lead IT and iGEMDOCK software, and the best free-binding energy scores for G. bicolor bioactive compounds were evaluated in comparison with the commercial DPPIV inhibitors, sitagliptin, anagliptin, linagliptin, saxagliptin, and alogliptin. Drug-likeness and absorption, distribution, metabolism, and excretion (ADME) analysis were also performed. Based on molecular docking analysis, four of the identified bioactive compounds in G. bicolor, 3-caffeoylquinic acid, 5-O-caffeoylquinic acid, 3,4-dicaffeoylquinic acid, and trans-5-p-coumaroylquinic acid, resulted in lower free-binding energy scores when compared with two of the commercially available gliptin inhibitors. The results revealed that bioactive compounds in G. bicolor are potential natural inhibitors of DPPIV. PMID:28932239
Jaspard, Emmanuel; Macherel, David; Hunault, Gilles
2012-01-01
Late Embryogenesis Abundant Proteins (LEAPs) are ubiquitous proteins expected to play major roles in desiccation tolerance. Little is known about their structure - function relationships because of the scarcity of 3-D structures for LEAPs. The previous building of LEAPdb, a database dedicated to LEAPs from plants and other organisms, led to the classification of 710 LEAPs into 12 non-overlapping classes with distinct properties. Using this resource, numerous physico-chemical properties of LEAPs and amino acid usage by LEAPs have been computed and statistically analyzed, revealing distinctive features for each class. This unprecedented analysis allowed a rigorous characterization of the 12 LEAP classes, which differed also in multiple structural and physico-chemical features. Although most LEAPs can be predicted as intrinsically disordered proteins, the analysis indicates that LEAP class 7 (PF03168) and probably LEAP class 11 (PF04927) are natively folded proteins. This study thus provides a detailed description of the structural properties of this protein family opening the path toward further LEAP structure - function analysis. Finally, since each LEAP class can be clearly characterized by a unique set of physico-chemical properties, this will allow development of software to predict proteins as LEAPs. PMID:22615859