Sample records for computationally driven quantitative

  1. Qualitative and Quantitative Pedigree Analysis: Graph Theory, Computer Software, and Case Studies.

    ERIC Educational Resources Information Center

    Jungck, John R.; Soderberg, Patti

    1995-01-01

    Presents a series of elementary mathematical tools for re-representing pedigrees, pedigree generators, pedigree-driven database management systems, and case studies for exploring genetic relationships. (MKR)

  2. All biology is computational biology.

    PubMed

    Markowetz, Florian

    2017-03-01

    Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.

  3. The simultaneous quantitation of ten amino acids in soil extracts by mass fragmentography

    NASA Technical Reports Server (NTRS)

    Pereira, W. E.; Hoyano, Y.; Reynolds, W. E.; Summons, R. E.; Duffield, A. M.

    1972-01-01

    A specific and sensitive method for the identification and simultaneous quantitation by mass fragmentography of ten of the amino acids present in soil was developed. The technique uses a computer driven quadrupole mass spectrometer and a commercial preparation of deuterated amino acids is used as internal standards for purposes of quantitation. The results obtained are comparable with those from an amino acid analyzer. In the quadrupole mass spectrometer-computer system up to 25 pre-selected ions may be monitored sequentially. This allows a maximum of 12 different amino acids (one specific ion in each of the undeuterated and deuterated amino acid spectra) to be quantitated. The method is relatively rapid (analysis time of approximately one hour) and is capable of the quantitation of nanogram quantities of amino acids.

  4. Knowledge-driven computational modeling in Alzheimer's disease research: Current state and future trends.

    PubMed

    Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J

    2017-11-01

    Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  5. Computer Aided Teaching of Digital Signal Processing.

    ERIC Educational Resources Information Center

    Castro, Ian P.

    1990-01-01

    Describes a microcomputer-based software package developed at the University of Surrey for teaching digital signal processing to undergraduate science and engineering students. Menu-driven software capabilities are explained, including demonstration of qualitative concepts and experimentation with quantitative data, and examples are given of…

  6. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silico brain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  7. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    2008-04-01

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silicobrain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  8. Theoretical Framework for Interaction Game Design

    DTIC Science & Technology

    2016-05-19

    modeling. We take a data-driven quantitative approach to understand conversational behaviors by measuring conversational behaviors using advanced sensing...current state of the art, human computing is considered to be a reasonable approach to break through the current limitation. To solicit high quality and...proper resources in conversation to enable smooth and effective interaction. The last technique is about conversation measurement , analysis, and

  9. Single Cell Genomics: Approaches and Utility in Immunology

    PubMed Central

    Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A

    2017-01-01

    Single cell genomics offers powerful tools for studying lymphocytes, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population-level. Advances in computer science and single cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single cell RNA-seq data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. PMID:28094102

  10. Computing eddy-driven effective diffusivity using Lagrangian particles

    DOE PAGES

    Wolfram, Phillip J.; Ringler, Todd D.

    2017-08-14

    A novel method to derive effective diffusivity from Lagrangian particle trajectory data sets is developed and then analyzed relative to particle-derived meridional diffusivity for eddy-driven mixing in an idealized circumpolar current. Quantitative standard dispersion- and transport-based mixing diagnostics are defined, compared and contrasted to motivate the computation and use of effective diffusivity derived from Lagrangian particles. We compute the effective diffusivity by first performing scalar transport on Lagrangian control areas using stored trajectories computed from online Lagrangian In-situ Global High-performance particle Tracking (LIGHT) using the Model for Prediction Across Scales Ocean (MPAS-O). Furthermore, the Lagrangian scalar transport scheme is comparedmore » against an Eulerian scalar transport scheme. Spatially-variable effective diffusivities are computed from resulting time-varying cumulative concentrations that vary as a function of cumulative area. The transport-based Eulerian and Lagrangian effective diffusivity diagnostics are found to be qualitatively consistent with the dispersion-based diffusivity. All diffusivity estimates show a region of increased subsurface diffusivity within the core of an idealized circumpolar current and results are within a factor of two of each other. The Eulerian and Lagrangian effective diffusivities are most similar; smaller and more spatially diffused values are obtained with the dispersion-based diffusivity computed with particle clusters.« less

  11. Computing eddy-driven effective diffusivity using Lagrangian particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfram, Phillip J.; Ringler, Todd D.

    A novel method to derive effective diffusivity from Lagrangian particle trajectory data sets is developed and then analyzed relative to particle-derived meridional diffusivity for eddy-driven mixing in an idealized circumpolar current. Quantitative standard dispersion- and transport-based mixing diagnostics are defined, compared and contrasted to motivate the computation and use of effective diffusivity derived from Lagrangian particles. We compute the effective diffusivity by first performing scalar transport on Lagrangian control areas using stored trajectories computed from online Lagrangian In-situ Global High-performance particle Tracking (LIGHT) using the Model for Prediction Across Scales Ocean (MPAS-O). Furthermore, the Lagrangian scalar transport scheme is comparedmore » against an Eulerian scalar transport scheme. Spatially-variable effective diffusivities are computed from resulting time-varying cumulative concentrations that vary as a function of cumulative area. The transport-based Eulerian and Lagrangian effective diffusivity diagnostics are found to be qualitatively consistent with the dispersion-based diffusivity. All diffusivity estimates show a region of increased subsurface diffusivity within the core of an idealized circumpolar current and results are within a factor of two of each other. The Eulerian and Lagrangian effective diffusivities are most similar; smaller and more spatially diffused values are obtained with the dispersion-based diffusivity computed with particle clusters.« less

  12. Single-Cell Genomics: Approaches and Utility in Immunology.

    PubMed

    Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A

    2017-02-01

    Single-cell genomics offers powerful tools for studying immune cells, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population level. Advances in computer science and single-cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single-cell RNA-sequencing data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Computational neuroscience approach to biomarkers and treatments for mental disorders.

    PubMed

    Yahata, Noriaki; Kasai, Kiyoto; Kawato, Mitsuo

    2017-04-01

    Psychiatry research has long experienced a stagnation stemming from a lack of understanding of the neurobiological underpinnings of phenomenologically defined mental disorders. Recently, the application of computational neuroscience to psychiatry research has shown great promise in establishing a link between phenomenological and pathophysiological aspects of mental disorders, thereby recasting current nosology in more biologically meaningful dimensions. In this review, we highlight recent investigations into computational neuroscience that have undertaken either theory- or data-driven approaches to quantitatively delineate the mechanisms of mental disorders. The theory-driven approach, including reinforcement learning models, plays an integrative role in this process by enabling correspondence between behavior and disorder-specific alterations at multiple levels of brain organization, ranging from molecules to cells to circuits. Previous studies have explicated a plethora of defining symptoms of mental disorders, including anhedonia, inattention, and poor executive function. The data-driven approach, on the other hand, is an emerging field in computational neuroscience seeking to identify disorder-specific features among high-dimensional big data. Remarkably, various machine-learning techniques have been applied to neuroimaging data, and the extracted disorder-specific features have been used for automatic case-control classification. For many disorders, the reported accuracies have reached 90% or more. However, we note that rigorous tests on independent cohorts are critically required to translate this research into clinical applications. Finally, we discuss the utility of the disorder-specific features found by the data-driven approach to psychiatric therapies, including neurofeedback. Such developments will allow simultaneous diagnosis and treatment of mental disorders using neuroimaging, thereby establishing 'theranostics' for the first time in clinical psychiatry. © 2016 The Authors. Psychiatry and Clinical Neurosciences © 2016 Japanese Society of Psychiatry and Neurology.

  14. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  15. A Fully GPU-Based Ray-Driven Backprojector via a Ray-Culling Scheme with Voxel-Level Parallelization for Cone-Beam CT Reconstruction.

    PubMed

    Park, Hyeong-Gyu; Shin, Yeong-Gil; Lee, Ho

    2015-12-01

    A ray-driven backprojector is based on ray-tracing, which computes the length of the intersection between the ray paths and each voxel to be reconstructed. To reduce the computational burden caused by these exhaustive intersection tests, we propose a fully graphics processing unit (GPU)-based ray-driven backprojector in conjunction with a ray-culling scheme that enables straightforward parallelization without compromising the high computing performance of a GPU. The purpose of the ray-culling scheme is to reduce the number of ray-voxel intersection tests by excluding rays irrelevant to a specific voxel computation. This rejection step is based on an axis-aligned bounding box (AABB) enclosing a region of voxel projection, where eight vertices of each voxel are projected onto the detector plane. The range of the rectangular-shaped AABB is determined by min/max operations on the coordinates in the region. Using the indices of pixels inside the AABB, the rays passing through the voxel can be identified and the voxel is weighted as the length of intersection between the voxel and the ray. This procedure makes it possible to reflect voxel-level parallelization, allowing an independent calculation at each voxel, which is feasible for a GPU implementation. To eliminate redundant calculations during ray-culling, a shared-memory optimization is applied to exploit the GPU memory hierarchy. In experimental results using real measurement data with phantoms, the proposed GPU-based ray-culling scheme reconstructed a volume of resolution 28032803176 in 77 seconds from 680 projections of resolution 10243768 , which is 26 times and 7.5 times faster than standard CPU-based and GPU-based ray-driven backprojectors, respectively. Qualitative and quantitative analyses showed that the ray-driven backprojector provides high-quality reconstruction images when compared with those generated by the Feldkamp-Davis-Kress algorithm using a pixel-driven backprojector, with an average of 2.5 times higher contrast-to-noise ratio, 1.04 times higher universal quality index, and 1.39 times higher normalized mutual information. © The Author(s) 2014.

  16. Self-Assembly of Coordinative Supramolecular Polygons with Open Binding Sites

    PubMed Central

    Zheng, Yao-Rong; Wang, Ming; Kobayashi, Shiho; Stang, Peter J.

    2011-01-01

    The design and synthesis of coordinative supramolecular polygons with open binding sites is described. Coordination-driven self-assembly of 2,6-bis(pyridin-4-ylethynyl)pyridine with 60° and 120° organoplatinum acceptors results in quantitative formation of a supramolecular rhomboid and hexagon, respectively, both bearing open pyridyl binding sites. The structures were determined by multinuclear (31P and 1H) NMR spectroscopy and electrospray ionization (ESI) mass spectrometry, along with a computational study. PMID:21516167

  17. Self-Assembly of Coordinative Supramolecular Polygons with Open Binding Sites.

    PubMed

    Zheng, Yao-Rong; Wang, Ming; Kobayashi, Shiho; Stang, Peter J

    2011-04-27

    The design and synthesis of coordinative supramolecular polygons with open binding sites is described. Coordination-driven self-assembly of 2,6-bis(pyridin-4-ylethynyl)pyridine with 60° and 120° organoplatinum acceptors results in quantitative formation of a supramolecular rhomboid and hexagon, respectively, both bearing open pyridyl binding sites. The structures were determined by multinuclear ((31)P and (1)H) NMR spectroscopy and electrospray ionization (ESI) mass spectrometry, along with a computational study.

  18. Self-consistent gyrokinetic modeling of neoclassical and turbulent impurity transport

    NASA Astrophysics Data System (ADS)

    Estève, D.; Sarazin, Y.; Garbet, X.; Grandgirard, V.; Breton, S.; Donnel, P.; Asahi, Y.; Bourdelle, C.; Dif-Pradalier, G.; Ehrlacher, C.; Emeriau, C.; Ghendrih, Ph.; Gillot, C.; Latu, G.; Passeron, C.

    2018-03-01

    Trace impurity transport is studied with the flux-driven gyrokinetic GYSELA code (Grandgirard et al 2016 Comput. Phys. Commun. 207 35). A reduced and linearized multi-species collision operator has been recently implemented, so that both neoclassical and turbulent transport channels can be treated self-consistently on an equal footing. In the Pfirsch-Schlüter regime that is probably relevant for tungsten, the standard expression for the neoclassical impurity flux is shown to be recovered from gyrokinetics with the employed collision operator. Purely neoclassical simulations of deuterium plasma with trace impurities of helium, carbon and tungsten lead to impurity diffusion coefficients, inward pinch velocities due to density peaking, and thermo-diffusion terms which quantitatively agree with neoclassical predictions and NEO simulations (Belli et al 2012 Plasma Phys. Control. Fusion 54 015015). The thermal screening factor appears to be less than predicted analytically in the Pfirsch-Schlüter regime, which can be detrimental to fusion performance. Finally, self-consistent nonlinear simulations have revealed that the tungsten impurity flux is not the sum of turbulent and neoclassical fluxes computed separately, as is usually assumed. The synergy partly results from the turbulence-driven in-out poloidal asymmetry of tungsten density. This result suggests the need for self-consistent simulations of impurity transport, i.e. including both turbulence and neoclassical physics, in view of quantitative predictions for ITER.

  19. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  20. A Detailed Data-Driven Network Model of Prefrontal Cortex Reproduces Key Features of In Vivo Activity

    PubMed Central

    Hass, Joachim; Hertäg, Loreen; Durstewitz, Daniel

    2016-01-01

    The prefrontal cortex is centrally involved in a wide range of cognitive functions and their impairment in psychiatric disorders. Yet, the computational principles that govern the dynamics of prefrontal neural networks, and link their physiological, biochemical and anatomical properties to cognitive functions, are not well understood. Computational models can help to bridge the gap between these different levels of description, provided they are sufficiently constrained by experimental data and capable of predicting key properties of the intact cortex. Here, we present a detailed network model of the prefrontal cortex, based on a simple computationally efficient single neuron model (simpAdEx), with all parameters derived from in vitro electrophysiological and anatomical data. Without additional tuning, this model could be shown to quantitatively reproduce a wide range of measures from in vivo electrophysiological recordings, to a degree where simulated and experimentally observed activities were statistically indistinguishable. These measures include spike train statistics, membrane potential fluctuations, local field potentials, and the transmission of transient stimulus information across layers. We further demonstrate that model predictions are robust against moderate changes in key parameters, and that synaptic heterogeneity is a crucial ingredient to the quantitative reproduction of in vivo-like electrophysiological behavior. Thus, we have produced a physiologically highly valid, in a quantitative sense, yet computationally efficient PFC network model, which helped to identify key properties underlying spike time dynamics as observed in vivo, and can be harvested for in-depth investigation of the links between physiology and cognition. PMID:27203563

  1. A color video display technique for flow field surveys

    NASA Technical Reports Server (NTRS)

    Winkelmann, A. E.; Tsao, C. P.

    1982-01-01

    A computer driven color video display technique has been developed for the presentation of wind tunnel flow field survey data. The results of both qualitative and quantitative flow field surveys can be presented in high spatial resolutions color coded displays. The technique has been used for data obtained with a hot-wire probe, a split-film probe, a Conrad (pitch) probe and a 5-tube pressure probe in surveys above and behind a wing with partially stalled and fully stalled flow.

  2. Estrogens are essential for male pubertal periosteal bone expansion.

    PubMed

    Bouillon, Roger; Bex, Marie; Vanderschueren, Dirk; Boonen, Steven

    2004-12-01

    The skeletal response to estrogen therapy was studied in a 17-yr-old boy with congenital aromatase deficiency. As expected, estrogen therapy (1 mg estradiol valeriate/d from age 17 until 20 yr) normalized total and free testosterone and reduced the rate of bone remodeling. Dual-energy x-ray absorptiometry-assessed areal bone mineral density (BMD) of the lumbar spine and femoral neck increased significantly (by 23% and 14%, respectively), but peripheral quantitative computed tomography at the ultradistal radius revealed no gain of either trabecular or cortical volumetric BMD. The increase in areal BMD was thus driven by an increase in bone size. Indeed, longitudinal bone growth (height, +8.5%) and especially cross-sectional area of the radius (+46%) and cortical thickness (+12%), as measured by peripheral quantitative computed tomography, increased markedly during estrogen treatment. These findings demonstrate that androgens alone are insufficient, whereas estrogens are essential for the process of pubertal periosteal bone expansion typically associated with the male bone phenotype.

  3. Associative image analysis: a method for automated quantification of 3D multi-parameter images of brain tissue

    PubMed Central

    Bjornsson, Christopher S; Lin, Gang; Al-Kofahi, Yousef; Narayanaswamy, Arunachalam; Smith, Karen L; Shain, William; Roysam, Badrinath

    2009-01-01

    Brain structural complexity has confounded prior efforts to extract quantitative image-based measurements. We present a systematic ‘divide and conquer’ methodology for analyzing three-dimensional (3D) multi-parameter images of brain tissue to delineate and classify key structures, and compute quantitative associations among them. To demonstrate the method, thick (~100 μm) slices of rat brain tissue were labeled using 3 – 5 fluorescent signals, and imaged using spectral confocal microscopy and unmixing algorithms. Automated 3D segmentation and tracing algorithms were used to delineate cell nuclei, vasculature, and cell processes. From these segmentations, a set of 23 intrinsic and 8 associative image-based measurements was computed for each cell. These features were used to classify astrocytes, microglia, neurons, and endothelial cells. Associations among cells and between cells and vasculature were computed and represented as graphical networks to enable further analysis. The automated results were validated using a graphical interface that permits investigator inspection and corrective editing of each cell in 3D. Nuclear counting accuracy was >89%, and cell classification accuracy ranged from 81–92% depending on cell type. We present a software system named FARSIGHT implementing our methodology. Its output is a detailed XML file containing measurements that may be used for diverse quantitative hypothesis-driven and exploratory studies of the central nervous system. PMID:18294697

  4. Quantitative Machine Learning Analysis of Brain MRI Morphology throughout Aging.

    PubMed

    Shamir, Lior; Long, Joe

    2016-01-01

    While cognition is clearly affected by aging, it is unclear whether the process of brain aging is driven solely by accumulation of environmental damage, or involves biological pathways. We applied quantitative image analysis to profile the alteration of brain tissues during aging. A dataset of 463 brain MRI images taken from a cohort of 416 subjects was analyzed using a large set of low-level numerical image content descriptors computed from the entire brain MRI images. The correlation between the numerical image content descriptors and the age was computed, and the alterations of the brain tissues during aging were quantified and profiled using machine learning. The comprehensive set of global image content descriptors provides high Pearson correlation of ~0.9822 with the chronological age, indicating that the machine learning analysis of global features is sensitive to the age of the subjects. Profiling of the predicted age shows several periods of mild changes, separated by shorter periods of more rapid alterations. The periods with the most rapid changes were around the age of 55, and around the age of 65. The results show that the process of brain aging of is not linear, and exhibit short periods of rapid aging separated by periods of milder change. These results are in agreement with patterns observed in cognitive decline, mental health status, and general human aging, suggesting that brain aging might not be driven solely by accumulation of environmental damage. Code and data used in the experiments are publicly available.

  5. Anatomy-driven multiple trajectory planning (ADMTP) of intracranial electrodes for epilepsy surgery.

    PubMed

    Sparks, Rachel; Vakharia, Vejay; Rodionov, Roman; Vos, Sjoerd B; Diehl, Beate; Wehner, Tim; Miserocchi, Anna; McEvoy, Andrew W; Duncan, John S; Ourselin, Sebastien

    2017-08-01

    Epilepsy is potentially curable with resective surgery if the epileptogenic zone (EZ) can be identified. If non-invasive imaging is unable to elucidate the EZ, intracranial electrodes may be implanted to identify the EZ as well as map cortical function. In current clinical practice, each electrode trajectory is determined by time-consuming manual inspection of preoperative imaging to find a path that avoids blood vessels while traversing appropriate deep and superficial regions of interest (ROIs). We present anatomy-driven multiple trajectory planning (ADMTP) to find safe trajectories from a list of user-defined ROIs within minutes rather than the hours required for manual planning. Electrode trajectories are automatically computed in three steps: (1) Target Point Selection to identify appropriate target points within each ROI; (2) Trajectory Risk Scoring to quantify the cumulative distance to critical structures (blood vessels) along each trajectory, defined as the skull entry point to target point. (3) Implantation Plan Computation: to determine a feasible combination of low-risk trajectories for all electrodes. ADMTP was evaluated on 20 patients (190 electrodes). ADMTP lowered the quantitative risk score in 83% of electrodes. Qualitative results show ADMTP found suitable trajectories for 70% of electrodes; a similar portion of manual trajectories were considered suitable. Trajectory suitability for ADMTP was 95% if traversing sulci was not included in the safety criteria. ADMTP is computationally efficient, computing between 7 and 12 trajectories in 54.5 (17.3-191.9) s. ADMTP efficiently compute safe and surgically feasible electrode trajectories.

  6. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  7. Mobile medical computing driven by the complexity of neurologic diagnosis.

    PubMed

    Segal, Michael M

    2006-07-01

    Medical computing has been split between palm-sized computers optimized for mobility and desktop computers optimized for capability. This split was due to technology too immature to deliver both mobility and capability in the same computer and the lack of medical software that demanded both mobility and capability. Advances in hardware and software are ushering in an era in which fully capable computers will be available ubiquitously. As a result, medical practice, education and publishing will change. Medical practice will be improved by the use of software that not only assists with diagnosis but can do so at the bedside, where the doctor can act immediately upon suggestions such as useful findings to check. Medical education will shift away from a focus on details of unusual diseases and toward a focus on skills of physical examination and using computerized tools. Medical publishing, in contrast, will shift toward greater detail: it will be increasingly important to quantitate the frequency of findings in diseases and their time course since such information can have a major impact clinically when added to decision support software.

  8. Identification and Quantitative Measurements of Chemical Species by Mass Spectrometry

    NASA Technical Reports Server (NTRS)

    Zondlo, Mark A.; Bomse, David S.

    2005-01-01

    The development of a miniature gas chromatograph/mass spectrometer system for the measurement of chemical species of interest to combustion is described. The completed system is a fully-contained, automated instrument consisting of a sampling inlet, a small-scale gas chromatograph, a miniature, quadrupole mass spectrometer, vacuum pumps, and software. A pair of computer-driven valves controls the gas sampling and introduction to the chromatographic column. The column has a stainless steel exterior and a silica interior, and contains an adsorbent of that is used to separate organic species. The detection system is based on a quadrupole mass spectrometer consisting of a micropole array, electrometer, and a computer interface. The vacuum system has two miniature pumps to maintain the low pressure needed for the mass spectrometer. A laptop computer uses custom software to control the entire system and collect the data. In a laboratory demonstration, the system separated calibration mixtures containing 1000 ppm of alkanes and alkenes.

  9. Automated land-use mapping from spacecraft data. [Oakland County, Michigan

    NASA Technical Reports Server (NTRS)

    Chase, P. E. (Principal Investigator); Rogers, R. H.; Reed, L. E.

    1974-01-01

    The author has identified the following significant results. In response to the need for a faster, more economical means of producing land use maps, this study evaluated the suitability of using ERTS-1 computer compatible tape (CCT) data as a basis for automatic mapping. Significant findings are: (1) automatic classification accuracy greater than 90% is achieved on categories of deep and shallow water, tended grass, rangeland, extractive (bare earth), urban, forest land, and nonforested wet lands; (2) computer-generated printouts by target class provide a quantitative measure of land use; and (3) the generation of map overlays showing land use from ERTS-1 CCTs offers a significant breakthrough in the rate at which land use maps are generated. Rather than uncorrected classified imagery or computer line printer outputs, the processing results in geometrically-corrected computer-driven pen drawing of land categories, drawn on a transparent material at a scale specified by the operator. These map overlays are economically produced and provide an efficient means of rapidly updating maps showing land use.

  10. Calculation of turbulence-driven secondary motion in ducts with arbitrary cross section

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.

    1989-01-01

    Calculation methods for turbulent duct flows are generalized for ducts with arbitrary cross-sections. The irregular physical geometry is transformed into a regular one in computational space, and the flow equations are solved with a finite-volume numerical procedure. The turbulent stresses are calculated with an algebraic stress model derived by simplifying model transport equations for the individual Reynolds stresses. Two variants of such a model are considered. These procedures enable the prediction of both the turbulence-driven secondary flow and the anisotropy of the Reynolds stresses, in contrast to some of the earlier calculation methods. Model predictions are compared to experimental data for developed flow in triangular duct, trapezoidal duct and a rod-bundle geometry. The correct trends are predicted, and the quantitative agreement is mostly fair. The simpler variant of the algebraic stress model procured better agreement with the measured data.

  11. Human body segmentation via data-driven graph cut.

    PubMed

    Li, Shifeng; Lu, Huchuan; Shao, Xingqing

    2014-11-01

    Human body segmentation is a challenging and important problem in computer vision. Existing methods usually entail a time-consuming training phase for prior knowledge learning with complex shape matching for body segmentation. In this paper, we propose a data-driven method that integrates top-down body pose information and bottom-up low-level visual cues for segmenting humans in static images within the graph cut framework. The key idea of our approach is first to exploit human kinematics to search for body part candidates via dynamic programming for high-level evidence. Then, by using the body parts classifiers, obtaining bottom-up cues of human body distribution for low-level evidence. All the evidence collected from top-down and bottom-up procedures are integrated in a graph cut framework for human body segmentation. Qualitative and quantitative experiment results demonstrate the merits of the proposed method in segmenting human bodies with arbitrary poses from cluttered backgrounds.

  12. Virtual screening of inorganic materials synthesis parameters with deep learning

    NASA Astrophysics Data System (ADS)

    Kim, Edward; Huang, Kevin; Jegelka, Stefanie; Olivetti, Elsa

    2017-12-01

    Virtual materials screening approaches have proliferated in the past decade, driven by rapid advances in first-principles computational techniques, and machine-learning algorithms. By comparison, computationally driven materials synthesis screening is still in its infancy, and is mired by the challenges of data sparsity and data scarcity: Synthesis routes exist in a sparse, high-dimensional parameter space that is difficult to optimize over directly, and, for some materials of interest, only scarce volumes of literature-reported syntheses are available. In this article, we present a framework for suggesting quantitative synthesis parameters and potential driving factors for synthesis outcomes. We use a variational autoencoder to compress sparse synthesis representations into a lower dimensional space, which is found to improve the performance of machine-learning tasks. To realize this screening framework even in cases where there are few literature data, we devise a novel data augmentation methodology that incorporates literature synthesis data from related materials systems. We apply this variational autoencoder framework to generate potential SrTiO3 synthesis parameter sets, propose driving factors for brookite TiO2 formation, and identify correlations between alkali-ion intercalation and MnO2 polymorph selection.

  13. Growth of Walled Cells: From Shells to Vesicles

    NASA Astrophysics Data System (ADS)

    Boudaoud, Arezki

    2003-07-01

    The growth of isolated walled cells is investigated. Examples of such cells range from bacteria to giant algae, and include cochlear hair, plant root hair, fungi, and yeast cells. They are modeled as elastic shells containing a liquid. Cell growth is driven by fluid pressure and is is similar to a plastic deformation of the wall. The requirement of mechanical equilibrium leads to two new scaling laws for cell size that are in quantitative agreement with the compiled biological data. Given these results, possible shapes for growing cells are computed by analogy with those of vesicle membranes.

  14. On the growth of walled cells: From shells to vesicles.

    NASA Astrophysics Data System (ADS)

    Boudaoud, Arezki

    2003-03-01

    The growth of isolated walled cells is investigated. Examples of such cells range from bacteria to giant algae, and include cochlear hair, plant root hair, fungi and yeast cells. They are modeled as elastic shells inflated by a liquid. Cell growth is driven by fluid pressure and is similar to a plastic deformation of the wall. The requirement of mechanical equilibrium leads to two new scaling laws for cell size that are in quantitative agreement with the compiled biological data. Given these results, possible shapes for growing cells are computed by analogy with those of vesicle membranes.

  15. The quantum computer game: citizen science

    NASA Astrophysics Data System (ADS)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  16. Accelerating simultaneous algebraic reconstruction technique with motion compensation using CUDA-enabled GPU.

    PubMed

    Pang, Wai-Man; Qin, Jing; Lu, Yuqiang; Xie, Yongming; Chui, Chee-Kong; Heng, Pheng-Ann

    2011-03-01

    To accelerate the simultaneous algebraic reconstruction technique (SART) with motion compensation for speedy and quality computed tomography reconstruction by exploiting CUDA-enabled GPU. Two core techniques are proposed to fit SART into the CUDA architecture: (1) a ray-driven projection along with hardware trilinear interpolation, and (2) a voxel-driven back-projection that can avoid redundant computation by combining CUDA shared memory. We utilize the independence of each ray and voxel on both techniques to design CUDA kernel to represent a ray in the projection and a voxel in the back-projection respectively. Thus, significant parallelization and performance boost can be achieved. For motion compensation, we rectify each ray's direction during the projection and back-projection stages based on a known motion vector field. Extensive experiments demonstrate the proposed techniques can provide faster reconstruction without compromising image quality. The process rate is nearly 100 projections s (-1), and it is about 150 times faster than a CPU-based SART. The reconstructed image is compared against ground truth visually and quantitatively by peak signal-to-noise ratio (PSNR) and line profiles. We further evaluate the reconstruction quality using quantitative metrics such as signal-to-noise ratio (SNR) and mean-square-error (MSE). All these reveal that satisfactory results are achieved. The effects of major parameters such as ray sampling interval and relaxation parameter are also investigated by a series of experiments. A simulated dataset is used for testing the effectiveness of our motion compensation technique. The results demonstrate our reconstructed volume can eliminate undesirable artifacts like blurring. Our proposed method has potential to realize instantaneous presentation of 3D CT volume to physicians once the projection data are acquired.

  17. Idiopathic Pulmonary Fibrosis: Data-driven Textural Analysis of Extent of Fibrosis at Baseline and 15-Month Follow-up.

    PubMed

    Humphries, Stephen M; Yagihashi, Kunihiro; Huckleberry, Jason; Rho, Byung-Hak; Schroeder, Joyce D; Strand, Matthew; Schwarz, Marvin I; Flaherty, Kevin R; Kazerooni, Ella A; van Beek, Edwin J R; Lynch, David A

    2017-10-01

    Purpose To evaluate associations between pulmonary function and both quantitative analysis and visual assessment of thin-section computed tomography (CT) images at baseline and at 15-month follow-up in subjects with idiopathic pulmonary fibrosis (IPF). Materials and Methods This retrospective analysis of preexisting anonymized data, collected prospectively between 2007 and 2013 in a HIPAA-compliant study, was exempt from additional institutional review board approval. The extent of lung fibrosis at baseline inspiratory chest CT in 280 subjects enrolled in the IPF Network was evaluated. Visual analysis was performed by using a semiquantitative scoring system. Computer-based quantitative analysis included CT histogram-based measurements and a data-driven textural analysis (DTA). Follow-up CT images in 72 of these subjects were also analyzed. Univariate comparisons were performed by using Spearman rank correlation. Multivariate and longitudinal analyses were performed by using a linear mixed model approach, in which models were compared by using asymptotic χ 2 tests. Results At baseline, all CT-derived measures showed moderate significant correlation (P < .001) with pulmonary function. At follow-up CT, changes in DTA scores showed significant correlation with changes in both forced vital capacity percentage predicted (ρ = -0.41, P < .001) and diffusing capacity for carbon monoxide percentage predicted (ρ = -0.40, P < .001). Asymptotic χ 2 tests showed that inclusion of DTA score significantly improved fit of both baseline and longitudinal linear mixed models in the prediction of pulmonary function (P < .001 for both). Conclusion When compared with semiquantitative visual assessment and CT histogram-based measurements, DTA score provides additional information that can be used to predict diminished function. Automatic quantification of lung fibrosis at CT yields an index of severity that correlates with visual assessment and functional change in subjects with IPF. © RSNA, 2017.

  18. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    PubMed

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications.

  19. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    PubMed

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  20. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  1. Variation of the hydraulic properties within gravity-driven deposits in basinal carbonates

    NASA Astrophysics Data System (ADS)

    Jablonska, D.; Zambrano, M.; Emanuele, T.; Di Celma, C.

    2017-12-01

    Deepwater gravity-driven deposits represent important stratigraphic heterogeneities within basinal sedimentary successions. A poor understanding of their distribution, internal architecture (at meso- and micro-scale) and hydraulic properties (porosity and permeability), may lead to unexpected compartmentalization issues in reservoir analysis. In this study, we examine gravity-driven deposits within the basinal-carbonate Maiolica Formation adjacent to the Apulian Carbonate Plaftorm, southern Italy. Maiolica formation is represented by horizontal layers of thin-bedded cherty pelagic limestones often intercalated by mass-transport deposits (slumps, debris-flow deposits) and calcarenites of diverse thickness (0.1 m - 40 m) and lateral extent (100 m - >500 m). Locally, gravity-driven deposits compose up to 60 % of the exposed succession. These deposits display broad array of internal architectures (from faulted and folded strata to conglomerates) and various texture. In order to further constrain the variation of the internal architectures and fracture distribution within gravity-driven deposits, field sedimentological and structural analyses were performed. To examine the texture and hydraulic properties of various lithofacies, the laboratory porosity measurements of suitable rock samples were undertaken. These data were supported by 3D pore network quantitative analysis of X-ray Computed microtomography (MicroCT) images performed at resolutions 1.25 and 2.0 microns. This analysis helped to describe the pores and grains geometrical and morphological properties (such as size, shape, specific surface area) and the hydraulic properties (porosity and permeability) of various lithofacies. The integration of the analyses allowed us to show how the internal architecture and the hydraulic properties vary in different types of gravity-driven deposits within the basinal carbonate succession.

  2. Quantitative UV spectroscopy of early O stars on the Magellanic Clouds: The determination of the stellar metallicities

    NASA Technical Reports Server (NTRS)

    Haser, Stefan M.; Pauldrach, Adalbert W. A.; Lennon, Danny J.; Kudritzki, Rolf-Peter; Lennon, Maguerite; Puls, Joachim; Voels, Stephen A.

    1997-01-01

    Ultraviolet spectra of four O stars in the Magellanic Clouds obtained with the faint object spectrograph of the Hubble Space Telescope are analyzed with respect to their metallicity. The metal abundances are derived from the stellar parameters and the mass loss rate with a two step procedure: hydrodynamic radiation-driven wind models with metallicity as a free parameter are constructed to fit the observed wind momentum rate and thus yield a dynamical metallicity, and synthetic spectra are computed for different metal abundances and compared to the observed spectra in order to obtain a spectroscopic metallicity.

  3. Perturbatively deformed defects in Pöschl-Teller-driven scenarios for quantum mechanics

    NASA Astrophysics Data System (ADS)

    Bernardini, Alex E.; da Rocha, Roldão

    2016-07-01

    Pöschl-Teller-driven solutions for quantum mechanical fluctuations are triggered off by single scalar field theories obtained through a systematic perturbative procedure for generating deformed defects. The analytical properties concerning the quantum fluctuations in one-dimension, zero-mode states, first- and second-excited states, and energy density profiles are all obtained from deformed topological and non-topological structures supported by real scalar fields. Results are firstly derived from an integrated λϕ4 theory, with corresponding generalizations applied to starting λχ4 and sine-Gordon theories. By focusing our calculations on structures supported by the λϕ4 theory, the outcome of our study suggests an exact quantitative correspondence to Pöschl-Teller-driven systems. Embedded into the perturbative quantum mechanics framework, such a correspondence turns into a helpful tool for computing excited states and continuous mode solutions, as well as their associated energy spectrum, for quantum fluctuations of perturbatively deformed structures. Perturbative deformations create distinct physical scenarios in the context of exactly solvable quantum systems and may also work as an analytical support for describing novel braneworld universes embedded into a 5-dimensional gravity bulk.

  4. Micromagnetic modal analysis of spin-transfer-driven ferromagnetic resonance of individual nanomagnets

    NASA Astrophysics Data System (ADS)

    Torres, L.; Finocchio, G.; Lopez-Diaz, L.; Martinez, E.; Carpentieri, M.; Consolo, G.; Azzerboni, B.

    2007-05-01

    In a recent investigation Sankey et al. [Phys. Rev. Lett. 96, 227601 (2006)] demonstrated a technique for measuring spin-transfer-driven ferromagnetic resonance in individual ellipsoidal PyCu nanomagnets as small as 30×90×5.5nm3. In the present work, these experiments are analyzed by means of full micromagnetic modeling finding quantitative agreement and enlightening the spatial distribution of the normal modes found in the experiment. The magnetic parameter set used in the computations is obtained by fitting static magnetoresistance measurements. The temperature effect is also included together with all the nonuniform contributions to the effective field as the magnetostatic coupling and the Ampere field. The polarization function of Slonczewski [J. Magn. Magn. Mater. 159, L1 (1996)] is used including its spatial and angular dependences. Experimental spin-transfer-driven ferromagnetic resonance spectra are reproduced using the same currents as in the experiment. The use of full micromagnetic modeling allows us to further investigate the spatial dependence of the modes. The dependence of the normal mode frequency on the dc and the external field together with a comparison to the normal modes induced by a microwave current is also addressed.

  5. Considerations for Explosively Driven Conical Shock Tube Design: Computations and Experiments

    DTIC Science & Technology

    2017-02-16

    ARL-TR-7953 ● FEB 2017 US Army Research Laboratory Considerations for Explosively Driven Conical Shock Tube Design : Computations...The findings in this report are not to be construed as an official Department of the Army position unless so designated by other authorized...Considerations for Explosively Driven Conical Shock Tube Designs : Computations and Experiments by Joel B Stewart Weapons and Materials Research Directorate

  6. A disassembly-driven mechanism explains F-actin-mediated chromosome transport in starfish oocytes

    PubMed Central

    Bun, Philippe; Dmitrieff, Serge; Belmonte, Julio M

    2018-01-01

    While contraction of sarcomeric actomyosin assemblies is well understood, this is not the case for disordered networks of actin filaments (F-actin) driving diverse essential processes in animal cells. For example, at the onset of meiosis in starfish oocytes a contractile F-actin network forms in the nuclear region transporting embedded chromosomes to the assembling microtubule spindle. Here, we addressed the mechanism driving contraction of this 3D disordered F-actin network by comparing quantitative observations to computational models. We analyzed 3D chromosome trajectories and imaged filament dynamics to monitor network behavior under various physical and chemical perturbations. We found no evidence of myosin activity driving network contractility. Instead, our observations are well explained by models based on a disassembly-driven contractile mechanism. We reconstitute this disassembly-based contractile system in silico revealing a simple architecture that robustly drives chromosome transport to prevent aneuploidy in the large oocyte, a prerequisite for normal embryonic development. PMID:29350616

  7. Multiple-Color Optical Activation, Silencing, and Desynchronization of Neural Activity, with Single-Spike Temporal Resolution

    PubMed Central

    Han, Xue; Boyden, Edward S.

    2007-01-01

    The quest to determine how precise neural activity patterns mediate computation, behavior, and pathology would be greatly aided by a set of tools for reliably activating and inactivating genetically targeted neurons, in a temporally precise and rapidly reversible fashion. Having earlier adapted a light-activated cation channel, channelrhodopsin-2 (ChR2), for allowing neurons to be stimulated by blue light, we searched for a complementary tool that would enable optical neuronal inhibition, driven by light of a second color. Here we report that targeting the codon-optimized form of the light-driven chloride pump halorhodopsin from the archaebacterium Natronomas pharaonis (hereafter abbreviated Halo) to genetically-specified neurons enables them to be silenced reliably, and reversibly, by millisecond-timescale pulses of yellow light. We show that trains of yellow and blue light pulses can drive high-fidelity sequences of hyperpolarizations and depolarizations in neurons simultaneously expressing yellow light-driven Halo and blue light-driven ChR2, allowing for the first time manipulations of neural synchrony without perturbation of other parameters such as spiking rates. The Halo/ChR2 system thus constitutes a powerful toolbox for multichannel photoinhibition and photostimulation of virally or transgenically targeted neural circuits without need for exogenous chemicals, enabling systematic analysis and engineering of the brain, and quantitative bioengineering of excitable cells. PMID:17375185

  8. Identifying the relationship between feedback provided in computer-assisted instructional modules, science self-efficacy, and academic achievement

    NASA Astrophysics Data System (ADS)

    Mazingo, Diann Etsuko

    Feedback has been identified as a key variable in developing academic self-efficacy. The types of feedback can vary from a traditional, objectivist approach that focuses on minimizing learner errors to a more constructivist approach, focusing on facilitating understanding. The influx of computer-based courses, whether online or through a series of computer-assisted instruction (CAI) modules require that the current research of effective feedback techniques in the classroom be extended to computer environments in order to impact their instructional design. In this study, exposure to different types of feedback during a chemistry CAI module was studied in relation to science self-efficacy (SSE) and performance on an objective-driven assessment (ODA) of the chemistry concepts covered in the unit. The quantitative analysis consisted of two separate ANCOVAs on the dependent variables, using pretest as the covariate and group as the fixed factor. No significant differences were found for either variable between the three groups on adjusted posttest means for the ODA and SSE measures (.95F(2, 106) = 1.311, p = 0.274 and .95F(2, 106) = 1.080, p = 0.344, respectively). However, a mixed methods approach yielded valuable qualitative insights into why only one overall quantitative effect was observed. These findings are discussed in relation to the need to further refine the instruments and methods used in order to more fully explore the possibility that type of feedback might play a role in developing SSE, and consequently, improve academic performance in science. Future research building on this study may reveal significance that could impact instructional design practices for developing online and computer-based instruction.

  9. Directional view interpolation for compensation of sparse angular sampling in cone-beam CT.

    PubMed

    Bertram, Matthias; Wiegert, Jens; Schafer, Dirk; Aach, Til; Rose, Georg

    2009-07-01

    In flat detector cone-beam computed tomography and related applications, sparse angular sampling frequently leads to characteristic streak artifacts. To overcome this problem, it has been suggested to generate additional views by means of interpolation. The practicality of this approach is investigated in combination with a dedicated method for angular interpolation of 3-D sinogram data. For this purpose, a novel dedicated shape-driven directional interpolation algorithm based on a structure tensor approach is developed. Quantitative evaluation shows that this method clearly outperforms conventional scene-based interpolation schemes. Furthermore, the image quality trade-offs associated with the use of interpolated intermediate views are systematically evaluated for simulated and clinical cone-beam computed tomography data sets of the human head. It is found that utilization of directionally interpolated views significantly reduces streak artifacts and noise, at the expense of small introduced image blur.

  10. RE-PLAN: An Extensible Software Architecture to Facilitate Disaster Response Planning

    PubMed Central

    O’Neill, Martin; Mikler, Armin R.; Indrakanti, Saratchandra; Tiwari, Chetan; Jimenez, Tamara

    2014-01-01

    Computational tools are needed to make data-driven disaster mitigation planning accessible to planners and policymakers without the need for programming or GIS expertise. To address this problem, we have created modules to facilitate quantitative analyses pertinent to a variety of different disaster scenarios. These modules, which comprise the REsponse PLan ANalyzer (RE-PLAN) framework, may be used to create tools for specific disaster scenarios that allow planners to harness large amounts of disparate data and execute computational models through a point-and-click interface. Bio-E, a user-friendly tool built using this framework, was designed to develop and analyze the feasibility of ad hoc clinics for treating populations following a biological emergency event. In this article, the design and implementation of the RE-PLAN framework are described, and the functionality of the modules used in the Bio-E biological emergency mitigation tool are demonstrated. PMID:25419503

  11. Atomistic simulations of carbon diffusion and segregation in liquid silicon

    NASA Astrophysics Data System (ADS)

    Luo, Jinping; Alateeqi, Abdullah; Liu, Lijun; Sinno, Talid

    2017-12-01

    The diffusivity of carbon atoms in liquid silicon and their equilibrium distribution between the silicon melt and crystal phases are key, but unfortunately not precisely known parameters for the global models of silicon solidification processes. In this study, we apply a suite of molecular simulation tools, driven by multiple empirical potential models, to compute diffusion and segregation coefficients of carbon at the silicon melting temperature. We generally find good consistency across the potential model predictions, although some exceptions are identified and discussed. We also find good agreement with the range of available experimental measurements of segregation coefficients. However, the carbon diffusion coefficients we compute are significantly lower than the values typically assumed in continuum models of impurity distribution. Overall, we show that currently available empirical potential models may be useful, at least semi-quantitatively, for studying carbon (and possibly other impurity) transport in silicon solidification, especially if a multi-model approach is taken.

  12. A perspective on bridging scales and design of models using low-dimensional manifolds and data-driven model inference

    PubMed Central

    Zenil, Hector; Kiani, Narsis A.; Ball, Gordon; Gomez-Cabrero, David

    2016-01-01

    Systems in nature capable of collective behaviour are nonlinear, operating across several scales. Yet our ability to account for their collective dynamics differs in physics, chemistry and biology. Here, we briefly review the similarities and differences between mathematical modelling of adaptive living systems versus physico-chemical systems. We find that physics-based chemistry modelling and computational neuroscience have a shared interest in developing techniques for model reductions aiming at the identification of a reduced subsystem or slow manifold, capturing the effective dynamics. By contrast, as relations and kinetics between biological molecules are less characterized, current quantitative analysis under the umbrella of bioinformatics focuses on signal extraction, correlation, regression and machine-learning analysis. We argue that model reduction analysis and the ensuing identification of manifolds bridges physics and biology. Furthermore, modelling living systems presents deep challenges as how to reconcile rich molecular data with inherent modelling uncertainties (formalism, variables selection and model parameters). We anticipate a new generative data-driven modelling paradigm constrained by identified governing principles extracted from low-dimensional manifold analysis. The rise of a new generation of models will ultimately connect biology to quantitative mechanistic descriptions, thereby setting the stage for investigating the character of the model language and principles driving living systems. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698038

  13. Educational Accountability: A Qualitatively Driven Mixed-Methods Approach

    ERIC Educational Resources Information Center

    Hall, Jori N.; Ryan, Katherine E.

    2011-01-01

    This article discusses the importance of mixed-methods research, in particular the value of qualitatively driven mixed-methods research for quantitatively driven domains like educational accountability. The article demonstrates the merits of qualitative thinking by describing a mixed-methods study that focuses on a middle school's system of…

  14. A quantitative approach to evolution of music and philosophy

    NASA Astrophysics Data System (ADS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano

    2012-08-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.

  15. Performance limits and trade-offs in entropy-driven biochemical computers.

    PubMed

    Chu, Dominique

    2018-04-14

    It is now widely accepted that biochemical reaction networks can perform computations. Examples are kinetic proof reading, gene regulation, or signalling networks. For many of these systems it was found that their computational performance is limited by a trade-off between the metabolic cost, the speed and the accuracy of the computation. In order to gain insight into the origins of these trade-offs, we consider entropy-driven computers as a model of biochemical computation. Using tools from stochastic thermodynamics, we show that entropy-driven computation is subject to a trade-off between accuracy and metabolic cost, but does not involve time-trade-offs. Time trade-offs appear when it is taken into account that the result of the computation needs to be measured in order to be known. We argue that this measurement process, although usually ignored, is a major contributor to the cost of biochemical computation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Validation of finite element computations for the quantitative prediction of underwater noise from impact pile driving.

    PubMed

    Zampolli, Mario; Nijhof, Marten J J; de Jong, Christ A F; Ainslie, Michael A; Jansen, Erwin H W; Quesson, Benoit A J

    2013-01-01

    The acoustic radiation from a pile being driven into the sediment by a sequence of hammer strikes is studied with a linear, axisymmetric, structural acoustic frequency domain finite element model. Each hammer strike results in an impulsive sound that is emitted from the pile and then propagated in the shallow water waveguide. Measurements from accelerometers mounted on the head of a test pile and from hydrophones deployed in the water are used to validate the model results. Transfer functions between the force input at the top of the anvil and field quantities, such as acceleration components in the structure or pressure in the fluid, are computed with the model. These transfer functions are validated using accelerometer or hydrophone measurements to infer the structural forcing. A modeled hammer forcing pulse is used in the successive step to produce quantitative predictions of sound exposure at the hydrophones. The comparison between the model and the measurements shows that, although several simplifying assumptions were made, useful predictions of noise levels based on linear structural acoustic models are possible. In the final part of the paper, the model is used to characterize the pile as an acoustic radiator by analyzing the flow of acoustic energy.

  17. Landform Erosion and Volatile Redistribution on Ganymede and Callisto

    NASA Technical Reports Server (NTRS)

    Moore, Jeffrey Morgan; Howard, Alan D.; McKinnon, William B.; Schenk, Paul M.; Wood, Stephen E.

    2009-01-01

    We have been modeling landscape evolution on the Galilean satellites driven by volatile transport. Our work directly addresses some of the most fundamental issues pertinent to deciphering icy Galilean satellite geologic histories by employing techniques currently at the forefront of terrestrial, martian, and icy satellite landscape evolution studies [e.g., 1-6], including modeling of surface and subsurface energy and volatile exchanges, and computer simulation of long-term landform evolution by a variety of processes. A quantitative understanding of the expression and rates of landform erosion, and of volatile redistribution on landforms, is especially essential in interpreting endogenic landforms that have, in many cases, been significantly modified by erosion [e.g., 7-9].

  18. Improved Quantitative Plant Proteomics via the Combination of Targeted and Untargeted Data Acquisition

    PubMed Central

    Hart-Smith, Gene; Reis, Rodrigo S.; Waterhouse, Peter M.; Wilkins, Marc R.

    2017-01-01

    Quantitative proteomics strategies – which are playing important roles in the expanding field of plant molecular systems biology – are traditionally designated as either hypothesis driven or non-hypothesis driven. Many of these strategies aim to select individual peptide ions for tandem mass spectrometry (MS/MS), and to do this mixed hypothesis driven and non-hypothesis driven approaches are theoretically simple to implement. In-depth investigations into the efficacies of such approaches have, however, yet to be described. In this study, using combined samples of unlabeled and metabolically 15N-labeled Arabidopsis thaliana proteins, we investigate the mixed use of targeted data acquisition (TDA) and data dependent acquisition (DDA) – referred to as TDA/DDA – to facilitate both hypothesis driven and non-hypothesis driven quantitative data collection in individual LC-MS/MS experiments. To investigate TDA/DDA for hypothesis driven data collection, 7 miRNA target proteins of differing size and abundance were targeted using inclusion lists comprised of 1558 m/z values, using 3 different TDA/DDA experimental designs. In samples in which targeted peptide ions were of particularly low abundance (i.e., predominantly only marginally above mass analyser detection limits), TDA/DDA produced statistically significant increases in the number of targeted peptides identified (230 ± 8 versus 80 ± 3 for DDA; p = 1.1 × 10-3) and quantified (35 ± 3 versus 21 ± 2 for DDA; p = 0.038) per experiment relative to the use of DDA only. These expected improvements in hypothesis driven data collection were observed alongside unexpected improvements in non-hypothesis driven data collection. Untargeted peptide ions with m/z values matching those in inclusion lists were repeatedly identified and quantified across technical replicate TDA/DDA experiments, resulting in significant increases in the percentages of proteins repeatedly quantified in TDA/DDA experiments only relative to DDA experiments only (33.0 ± 2.6% versus 8.0 ± 2.7%, respectively; p = 0.011). These results were observed together with uncompromised broad-scale MS/MS data collection in TDA/DDA experiments relative to DDA experiments. Using our observations we provide guidelines for TDA/DDA method design for quantitative plant proteomics studies, and suggest that TDA/DDA is a broadly underutilized proteomics data acquisition strategy. PMID:29021799

  19. Cellular network entropy as the energy potential in Waddington's differentiation landscape

    PubMed Central

    Banerji, Christopher R. S.; Miranda-Saavedra, Diego; Severini, Simone; Widschwendter, Martin; Enver, Tariq; Zhou, Joseph X.; Teschendorff, Andrew E.

    2013-01-01

    Differentiation is a key cellular process in normal tissue development that is significantly altered in cancer. Although molecular signatures characterising pluripotency and multipotency exist, there is, as yet, no single quantitative mark of a cellular sample's position in the global differentiation hierarchy. Here we adopt a systems view and consider the sample's network entropy, a measure of signaling pathway promiscuity, computable from a sample's genome-wide expression profile. We demonstrate that network entropy provides a quantitative, in-silico, readout of the average undifferentiated state of the profiled cells, recapitulating the known hierarchy of pluripotent, multipotent and differentiated cell types. Network entropy further exhibits dynamic changes in time course differentiation data, and in line with a sample's differentiation stage. In disease, network entropy predicts a higher level of cellular plasticity in cancer stem cell populations compared to ordinary cancer cells. Importantly, network entropy also allows identification of key differentiation pathways. Our results are consistent with the view that pluripotency is a statistical property defined at the cellular population level, correlating with intra-sample heterogeneity, and driven by the degree of signaling promiscuity in cells. In summary, network entropy provides a quantitative measure of a cell's undifferentiated state, defining its elevation in Waddington's landscape. PMID:24154593

  20. Self-sorting of dynamic metallosupramolecular libraries (DMLs) via metal-driven selection.

    PubMed

    Kocsis, Istvan; Dumitrescu, Dan; Legrand, Yves-Marie; van der Lee, Arie; Grosu, Ion; Barboiu, Mihail

    2014-03-11

    "Metal-driven" selection between finite mononuclear and polymeric metallosupramolecular species can be quantitatively achieved in solution and in a crystalline state via coupled coordination/stacking interactional algorithms within dynamic metallosupramolecular libraries - DMLs.

  1. Consistent data-driven computational mechanics

    NASA Astrophysics Data System (ADS)

    González, D.; Chinesta, F.; Cueto, E.

    2018-05-01

    We present a novel method, within the realm of data-driven computational mechanics, to obtain reliable and thermodynamically sound simulation from experimental data. We thus avoid the need to fit any phenomenological model in the construction of the simulation model. This kind of techniques opens unprecedented possibilities in the framework of data-driven application systems and, particularly, in the paradigm of industry 4.0.

  2. Survival Mode: The Stresses and Strains of Computing Curricula Review

    ERIC Educational Resources Information Center

    Tan, Grace; Venables, Anne

    2008-01-01

    In an ideal world, review and changes to computing curricula should be driven solely by academic concerns for the needs of students. The process should be informed by industry accreditation processes and international best practice (Hurst et al., 2001). However, Australian computing curricular review is often driven by the need for financial…

  3. Computer-Based Grammar Instruction in an EFL Context: Improving the Effectiveness of Teaching Adverbial Clauses

    ERIC Educational Resources Information Center

    Kiliçkaya, Ferit

    2015-01-01

    This study aims to find out whether there are any statistically significant differences in participants' achievements on three different types of instruction: computer-based instruction, teacher-driven instruction, and teacher-driven grammar supported by computer-based instruction. Each type of instruction follows the deductive approach. The…

  4. Core-collapse supernovae as supercomputing science: A status report toward six-dimensional simulations with exact Boltzmann neutrino transport in full general relativity

    NASA Astrophysics Data System (ADS)

    Kotake, Kei; Sumiyoshi, Kohsuke; Yamada, Shoichi; Takiwaki, Tomoya; Kuroda, Takami; Suwa, Yudai; Nagakura, Hiroki

    2012-08-01

    This is a status report on our endeavor to reveal the mechanism of core-collapse supernovae (CCSNe) by large-scale numerical simulations. Multi-dimensionality of the supernova engine, general relativistic magnetohydrodynamics, energy and lepton number transport by neutrinos emitted from the forming neutron star, as well as nuclear interactions there, are all believed to play crucial roles in repelling infalling matter and producing energetic explosions. These ingredients are non-linearly coupled with one another in the dynamics of core collapse, bounce, and shock expansion. Serious quantitative studies of CCSNe hence make extensive numerical computations mandatory. Since neutrinos are neither in thermal nor in chemical equilibrium in general, their distributions in the phase space should be computed. This is a six-dimensional (6D) neutrino transport problem and quite a challenge, even for those with access to the most advanced numerical resources such as the "K computer". To tackle this problem, we have embarked on efforts on multiple fronts. In particular, we report in this paper our recent progresses in the treatment of multidimensional (multi-D) radiation hydrodynamics. We are currently proceeding on two different paths to the ultimate goal. In one approach, we employ an approximate but highly efficient scheme for neutrino transport and treat 3D hydrodynamics and/or general relativity rigorously; some neutrino-driven explosions will be presented and quantitative comparisons will be made between 2D and 3D models. In the second approach, on the other hand, exact, but so far Newtonian, Boltzmann equations are solved in two and three spatial dimensions; we will show some example test simulations. We will also address the perspectives of exascale computations on the next generation supercomputers.

  5. Visualizing histopathologic deep learning classification and anomaly detection using nonlinear feature space dimensionality reduction.

    PubMed

    Faust, Kevin; Xie, Quin; Han, Dominick; Goyle, Kartikay; Volynskaya, Zoya; Djuric, Ugljesa; Diamandis, Phedias

    2018-05-16

    There is growing interest in utilizing artificial intelligence, and particularly deep learning, for computer vision in histopathology. While accumulating studies highlight expert-level performance of convolutional neural networks (CNNs) on focused classification tasks, most studies rely on probability distribution scores with empirically defined cutoff values based on post-hoc analysis. More generalizable tools that allow humans to visualize histology-based deep learning inferences and decision making are scarce. Here, we leverage t-distributed Stochastic Neighbor Embedding (t-SNE) to reduce dimensionality and depict how CNNs organize histomorphologic information. Unique to our workflow, we develop a quantitative and transparent approach to visualizing classification decisions prior to softmax compression. By discretizing the relationships between classes on the t-SNE plot, we show we can super-impose randomly sampled regions of test images and use their distribution to render statistically-driven classifications. Therefore, in addition to providing intuitive outputs for human review, this visual approach can carry out automated and objective multi-class classifications similar to more traditional and less-transparent categorical probability distribution scores. Importantly, this novel classification approach is driven by a priori statistically defined cutoffs. It therefore serves as a generalizable classification and anomaly detection tool less reliant on post-hoc tuning. Routine incorporation of this convenient approach for quantitative visualization and error reduction in histopathology aims to accelerate early adoption of CNNs into generalized real-world applications where unanticipated and previously untrained classes are often encountered.

  6. Three-dimensional Cascaded Lattice Boltzmann Model for Thermal Convective Flows

    NASA Astrophysics Data System (ADS)

    Hajabdollahi, Farzaneh; Premnath, Kannan

    2017-11-01

    Fluid motion driven by thermal effects, such as due to buoyancy in differentially heated enclosures arise in several natural and industrial settings, whose understanding can be achieved via numerical simulations. Lattice Boltzmann (LB) methods are efficient kinetic computational approaches for coupled flow physics problems. In this study, we develop three-dimensional (3D) LB models based on central moments and multiple relaxation times for D3Q7 and D3Q15 lattices to solve the energy transport equations in a double distribution function approach. Their collision operators lead to a cascaded structure involving higher order terms resulting in improved stability. This is coupled to a central moment based LB flow solver with source terms. The new 3D cascaded LB models for the convective flows are first validated for natural convection of air driven thermally on two vertically opposite faces in a cubic cavity at different Rayleigh numbers against prior numerical and experimental data, which show good quantitative agreement. Then, the detailed structure of the 3D flow and thermal fields and the heat transfer rates at different Rayleigh numbers are analyzed and interpreted.

  7. General Relativistic Effects on Neutrino-driven Winds from Young, Hot Neutron Stars and r-Process Nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Otsuki, Kaori; Tagoshi, Hideyuki; Kajino, Toshitaka; Wanajo, Shin-ya

    2000-04-01

    Neutrino-driven winds from young hot neutron stars, which are formed by supernova explosions, are the most promising candidate site for r-process nucleosynthesis. We study general relativistic effects on this wind in Schwarzschild geometry in order to look for suitable conditions for successful r-process nucleosynthesis. It is quantitatively demonstrated that general relativistic effects play a significant role in increasing the entropy and decreasing the dynamic timescale of the neutrino-driven wind. Exploring the wide parameter region that determines the expansion dynamics of the wind, we find interesting physical conditions that lead to successful r-process nucleosynthesis. The conditions that we found are realized in a neutrino-driven wind with a very short dynamic timescale, τdyn~6 ms, and a relatively low entropy, S~140. We carry out α-process and r-process nucleosynthesis calculations on these conditions with our single network code, which includes over 3000 isotopes, and confirm quantitatively that the second and third r-process abundance peaks are produced in neutrino-driven winds.

  8. Logic integer programming models for signaling networks.

    PubMed

    Haus, Utz-Uwe; Niermann, Kathrin; Truemper, Klaus; Weismantel, Robert

    2009-05-01

    We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.

  9. Modeling of Nonlinear Beat Signals of TAE's

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Berk, Herbert; Breizman, Boris; Zheng, Linjin

    2012-03-01

    Experiments on Alcator C-Mod reveal Toroidal Alfven Eigenmodes (TAE) together with signals at various beat frequencies, including those at twice the mode frequency. The beat frequencies are sidebands driven by quadratic nonlinear terms in the MHD equations. These nonlinear sidebands have not yet been quantified by any existing codes. We extend the AEGIS code to capture nonlinear effects by treating the nonlinear terms as a driving source in the linear MHD solver. Our goal is to compute the spatial structure of the sidebands for realistic geometry and q-profile, which can be directly compared with experiment in order to interpret the phase contrast imaging diagnostic measurements and to enable the quantitative determination of the Alfven wave amplitude in the plasma core

  10. Assistive Technology as an artificial intelligence opportunity: Case study of letter-based, head movement driven communication.

    PubMed

    Miksztai-Réthey, Brigitta; Faragó, Kinga Bettina

    2015-01-01

    We studied an artificial intelligent assisted interaction between a computer and a human with severe speech and physical impairments (SSPI). In order to speed up AAC, we extended a former study of typing performance optimization using a framework that included head movement controlled assistive technology and an onscreen writing device. Quantitative and qualitative data were collected and analysed with mathematical methods, manual interpretation and semi-supervised machine video annotation. As the result of our research, in contrast to the former experiment's conclusions, we found that our participant had at least two different typing strategies. To maximize his communication efficiency, a more complex assistive tool is suggested, which takes the different methods into consideration.

  11. Application of Phase-Field Techniques to Hydraulically- and Deformation-Induced Fracture.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Culp, David; Miller, Nathan; Schweizer, Laura

    Phase-field techniques provide an alternative approach to fracture problems which mitigate some of the computational expense associated with tracking the crack interface and the coalescence of individual fractures. The technique is extended to apply to hydraulically driven fracture such as would occur during fracking or CO 2 sequestration. Additionally, the technique is applied to a stainless steel specimen used in the Sandia Fracture Challenge. It was found that the phase-field model performs very well, at least qualitatively, in both deformation-induced fracture and hydraulically-induced fracture, though spurious hourglassing modes were observed during coupled hydralically-induced fracture. Future work would include performing additionalmore » quantitative benchmark tests and updating the model as needed.« less

  12. Intertwined electron-nuclear motion in frustrated double ionization in driven heteronuclear molecules

    NASA Astrophysics Data System (ADS)

    Vilà, A.; Zhu, J.; Scrinzi, A.; Emmanouilidou, A.

    2018-03-01

    We study frustrated double ionization (FDI) in a strongly-driven heteronuclear molecule HeH+ and compare with H2. We compute the probability distribution of the sum of the final kinetic energies of the nuclei for strongly-driven HeH+. We find that this distribution has more than one peak for strongly-driven HeH+, a feature we do not find to be present for strongly-driven H2. Moreover, we compute the probability distribution of the principal quantum number n of FDI. We find that this distribution has several peaks for strongly-driven HeH+, while the respective distribution has one main peak and a ‘shoulder’ at lower principal quantum numbers n for strongly-driven H2. Surprisingly, we find this feature to be a clear signature of the intertwined electron-nuclear motion.

  13. Clustering and Network Analysis of Reverse Phase Protein Array Data.

    PubMed

    Byron, Adam

    2017-01-01

    Molecular profiling of proteins and phosphoproteins using a reverse phase protein array (RPPA) platform, with a panel of target-specific antibodies, enables the parallel, quantitative proteomic analysis of many biological samples in a microarray format. Hence, RPPA analysis can generate a high volume of multidimensional data that must be effectively interrogated and interpreted. A range of computational techniques for data mining can be applied to detect and explore data structure and to form functional predictions from large datasets. Here, two approaches for the computational analysis of RPPA data are detailed: the identification of similar patterns of protein expression by hierarchical cluster analysis and the modeling of protein interactions and signaling relationships by network analysis. The protocols use freely available, cross-platform software, are easy to implement, and do not require any programming expertise. Serving as data-driven starting points for further in-depth analysis, validation, and biological experimentation, these and related bioinformatic approaches can accelerate the functional interpretation of RPPA data.

  14. Direct and Quantitative Characterization of Dynamic Ligand Exchange between Coordination-Driven Self-Assembled Supramolecular Polygons

    PubMed Central

    Zheng, Yao-Rong; Stang, Peter J.

    2009-01-01

    The direct observation of dynamic ligand exchange beween Pt-N coordination-driven self-assembled supramolecular polygons (triangles and rectangles) has been achieved using stable isotope labeling (1H/2D) of the pyridyl donors and electrospray ionization mass spectrometry (ESI-MS) together with NMR spectroscopy. Both the thermodynamic and kinetic aspects of such exchange processes have been established based on quantitative mass spectral results. Further investigation showed that the exchange is highly dependent on experimental conditions such as temperature, solvent, and the counter anions. PMID:19243144

  15. Direct and quantitative characterization of dynamic ligand exchange between coordination-driven self-assembled supramolecular polygons.

    PubMed

    Zheng, Yao-Rong; Stang, Peter J

    2009-03-18

    The direct observation of dynamic ligand exchange between Pt-N coordination-driven self-assembled supramolecular polygons (triangles and rectangles) has been achieved using stable (1)H/(2)D isotope labeling of the pyridyl donors and electrospray ionization mass spectrometry combined with NMR spectroscopy. Both the thermodynamic and kinetic aspects of such exchange processes have been established on the basis of quantitative mass spectral results. Further investigation has shown that the exchange is highly dependent on experimental conditions such as temperature, solvent, and the counteranions.

  16. Non-invasive methods for the determination of body and carcass composition in livestock: dual-energy X-ray absorptiometry, computed tomography, magnetic resonance imaging and ultrasound: invited review.

    PubMed

    Scholz, A M; Bünger, L; Kongsro, J; Baulain, U; Mitchell, A D

    2015-07-01

    The ability to accurately measure body or carcass composition is important for performance testing, grading and finally selection or payment of meat-producing animals. Advances especially in non-invasive techniques are mainly based on the development of electronic and computer-driven methods in order to provide objective phenotypic data. The preference for a specific technique depends on the target animal species or carcass, combined with technical and practical aspects such as accuracy, reliability, cost, portability, speed, ease of use, safety and for in vivo measurements the need for fixation or sedation. The techniques rely on specific device-driven signals, which interact with tissues in the body or carcass at the atomic or molecular level, resulting in secondary or attenuated signals detected by the instruments and analyzed quantitatively. The electromagnetic signal produced by the instrument may originate from mechanical energy such as sound waves (ultrasound - US), 'photon' radiation (X-ray-computed tomography - CT, dual-energy X-ray absorptiometry - DXA) or radio frequency waves (magnetic resonance imaging - MRI). The signals detected by the corresponding instruments are processed to measure, for example, tissue depths, areas, volumes or distributions of fat, muscle (water, protein) and partly bone or bone mineral. Among the above techniques, CT is the most accurate one followed by MRI and DXA, whereas US can be used for all sizes of farm animal species even under field conditions. CT, MRI and US can provide volume data, whereas only DXA delivers immediate whole-body composition results without (2D) image manipulation. A combination of simple US and more expensive CT, MRI or DXA might be applied for farm animal selection programs in a stepwise approach.

  17. Heat release and flame structure measurements of self-excited acoustically-driven premixed methane flames

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopp-Vaughan, Kristin M.; Tuttle, Steven G.; Renfro, Michael W.

    An open-open organ pipe burner (Rijke tube) with a bluff-body ring was used to create a self-excited, acoustically-driven, premixed methane-air conical flame, with equivalence ratios ranging from 0.85 to 1.05. The feed tube velocities corresponded to Re = 1780-4450. Coupled oscillations in pressure, velocity, and heat release from the flame are naturally encouraged at resonant frequencies in the Rijke tube combustor. This coupling creates sustainable self-excited oscillations in flame front area and shape. The period of the oscillations occur at the resonant frequency of the combustion chamber when the flame is placed {proportional_to}1/4 of the distance from the bottom ofmore » the tube. In this investigation, the shape of these acoustically-driven flames is measured by employing both OH planar laser-induced fluorescence (PLIF) and chemiluminescence imaging and the images are correlated to simultaneously measured pressure in the combustor. Past research on acoustically perturbed flames has focused on qualitative flame area and heat release relationships under imposed velocity perturbations at imposed frequencies. This study reports quantitative empirical fits with respect to pressure or phase angle in a self-generated pressure oscillation. The OH-PLIF images were single temporal shots and the chemiluminescence images were phase averaged on chip, such that 15 exposures were used to create one image. Thus, both measurements were time resolved during the flame oscillation. Phase-resolved area and heat release variations throughout the pressure oscillation were computed. A relation between flame area and the phase angle before the pressure maximum was derived for all flames in order to quantitatively show that the Rayleigh criterion was satisfied in the combustor. Qualitative trends in oscillating flame area were found with respect to feed tube flow rates. A logarithmic relation was found between the RMS pressure and both the normalized average area and heat release rate for all flames. (author)« less

  18. Limited angle CT reconstruction by simultaneous spatial and Radon domain regularization based on TV and data-driven tight frame

    NASA Astrophysics Data System (ADS)

    Zhang, Wenkun; Zhang, Hanming; Wang, Linyuan; Cai, Ailong; Li, Lei; Yan, Bin

    2018-02-01

    Limited angle computed tomography (CT) reconstruction is widely performed in medical diagnosis and industrial testing because of the size of objects, engine/armor inspection requirements, and limited scan flexibility. Limited angle reconstruction necessitates usage of optimization-based methods that utilize additional sparse priors. However, most of conventional methods solely exploit sparsity priors of spatial domains. When CT projection suffers from serious data deficiency or various noises, obtaining reconstruction images that meet the requirement of quality becomes difficult and challenging. To solve this problem, this paper developed an adaptive reconstruction method for limited angle CT problem. The proposed method simultaneously uses spatial and Radon domain regularization model based on total variation (TV) and data-driven tight frame. Data-driven tight frame being derived from wavelet transformation aims at exploiting sparsity priors of sinogram in Radon domain. Unlike existing works that utilize pre-constructed sparse transformation, the framelets of the data-driven regularization model can be adaptively learned from the latest projection data in the process of iterative reconstruction to provide optimal sparse approximations for given sinogram. At the same time, an effective alternating direction method is designed to solve the simultaneous spatial and Radon domain regularization model. The experiments for both simulation and real data demonstrate that the proposed algorithm shows better performance in artifacts depression and details preservation than the algorithms solely using regularization model of spatial domain. Quantitative evaluations for the results also indicate that the proposed algorithm applying learning strategy performs better than the dual domains algorithms without learning regularization model

  19. Temporal patterns of inputs to cerebellum necessary and sufficient for trace eyelid conditioning.

    PubMed

    Kalmbach, Brian E; Ohyama, Tatsuya; Mauk, Michael D

    2010-08-01

    Trace eyelid conditioning is a form of associative learning that requires several forebrain structures and cerebellum. Previous work suggests that at least two conditioned stimulus (CS)-driven signals are available to the cerebellum via mossy fiber inputs during trace conditioning: one driven by and terminating with the tone and a second driven by medial prefrontal cortex (mPFC) that persists through the stimulus-free trace interval to overlap in time with the unconditioned stimulus (US). We used electric stimulation of mossy fibers to determine whether this pattern of dual inputs is necessary and sufficient for cerebellar learning to express normal trace eyelid responses. We find that presenting the cerebellum with one input that mimics persistent activity observed in mPFC and the lateral pontine nuclei during trace eyelid conditioning and another that mimics tone-elicited mossy fiber activity is sufficient to produce responses whose properties quantitatively match trace eyelid responses using a tone. Probe trials with each input delivered separately provide evidence that the cerebellum learns to respond to the mPFC-like input (that overlaps with the US) and learns to suppress responding to the tone-like input (that does not). This contributes to precisely timed responses and the well-documented influence of tone offset on the timing of trace responses. Computer simulations suggest that the underlying cerebellar mechanisms involve activation of different subsets of granule cells during the tone and during the stimulus-free trace interval. These results indicate that tone-driven and mPFC-like inputs are necessary and sufficient for the cerebellum to learn well-timed trace conditioned responses.

  20. Driven damped harmonic oscillator resonance with an Arduino

    NASA Astrophysics Data System (ADS)

    Goncalves, A. M. B.; Cena, C. R.; Bozano, D. F.

    2017-07-01

    In this paper we propose a simple experimental apparatus that can be used to show quantitative and qualitative results of resonance in a driven damped harmonic oscillator. The driven oscillation is made by a servo motor, and the oscillation amplitude is measured by an ultrasonic position sensor. Both are controlled by an Arduino board. The frequency of free oscillation measured was campatible with the resonance frequency that was measured.

  1. Quantitative analysis of tympanic membrane perforation: a simple and reliable method.

    PubMed

    Ibekwe, T S; Adeosun, A A; Nwaorgu, O G

    2009-01-01

    Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.

  2. Mouse Driven Window Graphics for Network Teaching.

    ERIC Educational Resources Information Center

    Makinson, G. J.; And Others

    Computer enhanced teaching of computational mathematics on a network system driving graphics terminals is being redeveloped for a mouse-driven, high resolution, windowed environment of a UNIX work station. Preservation of the features of networked access by heterogeneous terminals is provided by the use of the X Window environment. A dmonstrator…

  3. Optimal protocols for slowly driven quantum systems.

    PubMed

    Zulkowski, Patrick R; DeWeese, Michael R

    2015-09-01

    The design of efficient quantum information processing will rely on optimal nonequilibrium transitions of driven quantum systems. Building on a recently developed geometric framework for computing optimal protocols for classical systems driven in finite time, we construct a general framework for optimizing the average information entropy for driven quantum systems. Geodesics on the parameter manifold endowed with a positive semidefinite metric correspond to protocols that minimize the average information entropy production in finite time. We use this framework to explicitly compute the optimal entropy production for a simple two-state quantum system coupled to a heat bath of bosonic oscillators, which has applications to quantum annealing.

  4. Purpose-driven biomaterials research in liver-tissue engineering.

    PubMed

    Ananthanarayanan, Abhishek; Narmada, Balakrishnan Chakrapani; Mo, Xuejun; McMillian, Michael; Yu, Hanry

    2011-03-01

    Bottom-up engineering of microscale tissue ("microtissue") constructs to recapitulate partially the complex structure-function relationships of liver parenchyma has been realized through the development of sophisticated biomaterial scaffolds, liver-cell sources, and in vitro culture techniques. With regard to in vivo applications, the long-lived stem/progenitor cell constructs can improve cell engraftment, whereas the short-lived, but highly functional hepatocyte constructs stimulate host liver regeneration. With regard to in vitro applications, microtissue constructs are being adapted or custom-engineered into cell-based assays for testing acute, chronic and idiosyncratic toxicities of drugs or pathogens. Systems-level methods and computational models that represent quantitative relationships between biomaterial scaffolds, cells and microtissue constructs will further enable their rational design for optimal integration into specific biomedical applications. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Granular Rayleigh-Taylor instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinningland, Jan Ludvig; Johnsen, Oistein; Flekkoey, Eirik G.

    2009-06-18

    A granular instability driven by gravity is studied experimentally and numerically. The instability arises as grains fall in a closed Hele-Shaw cell where a layer of dense granular material is positioned above a layer of air. The initially flat front defined by the grains subsequently develops into a pattern of falling granular fingers separated by rising bubbles of air. A transient coarsening of the front is observed right from the start by a finger merging process. The coarsening is later stabilized by new fingers growing from the center of the rising bubbles. The structures are quantified by means of Fouriermore » analysis and quantitative agreement between experiment and computation is shown. This analysis also reveals scale invariance of the flow structures under overall change of spatial scale.« less

  6. Non-invasive molecular imaging for preclinical cancer therapeutic development

    PubMed Central

    O'Farrell, AC; Shnyder, SD; Marston, G; Coletta, PL; Gill, JH

    2013-01-01

    Molecular and non-invasive imaging are rapidly emerging fields in preclinical cancer drug discovery. This is driven by the need to develop more efficacious and safer treatments, the advent of molecular-targeted therapeutics, and the requirements to reduce and refine current preclinical in vivo models. Such bioimaging strategies include MRI, PET, single positron emission computed tomography, ultrasound, and optical approaches such as bioluminescence and fluorescence imaging. These molecular imaging modalities have several advantages over traditional screening methods, not least the ability to quantitatively monitor pharmacodynamic changes at the cellular and molecular level in living animals non-invasively in real time. This review aims to provide an overview of non-invasive molecular imaging techniques, highlighting the strengths, limitations and versatility of these approaches in preclinical cancer drug discovery and development. PMID:23488622

  7. Computational pathology of pre-treatment biopsies identifies lymphocyte density as a predictor of response to neoadjuvant chemotherapy in breast cancer.

    PubMed

    Ali, H Raza; Dariush, Aliakbar; Provenzano, Elena; Bardwell, Helen; Abraham, Jean E; Iddawela, Mahesh; Vallier, Anne-Laure; Hiller, Louise; Dunn, Janet A; Bowden, Sarah J; Hickish, Tamas; McAdam, Karen; Houston, Stephen; Irwin, Mike J; Pharoah, Paul D P; Brenton, James D; Walton, Nicholas A; Earl, Helena M; Caldas, Carlos

    2016-02-16

    There is a need to improve prediction of response to chemotherapy in breast cancer in order to improve clinical management and this may be achieved by harnessing computational metrics of tissue pathology. We investigated the association between quantitative image metrics derived from computational analysis of digital pathology slides and response to chemotherapy in women with breast cancer who received neoadjuvant chemotherapy. We digitised tissue sections of both diagnostic and surgical samples of breast tumours from 768 patients enrolled in the Neo-tAnGo randomized controlled trial. We subjected digital images to systematic analysis optimised for detection of single cells. Machine-learning methods were used to classify cells as cancer, stromal or lymphocyte and we computed estimates of absolute numbers, relative fractions and cell densities using these data. Pathological complete response (pCR), a histological indicator of chemotherapy response, was the primary endpoint. Fifteen image metrics were tested for their association with pCR using univariate and multivariate logistic regression. Median lymphocyte density proved most strongly associated with pCR on univariate analysis (OR 4.46, 95 % CI 2.34-8.50, p < 0.0001; observations = 614) and on multivariate analysis (OR 2.42, 95 % CI 1.08-5.40, p = 0.03; observations = 406) after adjustment for clinical factors. Further exploratory analyses revealed that in approximately one quarter of cases there was an increase in lymphocyte density in the tumour removed at surgery compared to diagnostic biopsies. A reduction in lymphocyte density at surgery was strongly associated with pCR (OR 0.28, 95 % CI 0.17-0.47, p < 0.0001; observations = 553). A data-driven analysis of computational pathology reveals lymphocyte density as an independent predictor of pCR. Paradoxically an increase in lymphocyte density, following exposure to chemotherapy, is associated with a lack of pCR. Computational pathology can provide objective, quantitative and reproducible tissue metrics and represents a viable means of outcome prediction in breast cancer. ClinicalTrials.gov NCT00070278 ; 03/10/2003.

  8. Integrated computational model of the bioenergetics of isolated lung mitochondria

    PubMed Central

    Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.

    2018-01-01

    Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria. PMID:29889855

  9. Integrated computational model of the bioenergetics of isolated lung mitochondria.

    PubMed

    Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H

    2018-01-01

    Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria.

  10. Challenges in Scale-Resolving Simulations of turbulent wake flows with coherent structures

    NASA Astrophysics Data System (ADS)

    Pereira, Filipe S.; Eça, Luís; Vaz, Guilherme; Girimaji, Sharath S.

    2018-06-01

    The objective of this work is to investigate the challenges encountered in Scale-Resolving Simulations (SRS) of turbulent wake flows driven by spatially-developing coherent structures. SRS of practical interest are expressly intended for efficiently computing such flows by resolving only the most important features of the coherent structures and modelling the remainder as stochastic field. The success of SRS methods depends upon three important factors: i) ability to identify key flow mechanisms responsible for the generation of coherent structures; ii) determine the optimum range of resolution required to adequately capture key elements of coherent structures; and iii) ensure that the modelled part is comprised nearly exclusively of fully-developed stochastic turbulence. This study considers the canonical case of the flow around a circular cylinder to address the aforementioned three key issues. It is first demonstrated using experimental evidence that the vortex-shedding instability and flow-structure development involves four important stages. A series of SRS computations of progressively increasing resolution (decreasing cut-off length) are performed. An a priori basis for locating the origin of the coherent structures development is proposed and examined. The criterion is based on the fact that the coherent structures are generated by the Kelvin-Helmholtz (KH) instability. The most important finding is that the key aspects of coherent structures can be resolved only if the effective computational Reynolds number (based on total viscosity) exceeds the critical value of the KH instability in laminar flows. Finally, a quantitative criterion assessing the nature of the unresolved field based on the strain-rate ratio of mean and unresolved fields is examined. The two proposed conditions and rationale offer a quantitative basis for developing "good practice" guidelines for SRS of complex turbulent wake flows with coherent structures.

  11. Spatiotemporal dynamics of the Ebola epidemic in Guinea and implications for vaccination and disease elimination: a computational modeling analysis.

    PubMed

    Ajelli, Marco; Merler, Stefano; Fumanelli, Laura; Pastore Y Piontti, Ana; Dean, Natalie E; Longini, Ira M; Halloran, M Elizabeth; Vespignani, Alessandro

    2016-09-07

    Among the three countries most affected by the Ebola virus disease outbreak in 2014-2015, Guinea presents an unusual spatiotemporal epidemic pattern, with several waves and a long tail in the decay of the epidemic incidence. Here, we develop a stochastic agent-based model at the level of a single household that integrates detailed data on Guinean demography, hospitals, Ebola treatment units, contact tracing, and safe burial interventions. The microsimulation-based model is used to assess the effect of each control strategy and the probability of elimination of the epidemic according to different intervention scenarios, including ring vaccination with the recombinant vesicular stomatitis virus-vectored vaccine. The numerical results indicate that the dynamics of the Ebola epidemic in Guinea can be quantitatively explained by the timeline of the implemented interventions. In particular, the early availability of Ebola treatment units and the associated isolation of cases and safe burials helped to limit the number of Ebola cases experienced by Guinea. We provide quantitative evidence of a strong negative correlation between the time series of cases and the number of traced contacts. This result is confirmed by the computational model that suggests that contact tracing effort is a key determinant in the control and elimination of the disease. In data-driven microsimulations, we find that tracing at least 5-10 contacts per case is crucial in preventing epidemic resurgence during the epidemic elimination phase. The computational model is used to provide an analysis of the ring vaccination trial highlighting its potential effect on disease elimination. We identify contact tracing as one of the key determinants of the epidemic's behavior in Guinea, and we show that the early availability of Ebola treatment unit beds helped to limit the number of Ebola cases in Guinea.

  12. An appraisal of the classic forest succession paradigm with the shade tolerance index.

    PubMed

    Lienard, Jean; Florescu, Ionut; Strigul, Nikolay

    2015-01-01

    In this paper we revisit the classic theory of forest succession that relates shade tolerance and species replacement and assess its validity to understand patch-mosaic patterns of forested ecosystems of the USA. We introduce a macroscopic parameter called the "shade tolerance index" and compare it to the classic continuum index in southern Wisconsin forests. We exemplify shade tolerance driven succession in White Pine-Eastern Hemlock forests using computer simulations and analyzing approximated chronosequence data from the USDA FIA forest inventory. We describe this parameter across the last 50 years in the ecoregions of mainland USA, and demonstrate that it does not correlate with the usual macroscopic characteristics of stand age, biomass, basal area, and biodiversity measures. We characterize the dynamics of shade tolerance index using transition matrices and delimit geographical areas based on the relevance of shade tolerance to explain forest succession. We conclude that shade tolerance driven succession is linked to climatic variables and can be considered as a primary driving factor of forest dynamics mostly in central-north and northeastern areas in the USA. Overall, the shade tolerance index constitutes a new quantitative approach that can be used to understand and predict succession of forested ecosystems and biogeographic patterns.

  13. qPortal: A platform for data-driven biomedical research.

    PubMed

    Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven

    2018-01-01

    Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on high-performance computing systems via coupling of workflow management systems. Integration of project and data management as well as workflow resources in one place present clear advantages over existing solutions.

  14. An IRT Model with a Parameter-Driven Process for Change

    ERIC Educational Resources Information Center

    Rijmen, Frank; De Boeck, Paul; van der Maas, Han L. J.

    2005-01-01

    An IRT model with a parameter-driven process for change is proposed. Quantitative differences between persons are taken into account by a continuous latent variable, as in common IRT models. In addition, qualitative inter-individual differences and auto-dependencies are accounted for by assuming within-subject variability with respect to the…

  15. Decrease in Ground-Run Distance of Small Airplanes by Applying Electrically-Driven Wheels

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hiroshi; Nishizawa, Akira

    A new takeoff method for small airplanes was proposed. Ground-roll performance of an airplane driven by electrically-powered wheels was experimentally and computationally studied. The experiments verified that the ground-run distance was decreased by half with a combination of the powered driven wheels and propeller without increase of energy consumption during the ground-roll. The computational analysis showed the ground-run distance of the wheel-driven aircraft was independent of the motor power when the motor capability exceeded the friction between tires and ground. Furthermore, the distance was minimized when the angle of attack was set to the value so that the wing generated negative lift.

  16. Understanding shape entropy through local dense packing

    DOE PAGES

    van Anders, Greg; Klotsa, Daphne; Ahmed, N. Khalid; ...

    2014-10-24

    Entropy drives the phase behavior of colloids ranging from dense suspensions of hard spheres or rods to dilute suspensions of hard spheres and depletants. Entropic ordering of anisotropic shapes into complex crystals, liquid crystals, and even quasicrystals was demonstrated recently in computer simulations and experiments. The ordering of shapes appears to arise from the emergence of directional entropic forces (DEFs) that align neighboring particles, but these forces have been neither rigorously defined nor quantified in generic systems. In this paper, we show quantitatively that shape drives the phase behavior of systems of anisotropic particles upon crowding through DEFs. We definemore » DEFs in generic systems and compute them for several hard particle systems. We show they are on the order of a few times the thermal energy (k BT) at the onset of ordering, placing DEFs on par with traditional depletion, van der Waals, and other intrinsic interactions. In experimental systems with these other interactions, we provide direct quantitative evidence that entropic effects of shape also contribute to self-assembly. We use DEFs to draw a distinction between self-assembly and packing behavior. We show that the mechanism that generates directional entropic forces is the maximization of entropy by optimizing local particle packing. Finally, we show that this mechanism occurs in a wide class of systems and we treat, in a unified way, the entropy-driven phase behavior of arbitrary shapes, incorporating the well-known works of Kirkwood, Onsager, and Asakura and Oosawa.« less

  17. Design of a miniature implantable left ventricular assist device using CAD/CAM technology.

    PubMed

    Okamoto, Eiji; Hashimoto, Takuya; Mitamura, Yoshinori

    2003-01-01

    In this study, we developed a new miniature motor-driven pulsatile left ventricular assist device (LVAD) for implantation into a Japanese patient of average build by means of computer-aided design and manufacturing (CAD/CAM) technology. A specially designed miniature ball-screw and a high-performance brushless DC motor were used in an artificial heart actuator to allow miniaturization. A blood pump chamber (stroke volume 55 ml) and an inflow and outflow port were designed by computational fluid dynamics (CFD) analysis. The geometry of the blood pump was evaluated using the value of index of pump geometry (IPG) = (Reynolds shear stress) x (occupied volume) as a quantitative index for optimization. The calculated value of IPG varied from 20.6 Nm to 49.1 Nm, depending on small variations in pump geometry. We determined the optimum pump geometry based on the results of quantitative evaluation using IPG and qualitative evaluation using the flow velocity distribution with blood flow tracking. The geometry of the blood pump that gave lower shear stress had more optimum spiral flow around the diaphragm-housing (D-H) junction. The volume and weight of the new LVAD, made of epoxy resin, is 309 ml and 378 g, but further miniaturization will be possible by improving the geometry of both the blood pump and the back casing. Our results show that our new design method for an implantable LVAD using CAD/CAM promises to improve blood compatibility with greater miniaturization.

  18. Corpus of High School Academic Texts (COHAT): Data-Driven, Computer Assisted Discovery in Learning Academic English

    ERIC Educational Resources Information Center

    Bohát, Róbert; Rödlingová, Beata; Horáková, Nina

    2015-01-01

    Corpus of High School Academic Texts (COHAT), currently of 150,000+ words, aims to make academic language instruction a more data-driven and student-centered discovery learning as a special type of Computer-Assisted Language Learning (CALL), emphasizing students' critical thinking and metacognition. Since 2013, high school English as an additional…

  19. Data-Driven Learning: Taking the Computer out of the Equation

    ERIC Educational Resources Information Center

    Boulton, Alex

    2010-01-01

    Despite considerable research interest, data-driven learning (DDL) has not become part of mainstream teaching practice. It may be that technical aspects are too daunting for teachers and students, but there seems to be no reason why DDL in its early stages should not eliminate the computer from the equation by using prepared materials on…

  20. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    PubMed

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  1. Medical Impairment and Computational Reduction to a Single Whole Person Impairment (WPI) Rating Value

    NASA Astrophysics Data System (ADS)

    Artz, Jerry; Alchemy, John; Weilepp, Anne; Bongiovanni, Michael; Siddhartha, Kumar

    2014-03-01

    A medical, biophysics, engineering collaboration has produced a standardized cloud-based application for creating automated WPI ratings. The project assigns numerical values to injuries/illness in accordance with the American Medical Association Guides to the Evaluation of Permanent Impairment, Fifth Edition, AMA Press handbook, 5th edition (with 63 medical contributors and 89 medical reviewers). The AMA Guide serves as the industry standard for assigning impairment values for 32 US states and 190 other countries. Clinical medical data is collected using a menu-driven user interface which is computationally combined into a single numeric value. A medical doctor performs a biometric analysis and enters the quantitative data into a mobile device. The data is analyzed using proprietary validation algorithms, and a WPI Impairment rating is created. The findings are imbedded into a formalized medicolegal report in a matter of minutes. This particular presentation will concentrate upon the WPI rating of the spine--cervical, thoracic, and lumbar. Both common rating techniques will be presented--i.e., Diagnosis Related Estimates (DRE) and Range of Motion (ROM).

  2. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks

    PubMed Central

    Naveros, Francisco; Garrido, Jesus A.; Carrillo, Richard R.; Ros, Eduardo; Luque, Niceto R.

    2017-01-01

    Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under increasing levels of neural complexity. PMID:28223930

  3. Using scattering theory to compute invariant manifolds and numerical results for the laser-driven Hénon-Heiles system.

    PubMed

    Blazevski, Daniel; Franklin, Jennifer

    2012-12-01

    Scattering theory is a convenient way to describe systems that are subject to time-dependent perturbations which are localized in time. Using scattering theory, one can compute time-dependent invariant objects for the perturbed system knowing the invariant objects of the unperturbed system. In this paper, we use scattering theory to give numerical computations of invariant manifolds appearing in laser-driven reactions. In this setting, invariant manifolds separate regions of phase space that lead to different outcomes of the reaction and can be used to compute reaction rates.

  4. Instrumentation: Software-Driven Instrumentation: The New Wave.

    ERIC Educational Resources Information Center

    Salit, M. L.; Parsons, M. L.

    1985-01-01

    Software-driven instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The structure of such instrumentation, hardware requirements, and software requirements are discussed. Examples of software-driven instrumentation (such as wavelength-modulated continuum source…

  5. Molecular mechanics and dynamics characterization of an in silico mutated protein: a stand-alone lab module or support activity for in vivo and in vitro analyses of targeted proteins.

    PubMed

    Chiang, Harry; Robinson, Lucy C; Brame, Cynthia J; Messina, Troy C

    2013-01-01

    Over the past 20 years, the biological sciences have increasingly incorporated chemistry, physics, computer science, and mathematics to aid in the development and use of mathematical models. Such combined approaches have been used to address problems from protein structure-function relationships to the workings of complex biological systems. Computer simulations of molecular events can now be accomplished quickly and with standard computer technology. Also, simulation software is freely available for most computing platforms, and online support for the novice user is ample. We have therefore created a molecular dynamics laboratory module to enhance undergraduate student understanding of molecular events underlying organismal phenotype. This module builds on a previously described project in which students use site-directed mutagenesis to investigate functions of conserved sequence features in members of a eukaryotic protein kinase family. In this report, we detail the laboratory activities of a MD module that provide a complement to phenotypic outcomes by providing a hypothesis-driven and quantifiable measure of predicted structural changes caused by targeted mutations. We also present examples of analyses students may perform. These laboratory activities can be integrated with genetics or biochemistry experiments as described, but could also be used independently in any course that would benefit from a quantitative approach to protein structure-function relationships. Copyright © 2013 Wiley Periodicals, Inc.

  6. Application of solar max ACRIM data to analyze solar-driven climatic variability on Earth

    NASA Technical Reports Server (NTRS)

    Hoffert, M. I.

    1986-01-01

    Terrestrial climatic effects associated with solar variability have been proposed for at least a century, but could not be assessed quantitatively owing to observational uncertainities in solar flux variations. Measurements from 1980 to 1984 by the Active Cavity Radiometer Irradiance Monitor (ACRIM), capable of resolving fluctuations above the sensible atmosphere less than 0.1% of the solar constant, permit direct albeit preliminary assessments of solar forcing effects on global temperatures during this period. The global temperature response to ACRIM-measured fluctuations was computed from 1980 to 1985 using the NYU transient climate model including thermal inertia effects of the world ocean; and compared the results with observations of recent temperature trends. Monthly mean ACRIM-driven global surface temperature fluctuations computed with the climate model are an order of magnitude smaller, of order 0.01 C. In constrast, global mean surface temperature observations indicate an approx. 0.1 C increase during this period. Solar variability is therefore likely to have been a minor factor in global climate change during this period compared with variations in atmospheric albedo, greenhouse gases and internal self-inducedoscillations. It was not possible to extend the applicability of the measured flux variations to longer periods since a possible correlation of luminosity with solar annual activity is not supported by statistical analysis. The continuous monitoring of solar flux by satellite-based instruments over timescales of 20 years or more comparable to timescales for thermal relaxation of the oceans and of the solar cycle itself is needed to resolve the question of long-term solar variation effects on climate.

  7. Examining Data-Driven Decision Making in Private/Religious Schools

    ERIC Educational Resources Information Center

    Hanks, Jason Edward

    2011-01-01

    The purpose of this study was to investigate non-mandated data-driven decision making in private/religious schools. The school culture support of data use, teacher use of data, leader facilitation of using data, and the availability of data were investigated in three schools. A quantitative survey research design was used to explore the research…

  8. The smiling scan technique: Facially driven guided surgery and prosthetics.

    PubMed

    Pozzi, Alessandro; Arcuri, Lorenzo; Moy, Peter K

    2018-04-11

    To introduce a proof of concept technique and new integrated workflow to optimize the functional and esthetic outcome of the implant-supported restorations by means of a 3-dimensional (3D) facially-driven, digital assisted treatment plan. The Smiling Scan technique permits the creation of a virtual dental patient (VDP) showing a broad smile under static conditions. The patient is exposed to a cone beam computed tomography scan (CBCT), displaying a broad smile for the duration of the examination. Intraoral optical surface scanning (IOS) of the dental and soft tissue anatomy or extraoral optical surface scanning (EOS) of the study casts are achieved. The superimposition of the digital imaging and communications in medicine (DICOM) files with standard tessellation language (STL) files is performed using the virtual planning software program permitting the creation of a VDP. The smiling scan is an effective, easy to use, and low-cost technique to develop a more comprehensive and simplified facially driven computer-assisted treatment plan, allowing a prosthetically driven implant placement and the delivery of an immediate computer aided design (CAD) computer aided manufacturing (CAM) temporary fixed dental prostheses (CAD/CAM technology). Copyright © 2018 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  9. Out-of-equilibrium dynamics driven by localized time-dependent perturbations at quantum phase transitions

    NASA Astrophysics Data System (ADS)

    Pelissetto, Andrea; Rossini, Davide; Vicari, Ettore

    2018-03-01

    We investigate the quantum dynamics of many-body systems subject to local (i.e., restricted to a limited space region) time-dependent perturbations. If the system crosses a quantum phase transition, an off-equilibrium behavior is observed, even for a very slow driving. We show that, close to the transition, time-dependent quantities obey scaling laws. In first-order transitions, the scaling behavior is universal, and some scaling functions can be computed exactly. For continuous transitions, the scaling laws are controlled by the standard critical exponents and by the renormalization-group dimension of the perturbation at the transition. Our protocol can be implemented in existing relatively small quantum simulators, paving the way for a quantitative probe of the universal off-equilibrium scaling behavior, without the need to manipulate systems close to the thermodynamic limit.

  10. Invasion emerges from cancer cell adaptation to competitive microenvironments: Quantitative predictions from multiscale mathematical models

    PubMed Central

    Rejniak, Katarzyna A.; Gerlee, Philip

    2013-01-01

    Summary In this review we summarize our recent efforts using mathematical modeling and computation to simulate cancer invasion, with a special emphasis on the tumor microenvironment. We consider cancer progression as a complex multiscale process and approach it with three single-cell based mathematical models that examine the interactions between tumor microenvironment and cancer cells at several scales. The models exploit distinct mathematical and computational techniques, yet they share core elements and can be compared and/or related to each other. The overall aim of using mathematical models is to uncover the fundamental mechanisms that lend cancer progression its direction towards invasion and metastasis. The models effectively simulate various modes of cancer cell adaptation to the microenvironment in a growing tumor. All three point to a general mechanism underlying cancer invasion: competition for adaptation between distinct cancer cell phenotypes, driven by a tumor microenvironment with scarce resources. These theoretical predictions pose an intriguing experimental challenge: test the hypothesis that invasion is an emergent property of cancer cell populations adapting to selective microenvironment pressure, rather than culmination of cancer progression producing cells with the “invasive phenotype”. In broader terms, we propose that fundamental insights into cancer can be achieved by experimentation interacting with theoretical frameworks provided by computational and mathematical modeling. PMID:18524624

  11. Computer-aided design of liposomal drugs: In silico prediction and experimental validation of drug candidates for liposomal remote loading.

    PubMed

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-10

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs' structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al., J. Control. Release 160 (2012) 147-157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-Nearest Neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used by us in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. © 2013.

  12. Computer-aided design of liposomal drugs: in silico prediction and experimental validation of drug candidates for liposomal remote loading

    PubMed Central

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-01

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs’ structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al, Journal of Controlled Release, 160(2012) 14–157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-nearest neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. PMID:24184343

  13. Human systems immunology: hypothesis-based modeling and unbiased data-driven approaches.

    PubMed

    Arazi, Arnon; Pendergraft, William F; Ribeiro, Ruy M; Perelson, Alan S; Hacohen, Nir

    2013-10-31

    Systems immunology is an emerging paradigm that aims at a more systematic and quantitative understanding of the immune system. Two major approaches have been utilized to date in this field: unbiased data-driven modeling to comprehensively identify molecular and cellular components of a system and their interactions; and hypothesis-based quantitative modeling to understand the operating principles of a system by extracting a minimal set of variables and rules underlying them. In this review, we describe applications of the two approaches to the study of viral infections and autoimmune diseases in humans, and discuss possible ways by which these two approaches can synergize when applied to human immunology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Objective, Quantitative, Data-Driven Assessment of Chemical Probes.

    PubMed

    Antolin, Albert A; Tym, Joseph E; Komianou, Angeliki; Collins, Ian; Workman, Paul; Al-Lazikani, Bissan

    2018-02-15

    Chemical probes are essential tools for understanding biological systems and for target validation, yet selecting probes for biomedical research is rarely based on objective assessment of all potential compounds. Here, we describe the Probe Miner: Chemical Probes Objective Assessment resource, capitalizing on the plethora of public medicinal chemistry data to empower quantitative, objective, data-driven evaluation of chemical probes. We assess >1.8 million compounds for their suitability as chemical tools against 2,220 human targets and dissect the biases and limitations encountered. Probe Miner represents a valuable resource to aid the identification of potential chemical probes, particularly when used alongside expert curation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Scalable metadata environments (MDE): artistically impelled immersive environments for large-scale data exploration

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Margolis, Todd; Prudhomme, Andrew; Schulze, Jürgen P.; Mostafavi, Iman; Lewis, J. P.; Gossmann, Joachim; Singh, Rajvikram

    2014-02-01

    Scalable Metadata Environments (MDEs) are an artistic approach for designing immersive environments for large scale data exploration in which users interact with data by forming multiscale patterns that they alternatively disrupt and reform. Developed and prototyped as part of an art-science research collaboration, we define an MDE as a 4D virtual environment structured by quantitative and qualitative metadata describing multidimensional data collections. Entire data sets (e.g.10s of millions of records) can be visualized and sonified at multiple scales and at different levels of detail so they can be explored interactively in real-time within MDEs. They are designed to reflect similarities and differences in the underlying data or metadata such that patterns can be visually/aurally sorted in an exploratory fashion by an observer who is not familiar with the details of the mapping from data to visual, auditory or dynamic attributes. While many approaches for visual and auditory data mining exist, MDEs are distinct in that they utilize qualitative and quantitative data and metadata to construct multiple interrelated conceptual coordinate systems. These "regions" function as conceptual lattices for scalable auditory and visual representations within virtual environments computationally driven by multi-GPU CUDA-enabled fluid dyamics systems.

  16. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  17. Empirical Performance Model-Driven Data Layout Optimization and Library Call Selection for Tensor Contraction Expressions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Qingda; Gao, Xiaoyang; Krishnamoorthy, Sriram

    Empirical optimizers like ATLAS have been very effective in optimizing computational kernels in libraries. The best choice of parameters such as tile size and degree of loop unrolling is determined by executing different versions of the computation. In contrast, optimizing compilers use a model-driven approach to program transformation. While the model-driven approach of optimizing compilers is generally orders of magnitude faster than ATLAS-like library generators, its effectiveness can be limited by the accuracy of the performance models used. In this paper, we describe an approach where a class of computations is modeled in terms of constituent operations that are empiricallymore » measured, thereby allowing modeling of the overall execution time. The performance model with empirically determined cost components is used to perform data layout optimization together with the selection of library calls and layout transformations in the context of the Tensor Contraction Engine, a compiler for a high-level domain-specific language for expressing computational models in quantum chemistry. The effectiveness of the approach is demonstrated through experimental measurements on representative computations from quantum chemistry.« less

  18. Computationally driven drug discovery meeting-3 - Verona (Italy): 4 - 6th of March 2014.

    PubMed

    Costantino, Gabriele

    2014-12-01

    The following article reports on the results and the outcome of a meeting organised at the Aptuit Auditorium in Verona (Italy), which highlighted the current applications of state-of-the-art computational science to drug design in Italy. The meeting, which had > 100 people in attendance, consisted of over 40 presentations and included keynote lectures given by world-renowned speakers. The topics included in the meeting are areas related to ligand and structure-based ligand design and library design and screening; it also provided discussion pertaining to chemometrics. The meeting also stressed the importance of public-private collaboration and reviewed the different approaches to computationally driven drug discovery taken within academia and industry. The meeting helped define the current position of state-of-the-art computational drug discovery in Italy, pointing out criticalities and assets. This kind of focused meeting is important in the sense that it lends the opportunity of a restricted yet representative community of fellow professionals to deeply discuss the current methodological approaches and provide future perspectives for computationally driven drug discovery.

  19. Computer system for definition of the quantitative geometry of musculature from CT images.

    PubMed

    Daniel, Matej; Iglic, Ales; Kralj-Iglic, Veronika; Konvicková, Svatava

    2005-02-01

    The computer system for quantitative determination of musculoskeletal geometry from computer tomography (CT) images has been developed. The computer system processes series of CT images to obtain three-dimensional (3D) model of bony structures where the effective muscle fibres can be interactively defined. Presented computer system has flexible modular structure and is suitable also for educational purposes.

  20. A Projection Quality-Driven Tube Current Modulation Method in Cone-Beam CT for IGRT: Proof of Concept.

    PubMed

    Men, Kuo; Dai, Jianrong

    2017-12-01

    To develop a projection quality-driven tube current modulation method in cone-beam computed tomography for image-guided radiotherapy based on the prior attenuation information obtained by the planning computed tomography and then evaluate its effect on a reduction in the imaging dose. The QCKV-1 phantom with different thicknesses (0-400 mm) of solid water upon it was used to simulate different attenuation (μ). Projections were acquired with a series of tube current-exposure time product (mAs) settings, and a 2-dimensional contrast to noise ratio was analyzed for each projection to create a lookup table of mAs versus 2-dimensional contrast to noise ratio, μ. Before a patient underwent computed tomography, the maximum attenuation [Formula: see text] within the 95% range of each projection angle (θ) was estimated according to the planning computed tomography images. Then, a desired 2-dimensional contrast to noise ratio value was selected, and the mAs setting at θ was calculated with the lookup table of mAs versus 2-dimensional contrast to noise ratio,[Formula: see text]. Three-dimensional cone-beam computed tomography images were reconstructed using the projections acquired with the selected mAs. The imaging dose was evaluated with a polymethyl methacrylate dosimetry phantom in terms of volume computed tomography dose index. Image quality was analyzed using a Catphan 503 phantom with an oval body annulus and a pelvis phantom. For the Catphan 503 phantom, the cone-beam computed tomography image obtained by the projection quality-driven tube current modulation method had a similar quality to that of conventional cone-beam computed tomography . However, the proposed method could reduce the imaging dose by 16% to 33% to achieve an equivalent contrast to noise ratio value. For the pelvis phantom, the structural similarity index was 0.992 with a dose reduction of 39.7% for the projection quality-driven tube current modulation method. The proposed method could reduce the additional dose to the patient while not degrading the image quality for cone-beam computed tomography. The projection quality-driven tube current modulation method could be especially beneficial to patients who undergo cone-beam computed tomography frequently during a treatment course.

  1. Use of Theory-Driven Research in Counseling: Investigating Three Counseling Psychology Journals from 1990 to 1999

    ERIC Educational Resources Information Center

    Karr, Carolyn A.; Larson, Lisa M.

    2005-01-01

    Three major journals in counseling psychology were sampled from 1990 to 1999 to assess the percentage of quantitative, empirical articles that were theory driven. Only 43% of the studies utilized a theory or model, and 57% predicted the relation between the variables, with few studies specifying the strength of the relation. Studies sampled in the…

  2. Near real-time forecasting for cholera decision making in Haiti after Hurricane Matthew

    PubMed Central

    Camacho, Anton; Grandesso, Francesco; Cohuet, Sandra; Lemaitre, Joseph C.; Rinaldo, Andrea

    2018-01-01

    Computational models of cholera transmission can provide objective insights into the course of an ongoing epidemic and aid decision making on allocation of health care resources. However, models are typically designed, calibrated and interpreted post-hoc. Here, we report the efforts of a team from academia, field research and humanitarian organizations to model in near real-time the Haitian cholera outbreak after Hurricane Matthew in October 2016, to assess risk and to quantitatively estimate the efficacy of a then ongoing vaccination campaign. A rainfall-driven, spatially-explicit meta-community model of cholera transmission was coupled to a data assimilation scheme for computing short-term projections of the epidemic in near real-time. The model was used to forecast cholera incidence for the months after the passage of the hurricane (October-December 2016) and to predict the impact of a planned oral cholera vaccination campaign. Our first projection, from October 29 to December 31, predicted the highest incidence in the departments of Grande Anse and Sud, accounting for about 45% of the total cases in Haiti. The projection included a second peak in cholera incidence in early December largely driven by heavy rainfall forecasts, confirming the urgency for rapid intervention. A second projection (from November 12 to December 31) used updated rainfall forecasts to estimate that 835 cases would be averted by vaccinations in Grande Anse (90% Prediction Interval [PI] 476-1284) and 995 in Sud (90% PI 508-2043). The experience gained by this modeling effort shows that state-of-the-art computational modeling and data-assimilation methods can produce informative near real-time projections of cholera incidence. Collaboration among modelers and field epidemiologists is indispensable to gain fast access to field data and to translate model results into operational recommendations for emergency management during an outbreak. Future efforts should thus draw together multi-disciplinary teams to ensure model outputs are appropriately based, interpreted and communicated. PMID:29768401

  3. Near real-time forecasting for cholera decision making in Haiti after Hurricane Matthew.

    PubMed

    Pasetto, Damiano; Finger, Flavio; Camacho, Anton; Grandesso, Francesco; Cohuet, Sandra; Lemaitre, Joseph C; Azman, Andrew S; Luquero, Francisco J; Bertuzzo, Enrico; Rinaldo, Andrea

    2018-05-01

    Computational models of cholera transmission can provide objective insights into the course of an ongoing epidemic and aid decision making on allocation of health care resources. However, models are typically designed, calibrated and interpreted post-hoc. Here, we report the efforts of a team from academia, field research and humanitarian organizations to model in near real-time the Haitian cholera outbreak after Hurricane Matthew in October 2016, to assess risk and to quantitatively estimate the efficacy of a then ongoing vaccination campaign. A rainfall-driven, spatially-explicit meta-community model of cholera transmission was coupled to a data assimilation scheme for computing short-term projections of the epidemic in near real-time. The model was used to forecast cholera incidence for the months after the passage of the hurricane (October-December 2016) and to predict the impact of a planned oral cholera vaccination campaign. Our first projection, from October 29 to December 31, predicted the highest incidence in the departments of Grande Anse and Sud, accounting for about 45% of the total cases in Haiti. The projection included a second peak in cholera incidence in early December largely driven by heavy rainfall forecasts, confirming the urgency for rapid intervention. A second projection (from November 12 to December 31) used updated rainfall forecasts to estimate that 835 cases would be averted by vaccinations in Grande Anse (90% Prediction Interval [PI] 476-1284) and 995 in Sud (90% PI 508-2043). The experience gained by this modeling effort shows that state-of-the-art computational modeling and data-assimilation methods can produce informative near real-time projections of cholera incidence. Collaboration among modelers and field epidemiologists is indispensable to gain fast access to field data and to translate model results into operational recommendations for emergency management during an outbreak. Future efforts should thus draw together multi-disciplinary teams to ensure model outputs are appropriately based, interpreted and communicated.

  4. Digital Suicide Prevention: Can Technology Become a Game-changer?

    PubMed

    Vahabzadeh, Arshya; Sahin, Ned; Kalali, Amir

    2016-01-01

    Suicide continues to be a leading cause of death and has been recognized as a significant public health issue. Rapid advances in data science can provide us with useful tools for suicide prevention, and help to dynamically assess suicide risk in quantitative data-driven ways. In this article, the authors highlight the most current international research in digital suicide prevention, including the use of machine learning, smartphone applications, and wearable sensor driven systems. The authors also discuss future opportunities for digital suicide prevention, and propose a novel Sensor-driven Mental State Assessment System.

  5. An Implemented Strategy for Campus Connectivity and Cooperative Computing.

    ERIC Educational Resources Information Center

    Halaris, Antony S.; Sloan, Lynda W.

    1989-01-01

    ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)

  6. A Combined MIS/DS Course Uses Lecture Capture Technology to "Level the Playing Field" in Student Numeracy

    ERIC Educational Resources Information Center

    Popovich, Karen

    2012-01-01

    This paper describes the process taken to develop a quantitative-based and Excel™-driven course that combines "BOTH" Management Information Systems (MIS) and Decision Science (DS) modeling outcomes and lays the foundation for upper level quantitative courses such as operations management, finance and strategic management. In addition,…

  7. Teaching Note--"By Any Means Necessary!" Infusing Socioeconomic Justice Content into Quantitative Research Course Work

    ERIC Educational Resources Information Center

    Slayter, Elspeth M.

    2017-01-01

    Existing research suggests a majority of faculty include social justice content in research courses but not through the use of existing quantitative data for in-class activities that foster mastery of data analysis and interpretation and curiosity about social justice-related topics. By modeling data-driven dialogue and the deconstruction of…

  8. Computational Trials: Unraveling Motility Phenotypes, Progression Patterns, and Treatment Options for Glioblastoma Multiforme

    PubMed Central

    Raman, Fabio; Scribner, Elizabeth; Saut, Olivier; Wenger, Cornelia; Colin, Thierry; Fathallah-Shaykh, Hassan M.

    2016-01-01

    Glioblastoma multiforme is a malignant brain tumor with poor prognosis and high morbidity due to its invasiveness. Hypoxia-driven motility and concentration-driven motility are two mechanisms of glioblastoma multiforme invasion in the brain. The use of anti-angiogenic drugs has uncovered new progression patterns of glioblastoma multiforme associated with significant differences in overall survival. Here, we apply a mathematical model of glioblastoma multiforme growth and invasion in humans and design computational trials using agents that target angiogenesis, tumor replication rates, or motility. The findings link highly-dispersive, moderately-dispersive, and hypoxia-driven tumors to the patterns observed in glioblastoma multiforme treated by anti-angiogenesis, consisting of progression by Expanding FLAIR, Expanding FLAIR + Necrosis, and Expanding Necrosis, respectively. Furthermore, replication rate-reducing strategies (e.g. Tumor Treating Fields) appear to be effective in highly-dispersive and moderately-dispersive tumors but not in hypoxia-driven tumors. The latter may respond to motility-reducing agents. In a population computational trial, with all three phenotypes, a correlation was observed between the efficacy of the rate-reducing agent and the prolongation of overall survival times. This research highlights the potential applications of computational trials and supports new hypotheses on glioblastoma multiforme phenotypes and treatment options. PMID:26756205

  9. User-driven sampling strategies in image exploitation

    NASA Astrophysics Data System (ADS)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  10. An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.

    PubMed

    Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong

    2016-01-01

    With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.

  11. An Appraisal of the Classic Forest Succession Paradigm with the Shade Tolerance Index

    PubMed Central

    Lienard, Jean; Florescu, Ionut; Strigul, Nikolay

    2015-01-01

    In this paper we revisit the classic theory of forest succession that relates shade tolerance and species replacement and assess its validity to understand patch-mosaic patterns of forested ecosystems of the USA. We introduce a macroscopic parameter called the “shade tolerance index” and compare it to the classic continuum index in southern Wisconsin forests. We exemplify shade tolerance driven succession in White Pine-Eastern Hemlock forests using computer simulations and analyzing approximated chronosequence data from the USDA FIA forest inventory. We describe this parameter across the last 50 years in the ecoregions of mainland USA, and demonstrate that it does not correlate with the usual macroscopic characteristics of stand age, biomass, basal area, and biodiversity measures. We characterize the dynamics of shade tolerance index using transition matrices and delimit geographical areas based on the relevance of shade tolerance to explain forest succession. We conclude that shade tolerance driven succession is linked to climatic variables and can be considered as a primary driving factor of forest dynamics mostly in central-north and northeastern areas in the USA. Overall, the shade tolerance index constitutes a new quantitative approach that can be used to understand and predict succession of forested ecosystems and biogeographic patterns. PMID:25658092

  12. Magnetically-actuated artificial cilia for microfluidic propulsion.

    PubMed

    Khaderi, S N; Craus, C B; Hussong, J; Schorr, N; Belardi, J; Westerweel, J; Prucker, O; Rühe, J; den Toonder, J M J; Onck, P R

    2011-06-21

    In this paper we quantitatively analyse the performance of magnetically-driven artificial cilia for lab-on-a-chip applications. The artificial cilia are fabricated using thin polymer films with embedded magnetic nano-particles and their deformation is studied under different external magnetic fields and flows. A coupled magneto-mechanical solid-fluid model that accurately captures the interaction between the magnetic field, cilia and fluid is used to simulate the cilia motion. The elastic and magnetic properties of the cilia are obtained by fitting the results of the computational model to the experimental data. The performance of the artificial cilia with a non-uniform cross-section is characterised using the numerical model for two channel configurations that are of practical importance: an open-loop and a closed-loop channel. We predict that the flow and pressure head generated by the artificial cilia can be as high as 18 microlitres per minute and 3 mm of water, respectively. We also study the effect of metachronal waves on the flow generated and show that the fluid propelled increases drastically compared to synchronously beating cilia, and is unidirectional. This increase is significant even when the phase difference between adjacent cilia is small. The obtained results provide guidelines for the optimal design of magnetically-driven artificial cilia for microfluidic propulsion.

  13. Flux-driven turbulence GDB simulations of the IWL Alcator C-Mod L-mode edge compared with experiment

    NASA Astrophysics Data System (ADS)

    Francisquez, Manaure; Zhu, Ben; Rogers, Barrett

    2017-10-01

    Prior to predicting confinement regime transitions in tokamaks one may need an accurate description of L-mode profiles and turbulence properties. These features determine the heat-flux width upon which wall integrity depends, a topic of major interest for research aid to ITER. To this end our work uses the GDB model to simulate the Alcator C-Mod edge and contributes support for its use in studying critical edge phenomena in current and future tokamaks. We carried out 3D electromagnetic flux-driven two-fluid turbulence simulations of inner wall limited (IWL) C-Mod shots spanning closed and open flux surfaces. These simulations are compared with gas puff imaging (GPI) and mirror Langmuir probe (MLP) data, examining global features and statistical properties of turbulent dynamics. GDB reproduces important qualitative aspects of the C-Mod edge regarding global density and temperature profiles, within reasonable margins, and though the turbulence statistics of the simulated turbulence follow similar quantitative trends questions remain about the code's difficulty in exactly predicting quantities like the autocorrelation time A proposed breakpoint in the near SOL pressure and the posited separation between drift and ballooning dynamics it represents are examined This work was supported by DOE-SC-0010508. This research used resources of the National Energy Research Scientific Computing Center (NERSC).

  14. Direct evidence for EMIC wave scattering of relativistic electrons in space: EMIC-Driven Electron Losses in Space

    DOE PAGES

    Zhang, X. -J.; Li, W.; Ma, Q.; ...

    2016-07-01

    Electromagnetic ion cyclotron (EMIC) waves have been proposed to cause efficient losses of highly relativistic (>1 MeV) electrons via gyroresonant interactions. Simultaneous observations of EMIC waves and equatorial electron pitch angle distributions, which can be used to directly quantify the EMIC wave scattering effect, are still very limited, however. In the present study, we evaluate the effect of EMIC waves on pitch angle scattering of ultrarelativistic (>1 MeV) electrons during the main phase of a geomagnetic storm, when intense EMIC wave activity was observed in situ (in the plasma plume region with high plasma density) on both Van Allen Probes.more » EMIC waves captured by Time History of Events and Macroscale Interactions during Substorms (THEMIS) probes and on the ground across the Canadian Array for Real-time Investigations of Magnetic Activity (CARISMA) are also used to infer their magnetic local time (MLT) coverage. From the observed EMIC wave spectra and local plasma parameters, we compute wave diffusion rates and model the evolution of electron pitch angle distributions. In conclusion, by comparing model results with local observations of pitch angle distributions, we show direct, quantitative evidence of EMIC wave-driven relativistic electron losses in the Earth’s outer radiation belt.« less

  15. Big-Data-Driven Stem Cell Science and Tissue Engineering: Vision and Unique Opportunities.

    PubMed

    Del Sol, Antonio; Thiesen, Hans J; Imitola, Jaime; Carazo Salas, Rafael E

    2017-02-02

    Achieving the promises of stem cell science to generate precise disease models and designer cell samples for personalized therapeutics will require harnessing pheno-genotypic cell-level data quantitatively and predictively in the lab and clinic. Those requirements could be met by developing a Big-Data-driven stem cell science strategy and community. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Comparison of the performance of tracer kinetic model-driven registration for dynamic contrast enhanced MRI using different models of contrast enhancement.

    PubMed

    Buonaccorsi, Giovanni A; Roberts, Caleb; Cheung, Sue; Watson, Yvonne; O'Connor, James P B; Davies, Karen; Jackson, Alan; Jayson, Gordon C; Parker, Geoff J M

    2006-09-01

    The quantitative analysis of dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) data is subject to model fitting errors caused by motion during the time-series data acquisition. However, the time-varying features that occur as a result of contrast enhancement can confound motion correction techniques based on conventional registration similarity measures. We have therefore developed a heuristic, locally controlled tracer kinetic model-driven registration procedure, in which the model accounts for contrast enhancement, and applied it to the registration of abdominal DCE-MRI data at high temporal resolution. Using severely motion-corrupted data sets that had been excluded from analysis in a clinical trial of an antiangiogenic agent, we compared the results obtained when using different models to drive the tracer kinetic model-driven registration with those obtained when using a conventional registration against the time series mean image volume. Using tracer kinetic model-driven registration, it was possible to improve model fitting by reducing the sum of squared errors but the improvement was only realized when using a model that adequately described the features of the time series data. The registration against the time series mean significantly distorted the time series data, as did tracer kinetic model-driven registration using a simpler model of contrast enhancement. When an appropriate model is used, tracer kinetic model-driven registration influences motion-corrupted model fit parameter estimates and provides significant improvements in localization in three-dimensional parameter maps. This has positive implications for the use of quantitative DCE-MRI for example in clinical trials of antiangiogenic or antivascular agents.

  17. Numerical Simulation and Quantitative Uncertainty Assessment of Microchannel Flow

    NASA Astrophysics Data System (ADS)

    Debusschere, Bert; Najm, Habib; Knio, Omar; Matta, Alain; Ghanem, Roger; Le Maitre, Olivier

    2002-11-01

    This study investigates the effect of uncertainty in physical model parameters on computed electrokinetic flow of proteins in a microchannel with a potassium phosphate buffer. The coupled momentum, species transport, and electrostatic field equations give a detailed representation of electroosmotic and pressure-driven flow, including sample dispersion mechanisms. The chemistry model accounts for pH-dependent protein labeling reactions as well as detailed buffer electrochemistry in a mixed finite-rate/equilibrium formulation. To quantify uncertainty, the governing equations are reformulated using a pseudo-spectral stochastic methodology, which uses polynomial chaos expansions to describe uncertain/stochastic model parameters, boundary conditions, and flow quantities. Integration of the resulting equations for the spectral mode strengths gives the evolution of all stochastic modes for all variables. Results show the spatiotemporal evolution of uncertainties in predicted quantities and highlight the dominant parameters contributing to these uncertainties during various flow phases. This work is supported by DARPA.

  18. Investigating Uncertainty and Sensitivity in Integrated, Multimedia Environmental Models: Tools for FRAMES-3MRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babendreier, Justin E.; Castleton, Karl J.

    2005-08-01

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less

  19. Prediction of binary nanoparticle superlattices from soft potentials

    DOE PAGES

    Horst, Nathan; Travesset, Alex

    2016-01-07

    Driven by the hypothesis that a sufficiently continuous short-ranged potential is able to account for shell flexibility and phonon modes and therefore provides a more realistic description of nanoparticle interactions than a hard sphere model, we compute the solid phase diagram of particles of different radii interacting with an inverse power law potential. From a pool of 24 candidate lattices, the free energy is optimized with respect to additional internal parameters and the p-exponent, determining the short-range properties of the potential, is varied between p = 12 and p = 6. The phase diagrams contain the phases found in ongoingmore » self-assembly experiments, including DNA programmable self-assembly and nanoparticles with capping ligands assembled by evaporation from an organic solvent. Thus, the resulting phase diagrams can be mapped quantitatively to existing experiments as a function of only two parameters: Nanoparticle radius ratio (γ) and softness asymmetry.« less

  20. High-Dimensional Mutant and Modular Thermodynamic Cycles, Molecular Switching, and Free Energy Transduction

    PubMed Central

    Carter, Charles W.

    2017-01-01

    Understanding how distinct parts of proteins produce coordinated behavior has driven and continues to drive advances in protein science and enzymology. However, despite consensus about the conceptual basis for allostery, the idiosyncratic nature of allosteric mechanisms resists general approaches. Computational methods can identify conformational transition states from structural changes, revealing common switching mechanisms that impose multistate behavior. Thermodynamic cycles use factorial perturbations to measure coupling energies between side chains in molecular switches that mediate shear during domain motion. Such cycles have now been complemented by modular cycles that measure energetic coupling between separable domains. For one model system, energetic coupling between domains has been shown to be quantitatively equivalent to that between dynamic side chains. Linkages between domain motion, switching residues, and catalysis make nucleoside triphosphate hydrolysis conditional on domain movement, confirming an essential yet neglected aspect of free energy transduction and suggesting the potential generality of these studies. PMID:28375734

  1. Computational methods in the development of a knowledge-based system for the prediction of solid catalyst performance.

    PubMed

    Procelewska, Joanna; Galilea, Javier Llamas; Clerc, Frederic; Farrusseng, David; Schüth, Ferdi

    2007-01-01

    The objective of this work is the construction of a correlation between characteristics of heterogeneous catalysts, encoded in a descriptor vector, and their experimentally measured performances in the propene oxidation reaction. In this paper the key issue in the modeling process, namely the selection of adequate input variables, is explored. Several data-driven feature selection strategies were applied in order to obtain an estimate of the differences in variance and information content of various attributes, furthermore to compare their relative importance. Quantitative property activity relationship techniques using probabilistic neural networks have been used for the creation of various semi-empirical models. Finally, a robust classification model, assigning selected attributes of solid compounds as input to an appropriate performance class in the model reaction was obtained. It has been evident that the mathematical support for the primary attributes set proposed by chemists can be highly desirable.

  2. Weld pool development during GTA and laser beam welding of Type 304 stainless steel; Part II-experimental correlation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zacharia, T.; David, S.A.; Vitek, J.M.

    1989-12-01

    In part I of the paper, the results of the heat flow and the fluid flow analysis were presented. Here, in Part II of the paper, predictions of the computational model are verified by comparing the numerically predicted and experimentally observed fusion zone size and shape. Stationary gas tungsten arc and laser beam welds were made on Type 304 stainless steel for different times to provide a variety of solidification conditions such as cooling rate and temperature gradient. Calculated temperatures and cooling rates are correlated with the experimentally observed fusion zone structure. In addition, the effect of sulfur on GTAmore » weld penetration was quantitatively evaluated by considering two heats of 304 stainless steel containing 90 and 240 ppm sulfur. Sulfur, as expected, increased the depth/width ratio by altering the surface tension gradient driven flow in the weld pool.« less

  3. Normalization is a general neural mechanism for context-dependent decision making

    PubMed Central

    Louie, Kenway; Khaw, Mel W.; Glimcher, Paul W.

    2013-01-01

    Understanding the neural code is critical to linking brain and behavior. In sensory systems, divisive normalization seems to be a canonical neural computation, observed in areas ranging from retina to cortex and mediating processes including contrast adaptation, surround suppression, visual attention, and multisensory integration. Recent electrophysiological studies have extended these insights beyond the sensory domain, demonstrating an analogous algorithm for the value signals that guide decision making, but the effects of normalization on choice behavior are unknown. Here, we show that choice models using normalization generate significant (and classically irrational) choice phenomena driven by either the value or number of alternative options. In value-guided choice experiments, both monkey and human choosers show novel context-dependent behavior consistent with normalization. These findings suggest that the neural mechanism of value coding critically influences stochastic choice behavior and provide a generalizable quantitative framework for examining context effects in decision making. PMID:23530203

  4. Prediction of Binary Nanoparticle Superlattices from Soft Potentials

    NASA Astrophysics Data System (ADS)

    Horst, Nathan; Travesset, Alex

    Driven by the hypothesis that a sufficiently continuous short-ranged potential is able to account for shell flexibility and phonon modes and therefore provides a more realistic description of nanoparticle interactions than a hard sphere model, we compute the solid phase diagram of particles of different radii interacting with an inverse power law potential. We explore 24 candidate lattices where the p-exponent, determining the short-range properties of the potential, is varied between p=12 and p=6, and optimize the free energy with respect to additional internal parameters. The phase diagrams contain the phases found in ongoing self-assembly experiments, including DNA programmable self-assembly and nanoparticles with capping ligands assembled by evaporation from an organic solvent. The resulting phase diagrams can be mapped quantitatively to existing experiments as a function of only two parameters: nanoparticle radius ratio (γ) and softness asymmetry (SA). Supported by DOE under Contract Number DE-AC02-07CH11358.

  5. Use of Airport Noise Complaint Files to Improve Understanding of Community Response to Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Fidell, Sanford; Howe, Richard

    1998-01-01

    This study assessed the feasibility of using complaint information archived by modem airport monitoring systems to conduct quantitative analyses of the causes of aircraft noise complaints and their relationship to noise- induced annoyance. It was found that all computer-based airport monitoring systems provide at least rudimentary tools for performing data base searches by complainant name, address, date, time of day, and types of aircraft and complaints. Analyses of such information can provide useful information about longstanding concerns, such as the extent to which complaint rates are driven by objectively measurable aspects of aircraft operations; the degree to which changes in complaint rates can be predicted prior to implementation of noise mitigation measures; and the degree to which aircraft complaint information can be used to simplify and otherwise improve prediction of the prevalence of noise-induced annoyance in communities.

  6. Prediction of binary nanoparticle superlattices from soft potentials

    NASA Astrophysics Data System (ADS)

    Horst, Nathan; Travesset, Alex

    2016-01-01

    Driven by the hypothesis that a sufficiently continuous short-ranged potential is able to account for shell flexibility and phonon modes and therefore provides a more realistic description of nanoparticle interactions than a hard sphere model, we compute the solid phase diagram of particles of different radii interacting with an inverse power law potential. From a pool of 24 candidate lattices, the free energy is optimized with respect to additional internal parameters and the p-exponent, determining the short-range properties of the potential, is varied between p = 12 and p = 6. The phase diagrams contain the phases found in ongoing self-assembly experiments, including DNA programmable self-assembly and nanoparticles with capping ligands assembled by evaporation from an organic solvent. The resulting phase diagrams can be mapped quantitatively to existing experiments as a function of only two parameters: Nanoparticle radius ratio (γ) and softness asymmetry.

  7. A review of models of fluctuating protrusion and retraction patterns at the leading edge of motile cells.

    PubMed

    Ryan, Gillian L; Watanabe, Naoki; Vavylonis, Dimitrios

    2012-04-01

    A characteristic feature of motile cells as they undergo a change in motile behavior is the development of fluctuating exploratory motions of the leading edge, driven by actin polymerization. We review quantitative models of these protrusion and retraction phenomena. Theoretical studies have been motivated by advances in experimental and computational methods that allow controlled perturbations, single molecule imaging, and analysis of spatiotemporal correlations in microscopic images. To explain oscillations and waves of the leading edge, most theoretical models propose nonlinear interactions and feedback mechanisms among different components of the actin cytoskeleton system. These mechanisms include curvature-sensing membrane proteins, myosin contraction, and autocatalytic biochemical reaction kinetics. We discuss how the combination of experimental studies with modeling promises to quantify the relative importance of these biochemical and biophysical processes at the leading edge and to evaluate their generality across cell types and extracellular environments. Copyright © 2012 Wiley Periodicals, Inc.

  8. Inactivity periods and postural change speed can explain atypical postural change patterns of Caenorhabditis elegans mutants.

    PubMed

    Fukunaga, Tsukasa; Iwasaki, Wataru

    2017-01-19

    With rapid advances in genome sequencing and editing technologies, systematic and quantitative analysis of animal behavior is expected to be another key to facilitating data-driven behavioral genetics. The nematode Caenorhabditis elegans is a model organism in this field. Several video-tracking systems are available for automatically recording behavioral data for the nematode, but computational methods for analyzing these data are still under development. In this study, we applied the Gaussian mixture model-based binning method to time-series postural data for 322 C. elegans strains. We revealed that the occurrence patterns of the postural states and the transition patterns among these states have a relationship as expected, and such a relationship must be taken into account to identify strains with atypical behaviors that are different from those of wild type. Based on this observation, we identified several strains that exhibit atypical transition patterns that cannot be fully explained by their occurrence patterns of postural states. Surprisingly, we found that two simple factors-overall acceleration of postural movement and elimination of inactivity periods-explained the behavioral characteristics of strains with very atypical transition patterns; therefore, computational analysis of animal behavior must be accompanied by evaluation of the effects of these simple factors. Finally, we found that the npr-1 and npr-3 mutants have similar behavioral patterns that were not predictable by sequence homology, proving that our data-driven approach can reveal the functions of genes that have not yet been characterized. We propose that elimination of inactivity periods and overall acceleration of postural change speed can explain behavioral phenotypes of strains with very atypical postural transition patterns. Our methods and results constitute guidelines for effectively finding strains that show "truly" interesting behaviors and systematically uncovering novel gene functions by bioimage-informatic approaches.

  9. Dual reporter transgene driven by 2.3Col1a1 promoter is active in differentiated osteoblasts

    NASA Technical Reports Server (NTRS)

    Marijanovic, Inga; Jiang, Xi; Kronenberg, Mark S.; Stover, Mary Louise; Erceg, Ivana; Lichtler, Alexander C.; Rowe, David W.

    2003-01-01

    AIM: As quantitative and spatial analyses of promoter reporter constructs are not easily performed in intact bone, we designed a reporter gene specific to bone, which could be analyzed both visually and quantitatively by using chloramphenicol acetyltransferase (CAT) and a cyan version of green fluorescent protein (GFPcyan), driven by a 2.3-kb fragment of the rat collagen promoter (Col2.3). METHODS: The construct Col2.3CATiresGFPcyan was used for generating transgenic mice. Quantitative measurement of promoter activity was performed by CAT analysis of different tissues derived from transgenic animals; localization was performed by visualized GFP in frozen bone sections. To assess transgene expression during in vitro differentiation, marrow stromal cell and neonatal calvarial osteoblast cultures were analyzed for CAT and GFP activity. RESULTS: In mice, CAT activity was detected in the calvaria, long bone, teeth, and tendon, whereas histology showed that GFP expression was limited to osteoblasts and osteocytes. In cell culture, increased activity of CAT correlated with increased differentiation, and GFP activity was restricted to mineralized nodules. CONCLUSION: The concept of a dual reporter allows a simultaneous visual and quantitative analysis of transgene activity in bone.

  10. Integration of Metabolic and Quorum Sensing Signals Governing the Decision to Cooperate in a Bacterial Social Trait

    PubMed Central

    Boyle, Kerry E.; Monaco, Hilary; van Ditmarsch, Dave; Deforet, Maxime; Xavier, Joao B.

    2015-01-01

    Many unicellular organisms live in multicellular communities that rely on cooperation between cells. However, cooperative traits are vulnerable to exploitation by non-cooperators (cheaters). We expand our understanding of the molecular mechanisms that allow multicellular systems to remain robust in the face of cheating by dissecting the dynamic regulation of cooperative rhamnolipids required for swarming in Pseudomonas aeruginosa. We combine mathematical modeling and experiments to quantitatively characterize the integration of metabolic and population density signals (quorum sensing) governing expression of the rhamnolipid synthesis operon rhlAB. The combined computational/experimental analysis reveals that when nutrients are abundant, rhlAB promoter activity increases gradually in a density dependent way. When growth slows down due to nutrient limitation, rhlAB promoter activity can stop abruptly, decrease gradually or even increase depending on whether the growth-limiting nutrient is the carbon source, nitrogen source or iron. Starvation by specific nutrients drives growth on intracellular nutrient pools as well as the qualitative rhlAB promoter response, which itself is modulated by quorum sensing. Our quantitative analysis suggests a supply-driven activation that integrates metabolic prudence with quorum sensing in a non-digital manner and allows P. aeruginosa cells to invest in cooperation only when the population size is large enough (quorum sensing) and individual cells have enough metabolic resources to do so (metabolic prudence). Thus, the quantitative description of rhlAB regulatory dynamics brings a greater understating to the regulation required to make swarming cooperation stable. PMID:26102206

  11. Comments on event driven animation

    NASA Technical Reports Server (NTRS)

    Gomez, Julian E.

    1987-01-01

    Event driven animation provides a general method of describing controlling values for various computer animation techniques. A definition and comments are provided on genralizing motion description with events. Additional comments are also provided about the implementation of twixt.

  12. Digital Suicide Prevention: Can Technology Become a Game-changer?

    PubMed Central

    Sahin, Ned; Kalali, Amir

    2016-01-01

    Suicide continues to be a leading cause of death and has been recognized as a significant public health issue. Rapid advances in data science can provide us with useful tools for suicide prevention, and help to dynamically assess suicide risk in quantitative data-driven ways. In this article, the authors highlight the most current international research in digital suicide prevention, including the use of machine learning, smartphone applications, and wearable sensor driven systems. The authors also discuss future opportunities for digital suicide prevention, and propose a novel Sensor-driven Mental State Assessment System. PMID:27800282

  13. Flectofold—a biomimetic compliant shading device for complex free form facades

    NASA Astrophysics Data System (ADS)

    Körner, A.; Born, L.; Mader, A.; Sachse, R.; Saffarian, S.; Westermeier, A. S.; Poppinga, S.; Bischoff, M.; Gresser, G. T.; Milwich, M.; Speck, T.; Knippers, J.

    2018-01-01

    Smart and adaptive outer façade shading systems are of high interest in modern architecture. For long lasting and reliable systems, the abandonment of hinges which often fail due to mechanical wear during repetitive use is of particular importance. Drawing inspiration from the hinge-less motion of the underwater snap-trap of the carnivorous waterwheel plant (Aldrovanda vesiculosa), the compliant façade shading device Flectofold was developed. Based on computational simulations of the biological role-model’s elastic and reversible motion, the actuation principle of the plant can be identified. The enclosed geometric motion principle is abstracted into a simplified curved-line folding geometry with distinct flexible hinge-zones. The kinematic behaviour is translated into a quantitative kinetic model, using finite element simulation which allows the detailed analyses of the influence of geometric parameters such as curved-fold line radius and various pneumatically driven actuation principles on the motion behaviour, stress concentrations within the hinge-zones, and actuation forces. The information regarding geometric relations and material gradients gained from those computational models are then used to develop novel material combinations for glass fibre reinforced plastics which enabled the fabrication of physical prototypes of the compliant façade shading device Flectofold.

  14. Full cell simulation and the evaluation of the buffer system on air-cathode microbial fuel cell

    NASA Astrophysics Data System (ADS)

    Ou, Shiqi; Kashima, Hiroyuki; Aaron, Douglas S.; Regan, John M.; Mench, Matthew M.

    2017-04-01

    This paper presents a computational model of a single chamber, air-cathode MFC. The model considers losses due to mass transport, as well as biological and electrochemical reactions, in both the anode and cathode half-cells. Computational fluid dynamics and Monod-Nernst analysis are incorporated into the reactions for the anode biofilm and cathode Pt catalyst and biofilm. The integrated model provides a macro-perspective of the interrelation between the anode and cathode during power production, while incorporating microscale contributions of mass transport within the anode and cathode layers. Model considerations include the effects of pH (H+/OH- transport) and electric field-driven migration on concentration overpotential, effects of various buffers and various amounts of buffer on the pH in the whole reactor, and overall impacts on the power output of the MFC. The simulation results fit the experimental polarization and power density curves well. Further, this model provides insight regarding mass transport at varying current density regimes and quantitative delineation of overpotentials at the anode and cathode. Overall, this comprehensive simulation is designed to accurately predict MFC performance based on fundamental fluid and kinetic relations and guide optimization of the MFC system.

  15. Improving the Energy Saving Process with High-Resolution Data: A Case Study in a University Building.

    PubMed

    Han, Jeongyun; Lee, Eunjung; Cho, Hyunghun; Yoon, Yoonjin; Lee, Hyoseop; Rhee, Wonjong

    2018-05-17

    In this paper, we provide findings from an energy saving experiment in a university building, where an IoT platform with 1 Hz sampling sensors was deployed to collect electric power consumption data. The experiment was a reward setup with daily feedback delivered by an energy delegate for one week, and energy saving of 25.4% was achieved during the experiment. Post-experiment sustainability, defined as 10% or more of energy saving, was also accomplished for 44 days without any further intervention efforts. The saving was possible mainly because of the data-driven intervention designs with high-resolution data in terms of sampling frequency and number of sensors, and the high-resolution data turned out to be pivotal for an effective waste behavior investigation. While the quantitative result was encouraging, we also noticed many uncontrollable factors, such as exams, papers due, office allocation shuffling, graduation, and new-comers, that affected the result in the campus environment. To confirm that the quantitative result was due to behavior changes, rather than uncontrollable factors, we developed several data-driven behavior detection measures. With these measures, it was possible to analyze behavioral changes, as opposed to simply analyzing quantitative fluctuations. Overall, we conclude that the space-time resolution of data can be crucial for energy saving, and potentially for many other data-driven energy applications.

  16. Effects-Driven Participatory Design: Learning from Sampling Interruptions.

    PubMed

    Brandrup, Morten; Østergaard, Kija Lin; Hertzum, Morten; Karasti, Helena; Simonsen, Jesper

    2017-01-01

    Participatory design (PD) can play an important role in obtaining benefits from healthcare information technologies, but we contend that to fulfil this role PD must incorporate feedback from real use of the technologies. In this paper we describe an effects-driven PD approach that revolves around a sustained focus on pursued effects and uses the experience sampling method (ESM) to collect real-use feedback. To illustrate the use of the method we analyze a case that involves the organizational implementation of electronic whiteboards at a Danish hospital to support the clinicians' intra- and interdepartmental coordination. The hospital aimed to reduce the number of phone calls involved in coordinating work because many phone calls were seen as unnecessary interruptions. To learn about the interruptions we introduced an app for capturing quantitative data and qualitative feedback about the phone calls. The investigation showed that the electronic whiteboards had little potential for reducing the number of phone calls at the operating ward. The combination of quantitative data and qualitative feedback worked both as a basis for aligning assumptions to data and showed ESM as an instrument for triggering in-situ reflection. The participant-driven design and redesign of the way data were captured by means of ESM is a central contribution to the understanding of how to conduct effects-driven PD.

  17. Data-driven train set crash dynamics simulation

    NASA Astrophysics Data System (ADS)

    Tang, Zhao; Zhu, Yunrui; Nie, Yinyu; Guo, Shihui; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2017-02-01

    Traditional finite element (FE) methods are arguably expensive in computation/simulation of the train crash. High computational cost limits their direct applications in investigating dynamic behaviours of an entire train set for crashworthiness design and structural optimisation. On the contrary, multi-body modelling is widely used because of its low computational cost with the trade-off in accuracy. In this study, a data-driven train crash modelling method is proposed to improve the performance of a multi-body dynamics simulation of train set crash without increasing the computational burden. This is achieved by the parallel random forest algorithm, which is a machine learning approach that extracts useful patterns of force-displacement curves and predicts a force-displacement relation in a given collision condition from a collection of offline FE simulation data on various collision conditions, namely different crash velocities in our analysis. Using the FE simulation results as a benchmark, we compared our method with traditional multi-body modelling methods and the result shows that our data-driven method improves the accuracy over traditional multi-body models in train crash simulation and runs at the same level of efficiency.

  18. Mission Driven Scene Understanding: Candidate Model Training and Validation

    DTIC Science & Technology

    2016-09-01

    driven scene understanding. One of the candidate engines that we are evaluating is a convolutional neural network (CNN) program installed on a Windows 10...Theano-AlexNet6,7) installed on a Windows 10 notebook computer. To the best of our knowledge, an implementation of the open-source, Python-based...AlexNet CNN on a Windows notebook computer has not been previously reported. In this report, we present progress toward the proof-of-principle testing

  19. On the role of electron-driven processes in planetary atmospheres and comets

    NASA Astrophysics Data System (ADS)

    Campbell, L.; Brunger, M. J.

    2009-11-01

    After the presence of ionized layers in the Earth's atmosphere was inferred, it took 50 years to quantitatively understand them. The electron density could not be accounted for until Sir David Bates first suggested (along with Sir Harrie Massey) that the main electron-loss process was dissociative recombination with molecular ions, and he and colleagues then developed a theory to predict those rates of dissociative recombination. However, electron impact processes, particularly excitation, have been considered insignificant in most situations, in both planetary and cometary atmospheres. Here we describe cases where recent calculations have shown that electron impact excitation of molecules is important, suggesting that, just as in the time of Sir David Bates, electron-driven processes remain fundamental to our quantitative understanding of atmospheric and cometary phenomena.

  20. Quantitative Accelerated Life Testing of MEMS Accelerometers.

    PubMed

    Bâzu, Marius; Gălăţeanu, Lucian; Ilian, Virgil Emil; Loicq, Jerome; Habraken, Serge; Collette, Jean-Paul

    2007-11-20

    Quantitative Accelerated Life Testing (QALT) is a solution for assessing thereliability of Micro Electro Mechanical Systems (MEMS). A procedure for QALT is shownin this paper and an attempt to assess the reliability level for a batch of MEMSaccelerometers is reported. The testing plan is application-driven and contains combinedtests: thermal (high temperature) and mechanical stress. Two variants of mechanical stressare used: vibration (at a fixed frequency) and tilting. Original equipment for testing at tiltingand high temperature is used. Tilting is appropriate as application-driven stress, because thetilt movement is a natural environment for devices used for automotive and aerospaceapplications. Also, tilting is used by MEMS accelerometers for anti-theft systems. The testresults demonstrated the excellent reliability of the studied devices, the failure rate in the"worst case" being smaller than 10 -7 h -1 .

  1. Floquet spectrum and driven conductance in Dirac materials: Effects of Landau-Zener-Stuckelberg-Majorana interferometry

    NASA Astrophysics Data System (ADS)

    Rodionov, Yaroslav; Kugel, Kliment; Nori, Franco

    Using the Landau-Zener-Stückelberg-Majorana-type (LZSM) semiclassical approach, we study both graphene and a thin film of a Weyl semimetal subjected to a strong ac electromagnetic field. The spectrum of quasienergies in the Weyl semimetal turns out to be similar to that of a graphene sheet. It has been predicted qualitatively that the transport properties of strongly irradiated graphene oscillate as a function of the radiation intensity. Here we obtain rigorous quantitative results for a driven linear conductance of graphene and a thin film of a Weyl semimetal. The exact quantitative structure of oscillations exhibits two contributions. The first one is a manifestation of the Ramsauer-Townsend effect, while the second contribution is a consequence of the LZSM interference defining the spectrum of quasienergies.

  2. Inferring Muscle-Tendon Unit Power from Ankle Joint Power during the Push-Off Phase of Human Walking: Insights from a Multiarticular EMG-Driven Model

    PubMed Central

    2016-01-01

    Introduction Inverse dynamics joint kinetics are often used to infer contributions from underlying groups of muscle-tendon units (MTUs). However, such interpretations are confounded by multiarticular (multi-joint) musculature, which can cause inverse dynamics to over- or under-estimate net MTU power. Misestimation of MTU power could lead to incorrect scientific conclusions, or to empirical estimates that misguide musculoskeletal simulations, assistive device designs, or clinical interventions. The objective of this study was to investigate the degree to which ankle joint power overestimates net plantarflexor MTU power during the Push-off phase of walking, due to the behavior of the flexor digitorum and hallucis longus (FDHL)–multiarticular MTUs crossing the ankle and metatarsophalangeal (toe) joints. Methods We performed a gait analysis study on six healthy participants, recording ground reaction forces, kinematics, and electromyography (EMG). Empirical data were input into an EMG-driven musculoskeletal model to estimate ankle power. This model enabled us to parse contributions from mono- and multi-articular MTUs, and required only one scaling and one time delay factor for each subject and speed, which were solved for based on empirical data. Net plantarflexing MTU power was computed by the model and quantitatively compared to inverse dynamics ankle power. Results The EMG-driven model was able to reproduce inverse dynamics ankle power across a range of gait speeds (R2 ≥ 0.97), while also providing MTU-specific power estimates. We found that FDHL dynamics caused ankle power to slightly overestimate net plantarflexor MTU power, but only by ~2–7%. Conclusions During Push-off, FDHL MTU dynamics do not substantially confound the inference of net plantarflexor MTU power from inverse dynamics ankle power. However, other methodological limitations may cause inverse dynamics to overestimate net MTU power; for instance, due to rigid-body foot assumptions. Moving forward, the EMG-driven modeling approach presented could be applied to understand other tasks or larger multiarticular MTUs. PMID:27764110

  3. Inferring Muscle-Tendon Unit Power from Ankle Joint Power during the Push-Off Phase of Human Walking: Insights from a Multiarticular EMG-Driven Model.

    PubMed

    Honert, Eric C; Zelik, Karl E

    2016-01-01

    Inverse dynamics joint kinetics are often used to infer contributions from underlying groups of muscle-tendon units (MTUs). However, such interpretations are confounded by multiarticular (multi-joint) musculature, which can cause inverse dynamics to over- or under-estimate net MTU power. Misestimation of MTU power could lead to incorrect scientific conclusions, or to empirical estimates that misguide musculoskeletal simulations, assistive device designs, or clinical interventions. The objective of this study was to investigate the degree to which ankle joint power overestimates net plantarflexor MTU power during the Push-off phase of walking, due to the behavior of the flexor digitorum and hallucis longus (FDHL)-multiarticular MTUs crossing the ankle and metatarsophalangeal (toe) joints. We performed a gait analysis study on six healthy participants, recording ground reaction forces, kinematics, and electromyography (EMG). Empirical data were input into an EMG-driven musculoskeletal model to estimate ankle power. This model enabled us to parse contributions from mono- and multi-articular MTUs, and required only one scaling and one time delay factor for each subject and speed, which were solved for based on empirical data. Net plantarflexing MTU power was computed by the model and quantitatively compared to inverse dynamics ankle power. The EMG-driven model was able to reproduce inverse dynamics ankle power across a range of gait speeds (R2 ≥ 0.97), while also providing MTU-specific power estimates. We found that FDHL dynamics caused ankle power to slightly overestimate net plantarflexor MTU power, but only by ~2-7%. During Push-off, FDHL MTU dynamics do not substantially confound the inference of net plantarflexor MTU power from inverse dynamics ankle power. However, other methodological limitations may cause inverse dynamics to overestimate net MTU power; for instance, due to rigid-body foot assumptions. Moving forward, the EMG-driven modeling approach presented could be applied to understand other tasks or larger multiarticular MTUs.

  4. A Perspective on Implementing a Quantitative Systems Pharmacology Platform for Drug Discovery and the Advancement of Personalized Medicine

    PubMed Central

    Stern, Andrew M.; Schurdak, Mark E.; Bahar, Ivet; Berg, Jeremy M.; Taylor, D. Lansing

    2016-01-01

    Drug candidates exhibiting well-defined pharmacokinetic and pharmacodynamic profiles that are otherwise safe often fail to demonstrate proof-of-concept in phase II and III trials. Innovation in drug discovery and development has been identified as a critical need for improving the efficiency of drug discovery, especially through collaborations between academia, government agencies, and industry. To address the innovation challenge, we describe a comprehensive, unbiased, integrated, and iterative quantitative systems pharmacology (QSP)–driven drug discovery and development strategy and platform that we have implemented at the University of Pittsburgh Drug Discovery Institute. Intrinsic to QSP is its integrated use of multiscale experimental and computational methods to identify mechanisms of disease progression and to test predicted therapeutic strategies likely to achieve clinical validation for appropriate subpopulations of patients. The QSP platform can address biological heterogeneity and anticipate the evolution of resistance mechanisms, which are major challenges for drug development. The implementation of this platform is dedicated to gaining an understanding of mechanism(s) of disease progression to enable the identification of novel therapeutic strategies as well as repurposing drugs. The QSP platform will help promote the paradigm shift from reactive population-based medicine to proactive personalized medicine by focusing on the patient as the starting and the end point. PMID:26962875

  5. Exploring the Perceptions of College Instructors towards Computer Simulation Software Programs: A Quantitative Study

    ERIC Educational Resources Information Center

    Punch, Raymond J.

    2012-01-01

    The purpose of the quantitative regression study was to explore and to identify relationships between attitudes toward use and perceptions of value of computer-based simulation programs, of college instructors, toward computer based simulation programs. A relationship has been reported between attitudes toward use and perceptions of the value of…

  6. Maximum Langmuir Fields in Planetary Foreshocks Determined from the Electrostatic Decay Threshold

    NASA Technical Reports Server (NTRS)

    Robinson, P. A.; Cairns, Iver H.

    1995-01-01

    Maximum electric fields of Langmuir waves at planetary foreshocks are estimated from the threshold for electrostatic decay, assuming it saturates beam driven growth, and incorporating heliospheric variation of plasma density and temperature. Comparisons with spacecraft observations yields good quantitative agreement. Observations in type 3 radio sources are also in accord with this interpretation. A single mechanism can thus account for the highest fields of beam driven waves in both contexts.

  7. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  8. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  9. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  10. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  11. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... established for the period, made, if applicable, on a quantitative basis related to cost data for computation... established for the period, made, if applicable, on a quantitative basis related to costs for computation of...

  12. A Menu-Driven Interface to Unix-Based Resources

    PubMed Central

    Evans, Elizabeth A.

    1989-01-01

    Unix has often been overlooked in the past as a viable operating system for anyone other than computer scientists. Its terseness, non-mnemonic nature of the commands, and the lack of user-friendly software to run under it are but a few of the user-related reasons which have been cited. It is, nevertheless, the operating system of choice in many cases. This paper describes a menu-driven interface to Unix which provides user-friendlier access to the software resources available on the computers running under Unix.

  13. Database Driven 6-DOF Trajectory Simulation for Debris Transport Analysis

    NASA Technical Reports Server (NTRS)

    West, Jeff

    2008-01-01

    Debris mitigation and risk assessment have been carried out by NASA and its contractors supporting Space Shuttle Return-To-Flight (RTF). As a part of this assessment, analysis of transport potential for debris that may be liberated from the vehicle or from pad facilities prior to tower clear (Lift-Off Debris) is being performed by MSFC. This class of debris includes plume driven and wind driven sources for which lift as well as drag are critical for the determination of the debris trajectory. As a result, NASA MSFC has a need for a debris transport or trajectory simulation that supports the computation of lift effect in addition to drag without the computational expense of fully coupled CFD with 6-DOF. A database driven 6-DOF simulation that uses aerodynamic force and moment coefficients for the debris shape that are interpolated from a database has been developed to meet this need. The design, implementation, and verification of the database driven six degree of freedom (6-DOF) simulation addition to the Lift-Off Debris Transport Analysis (LODTA) software are discussed in this paper.

  14. System and method for embedding emotion in logic systems

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A. (Inventor)

    2012-01-01

    A system, method, and computer readable-media for creating a stable synthetic neural system. The method includes training an intellectual choice-driven synthetic neural system (SNS), training an emotional rule-driven SNS by generating emotions from rules, incorporating the rule-driven SNS into the choice-driven SNS through an evolvable interface, and balancing the emotional SNS and the intellectual SNS to achieve stability in a nontrivial autonomous environment with a Stability Algorithm for Neural Entities (SANE). Generating emotions from rules can include coding the rules into the rule-driven SNS in a self-consistent way. Training the emotional rule-driven SNS can occur during a training stage in parallel with training the choice-driven SNS. The training stage can include a self assessment loop which measures performance characteristics of the rule-driven SNS against core genetic code. The method uses a stability threshold to measure stability of the incorporated rule-driven SNS and choice-driven SNS using SANE.

  15. Moving Computational Domain Method and Its Application to Flow Around a High-Speed Car Passing Through a Hairpin Curve

    NASA Astrophysics Data System (ADS)

    Watanabe, Koji; Matsuno, Kenichi

    This paper presents a new method for simulating flows driven by a body traveling with neither restriction on motion nor a limit of a region size. In the present method named 'Moving Computational Domain Method', the whole of the computational domain including bodies inside moves in the physical space without the limit of region size. Since the whole of the grid of the computational domain moves according to the movement of the body, a flow solver of the method has to be constructed on the moving grid system and it is important for the flow solver to satisfy physical and geometric conservation laws simultaneously on moving grid. For this issue, the Moving-Grid Finite-Volume Method is employed as the flow solver. The present Moving Computational Domain Method makes it possible to simulate flow driven by any kind of motion of the body in any size of the region with satisfying physical and geometric conservation laws simultaneously. In this paper, the method is applied to the flow around a high-speed car passing through a hairpin curve. The distinctive flow field driven by the car at the hairpin curve has been demonstrated in detail. The results show the promising feature of the method.

  16. Computational and Theoretical Investigations of Strongly Correlated Fermions in Optical Lattices

    DTIC Science & Technology

    2013-08-29

    and two-particle spectral functions across the disorder - driven superconductor - insulator transition". 22. Invited speaker, \\Fermions in Optical...energy gaps across the disorder - driven superconductor - insulator transition", October 7, 2010, Harvard. 27. Seminar on \\Probing Quantum Phases of...Perimeter Institute, November 14, 2011. 37. Seminar on \\Single and two-particle energy gaps across the disorder - driven superconductor - insulator transition

  17. Pivotal role of computers and software in mass spectrometry - SEQUEST and 20 years of tandem MS database searching.

    PubMed

    Yates, John R

    2015-11-01

    Advances in computer technology and software have driven developments in mass spectrometry over the last 50 years. Computers and software have been impactful in three areas: the automation of difficult calculations to aid interpretation, the collection of data and control of instruments, and data interpretation. As the power of computers has grown, so too has the utility and impact on mass spectrometers and their capabilities. This has been particularly evident in the use of tandem mass spectrometry data to search protein and nucleotide sequence databases to identify peptide and protein sequences. This capability has driven the development of many new approaches to study biological systems, including the use of "bottom-up shotgun proteomics" to directly analyze protein mixtures. Graphical Abstract ᅟ.

  18. EnviroLand: A Simple Computer Program for Quantitative Stream Assessment.

    ERIC Educational Resources Information Center

    Dunnivant, Frank; Danowski, Dan; Timmens-Haroldson, Alice; Newman, Meredith

    2002-01-01

    Introduces the Enviroland computer program which features lab simulations of theoretical calculations for quantitative analysis and environmental chemistry, and fate and transport models. Uses the program to demonstrate the nature of linear and nonlinear equations. (Author/YDS)

  19. Emerging Patient-Driven Health Care Models: An Examination of Health Social Networks, Consumer Personalized Medicine and Quantified Self-Tracking

    PubMed Central

    Swan, Melanie

    2009-01-01

    A new class of patient-driven health care services is emerging to supplement and extend traditional health care delivery models and empower patient self-care. Patient-driven health care can be characterized as having an increased level of information flow, transparency, customization, collaboration and patient choice and responsibility-taking, as well as quantitative, predictive and preventive aspects. The potential exists to both improve traditional health care systems and expand the concept of health care though new services. This paper examines three categories of novel health services: health social networks, consumer personalized medicine and quantified self-tracking. PMID:19440396

  20. Is There Computer Graphics after Multimedia?

    ERIC Educational Resources Information Center

    Booth, Kellogg S.

    Computer graphics has been driven by the desire to generate real-time imagery subject to constraints imposed by the human visual system. The future of computer graphics, when off-the-shelf systems have full multimedia capability and when standard computing engines render imagery faster than real-time, remains to be seen. A dedicated pipeline for…

  1. Programmable Quantum Photonic Processor Using Silicon Photonics

    DTIC Science & Technology

    2017-04-01

    quantum information processing and quantum sensing, ranging from linear optics quantum computing and quantum simulation to quantum ...transformers have driven experimental and theoretical advances in quantum simulation, cluster-state quantum computing , all-optical quantum repeaters...neuromorphic computing , and other applications. In addition, we developed new schemes for ballistic quantum computation , new methods for

  2. Round Girls in Square Computers: Feminist Perspectives on the Aesthetics of Computer Hardware.

    ERIC Educational Resources Information Center

    Carr-Chellman, Alison A.; Marra, Rose M.; Roberts, Shari L.

    2002-01-01

    Considers issues related to computer hardware, aesthetics, and gender. Explores how gender has influenced the design of computer hardware and how these gender-driven aesthetics may have worked to maintain, extend, or alter gender distinctions, roles, and stereotypes; discusses masculine media representations; and presents an alternative model.…

  3. A new software-based architecture for quantum computer

    NASA Astrophysics Data System (ADS)

    Wu, Nan; Song, FangMin; Li, Xiangdong

    2010-04-01

    In this paper, we study a reliable architecture of a quantum computer and a new instruction set and machine language for the architecture, which can improve the performance and reduce the cost of the quantum computing. We also try to address some key issues in detail in the software-driven universal quantum computers.

  4. Quantitation of zolpidem in biological fluids by electro-driven microextraction combined with HPLC-UV analysis.

    PubMed

    Yaripour, Saeid; Mohammadi, Ali; Esfanjani, Isa; Walker, Roderick B; Nojavan, Saeed

    2018-01-01

    In this study, for the first time, an electro-driven microextraction method named electromembrane extraction combined with a simple high performance liquid chromatography and ultraviolet detection was developed and validated for the quantitation of zolpidem in biological samples. Parameters influencing electromembrane extraction were evaluated and optimized. The membrane consisted of 2-ethylhexanol immobilized in the pores of a hollow fiber. As a driving force, a 150 V electric field was applied to facilitate the analyte migration from the sample matrix to an acceptor solution through a supported liquid membrane. The pHs of donor and acceptor solutions were optimized to 6.0 and 2.0, respectively. The enrichment factor was obtained >75 within 15 minutes. The effect of carbon nanotubes (as solid nano-sorbents) on the membrane performance and EME efficiency was evaluated. The method was linear over the range of 10-1000 ng/mL for zolpidem (R 2 >0.9991) with repeatability ( %RSD) between 0.3 % and 7.3 % ( n = 3). The limits of detection and quantitation were 3 and 10 ng/mL, respectively. The sensitivity of HPLC-UV for the determination of zolpidem was enhanced by electromembrane extraction. Finally, the method was employed for the quantitation of zolpidem in biological samples with relative recoveries in the range of 60-79 %.

  5. Quantitation of zolpidem in biological fluids by electro-driven microextraction combined with HPLC-UV analysis

    PubMed Central

    Yaripour, Saeid; Mohammadi, Ali; Esfanjani, Isa; Walker, Roderick B.; Nojavan, Saeed

    2018-01-01

    In this study, for the first time, an electro-driven microextraction method named electromembrane extraction combined with a simple high performance liquid chromatography and ultraviolet detection was developed and validated for the quantitation of zolpidem in biological samples. Parameters influencing electromembrane extraction were evaluated and optimized. The membrane consisted of 2-ethylhexanol immobilized in the pores of a hollow fiber. As a driving force, a 150 V electric field was applied to facilitate the analyte migration from the sample matrix to an acceptor solution through a supported liquid membrane. The pHs of donor and acceptor solutions were optimized to 6.0 and 2.0, respectively. The enrichment factor was obtained >75 within 15 minutes. The effect of carbon nanotubes (as solid nano-sorbents) on the membrane performance and EME efficiency was evaluated. The method was linear over the range of 10-1000 ng/mL for zolpidem (R2 >0.9991) with repeatability ( %RSD) between 0.3 % and 7.3 % (n = 3). The limits of detection and quantitation were 3 and 10 ng/mL, respectively. The sensitivity of HPLC-UV for the determination of zolpidem was enhanced by electromembrane extraction. Finally, the method was employed for the quantitation of zolpidem in biological samples with relative recoveries in the range of 60-79 %. PMID:29805344

  6. Nicholas Metropolis Award for Outstanding Doctoral Thesis Work in Computational Physics Lecture: The Janus computer, a new window into spin-glass physics

    NASA Astrophysics Data System (ADS)

    Yllanes, David

    2013-03-01

    Spin glasses are a longstanding model for the sluggish dynamics that appears at the glass transition. They enjoy a privileged status in this context, as they provide the simplest model system both for theoretical and experimental studies of glassy dynamics. However, in spite of forty years of intensive investigation, spin glasses still pose a formidable challenge to theoretical, computational and experimental physics. The main difficulty lies in their incredibly slow dynamics. A recent breakthrough has been made possible by our custom-built computer, Janus, designed and built in a collaboration formed by five universities in Spain and Italy. By employing a purpose-driven architecture, capable of fully exploiting the parallelization possibilities intrinsic to these simulations, Janus outperforms conventional computers by several orders of magnitude. After a brief introduction to spin glasses, the talk will focus on the new physics unearthed by Janus. In particular, we recall our numerical study of the nonequilibrium dynamics of the Edwards-Anderson Ising Spin Glass, for a time that spans eleven orders of magnitude, thus approaching the experimentally relevant scale (i.e. seconds). We have also studied the equilibrium properties of the spin-glass phase, with an emphasis on the quantitative matching between non-equilibrium and equilibrium correlation functions, through a time-length dictionary. Last but not least, we have clarified the existence of a glass transition in the presence of a magnetic field for a finite-range spin glass (the so-called de Almeida-Thouless line). We will finally mention some of the currently ongoing work of the collaboration, such as the characterization of the non-equilibrium dynamics in a magnetic field and the existence of a statics-dynamics dictionary in these conditions.

  7. Biomechanical differences in the stem straightening process among Pinus pinaster provenances. A new approach for early selection of stem straightness.

    PubMed

    Sierra-de-Grado, Rosario; Pando, Valentín; Martínez-Zurimendi, Pablo; Peñalvo, Alejandro; Báscones, Esther; Moulia, Bruno

    2008-06-01

    Stem straightness is an important selection trait in Pinus pinaster Ait. breeding programs. Despite the stability of stem straightness rankings in provenance trials, the efficiency of breeding programs based on a quantitative index of stem straightness remains low. An alternative approach is to analyze biomechanical processes that underlie stem form. The rationale for this selection method is that genetic differences in the biomechanical processes that maintain stem straightness in young plants will continue to control stem form throughout the life of the tree. We analyzed the components contributing most to genetic differences among provenances in stem straightening processes by kinetic analysis and with a biomechanical model defining the interactions between the variables involved (Fournier's model). This framework was tested on three P. pinaster provenances differing in adult stem straightness and growth. One-year-old plants were tilted at 45 degrees, and individual stem positions and sizes were recorded weekly for 5 months. We measured the radial extension of reaction wood and the anatomical features of wood cells in serial stem cross sections. The integral effect of reaction wood on stem leaning was computed with Fournier's model. Responses driven by both primary and secondary growth were involved in the stem straightening process, but secondary-growth-driven responses accounted for most differences among provenances. Plants from the straight-stemmed provenance showed a greater capacity for stem straightening than plants from the sinuous provenances mainly because of (1) more efficient reaction wood (higher maturation strains) and (2) more pronounced secondary-growth-driven autotropic decurving. These two process-based traits are thus good candidates for early selection of stem straightness, but additional tests on a greater number of genotypes over a longer period are required.

  8. A biomechanical modeling-guided simultaneous motion estimation and image reconstruction technique (SMEIR-Bio) for 4D-CBCT reconstruction

    NASA Astrophysics Data System (ADS)

    Huang, Xiaokun; Zhang, You; Wang, Jing

    2018-02-01

    Reconstructing four-dimensional cone-beam computed tomography (4D-CBCT) images directly from respiratory phase-sorted traditional 3D-CBCT projections can capture target motion trajectory, reduce motion artifacts, and reduce imaging dose and time. However, the limited numbers of projections in each phase after phase-sorting decreases CBCT image quality under traditional reconstruction techniques. To address this problem, we developed a simultaneous motion estimation and image reconstruction (SMEIR) algorithm, an iterative method that can reconstruct higher quality 4D-CBCT images from limited projections using an inter-phase intensity-driven motion model. However, the accuracy of the intensity-driven motion model is limited in regions with fine details whose quality is degraded due to insufficient projection number, which consequently degrades the reconstructed image quality in corresponding regions. In this study, we developed a new 4D-CBCT reconstruction algorithm by introducing biomechanical modeling into SMEIR (SMEIR-Bio) to boost the accuracy of the motion model in regions with small fine structures. The biomechanical modeling uses tetrahedral meshes to model organs of interest and solves internal organ motion using tissue elasticity parameters and mesh boundary conditions. This physics-driven approach enhances the accuracy of solved motion in the organ’s fine structures regions. This study used 11 lung patient cases to evaluate the performance of SMEIR-Bio, making both qualitative and quantitative comparisons between SMEIR-Bio, SMEIR, and the algebraic reconstruction technique with total variation regularization (ART-TV). The reconstruction results suggest that SMEIR-Bio improves the motion model’s accuracy in regions containing small fine details, which consequently enhances the accuracy and quality of the reconstructed 4D-CBCT images.

  9. How Complex, Probable, and Predictable is Genetically Driven Red Queen Chaos?

    PubMed

    Duarte, Jorge; Rodrigues, Carla; Januário, Cristina; Martins, Nuno; Sardanyés, Josep

    2015-12-01

    Coevolution between two antagonistic species has been widely studied theoretically for both ecologically- and genetically-driven Red Queen dynamics. A typical outcome of these systems is an oscillatory behavior causing an endless series of one species adaptation and others counter-adaptation. More recently, a mathematical model combining a three-species food chain system with an adaptive dynamics approach revealed genetically driven chaotic Red Queen coevolution. In the present article, we analyze this mathematical model mainly focusing on the impact of species rates of evolution (mutation rates) in the dynamics. Firstly, we analytically proof the boundedness of the trajectories of the chaotic attractor. The complexity of the coupling between the dynamical variables is quantified using observability indices. By using symbolic dynamics theory, we quantify the complexity of genetically driven Red Queen chaos computing the topological entropy of existing one-dimensional iterated maps using Markov partitions. Co-dimensional two bifurcation diagrams are also built from the period ordering of the orbits of the maps. Then, we study the predictability of the Red Queen chaos, found in narrow regions of mutation rates. To extend the previous analyses, we also computed the likeliness of finding chaos in a given region of the parameter space varying other model parameters simultaneously. Such analyses allowed us to compute a mean predictability measure for the system in the explored region of the parameter space. We found that genetically driven Red Queen chaos, although being restricted to small regions of the analyzed parameter space, might be highly unpredictable.

  10. Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.

    PubMed

    Stein, Lincoln D

    2008-09-01

    Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.

  11. Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science

    NASA Astrophysics Data System (ADS)

    Baru, C.

    2014-12-01

    Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.

  12. Comparison between two types of improved motion-sensitized driven-equilibrium (iMSDE) for intracranial black-blood imaging at 3.0 tesla.

    PubMed

    Obara, Makoto; Kuroda, Kagayaki; Wang, Jinnan; Honda, Masatoshi; Yoneyama, Masami; Imai, Yutaka; Van Cauteren, Marc

    2014-10-01

    To investigate the image quality impact of a new implementation of the improved motion-sensitized driven-equilibrium (iMSDE) pulse scheme in the human brain at 3.0 Tesla. Two iMSDE preparation schemes were compared; (a) iMSDE-1: two refocusing pulses and two pairs of bipolar gradients and (b) iMSDE-2: adding extra bipolar gradients in front of the iMSDE-1 preparation. Computer simulation was used to evaluate the difference of eddy currents effect between these two approaches. Five healthy volunteers were then scanned with both sequences in the intracranial region and signal changes associated with iMSDE-1 and iMSDE-2 were assessed and compared quantitatively and qualitatively. Simulation results demonstrated that eddy currents are better compensated in iMSDE-2 than in the iMSDE-1 design. In vivo comparison showed that the iMSDE-2 sequence significantly reduced the tissue signal loss at all locations compared with iMSDE-1 (5.0% versus 23% in average, P < 0.0002 at paired t-test). The signal in iMSDE-1 showed greater spatial inhomogeneity than that of iMSDE-2. Our results show that iMSDE-2 demonstrated smaller loss in signal and less spatial variation compared with iMSDE-1, we conjecture due to the improved eddy current compensation. © 2013 Wiley Periodicals, Inc.

  13. Six sigma tools for a patient safety-oriented, quality-checklist driven radiation medicine department.

    PubMed

    Kapur, Ajay; Potters, Louis

    2012-01-01

    The purpose of this work was to develop and implement six sigma practices toward the enhancement of patient safety in an electronic, quality checklist-driven, multicenter, paperless radiation medicine department. A quality checklist process map (QPM), stratified into consultation through treatment-completion stages was incorporated into an oncology information systems platform. A cross-functional quality management team conducted quality-function-deployment and define-measure-analyze-improve-control (DMAIC) six sigma exercises with a focus on patient safety. QPM procedures were Pareto-sorted in order of decreasing patient safety risk with failure mode and effects analysis (FMEA). Quantitative metrics for a grouped set of highest risk procedures were established. These included procedural delays, associated standard deviations and six sigma Z scores. Baseline performance of the QPM was established over the previous year of usage. Data-driven analysis led to simplification, standardization, and refinement of the QPM with standard deviation, slip-day reduction, and Z-score enhancement goals. A no-fly policy (NFP) for patient safety was introduced at the improve-control DMAIC phase, with a process map interlock imposed on treatment initiation in the event of FMEA-identified high-risk tasks being delayed or not completed. The NFP was introduced in a pilot phase with specific stopping rules and the same metrics used for performance assessments. A custom root-cause analysis database was deployed to monitor patient safety events. Relative to the baseline period, average slip days and standard deviations for the risk-enhanced QPM procedures improved by over threefold factors in the NFP period. The Z scores improved by approximately 20%. A trend for proactive delays instead of reactive hard stops was observed with no adverse effects of the NFP. The number of computed potential no-fly delays per month dropped from 60 to 20 over a total of 520 cases. The fraction of computed potential no-fly cases that were delayed in NFP compliance rose from 28% to 45%. Proactive delays rose to 80% of all delayed cases. For potential no-fly cases, event reporting rose from 18% to 50%, while for actually delayed cases, event reporting rose from 65% to 100%. With complex technologies, resource-compromised staff, and pressures to hasten treatment initiation, the use of the six sigma driven process interlocks may mitigate potential patient safety risks as demonstrated in this study. Copyright © 2012 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  14. Unsteady Fast Random Particle Mesh method for efficient prediction of tonal and broadband noises of a centrifugal fan unit

    NASA Astrophysics Data System (ADS)

    Heo, Seung; Cheong, Cheolung; Kim, Taehoon

    2015-09-01

    In this study, efficient numerical method is proposed for predicting tonal and broadband noises of a centrifugal fan unit. The proposed method is based on Hybrid Computational Aero-Acoustic (H-CAA) techniques combined with Unsteady Fast Random Particle Mesh (U-FRPM) method. The U-FRPM method is developed by extending the FRPM method proposed by Ewert et al. and is utilized to synthesize turbulence flow field from unsteady RANS solutions. The H-CAA technique combined with U-FRPM method is applied to predict broadband as well as tonal noises of a centrifugal fan unit in a household refrigerator. Firstly, unsteady flow field driven by a rotating fan is computed by solving the RANS equations with Computational Fluid Dynamic (CFD) techniques. Main source regions around the rotating fan are identified by examining the computed flow fields. Then, turbulence flow fields in the main source regions are synthesized by applying the U-FRPM method. The acoustic analogy is applied to model acoustic sources in the main source regions. Finally, the centrifugal fan noise is predicted by feeding the modeled acoustic sources into an acoustic solver based on the Boundary Element Method (BEM). The sound spectral levels predicted using the current numerical method show good agreements with the measured spectra at the Blade Pass Frequencies (BPFs) as well as in the high frequency range. On the more, the present method enables quantitative assessment of relative contributions of identified source regions to the sound field by comparing predicted sound pressure spectrum due to modeled sources.

  15. User-Driven Sampling Strategies in Image Exploitation

    DOE PAGES

    Harvey, Neal R.; Porter, Reid B.

    2013-12-23

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-drivenmore » sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. We discovered that in user-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. Furthermore, in preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.« less

  16. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    NASA Astrophysics Data System (ADS)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in the experimental group, who responded to the use of Internet Resources Survey, were positive (mean of 3.4 on the 4-point scale) toward their use of Internet resources which included the online courseware developed by the researcher. Findings from this study suggest that (1) the digital divide with respect to gender and ethnicity may be narrowing, and (2) students who are exposed to a course that augments computer-driven courseware with traditional teaching methods appear to have less anxiety, have a clearer perception of computer usefulness, and feel that online resources enhance their learning.

  17. Telehealth: When Technology Meets Health Care

    MedlinePlus

    ... of digital information and communication technologies, such as computers and mobile devices, to access health care services ... your medical history may not be considered. The computer-driven decision-making model may not be optimal ...

  18. Administrators' Perceptions of Community College Students' Computer Literacy Skills in Beginner Courses

    ERIC Educational Resources Information Center

    Ragin, Tracey B.

    2013-01-01

    Fundamental computer skills are vital in the current technology-driven society. The purpose of this study was to investigate the development needs of students at a rural community college in the Southeast who lacked the computer literacy skills required in a basic computer course. Guided by Greenwood's pragmatic approach as a reformative force in…

  19. Dense, Efficient Chip-to-Chip Communication at the Extremes of Computing

    ERIC Educational Resources Information Center

    Loh, Matthew

    2013-01-01

    The scalability of CMOS technology has driven computation into a diverse range of applications across the power consumption, performance and size spectra. Communication is a necessary adjunct to computation, and whether this is to push data from node-to-node in a high-performance computing cluster or from the receiver of wireless link to a neural…

  20. Systematic development of a text-driven and a video-driven web-based computer-tailored obesity prevention intervention

    PubMed Central

    2013-01-01

    Background This paper describes the systematic development of a text-driven and a video-driven web-based computer-tailored intervention aimed to prevent obesity among normal weight and overweight adults. We hypothesize that the video-driven intervention will be more effective and appealing for individuals with a low level of education. Methods and Design The Intervention Mapping protocol was used to develop the interventions, which have exactly the same educational content but differ in the format in which the information is delivered. One intervention is fully text-based, while in the other intervention in addition to text-based feedback, the core messages are provided by means of videos. The aim of the interventions is to prevent weight gain or achieve modest weight loss by making small changes in dietary intake or physical activity. The content of the interventions is based on the I-Change Model and self-regulation theories and includes behavior change methods such as consciousness raising, tailored feedback on behavior and cognitions, goal setting, action and coping planning, and evaluation of goal pursuit. The interventions consist of six sessions. In the first two sessions, participants will set weight and behavioral change goals and form plans for specific actions to achieve the desired goals. In the remaining four sessions, participants’ will evaluate their progress toward achievement of the behavioral and weight goals. They will also receive personalized feedback on how to deal with difficulties they may encounter, including the opportunity to make coping plans and the possibility to learn from experiences of others. The efficacy and appreciation of the interventions will be examined by means of a three-group randomized controlled trial using a waiting list control group. Measurements will take place at baseline and six and twelve months after baseline. Primary outcome measures are body mass index, physical activity, and dietary intake. Discussion The present paper provides insight into how web-based computer-tailored obesity prevention interventions consisting of self-regulation concepts and text-driven and video-driven messages can be developed systematically. The evaluation of the interventions will provide insight into their efficacy and will result in recommendations for future web-based computer-tailored interventions and the additional value of using video tailoring. Trial registration NTR3501. PMID:24138937

  1. Quantitative theory of driven nonlinear brain dynamics.

    PubMed

    Roberts, J A; Robinson, P A

    2012-09-01

    Strong periodic stimuli such as bright flashing lights evoke nonlinear responses in the brain and interact nonlinearly with ongoing cortical activity, but the underlying mechanisms for these phenomena are poorly understood at present. The dominant features of these experimentally observed dynamics are reproduced by the dynamics of a quantitative neural field model subject to periodic drive. Model power spectra over a range of drive frequencies show agreement with multiple features of experimental measurements, exhibiting nonlinear effects including entrainment over a range of frequencies around the natural alpha frequency f(α), subharmonic entrainment near 2f(α), and harmonic generation. Further analysis of the driven dynamics as a function of the drive parameters reveals rich nonlinear dynamics that is predicted to be observable in future experiments at high drive amplitude, including period doubling, bistable phase-locking, hysteresis, wave mixing, and chaos indicated by positive Lyapunov exponents. Moreover, photosensitive seizures are predicted for physiologically realistic model parameters yielding bistability between healthy and seizure dynamics. These results demonstrate the applicability of neural field models to the new regime of periodically driven nonlinear dynamics, enabling interpretation of experimental data in terms of specific generating mechanisms and providing new tests of the theory. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  3. Compiler-Driven Performance Optimization and Tuning for Multicore Architectures

    DTIC Science & Technology

    2015-04-10

    develop a powerful system for auto-tuning of library routines and compute-intensive kernels, driven by the Pluto system for multicores that we are...kernels, driven by the Pluto system for multicores that we are developing. The work here is motivated by recent advances in two major areas of...automatic C-to-CUDA code generator using a polyhedral compiler transformation framework. We have used and adapted PLUTO (our state-of-the-art tool

  4. Synchronization of autonomous objects in discrete event simulation

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  5. Task-Driven Optimization of Fluence Field and Regularization for Model-Based Iterative Reconstruction in Computed Tomography.

    PubMed

    Gang, Grace J; Siewerdsen, Jeffrey H; Stayman, J Webster

    2017-12-01

    This paper presents a joint optimization of dynamic fluence field modulation (FFM) and regularization in quadratic penalized-likelihood reconstruction that maximizes a task-based imaging performance metric. We adopted a task-driven imaging framework for prospective designs of the imaging parameters. A maxi-min objective function was adopted to maximize the minimum detectability index ( ) throughout the image. The optimization algorithm alternates between FFM (represented by low-dimensional basis functions) and local regularization (including the regularization strength and directional penalty weights). The task-driven approach was compared with three FFM strategies commonly proposed for FBP reconstruction (as well as a task-driven TCM strategy) for a discrimination task in an abdomen phantom. The task-driven FFM assigned more fluence to less attenuating anteroposterior views and yielded approximately constant fluence behind the object. The optimal regularization was almost uniform throughout image. Furthermore, the task-driven FFM strategy redistribute fluence across detector elements in order to prescribe more fluence to the more attenuating central region of the phantom. Compared with all strategies, the task-driven FFM strategy not only improved minimum by at least 17.8%, but yielded higher over a large area inside the object. The optimal FFM was highly dependent on the amount of regularization, indicating the importance of a joint optimization. Sample reconstructions of simulated data generally support the performance estimates based on computed . The improvements in detectability show the potential of the task-driven imaging framework to improve imaging performance at a fixed dose, or, equivalently, to provide a similar level of performance at reduced dose.

  6. The Increasing Effects of Computers on Education.

    ERIC Educational Resources Information Center

    Gannon, John F.

    Predicting that the teaching-learning process in American higher education is about to change drastically because of continuing innovations in computer-assisted technology, this paper argues that this change will be driven by inexpensive but powerful computer technology, and that it will manifest itself by reducing the traditional timing of…

  7. Quo Vadimus? The 21st Century and Multimedia.

    ERIC Educational Resources Information Center

    Kuhn, Allan D.

    This paper relates the concept of computer-driven multimedia to the National Aeronautics and Space Administration (NASA) Scientific and Technical Information Program (STIP). Multimedia is defined here as computer integration and output of text, animation, audio, video, and graphics. Multimedia is the stage of computer-based information that allows…

  8. "Intelligent" Computer Assisted Instruction (CAI) Applications. Interim Report.

    ERIC Educational Resources Information Center

    Brown, John Seely; And Others

    Interim work is documented describing efforts to modify computer techniques used to recognize and process English language requests to an instructional simulator. The conversion from a hand-coded to a table driven technique are described in detail. Other modifications to a simulation based computer assisted instruction program to allow a gaming…

  9. Quantitative Prediction of Computational Quality (so the S and C Folks will Accept it)

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Luckring, James M.; Morrison, Joseph H.

    2004-01-01

    Our choice of title may seem strange but we mean each word. In this talk, we are not going to be concerned with computations made "after the fact", i.e. those for which data are available and which are being conducted for explanation and insight. Here we are interested in preventing S&C design problems by finding them through computation before data are available. For such a computation to have any credibility with those who absorb the risk, it is necessary to quantitatively PREDICT the quality of the computational results.

  10. Diagnostic accuracy of semi-automatic quantitative metrics as an alternative to expert reading of CT myocardial perfusion in the CORE320 study.

    PubMed

    Ostovaneh, Mohammad R; Vavere, Andrea L; Mehra, Vishal C; Kofoed, Klaus F; Matheson, Matthew B; Arbab-Zadeh, Armin; Fujisawa, Yasuko; Schuijf, Joanne D; Rochitte, Carlos E; Scholte, Arthur J; Kitagawa, Kakuya; Dewey, Marc; Cox, Christopher; DiCarli, Marcelo F; George, Richard T; Lima, Joao A C

    To determine the diagnostic accuracy of semi-automatic quantitative metrics compared to expert reading for interpretation of computed tomography perfusion (CTP) imaging. The CORE320 multicenter diagnostic accuracy clinical study enrolled patients between 45 and 85 years of age who were clinically referred for invasive coronary angiography (ICA). Computed tomography angiography (CTA), CTP, single photon emission computed tomography (SPECT), and ICA images were interpreted manually in blinded core laboratories by two experienced readers. Additionally, eight quantitative CTP metrics as continuous values were computed semi-automatically from myocardial and blood attenuation and were combined using logistic regression to derive a final quantitative CTP metric score. For the reference standard, hemodynamically significant coronary artery disease (CAD) was defined as a quantitative ICA stenosis of 50% or greater and a corresponding perfusion defect by SPECT. Diagnostic accuracy was determined by area under the receiver operating characteristic curve (AUC). Of the total 377 included patients, 66% were male, median age was 62 (IQR: 56, 68) years, and 27% had prior myocardial infarction. In patient based analysis, the AUC (95% CI) for combined CTA-CTP expert reading and combined CTA-CTP semi-automatic quantitative metrics was 0.87(0.84-0.91) and 0.86 (0.83-0.9), respectively. In vessel based analyses the AUC's were 0.85 (0.82-0.88) and 0.84 (0.81-0.87), respectively. No significant difference in AUC was found between combined CTA-CTP expert reading and CTA-CTP semi-automatic quantitative metrics in patient based or vessel based analyses(p > 0.05 for all). Combined CTA-CTP semi-automatic quantitative metrics is as accurate as CTA-CTP expert reading to detect hemodynamically significant CAD. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  11. Economic and environmental impacts of alternative transportation technologies.

    DOT National Transportation Integrated Search

    2013-04-01

    This project has focused on comparing alternative transportation technologies in terms of their : environmental and economic impacts. The research is data-driven and quantitative, and examines the : dynamics of impact. We have developed new theory an...

  12. Quantifying Ubiquitin Signaling

    PubMed Central

    Ordureau, Alban; Münch, Christian; Harper, J. Wade

    2015-01-01

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), most notably phosphorylation. Flux through such pathways is typically dictated by the fractional stoichiometry of distinct regulatory modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events. A key regulatory feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. PMID:26000850

  13. Quantitative Accelerated Life Testing of MEMS Accelerometers

    PubMed Central

    Bâzu, Marius; Gălăţeanu, Lucian; Ilian, Virgil Emil; Loicq, Jerome; Habraken, Serge; Collette, Jean-Paul

    2007-01-01

    Quantitative Accelerated Life Testing (QALT) is a solution for assessing the reliability of Micro Electro Mechanical Systems (MEMS). A procedure for QALT is shown in this paper and an attempt to assess the reliability level for a batch of MEMS accelerometers is reported. The testing plan is application-driven and contains combined tests: thermal (high temperature) and mechanical stress. Two variants of mechanical stress are used: vibration (at a fixed frequency) and tilting. Original equipment for testing at tilting and high temperature is used. Tilting is appropriate as application-driven stress, because the tilt movement is a natural environment for devices used for automotive and aerospace applications. Also, tilting is used by MEMS accelerometers for anti-theft systems. The test results demonstrated the excellent reliability of the studied devices, the failure rate in the “worst case” being smaller than 10-7h-1. PMID:28903265

  14. Metabolic network reconstruction of Chlamydomonas offers insight into light-driven algal metabolism

    PubMed Central

    Chang, Roger L; Ghamsari, Lila; Manichaikul, Ani; Hom, Erik F Y; Balaji, Santhanam; Fu, Weiqi; Shen, Yun; Hao, Tong; Palsson, Bernhard Ø; Salehi-Ashtiani, Kourosh; Papin, Jason A

    2011-01-01

    Metabolic network reconstruction encompasses existing knowledge about an organism's metabolism and genome annotation, providing a platform for omics data analysis and phenotype prediction. The model alga Chlamydomonas reinhardtii is employed to study diverse biological processes from photosynthesis to phototaxis. Recent heightened interest in this species results from an international movement to develop algal biofuels. Integrating biological and optical data, we reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. PMID:21811229

  15. Floquet spectrum and driven conductance in Dirac materials: Effects of Landau-Zener-Stückelberg-Majorana interferometry

    NASA Astrophysics Data System (ADS)

    Rodionov, Ya. I.; Kugel, K. I.; Nori, Franco

    2016-11-01

    Using the Landau-Zener-Stückelberg-Majorana-type (LZSM) semiclassical approach, we study both graphene and a thin film of a Weyl semimetal subjected to a strong ac electromagnetic field. The spectrum of quasienergies in the Weyl semimetal turns out to be similar to that of a graphene sheet. It has been predicted qualitatively that the transport properties of strongly irradiated graphene oscillate as a function of the radiation intensity [S. V. Syzranov et al., Phys. Rev. B 88, 241112 (2013)], 10.1103/PhysRevB.88.241112. Here we obtain rigorous quantitative results for a driven linear conductance of graphene and a thin film of a Weyl semimetal. The exact quantitative structure of oscillations exhibits two contributions. The first one is a manifestation of the Ramsauer-Townsend effect, while the second contribution is a consequence of the LZSM interference defining the spectrum of quasienergies.

  16. Atoll island hydrogeology: flow and freshwater occurrence in a tidally dominated system

    NASA Astrophysics Data System (ADS)

    Oberdorfer, June A.; Hogan, Patrick J.; Buddemeier, Robert W.

    1990-12-01

    A layered-aquifer model of groundwater occurrence in an atoll island was tested with a solute-transport numerical model. The computer model used, SUTRA, incorporates density-dependent flow. This can be significant in freshwater-saltwater interactions associated with the freshwater lens of an atoll island. Boundary conditions for the model included ocean and lagoon tidal variations. The model was calibrated to field data from Enjebi Island, Enewetak Atoll, and tested for sensitivity to a variety of parameters. This resulted in a hydraulic conductivity of 10 m day -1 for the surficial aquifer and 1000 m day -1 for the deeper aquifer; this combination of values gave an excellent reproduction of the tidal response data from test wells. The average salinity distribution was closely reproduced using a dispersivity of 0.02m. The computer simulation quantitatively supports the layered-aquifer model, including under conditions of density-dependent flow, and shows that tidal variations are the predominant driving force for flow beneath the island. The oscillating, vertical flow produced by the tidal variations creates an extensive mixing zone of brackish water. The layered-aquifer model with tidally driven flow is a significant improvement over the Ghyben-Herzberg-Dupuit model as it is conventionally applied to groundwater studies for many Pacific reef islands.

  17. Computational study of the inhibitory mechanism of the kinase CDK5 hyperactivity by peptide p5 and derivation of a pharmacophore

    NASA Astrophysics Data System (ADS)

    Cardone, A.; Brady, M.; Sriram, R.; Pant, H. C.; Hassan, S. A.

    2016-06-01

    The hyperactivity of the cyclic dependent kinase 5 (CDK5) induced by the activator protein p25 has been linked to a number of pathologies of the brain. The CDK5-p25 complex has thus emerged as a major therapeutic target for Alzheimer's disease (AD) and other neurodegenerative conditions. Experiments have shown that the peptide p5 reduces the CDK5-p25 activity without affecting the endogenous CDK5-p35 activity, whereas the peptide TFP5, obtained from p5, elicits similar inhibition, crosses the blood-brain barrier, and exhibits behavioral rescue of AD mice models with no toxic side effects. The molecular basis of the kinase inhibition is not currently known, and is here investigated by computer simulations. It is shown that p5 binds the kinase at the same CDK5/p25 and CDK5/p35 interfaces, and is thus a non-selective competitor of both activators, in agreement with available experimental data in vitro. Binding of p5 is enthalpically driven with an affinity estimated in the low µM range. A quantitative description of the binding site and pharmacophore is presented, and options are discussed to increase the binding affinity and selectivity in the design of drug-like compounds against AD.

  18. Full cell simulation and the evaluation of the buffer system on air-cathode microbial fuel cell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ou, Shiqi; Kashima, Hiroyuki; Aaron, Douglas S.

    This paper presents a computational model of a single chamber, air-cathode MFC. The model considers losses due to mass transport, as well as biological and electrochemical reactions, in both the anode and cathode half-cells. Computational fluid dynamics and Monod-Nernst analysis are incorporated into the reactions for the anode biofilm and cathode Pt catalyst and biofilm. The integrated model provides a macro-perspective of the interrelation between the anode and cathode during power production, while incorporating microscale contributions of mass transport within the anode and cathode layers. Model considerations include the effects of pH (H +/OH – transport) and electric field-driven migrationmore » on concentration overpotential, effects of various buffers and various amounts of buffer on the pH in the whole reactor, and overall impacts on the power output of the MFC. The simulation results fit the experimental polarization and power density curves well. Further, this model provides insight regarding mass transport at varying current density regimes and quantitative delineation of overpotentials at the anode and cathode. Altogether, this comprehensive simulation is designed to accurately predict MFC performance based on fundamental fluid and kinetic relations and guide optimization of the MFC system.« less

  19. Full cell simulation and the evaluation of the buffer system on air-cathode microbial fuel cell

    DOE PAGES

    Ou, Shiqi; Kashima, Hiroyuki; Aaron, Douglas S.; ...

    2017-02-23

    This paper presents a computational model of a single chamber, air-cathode MFC. The model considers losses due to mass transport, as well as biological and electrochemical reactions, in both the anode and cathode half-cells. Computational fluid dynamics and Monod-Nernst analysis are incorporated into the reactions for the anode biofilm and cathode Pt catalyst and biofilm. The integrated model provides a macro-perspective of the interrelation between the anode and cathode during power production, while incorporating microscale contributions of mass transport within the anode and cathode layers. Model considerations include the effects of pH (H +/OH – transport) and electric field-driven migrationmore » on concentration overpotential, effects of various buffers and various amounts of buffer on the pH in the whole reactor, and overall impacts on the power output of the MFC. The simulation results fit the experimental polarization and power density curves well. Further, this model provides insight regarding mass transport at varying current density regimes and quantitative delineation of overpotentials at the anode and cathode. Altogether, this comprehensive simulation is designed to accurately predict MFC performance based on fundamental fluid and kinetic relations and guide optimization of the MFC system.« less

  20. Outer Membrane Protein Folding and Topology from a Computational Transfer Free Energy Scale.

    PubMed

    Lin, Meishan; Gessmann, Dennis; Naveed, Hammad; Liang, Jie

    2016-03-02

    Knowledge of the transfer free energy of amino acids from aqueous solution to a lipid bilayer is essential for understanding membrane protein folding and for predicting membrane protein structure. Here we report a computational approach that can calculate the folding free energy of the transmembrane region of outer membrane β-barrel proteins (OMPs) by combining an empirical energy function with a reduced discrete state space model. We quantitatively analyzed the transfer free energies of 20 amino acid residues at the center of the lipid bilayer of OmpLA. Our results are in excellent agreement with the experimentally derived hydrophobicity scales. We further exhaustively calculated the transfer free energies of 20 amino acids at all positions in the TM region of OmpLA. We found that the asymmetry of the Gram-negative bacterial outer membrane as well as the TM residues of an OMP determine its functional fold in vivo. Our results suggest that the folding process of an OMP is driven by the lipid-facing residues in its hydrophobic core, and its NC-IN topology is determined by the differential stabilities of OMPs in the asymmetrical outer membrane. The folding free energy is further reduced by lipid A and assisted by general depth-dependent cooperativities that exist between polar and ionizable residues. Moreover, context-dependency of transfer free energies at specific positions in OmpLA predict regions important for protein function as well as structural anomalies. Our computational approach is fast, efficient and applicable to any OMP.

  1. A unified account of perceptual layering and surface appearance in terms of gamut relativity.

    PubMed

    Vladusich, Tony; McDonnell, Mark D

    2014-01-01

    When we look at the world--or a graphical depiction of the world--we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance--based on a boarder theoretical framework called gamut relativity--that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications.

  2. A Unified Account of Perceptual Layering and Surface Appearance in Terms of Gamut Relativity

    PubMed Central

    Vladusich, Tony; McDonnell, Mark D.

    2014-01-01

    When we look at the world—or a graphical depiction of the world—we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance—based on a boarder theoretical framework called gamut relativity—that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications. PMID:25402466

  3. DEBRIS: a computer program for analyzing channel cross sections

    Treesearch

    Patrick Deenihan; Thomas E. Lisle

    1988-01-01

    DEBRIS is a menu-driven, interactive computer program written in FORTRAN 77 for recording and plotting survey data and for computing hydraulic variables and depths of scour and fill. It was developed for use with the USDA Forest Service's Data General computer system, with the AOS/VS operation system. By using menus, the operator does not need to know any...

  4. DEBRIS: A computer program for analyzing channel cross sections

    Treesearch

    Patrick Deenihan; Thomas E. Lisle

    1988-01-01

    DEBRIS is a menu-driven, interactive computer program written in FORTRAN 77 for recording and platting survey data and for computing hydraulic variables and depths of scour and fill. It was developed for use with the USDA Forest Service's Data General computer system, with the AOS/VS operating system. By using menus, the operator does not need to know any...

  5. Time Scale for Adiabaticity Breakdown in Driven Many-Body Systems and Orthogonality Catastrophe

    NASA Astrophysics Data System (ADS)

    Lychkovskiy, Oleg; Gamayun, Oleksandr; Cheianov, Vadim

    2017-11-01

    The adiabatic theorem is a fundamental result in quantum mechanics, which states that a system can be kept arbitrarily close to the instantaneous ground state of its Hamiltonian if the latter varies in time slowly enough. The theorem has an impressive record of applications ranging from foundations of quantum field theory to computational molecular dynamics. In light of this success it is remarkable that a practicable quantitative understanding of what "slowly enough" means is limited to a modest set of systems mostly having a small Hilbert space. Here we show how this gap can be bridged for a broad natural class of physical systems, namely, many-body systems where a small move in the parameter space induces an orthogonality catastrophe. In this class, the conditions for adiabaticity are derived from the scaling properties of the parameter-dependent ground state without a reference to the excitation spectrum. This finding constitutes a major simplification of a complex problem, which otherwise requires solving nonautonomous time evolution in a large Hilbert space.

  6. PatternLab for proteomics 4.0: A one-stop shop for analyzing shotgun proteomic data

    PubMed Central

    Carvalho, Paulo C; Lima, Diogo B; Leprevost, Felipe V; Santos, Marlon D M; Fischer, Juliana S G; Aquino, Priscila F; Moresco, James J; Yates, John R; Barbosa, Valmir C

    2017-01-01

    PatternLab for proteomics is an integrated computational environment that unifies several previously published modules for analyzing shotgun proteomic data. PatternLab contains modules for formatting sequence databases, performing peptide spectrum matching, statistically filtering and organizing shotgun proteomic data, extracting quantitative information from label-free and chemically labeled data, performing statistics for differential proteomics, displaying results in a variety of graphical formats, performing similarity-driven studies with de novo sequencing data, analyzing time-course experiments, and helping with the understanding of the biological significance of data in the light of the Gene Ontology. Here we describe PatternLab for proteomics 4.0, which closely knits together all of these modules in a self-contained environment, covering the principal aspects of proteomic data analysis as a freely available and easily installable software package. All updates to PatternLab, as well as all new features added to it, have been tested over the years on millions of mass spectra. PMID:26658470

  7. A cell-based computational model of early embryogenesis coupling mechanical behaviour and gene regulation

    NASA Astrophysics Data System (ADS)

    Delile, Julien; Herrmann, Matthieu; Peyriéras, Nadine; Doursat, René

    2017-01-01

    The study of multicellular development is grounded in two complementary domains: cell biomechanics, which examines how physical forces shape the embryo, and genetic regulation and molecular signalling, which concern how cells determine their states and behaviours. Integrating both sides into a unified framework is crucial to fully understand the self-organized dynamics of morphogenesis. Here we introduce MecaGen, an integrative modelling platform enabling the hypothesis-driven simulation of these dual processes via the coupling between mechanical and chemical variables. Our approach relies upon a minimal `cell behaviour ontology' comprising mesenchymal and epithelial cells and their associated behaviours. MecaGen enables the specification and control of complex collective movements in 3D space through a biologically relevant gene regulatory network and parameter space exploration. Three case studies investigating pattern formation, epithelial differentiation and tissue tectonics in zebrafish early embryogenesis, the latter with quantitative comparison to live imaging data, demonstrate the validity and usefulness of our framework.

  8. Convection driven zonal flows and vortices in the major planets.

    PubMed

    Busse, F. H.

    1994-06-01

    The dynamical properties of convection in rotating cylindrical annuli and spherical shells are reviewed. Simple theoretical models and experimental simulations of planetary convection through the use of the centrifugal force in the laboratory are emphasized. The model of columnar convection in a cylindrical annulus not only serves as a guide to the dynamical properties of convection in rotating sphere; it also is of interest as a basic physical system that exhibits several dynamical properties in their most simple form. The generation of zonal mean flows is discussed in some detail and examples of recent numerical computations are presented. The exploration of the parameter space for the annulus model is not yet complete and the theoretical exploration of convection in rotating spheres is still in the beginning phase. Quantitative comparisons with the observations of the dynamics of planetary atmospheres will have to await the consideration in the models of the effects of magnetic fields and the deviations from the Boussinesq approximation.

  9. Measurements on a guitar string as an example of a physical nonlinear driven oscillator

    NASA Astrophysics Data System (ADS)

    Carlà, Marcello; Straulino, Samuele

    2017-08-01

    An experimental study is described to characterize the oscillation of a guitar string around resonance. A periodic force was applied to the string, generated by the electromagnetic interaction between an alternating current flowing in the string and a magnetic field. The oscillation was studied by measuring the voltage induced in the string itself, which is proportional to the velocity. Accurate quantitative data were obtained for the velocity, both modulus and phase, with a time resolution of 3 ms, corresponding to the oscillation period. The measuring instrument was a personal computer with its sound card and an electronic amplifier, both used to generate the excitation current and record the velocity signal, while performing the required frequency sweep. The study covered an excitation force range more than two and half decades wide (51 dB). The experimental results showed very good agreement with the theoretical behavior of a Duffing oscillator with nonlinear damping over about two decades.

  10. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  11. Chinese and American Employers’ Perspectives Regarding Hiring People with Behaviorally Driven Health Conditions: The Role of Stigma

    PubMed Central

    Corrigan, Patrick W.; Tsang, Hector W. H.; Shi, Kan; Lam, Chow S.; Larson, Jon

    2010-01-01

    Work opportunities for people with behaviorally driven health conditions such as HIV/AIDS, drug abuse, alcohol abuse, and psychosis are directly impacted by employer perspectives. To investigate this issue, we report findings from a mixed method design involving qualitative interviews followed by a quantitative survey of employers from Chicago (U.S.), Beijing (China), and Hong Kong (China). Findings from qualitative interviews of 100 employers were used to create 27 items measuring employer perspectives (the Employer Perspective Scale: EPS) about hiring people with health conditions. These perspectives reflect reasons for or against discrimination. In the quantitative phase of the study, representative samples of approximately 300 employers per city were administered the EPS in addition to measures of stigma, including attributions about disease onset and offset. The EPS and stigma scales were completed in the context of one of five randomly assigned health conditions. We weighted data with ratios of key demographics between the sample and the corresponding employer population data. Analyses showed that both onset and offset responsibility varied by behaviorally driven condition. Analyses also showed that employer perspectives were more negative for health conditions that are seen as more behaviorally driven, e.g., drug and alcohol abuse. Chicago employers endorsed onset and offset attributions less strongly compared to those in Hong Kong and Beijing. Chicago employers also recognized more benefits of hiring people with various health conditions. The implications of these findings for better understanding stigma and stigma change among employers are considered. PMID:21036445

  12. Resonance-assisted decay of nondispersive wave packets.

    PubMed

    Wimberger, Sandro; Schlagheck, Peter; Eltschka, Christopher; Buchleitner, Andreas

    2006-07-28

    We present a quantitative semiclassical theory for the decay of nondispersive electronic wave packets in driven, ionizing Rydberg systems. Statistically robust quantities are extracted combining resonance-assisted tunneling with subsequent transport across chaotic phase space and a final ionization step.

  13. Computational challenges in modeling gene regulatory events.

    PubMed

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  14. Evolution of an Intelligent Deductive Logic Tutor Using Data-Driven Elements

    ERIC Educational Resources Information Center

    Mostafavi, Behrooz; Barnes, Tiffany

    2017-01-01

    Deductive logic is essential to a complete understanding of computer science concepts, and is thus fundamental to computer science education. Intelligent tutoring systems with individualized instruction have been shown to increase learning gains. We seek to improve the way deductive logic is taught in computer science by developing an intelligent,…

  15. Generalized likelihood ratios for quantitative diagnostic test scores.

    PubMed

    Tandberg, D; Deely, J J; O'Malley, A J

    1997-11-01

    The reduction of quantitative diagnostic test scores to the dichotomous case is a wasteful and unnecessary simplification in the era of high-speed computing. Physicians could make better use of the information embedded in quantitative test results if modern generalized curve estimation techniques were applied to the likelihood functions of Bayes' theorem. Hand calculations could be completely avoided and computed graphical summaries provided instead. Graphs showing posttest probability of disease as a function of pretest probability with confidence intervals (POD plots) would enhance acceptance of these techniques if they were immediately available at the computer terminal when test results were retrieved. Such constructs would also provide immediate feedback to physicians when a valueless test had been ordered.

  16. The art and science of selecting graduate students in the biomedical sciences: Performance in doctoral study of the foundational sciences.

    PubMed

    Park, Hee-Young; Berkowitz, Oren; Symes, Karen; Dasgupta, Shoumita

    2018-01-01

    The goal of this study was to investigate associations between admissions criteria and performance in Ph.D. programs at Boston University School of Medicine. The initial phase of this project examined student performance in the classroom component of a newly established curriculum named "Foundations in Biomedical Sciences (FiBS)". Quantitative measures including undergraduate grade point average (GPA), graduate record examination (GRE; a standardized, computer-based test) scores for the verbal (assessment of test takers' ability to analyze, evaluate, and synthesize information and concepts provided in writing) and quantitative (assessment of test takers' problem-solving ability) components of the examination, previous research experience, and competitiveness of previous research institution were used in the study. These criteria were compared with competencies in the program defined as students who pass the curriculum as well as students categorized as High Performers. These data indicated that there is a significant positive correlation between FiBS performance and undergraduate GPA, GRE scores, and competitiveness of undergraduate institution. No significant correlations were found between FiBS performance and research background. By taking a data-driven approach to examine admissions and performance, we hope to refine our admissions criteria to facilitate an unbiased approach to recruitment of students in the life sciences and to share our strategy to support similar goals at other institutions.

  17. A Perspective on Implementing a Quantitative Systems Pharmacology Platform for Drug Discovery and the Advancement of Personalized Medicine.

    PubMed

    Stern, Andrew M; Schurdak, Mark E; Bahar, Ivet; Berg, Jeremy M; Taylor, D Lansing

    2016-07-01

    Drug candidates exhibiting well-defined pharmacokinetic and pharmacodynamic profiles that are otherwise safe often fail to demonstrate proof-of-concept in phase II and III trials. Innovation in drug discovery and development has been identified as a critical need for improving the efficiency of drug discovery, especially through collaborations between academia, government agencies, and industry. To address the innovation challenge, we describe a comprehensive, unbiased, integrated, and iterative quantitative systems pharmacology (QSP)-driven drug discovery and development strategy and platform that we have implemented at the University of Pittsburgh Drug Discovery Institute. Intrinsic to QSP is its integrated use of multiscale experimental and computational methods to identify mechanisms of disease progression and to test predicted therapeutic strategies likely to achieve clinical validation for appropriate subpopulations of patients. The QSP platform can address biological heterogeneity and anticipate the evolution of resistance mechanisms, which are major challenges for drug development. The implementation of this platform is dedicated to gaining an understanding of mechanism(s) of disease progression to enable the identification of novel therapeutic strategies as well as repurposing drugs. The QSP platform will help promote the paradigm shift from reactive population-based medicine to proactive personalized medicine by focusing on the patient as the starting and the end point. © 2016 Society for Laboratory Automation and Screening.

  18. Radiation-driven winds of hot stars. V - Wind models for central stars of planetary nebulae

    NASA Technical Reports Server (NTRS)

    Pauldrach, A.; Puls, J.; Kudritzki, R. P.; Mendez, R. H.; Heap, S. R.

    1988-01-01

    Wind models using the recent improvements of radiation driven wind theory by Pauldrach et al. (1986) and Pauldrach (1987) are presented for central stars of planetary nebulae. The models are computed along evolutionary tracks evolving with different stellar mass from the Asymptotic Giant Branch. We show that the calculated terminal wind velocities are in agreement with the observations and allow in principle an independent determination of stellar masses and radii. The computed mass-loss rates are in qualitative agreement with the occurrence of spectroscopic stellar wind features as a function of stellar effective temperature and gravity.

  19. DIGGING DEEPER INTO DEEP DATA: MOLECULAR DOCKING AS A HYPOTHESIS-DRIVEN BIOPHYSICAL INTERROGATION SYSTEM IN COMPUTATIONAL TOXICOLOGY.

    EPA Science Inventory

    Developing and evaluating prediactive strategies to elucidate the mode of biological activity of environmental chemicals is a major objective of the concerted efforts of the US-EPA's computational toxicology program.

  20. Light-driven solute transport in Halobacterium halobium

    NASA Technical Reports Server (NTRS)

    Lanyi, J. K.

    1979-01-01

    The cell membrane of Halobacterium halobium exhibits differential regions which contain crystalline arrays of a single kind of protein, termed bacteriorhodopsin. This bacterial retinal-protein complex resembles the visual pigment and, after the absorption of protons, translocates H(+) across the cell membrane, leading to an electrochemical gradient for protons between the inside and the outside of the cell. Thus, light is an alternate source of energy in these bacteria, in addition to terminal oxidation. The paper deals with work on light-driven transport in H. halobium with cell envelope vesicles. The discussion covers light-driven movements of H(+), Na(+), and K(+); light-driven amino acid transport; and apparent allosteric control of amino acid transport. The scheme of energy coupling in H. halobium vesicles appears simple, its quantitative details are quite complex and reveal regulatory phenomena. More knowledge is required of the way the coupling components are regulated by the ion gradients present.

  1. Comparison between numerical and analytical results on the required rf current for stabilizing neoclassical tearing modes

    NASA Astrophysics Data System (ADS)

    Wang, Xiaojing; Yu, Qingquan; Zhang, Xiaodong; Zhang, Yang; Zhu, Sizheng; Wang, Xiaoguang; Wu, Bin

    2018-04-01

    Numerical studies on the stabilization of neoclassical tearing modes (NTMs) by electron cyclotron current drive (ECCD) have been carried out based on reduced MHD equations, focusing on the amount of the required driven current for mode stabilization and the comparison with analytical results. The dependence of the minimum driven current required for NTM stabilization on some parameters, including the bootstrap current density, radial width of the driven current, radial deviation of the driven current from the resonant surface, and the island width when applying ECCD, are studied. By fitting the numerical results, simple expressions for these dependences are obtained. Analysis based on the modified Rutherford equation (MRE) has also been carried out, and the corresponding results have the same trend as numerical ones, while a quantitative difference between them exists. This difference becomes smaller when the applied radio frequency (rf) current is smaller.

  2. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  3. Easy calculations of lod scores and genetic risks on small computers.

    PubMed Central

    Lathrop, G M; Lalouel, J M

    1984-01-01

    A computer program that calculates lod scores and genetic risks for a wide variety of both qualitative and quantitative genetic traits is discussed. An illustration is given of the joint use of a genetic marker, affection status, and quantitative information in counseling situations regarding Duchenne muscular dystrophy. PMID:6585139

  4. A Quantitative Study of the Relationship between Leadership Practice and Strategic Intentions to Use Cloud Computing

    ERIC Educational Resources Information Center

    Castillo, Alan F.

    2014-01-01

    The purpose of this quantitative correlational cross-sectional research study was to examine a theoretical model consisting of leadership practice, attitudes of business process outsourcing, and strategic intentions of leaders to use cloud computing and to examine the relationships between each of the variables respectively. This study…

  5. Integrating Model-Driven and Data-Driven Techniques for Analyzing Learning Behaviors in Open-Ended Learning Environments

    ERIC Educational Resources Information Center

    Kinnebrew, John S.; Segedy, James R.; Biswas, Gautam

    2017-01-01

    Research in computer-based learning environments has long recognized the vital role of adaptivity in promoting effective, individualized learning among students. Adaptive scaffolding capabilities are particularly important in open-ended learning environments, which provide students with opportunities for solving authentic and complex problems, and…

  6. Infusing Technology Driven Design Thinking in Industrial Design Education: A Case Study

    ERIC Educational Resources Information Center

    Mubin, Omar; Novoa, Mauricio; Al Mahmud, Abdullah

    2017-01-01

    Purpose: This paper narrates a case study on design thinking-based education work in an industrial design honours program. Student projects were developed in a multi-disciplinary setting across a Computing and Engineering faculty that allowed promoting technologically and user-driven innovation strategies. Design/methodology/approach: A renewed…

  7. Reduction of Metal Artifact in Single Photon-Counting Computed Tomography by Spectral-Driven Iterative Reconstruction Technique

    PubMed Central

    Nasirudin, Radin A.; Mei, Kai; Panchev, Petar; Fehringer, Andreas; Pfeiffer, Franz; Rummeny, Ernst J.; Fiebich, Martin; Noël, Peter B.

    2015-01-01

    Purpose The exciting prospect of Spectral CT (SCT) using photon-counting detectors (PCD) will lead to new techniques in computed tomography (CT) that take advantage of the additional spectral information provided. We introduce a method to reduce metal artifact in X-ray tomography by incorporating knowledge obtained from SCT into a statistical iterative reconstruction scheme. We call our method Spectral-driven Iterative Reconstruction (SPIR). Method The proposed algorithm consists of two main components: material decomposition and penalized maximum likelihood iterative reconstruction. In this study, the spectral data acquisitions with an energy-resolving PCD were simulated using a Monte-Carlo simulator based on EGSnrc C++ class library. A jaw phantom with a dental implant made of gold was used as an object in this study. A total of three dental implant shapes were simulated separately to test the influence of prior knowledge on the overall performance of the algorithm. The generated projection data was first decomposed into three basis functions: photoelectric absorption, Compton scattering and attenuation of gold. A pseudo-monochromatic sinogram was calculated and used as input in the reconstruction, while the spatial information of the gold implant was used as a prior. The results from the algorithm were assessed and benchmarked with state-of-the-art reconstruction methods. Results Decomposition results illustrate that gold implant of any shape can be distinguished from other components of the phantom. Additionally, the result from the penalized maximum likelihood iterative reconstruction shows that artifacts are significantly reduced in SPIR reconstructed slices in comparison to other known techniques, while at the same time details around the implant are preserved. Quantitatively, the SPIR algorithm best reflects the true attenuation value in comparison to other algorithms. Conclusion It is demonstrated that the combination of the additional information from Spectral CT and statistical reconstruction can significantly improve image quality, especially streaking artifacts caused by the presence of materials with high atomic numbers. PMID:25955019

  8. Model-driven meta-analyses for informing health care: a diabetes meta-analysis as an exemplar.

    PubMed

    Brown, Sharon A; Becker, Betsy Jane; García, Alexandra A; Brown, Adama; Ramírez, Gilbert

    2015-04-01

    A relatively novel type of meta-analysis, a model-driven meta-analysis, involves the quantitative synthesis of descriptive, correlational data and is useful for identifying key predictors of health outcomes and informing clinical guidelines. Few such meta-analyses have been conducted and thus, large bodies of research remain unsynthesized and uninterpreted for application in health care. We describe the unique challenges of conducting a model-driven meta-analysis, focusing primarily on issues related to locating a sample of published and unpublished primary studies, extracting and verifying descriptive and correlational data, and conducting analyses. A current meta-analysis of the research on predictors of key health outcomes in diabetes is used to illustrate our main points. © The Author(s) 2014.

  9. MODEL-DRIVEN META-ANALYSES FOR INFORMING HEALTH CARE: A DIABETES META-ANALYSIS AS AN EXEMPLAR

    PubMed Central

    Brown, Sharon A.; Becker, Betsy Jane; García, Alexandra A.; Brown, Adama; Ramírez, Gilbert

    2015-01-01

    A relatively novel type of meta-analysis, a model-driven meta-analysis, involves the quantitative synthesis of descriptive, correlational data and is useful for identifying key predictors of health outcomes and informing clinical guidelines. Few such meta-analyses have been conducted and thus, large bodies of research remain unsynthesized and uninterpreted for application in health care. We describe the unique challenges of conducting a model-driven meta-analysis, focusing primarily on issues related to locating a sample of published and unpublished primary studies, extracting and verifying descriptive and correlational data, and conducting analyses. A current meta-analysis of the research on predictors of key health outcomes in diabetes is used to illustrate our main points. PMID:25142707

  10. Mathematical modeling and computational prediction of cancer drug resistance.

    PubMed

    Sun, Xiaoqiang; Hu, Bin

    2017-06-23

    Diverse forms of resistance to anticancer drugs can lead to the failure of chemotherapy. Drug resistance is one of the most intractable issues for successfully treating cancer in current clinical practice. Effective clinical approaches that could counter drug resistance by restoring the sensitivity of tumors to the targeted agents are urgently needed. As numerous experimental results on resistance mechanisms have been obtained and a mass of high-throughput data has been accumulated, mathematical modeling and computational predictions using systematic and quantitative approaches have become increasingly important, as they can potentially provide deeper insights into resistance mechanisms, generate novel hypotheses or suggest promising treatment strategies for future testing. In this review, we first briefly summarize the current progress of experimentally revealed resistance mechanisms of targeted therapy, including genetic mechanisms, epigenetic mechanisms, posttranslational mechanisms, cellular mechanisms, microenvironmental mechanisms and pharmacokinetic mechanisms. Subsequently, we list several currently available databases and Web-based tools related to drug sensitivity and resistance. Then, we focus primarily on introducing some state-of-the-art computational methods used in drug resistance studies, including mechanism-based mathematical modeling approaches (e.g. molecular dynamics simulation, kinetic model of molecular networks, ordinary differential equation model of cellular dynamics, stochastic model, partial differential equation model, agent-based model, pharmacokinetic-pharmacodynamic model, etc.) and data-driven prediction methods (e.g. omics data-based conventional screening approach for node biomarkers, static network approach for edge biomarkers and module biomarkers, dynamic network approach for dynamic network biomarkers and dynamic module network biomarkers, etc.). Finally, we discuss several further questions and future directions for the use of computational methods for studying drug resistance, including inferring drug-induced signaling networks, multiscale modeling, drug combinations and precision medicine. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. ASC ATDM Level 2 Milestone #5325: Asynchronous Many-Task Runtime System Analysis and Assessment for Next Generation Platforms.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Gavin Matthew; Bettencourt, Matthew Tyler; Bova, Steven W.

    2015-09-01

    This report provides in-depth information and analysis to help create a technical road map for developing next- generation Orogramming mocleN and runtime systemsl that support Advanced Simulation and Computing (ASC) work- load requirements. The focus herein is on 4synchronous many-task (AMT) model and runtime systems, which are of great interest in the context of "Oriascale7 computing, as they hold the promise to address key issues associated with future extreme-scale computer architectures. This report includes a thorough qualitative and quantitative examination of three best-of-class AIM] runtime systemsHCharm-HE, Legion, and Uintah, all of which are in use as part of the Centers.more » The studies focus on each of the runtimes' programmability, performance, and mutability. Through the experiments and analysis presented, several overarching Predictive Science Academic Alliance Program II (PSAAP-II) Ascl findings emerge. From a performance perspective, AIVT11runtimes show tremendous potential for addressing extreme- scale challenges. Empirical studies show an AM11 runtime can mitigate performance heterogeneity inherent to the machine itself and that Message Passing Interface (MP1) and AM11runtimes perform comparably under balanced con- ditions. From a programmability and mutability perspective however, none of the runtimes in this study are currently ready for use in developing production-ready Sandia ASCIapplications. The report concludes by recommending a co- design path forward, wherein application, programming model, and runtime system developers work together to define requirements and solutions. Such a requirements-driven co-design approach benefits the community as a whole, with widespread community engagement mitigating risk for both application developers developers. and high-performance computing inntime systein« less

  12. Data Driven Performance Evaluation of Wireless Sensor Networks

    PubMed Central

    Frery, Alejandro C.; Ramos, Heitor S.; Alencar-Neto, José; Nakamura, Eduardo; Loureiro, Antonio A. F.

    2010-01-01

    Wireless Sensor Networks are presented as devices for signal sampling and reconstruction. Within this framework, the qualitative and quantitative influence of (i) signal granularity, (ii) spatial distribution of sensors, (iii) sensors clustering, and (iv) signal reconstruction procedure are assessed. This is done by defining an error metric and performing a Monte Carlo experiment. It is shown that all these factors have significant impact on the quality of the reconstructed signal. The extent of such impact is quantitatively assessed. PMID:22294920

  13. Geographic and demographic variabilities of quantitative parameters in stress myocardial computed tomography perfusion.

    PubMed

    Park, Jinoh; Kim, Hyun-Sook; Hwang, Hye Jeon; Yang, Dong Hyun; Koo, Hyun Jung; Kang, Joon-Won; Kim, Young-Hak

    2017-09-01

    To evaluate the geographic and demographic variabilities of the quantitative parameters of computed tomography perfusion (CTP) of the left ventricular (LV) myocardium in patients with normal coronary artery on computed tomography angiography (CTA). From a multicenter CTP registry of stress and static computed tomography, we retrospectively recruited 113 patients (mean age, 60 years; 57 men) without perfusion defect on visual assessment and minimal (< 20% of diameter stenosis) or no coronary artery disease on CTA. Using semiautomatic analysis software, quantitative parameters of the LV myocardium, including the myocardial attenuation in stress and rest phases, transmural perfusion ratio (TPR), and myocardial perfusion reserve index (MPRI), were evaluated in 16 myocardial segments. In the lateral wall of the LV myocardium, all quantitative parameters except for MPRI were significantly higher compared with those in the other walls. The MPRI showed consistent values in all myocardial walls (anterior to lateral wall: range, 25% to 27%; p = 0.401). At the basal level of the myocardium, all quantitative parameters were significantly lower than those at the mid- and apical levels. Compared with men, women had significantly higher values of myocardial attenuation and TPR. Age, body mass index, and Framingham risk score were significantly associated with the difference in myocardial attenuation. Geographic and demographic variabilities of quantitative parameters in stress myocardial CTP exist in healthy subjects without significant coronary artery disease. This information may be helpful when assessing myocardial perfusion defects in CTP.

  14. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  15. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  16. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  17. Computational challenges in modeling gene regulatory events

    PubMed Central

    Pataskar, Abhijeet; Tiwari, Vijay K.

    2016-01-01

    ABSTRACT Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating “omics” data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology. PMID:27390891

  18. A Year in the Life: Two Seventh Grade Teachers Implement One-to-One Computing

    ERIC Educational Resources Information Center

    Garthwait, Abigail; Weller, Herman G.

    2005-01-01

    Maine was the first state to put laptops in the hands of an entire grade of students. This interpretive case study of two middle school science-math teachers was driven by the general question: Given ubiquitous computing, how do teachers use computers in constructing curriculum and delivering instruction? Specifically, the researchers sought to…

  19. Software for computing plant biomass—BIOPAK users guide.

    Treesearch

    Joseph E. Means; Heather A. Hansen; Greg J. Koerper; Paul B Alaback; Mark W. Klopsch

    1994-01-01

    BIOPAK is a menu-driven package of computer programs for IBM-compatible personal computers that calculates the biomass, area, height, length, or volume of plant components (leaves, branches, stem, crown, and roots). The routines were written in FoxPro, Fortran, and C.BIOPAK was created to facilitate linking of a diverse array of vegetation datasets with the...

  20. Clinical application of a light-pen computer system for quantitative angiography

    NASA Technical Reports Server (NTRS)

    Alderman, E. L.

    1975-01-01

    The important features in a clinical system for quantitative angiography were examined. The human interface for data input, whether an electrostatic pen, sonic pen, or light-pen must be engineered to optimize the quality of margin definition. The computer programs which the technician uses for data entry and computation of ventriculographic measurements must be convenient to use on a routine basis in a laboratory performing multiple studies per day. The method used for magnification correction must be continuously monitored.

  1. Laboratory data base for isomer-specific determination of polychlorinated biphenyls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, T.R.; Campbell, R.D.; Stalling, D.L.

    1984-07-01

    A computer-assisted technique for quantitative determination of polychlorinated biphenyl isomers is described. PCB isomers were identified by use of a retention index system with n-alkyl trichloroacetates as retention index marker compounds. A laboratory data base system was developed to aid in editing and quantitation of data generated from capillary gas chromatographic data. Data base management was provided by computer programs written in DSM-11 (Digital Standard MUMPS) for the PDP-11 family of computers. 13 references, 4 figures, 2 tables.

  2. Clinical and mathematical introduction to computer processing of scintigraphic images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goris, M.L.; Briandet, P.A.

    The authors state in their preface:''...we believe that there is no book yet available in which computing in nuclear medicine has been approached in a reasonable manner. This book is our attempt to correct the situation.'' The book is divided into four sections: (1) Clinical Applications of Quantitative Scintigraphic Analysis; (2) Mathematical Derivations; (3) Processing Methods of Scintigraphic Images; and (4) The (Computer) System. Section 1 has chapters on quantitative approaches to congenital and acquired heart diseases, nephrology and urology, and pulmonary medicine.

  3. Effects of Computer Programming on Students' Cognitive Performance: A Quantitative Synthesis.

    ERIC Educational Resources Information Center

    Liao, Yuen-Kuang Cliff

    A meta-analysis was performed to synthesize existing data concerning the effects of computer programing on cognitive outcomes of students. Sixty-five studies were located from three sources, and their quantitative data were transformed into a common scale--Effect Size (ES). The analysis showed that 58 (89%) of the study-weighted ESs were positive…

  4. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  5. An evaluation of data-driven motion estimation in comparison to the usage of external-surrogates in cardiac SPECT imaging

    PubMed Central

    Mukherjee, Joyeeta Mitra; Hutton, Brian F; Johnson, Karen L; Pretorius, P Hendrik; King, Michael A

    2014-01-01

    Motion estimation methods in single photon emission computed tomography (SPECT) can be classified into methods which depend on just the emission data (data-driven), or those that use some other source of information such as an external surrogate. The surrogate-based methods estimate the motion exhibited externally which may not correlate exactly with the movement of organs inside the body. The accuracy of data-driven strategies on the other hand is affected by the type and timing of motion occurrence during acquisition, the source distribution, and various degrading factors such as attenuation, scatter, and system spatial resolution. The goal of this paper is to investigate the performance of two data-driven motion estimation schemes based on the rigid-body registration of projections of motion-transformed source distributions to the acquired projection data for cardiac SPECT studies. Comparison is also made of six intensity based registration metrics to an external surrogate-based method. In the data-driven schemes, a partially reconstructed heart is used as the initial source distribution. The partially-reconstructed heart has inaccuracies due to limited angle artifacts resulting from using only a part of the SPECT projections acquired while the patient maintained the same pose. The performance of different cost functions in quantifying consistency with the SPECT projection data in the data-driven schemes was compared for clinically realistic patient motion occurring as discrete pose changes, one or two times during acquisition. The six intensity-based metrics studied were mean-squared difference (MSD), mutual information (MI), normalized mutual information (NMI), pattern intensity (PI), normalized cross-correlation (NCC) and entropy of the difference (EDI). Quantitative and qualitative analysis of the performance is reported using Monte-Carlo simulations of a realistic heart phantom including degradation factors such as attenuation, scatter and system spatial resolution. Further the visual appearance of motion-corrected images using data-driven motion estimates was compared to that obtained using the external motion-tracking system in patient studies. Pattern intensity and normalized mutual information cost functions were observed to have the best performance in terms of lowest average position error and stability with degradation of image quality of the partial reconstruction in simulations. In all patients, the visual quality of PI-based estimation was either significantly better or comparable to NMI-based estimation. Best visual quality was obtained with PI-based estimation in 1 of the 5 patient studies, and with external-surrogate based correction in 3 out of 5 patients. In the remaining patient study there was little motion and all methods yielded similar visual image quality. PMID:24107647

  6. Routine Clinical Quantitative Rest Stress Myocardial Perfusion for Managing Coronary Artery Disease: Clinical Relevance of Test-Retest Variability.

    PubMed

    Kitkungvan, Danai; Johnson, Nils P; Roby, Amanda E; Patel, Monika B; Kirkeeide, Richard; Gould, K Lance

    2017-05-01

    Positron emission tomography (PET) quantifies stress myocardial perfusion (in cc/min/g) and coronary flow reserve to guide noninvasively the management of coronary artery disease. This study determined their test-retest precision within minutes and daily biological variability essential for bounding clinical decision-making or risk stratification based on low flow ischemic thresholds or follow-up changes. Randomized trials of fractional flow reserve-guided percutaneous coronary interventions established an objective, quantitative, outcomes-driven standard of physiological stenosis severity. However, pressure-derived fractional flow reserve requires invasive coronary angiogram and was originally validated by comparison to noninvasive PET. The time course and test-retest precision of serial quantitative rest-rest and stress-stress global myocardial perfusion by PET within minutes and days apart in the same patient were compared in 120 volunteers undergoing serial 708 quantitative PET perfusion scans using rubidium 82 (Rb-82) and dipyridamole stress with a 2-dimensional PET-computed tomography scanner (GE DST 16) and University of Texas HeartSee software with our validated perfusion model. Test-retest methodological precision (coefficient of variance) for serial quantitative global myocardial perfusion minutes apart is ±10% (mean ΔSD at rest ±0.09, at stress ±0.23 cc/min/g) and for days apart is ±21% (mean ΔSD at rest ±0.2, at stress ±0.46 cc/min/g) reflecting added biological variability. Global myocardial perfusion at 8 min after 4-min dipyridamole infusion is 10% higher than at standard 4 min after dipyridamole. Test-retest methodological precision of global PET myocardial perfusion by serial rest or stress PET minutes apart is ±10%. Day-to-different-day biological plus methodological variability is ±21%, thereby establishing boundaries of variability on physiological severity to guide or follow coronary artery disease management. Maximum stress increases perfusion and coronary flow reserve, thereby reducing potentially falsely low values mimicking ischemia. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  7. The three-dimensional Event-Driven Graphics Environment (3D-EDGE)

    NASA Technical Reports Server (NTRS)

    Freedman, Jeffrey; Hahn, Roger; Schwartz, David M.

    1993-01-01

    Stanford Telecom developed the Three-Dimensional Event-Driven Graphics Environment (3D-EDGE) for NASA GSFC's (GSFC) Communications Link Analysis and Simulation System (CLASS). 3D-EDGE consists of a library of object-oriented subroutines which allow engineers with little or no computer graphics experience to programmatically manipulate, render, animate, and access complex three-dimensional objects.

  8. The Effects of Data-Driven Learning upon Vocabulary Acquisition for Secondary International School Students in Vietnam

    ERIC Educational Resources Information Center

    Karras, Jacob Nolen

    2016-01-01

    Within the field of computer assisted language learning (CALL), scant literature exists regarding the effectiveness and practicality for secondary students to utilize data-driven learning (DDL) for vocabulary acquisition. In this study, there were 100 participants, who had a mean age of thirteen years, and were attending an international school in…

  9. Computational Fluid Dynamics Simulation of Flows in an Oxidation Ditch Driven by a New Surface Aerator.

    PubMed

    Huang, Weidong; Li, Kun; Wang, Gan; Wang, Yingzhe

    2013-11-01

    In this article, we present a newly designed inverse umbrella surface aerator, and tested its performance in driving flow of an oxidation ditch. Results show that it has a better performance in driving the oxidation ditch than the original one with higher average velocity and more uniform flow field. We also present a computational fluid dynamics model for predicting the flow field in an oxidation ditch driven by a surface aerator. The improved momentum source term approach to simulate the flow field of the oxidation ditch driven by an inverse umbrella surface aerator was developed and validated through experiments. Four kinds of turbulent models were investigated with the approach, including the standard k - ɛ model, RNG k - ɛ model, realizable k - ɛ model, and Reynolds stress model, and the predicted data were compared with those calculated with the multiple rotating reference frame approach (MRF) and sliding mesh approach (SM). Results of the momentum source term approach are in good agreement with the experimental data, and its prediction accuracy is better than MRF, close to SM. It is also found that the momentum source term approach has lower computational expenses, is simpler to preprocess, and is easier to use.

  10. Computational model of collisional-radiative nonequilibrium plasma in an air-driven type laser propulsion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ogino, Yousuke; Ohnishi, Naofumi

    A thrust power of a gas-driven laser-propulsion system is obtained through interaction with a propellant gas heated by a laser energy. Therefore, understanding the nonequilibrium nature of laser-produced plasma is essential for increasing available thrust force and for improving energy conversion efficiency from a laser to a propellant gas. In this work, a time-dependent collisional-radiative model for air plasma has been developed to study the effects of nonequilibrium atomic and molecular processes on population densities for an air-driven type laser propulsion. Many elementary processes are considered in the number density range of 10{sup 12}/cm{sup 3}<=N<=10{sup 19}/cm{sup 3} and the temperaturemore » range of 300 K<=T<=40,000 K. We then compute the unsteady nature of pulsively heated air plasma. When the ionization relaxation time is the same order as the time scale of a heating pulse, the effects of unsteady ionization are important for estimating air plasma states. From parametric computations, we determine the appropriate conditions for the collisional-radiative steady state, local thermodynamic equilibrium, and corona equilibrium models in that density and temperature range.« less

  11. On edge-aware path-based color spatial sampling for Retinex: from Termite Retinex to Light Energy-driven Termite Retinex

    NASA Astrophysics Data System (ADS)

    Simone, Gabriele; Cordone, Roberto; Serapioni, Raul Paolo; Lecca, Michela

    2017-05-01

    Retinex theory estimates the human color sensation at any observed point by correcting its color based on the spatial arrangement of the colors in proximate regions. We revise two recent path-based, edge-aware Retinex implementations: Termite Retinex (TR) and Energy-driven Termite Retinex (ETR). As the original Retinex implementation, TR and ETR scan the neighborhood of any image pixel by paths and rescale their chromatic intensities by intensity levels computed by reworking the colors of the pixels on the paths. Our interest in TR and ETR is due to their unique, content-based scanning scheme, which uses the image edges to define the paths and exploits a swarm intelligence model for guiding the spatial exploration of the image. The exploration scheme of ETR has been showed to be particularly effective: its paths are local minima of an energy functional, designed to favor the sampling of image pixels highly relevant to color sensation. Nevertheless, since its computational complexity makes ETR poorly practicable, here we present a light version of it, named Light Energy-driven TR, and obtained from ETR by implementing a modified, optimized minimization procedure and by exploiting parallel computing.

  12. The application of data mining and cloud computing techniques in data-driven models for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.

    2016-04-01

    Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.

  13. Quantitative multi-target RNA profiling in Epstein-Barr virus infected tumor cells.

    PubMed

    Greijer, A E; Ramayanti, O; Verkuijlen, S A W M; Novalić, Z; Juwana, H; Middeldorp, J M

    2017-03-01

    Epstein-Barr virus (EBV) is etiologically linked to multiple acute, chronic and malignant diseases. Detection of EBV-RNA transcripts in tissues or biofluids besides EBV-DNA can help in diagnosing EBV related syndromes. Sensitive EBV transcription profiling yields new insights on its pathogenic role and may be useful for monitoring virus targeted therapy. Here we describe a multi-gene quantitative RT-PCR profiling method that simultaneously detects a broad spectrum (n=16) of crucial latent and lytic EBV transcripts. These transcripts include (but are not restricted to), EBNA1, EBNA2, LMP1, LMP2, BARTs, EBER1, BARF1 and ZEBRA, Rta, BGLF4 (PK), BXLF1 (TK) and BFRF3 (VCAp18) all of which have been implicated in EBV-driven oncogenesis and viral replication. With this method we determine the amount of RNA copies per infected (tumor) cell in bulk populations of various origin. While we confirm the expected RNA profiles within classic EBV latency programs, this sensitive quantitative approach revealed the presence of rare cells undergoing lytic replication. Inducing lytic replication in EBV tumor cells supports apoptosis and is considered as therapeutic approach to treat EBV-driven malignancies. This sensitive multi-primed quantitative RT-PCR approach can provide broader understanding of transcriptional activity in latent and lytic EBV infection and is suitable for monitoring virus-specific therapy responses in patients with EBV associated cancers. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Quantitative Story Telling: Initial steps towards bridging perspectives and tools for a robust nexus assessment

    NASA Astrophysics Data System (ADS)

    Cabello, Violeta

    2017-04-01

    This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.

  15. Chinese and American employers' perspectives regarding hiring people with behaviorally driven health conditions: the role of stigma.

    PubMed

    Corrigan, Patrick W; Tsang, Hector W H; Shi, Kan; Lam, Chow S; Larson, Jon

    2010-12-01

    Work opportunities for people with behaviorally driven health conditions such as HIV/AIDS, drug abuse, alcohol abuse, and psychosis are directly impacted by employer perspectives. To investigate this issue, we report findings from a mixed method design involving qualitative interviews followed by a quantitative survey of employers from Chicago (U.S.), Beijing (China), and Hong Kong (China). Findings from qualitative interviews of 100 employers were used to create 27 items measuring employer perspectives (the Employer Perspective Scale: EPS) about hiring people with health conditions. These perspectives reflect reasons for or against discrimination. In the quantitative phase of the study, representative samples of approximately 300 employers per city were administered the EPS in addition to measures of stigma, including attributions about disease onset and offset. The EPS and stigma scales were completed in the context of one of five randomly assigned health conditions. We weighted data with ratios of key demographics between the sample and the corresponding employer population data. Analyses showed that both onset and offset responsibility varied by behaviorally driven condition. Analyses also showed that employer perspectives were more negative for health conditions that are seen as more behaviorally driven, e.g., drug and alcohol abuse. Chicago employers endorsed onset and offset attributions less strongly compared to those in Hong Kong and Beijing. Chicago employers also recognized more benefits of hiring people with various health conditions. The implications of these findings for better understanding stigma and stigma change among employers are considered. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Materials discovery guided by data-driven insights

    NASA Astrophysics Data System (ADS)

    Klintenberg, Mattias

    As the computational power continues to grow systematic computational exploration has become an important tool for materials discovery. In this presentation the Electronic Structure Project (ESP/ELSA) will be discussed and a number of examples presented that show some of the capabilities of a data-driven methodology for guiding materials discovery. These examples include topological insulators, detector materials and 2D materials. ESP/ELSA is an initiative that dates back to 2001 and today contain many tens of thousands of materials that have been investigated using a robust and high accuracy electronic structure method (all-electron FP-LMTO) thus providing basic materials first-principles data for most inorganic compounds that have been structurally characterized. The web-site containing the ESP/ELSA data has as of today been accessed from more than 4,000 unique computers from all around the world.

  17. T cell receptor-driven transendothelial migration of human effector memory CD4 T cells involves Vav, Rac and Myosin IIA

    PubMed Central

    Manes, Thomas D.; Pober, Jordan S.

    2013-01-01

    Human effector memory (EM) CD4 T cells may be recruited from the blood into a site of inflammation in response either to inflammatory chemokines displayed on or specific antigen presented by venular endothelial cells (ECs), designated as chemokine-driven or TCR-driven transendothelial migration (TEM), respectively. We have previously described differences in the morphological appearance of transmigrating T cells as well as in the molecules that mediate T cell-EC interactions distinguishing these two pathways. Here we report that TCR-driven TEM requires ZAP-70-dependent activation of a pathway involving Vav, Rac and myosin IIA. Chemokine-driven TEM also utilizes ZAP-70, albeit in a quantitatively and spatially different manner of activation, and is independent of Vav, Rac and mysosin IIA, depending instead on an as yet unidentified GTP exchange factor that activates Cdc42. The differential use of small Rho family GTPases to activate the cytoskeleton is consistent with the morphological differences observed in T cells that undergo TEM in response to these distinct recruitment signals. PMID:23420881

  18. A Qualitative Study of Students' Computational Thinking Skills in a Data-Driven Computing Class

    ERIC Educational Resources Information Center

    Yuen, Timothy T.; Robbins, Kay A.

    2014-01-01

    Critical thinking, problem solving, the use of tools, and the ability to consume and analyze information are important skills for the 21st century workforce. This article presents a qualitative case study that follows five undergraduate biology majors in a computer science course (CS0). This CS0 course teaches programming within a data-driven…

  19. Overview of ASC Capability Computing System Governance Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott W.

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  20. Bio and health informatics meets cloud : BioVLab as an example.

    PubMed

    Chae, Heejoon; Jung, Inuk; Lee, Hyungro; Marru, Suresh; Lee, Seong-Whan; Kim, Sun

    2013-01-01

    The exponential increase of genomic data brought by the advent of the next or the third generation sequencing (NGS) technologies and the dramatic drop in sequencing cost have driven biological and medical sciences to data-driven sciences. This revolutionary paradigm shift comes with challenges in terms of data transfer, storage, computation, and analysis of big bio/medical data. Cloud computing is a service model sharing a pool of configurable resources, which is a suitable workbench to address these challenges. From the medical or biological perspective, providing computing power and storage is the most attractive feature of cloud computing in handling the ever increasing biological data. As data increases in size, many research organizations start to experience the lack of computing power, which becomes a major hurdle in achieving research goals. In this paper, we review the features of publically available bio and health cloud systems in terms of graphical user interface, external data integration, security and extensibility of features. We then discuss about issues and limitations of current cloud systems and conclude with suggestion of a biological cloud environment concept, which can be defined as a total workbench environment assembling computational tools and databases for analyzing bio/medical big data in particular application domains.

  1. Developing the Quantitative Histopathology Image Ontology (QHIO): A case study using the hot spot detection problem.

    PubMed

    Gurcan, Metin N; Tomaszewski, John; Overton, James A; Doyle, Scott; Ruttenberg, Alan; Smith, Barry

    2017-02-01

    Interoperability across data sets is a key challenge for quantitative histopathological imaging. There is a need for an ontology that can support effective merging of pathological image data with associated clinical and demographic data. To foster organized, cross-disciplinary, information-driven collaborations in the pathological imaging field, we propose to develop an ontology to represent imaging data and methods used in pathological imaging and analysis, and call it Quantitative Histopathological Imaging Ontology - QHIO. We apply QHIO to breast cancer hot-spot detection with the goal of enhancing reliability of detection by promoting the sharing of data between image analysts. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Advanced Materials for Quantum Computing

    DTIC Science & Technology

    2010-04-28

    with Magnons co-PI: Leszek Malkinski w/ Postdoc Dr. Seong-Gi Min Project Name: Quantum Computing with Magnons 1. Brief Narrative: Quanta of...spinwaves called magnons can be used to exchange quantum information between solid state qubits. The project was driven by the concept of spiwave bus

  3. The power of an ontology-driven developmental toxicity database for data mining and computational modeling

    EPA Science Inventory

    Modeling of developmental toxicology presents a significant challenge to computational toxicology due to endpoint complexity and lack of data coverage. These challenges largely account for the relatively few modeling successes using the structure–activity relationship (SAR) parad...

  4. Computer Microtechnology for a Severely Disabled Preschool Child.

    ERIC Educational Resources Information Center

    Douglas, J.; And Others

    1988-01-01

    The case study describes microtechnological aids for a quadriplegic preschool aged boy dependent on a ventilator via a tracheostomy. Provision of a computer, a variety of specially designed switches and software, together with a self-driven powered wheelchair maximized expression of his developmental needs. (DB)

  5. Agro-ecoregionalization of Iowa using multivariate geographical clustering

    Treesearch

    Carol L. Williams; William W. Hargrove; Matt Leibman; David E. James

    2008-01-01

    Agro-ecoregionalization is categorization of landscapes for use in crop suitability analysis, strategic agroeconomic development, risk analysis, and other purposes. Past agro-ecoregionalizations have been subjective, expert opinion driven, crop specific, and unsuitable for statistical extrapolation. Use of quantitative analytical methods provides an opportunity for...

  6. Quantifying ubiquitin signaling.

    PubMed

    Ordureau, Alban; Münch, Christian; Harper, J Wade

    2015-05-21

    Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), including phosphorylation. Flux through such pathways is dictated by the fractional stoichiometry of distinct modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events, illustrated with the PINK1/PARKIN pathway. A key feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Big data to smart data in Alzheimer's disease: The brain health modeling initiative to foster actionable knowledge.

    PubMed

    Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane

    2016-09-01

    Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Cerebrospinal fluid bulk flow is driven by the cardiac cycle

    NASA Astrophysics Data System (ADS)

    Tithof, Jeffrey; Mestre, Humberto; Thomas, John; Nedergaard, Maiken; Kelley, Douglas

    2017-11-01

    Recent discoveries have uncovered a cerebrospinal fluid (CSF) transport system in the perivascular spaces (PVS) of the mammalian brain which clears excess extracellular fluid and protein waste products. The oscillatory pattern of CSF flow has long been attributed to arterial pulsations due to cardiac contractility but limitations in imaging techniques have impeded quantitative measurement of flow rates within the PVS. In this talk, we describe quantitative measurements from the first ever direct imaging of CSF flow in the PVS of a mouse brain. We perform particle tracking velocimetry to obtain time-resolved velocity measurements. To identify the cardiac and/or respiratory dependence of the flow, while imaging, we simultaneously record the mouse's electrocardiogram and respiration. Our measurements conclusively indicate that CSF pulsatility in the arterial PVS is directly driven by the cardiac cycle and not by the respiratory cycle or cerebral vasomotion. These results offer a substantial step forward in understanding bulk flow of CSF in the mammalian brain and may have important implications related to neurodegenerative diseases.

  9. Stratification Modelling of Key Bacterial Taxa Driven by Metabolic Dynamics in Meromictic Lakes.

    PubMed

    Zhu, Kaicheng; Lauro, Federico M; Su, Haibin

    2018-06-22

    In meromictic lakes, the water column is stratified into distinguishable steady layers with different physico-chemical properties. The bottom portion, known as monimolimnion, has been studied for the functional stratification of microbial populations. Recent experiments have reported the profiles of bacterial and nutrient spatial distributions, but quantitative understanding is invoked to unravel the underlying mechanism of maintaining the discrete spatial organization. Here a reaction-diffusion model is developed to highlight the spatial pattern coupled with the light-driven metabolism of bacteria, which is resilient to a wide range of dynamical correlation between bacterial and nutrient species at the molecular level. Particularly, exact analytical solutions of the system are presented together with numerical results, in a good agreement with measurements in Ace lake and Rogoznica lake. Furthermore, one quantitative prediction is reported here on the dynamics of the seasonal stratification patterns in Ace lake. The active role played by the bacterial metabolism at microscale clearly shapes the biogeochemistry landscape of lake-wide ecology at macroscale.

  10. FWT2D: A massively parallel program for frequency-domain full-waveform tomography of wide-aperture seismic data—Part 1: Algorithm

    NASA Astrophysics Data System (ADS)

    Sourbier, Florent; Operto, Stéphane; Virieux, Jean; Amestoy, Patrick; L'Excellent, Jean-Yves

    2009-03-01

    This is the first paper in a two-part series that describes a massively parallel code that performs 2D frequency-domain full-waveform inversion of wide-aperture seismic data for imaging complex structures. Full-waveform inversion methods, namely quantitative seismic imaging methods based on the resolution of the full wave equation, are computationally expensive. Therefore, designing efficient algorithms which take advantage of parallel computing facilities is critical for the appraisal of these approaches when applied to representative case studies and for further improvements. Full-waveform modelling requires the resolution of a large sparse system of linear equations which is performed with the massively parallel direct solver MUMPS for efficient multiple-shot simulations. Efficiency of the multiple-shot solution phase (forward/backward substitutions) is improved by using the BLAS3 library. The inverse problem relies on a classic local optimization approach implemented with a gradient method. The direct solver returns the multiple-shot wavefield solutions distributed over the processors according to a domain decomposition driven by the distribution of the LU factors. The domain decomposition of the wavefield solutions is used to compute in parallel the gradient of the objective function and the diagonal Hessian, this latter providing a suitable scaling of the gradient. The algorithm allows one to test different strategies for multiscale frequency inversion ranging from successive mono-frequency inversion to simultaneous multifrequency inversion. These different inversion strategies will be illustrated in the following companion paper. The parallel efficiency and the scalability of the code will also be quantified.

  11. Least squares QR-based decomposition provides an efficient way of computing optimal regularization parameter in photoacoustic tomography.

    PubMed

    Shaw, Calvin B; Prakash, Jaya; Pramanik, Manojit; Yalavarthy, Phaneendra K

    2013-08-01

    A computationally efficient approach that computes the optimal regularization parameter for the Tikhonov-minimization scheme is developed for photoacoustic imaging. This approach is based on the least squares-QR decomposition which is a well-known dimensionality reduction technique for a large system of equations. It is shown that the proposed framework is effective in terms of quantitative and qualitative reconstructions of initial pressure distribution enabled via finding an optimal regularization parameter. The computational efficiency and performance of the proposed method are shown using a test case of numerical blood vessel phantom, where the initial pressure is exactly known for quantitative comparison.

  12. C-arm technique using distance driven method for nephrolithiasis and kidney stones detection

    NASA Astrophysics Data System (ADS)

    Malalla, Nuhad; Sun, Pengfei; Chen, Ying; Lipkin, Michael E.; Preminger, Glenn M.; Qin, Jun

    2016-04-01

    Distance driven represents a state of art method that used for reconstruction for x-ray techniques. C-arm tomography is an x-ray imaging technique that provides three dimensional information of the object by moving the C-shaped gantry around the patient. With limited view angle, C-arm system was investigated to generate volumetric data of the object with low radiation dosage and examination time. This paper is a new simulation study with two reconstruction methods based on distance driven including: simultaneous algebraic reconstruction technique (SART) and Maximum Likelihood expectation maximization (MLEM). Distance driven is an efficient method that has low computation cost and free artifacts compared with other methods such as ray driven and pixel driven methods. Projection images of spherical objects were simulated with a virtual C-arm system with a total view angle of 40 degrees. Results show the ability of limited angle C-arm technique to generate three dimensional images with distance driven reconstruction.

  13. Computational Toxicology (S)

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. Th...

  14. Documentation Driven Development for Complex Real-Time Systems

    DTIC Science & Technology

    2004-12-01

    This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real

  15. Examining Data Driven Decision Making via Formative Assessment: A Confluence of Technology, Data Interpretation Heuristics and Curricular Policy

    ERIC Educational Resources Information Center

    Swan, Gerry; Mazur, Joan

    2011-01-01

    Although the term data-driven decision making (DDDM) is relatively new (Moss, 2007), the underlying concept of DDDM is not. For example, the practices of formative assessment and computer-managed instruction have historically involved the use of student performance data to guide what happens next in the instructional sequence (Morrison, Kemp, &…

  16. A Monthly Water-Balance Model Driven By a Graphical User Interface

    USGS Publications Warehouse

    McCabe, Gregory J.; Markstrom, Steven L.

    2007-01-01

    This report describes a monthly water-balance model driven by a graphical user interface, referred to as the Thornthwaite monthly water-balance program. Computations of monthly water-balance components of the hydrologic cycle are made for a specified location. The program can be used as a research tool, an assessment tool, and a tool for classroom instruction.

  17. The NTeQ ISD Model: A Tech-Driven Model for Digital Natives (DNs)

    ERIC Educational Resources Information Center

    Williams, C.; Anekwe, J. U.

    2017-01-01

    Integrating Technology for enquiry (NTeQ) instructional development model (ISD), is believed to be a technology-driven model. The authors x-rayed the ten-step model to reaffirm the ICT knowledge demand of the learner and the educator; hence computer-based activities at various stages of the model are core elements. The model also is conscious of…

  18. Data-driven discovery of new Dirac semimetal materials

    NASA Astrophysics Data System (ADS)

    Yan, Qimin; Chen, Ru; Neaton, Jeffrey

    In recent years, a significant amount of materials property data from high-throughput computations based on density functional theory (DFT) and the application of database technologies have enabled the rise of data-driven materials discovery. In this work, we initiate the extension of the data-driven materials discovery framework to the realm of topological semimetal materials and to accelerate the discovery of novel Dirac semimetals. We implement current available and develop new workflows to data-mine the Materials Project database for novel Dirac semimetals with desirable band structures and symmetry protected topological properties. This data-driven effort relies on the successful development of several automatic data generation and analysis tools, including a workflow for the automatic identification of topological invariants and pattern recognition techniques to find specific features in a massive number of computed band structures. Utilizing this approach, we successfully identified more than 15 novel Dirac point and Dirac nodal line systems that have not been theoretically predicted or experimentally identified. This work is supported by the Materials Project Predictive Modeling Center through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231.

  19. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    PubMed

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  20. Two-fluid flowing equilibria of spherical torus sustained by coaxial helicity injection

    NASA Astrophysics Data System (ADS)

    Kanki, Takashi; Steinhauer, Loren; Nagata, Masayoshi

    2007-11-01

    Two-dimensional equilibria in helicity-driven systems using two-fluid model were previously computed, showing the existence of an ultra-low-q spherical torus (ST) configuration with diamagnetism and higher beta. However, this computation assumed purely toroidal ion flow and uniform density. The purpose of the present study is to apply the two-fluid model to the two-dimensional equilibria of helicity-driven ST with non-uniform density and both toroidal and poloidal flows for each species by means of the nearby-fluids procedure, and to explore their properties. We focus our attention on the equilibria relevant to the HIST device, which are characterized by either driven or decaying λ profiles. The equilibrium for the driven λ profile has a diamagnetic toroidal field, high-β (βt = 32%), and centrally broad density. By contrast, the decaying equilibrium has a paramagnetic toroidal field, low-β (βt = 10%), and centrally peaked density with a steep gradient in the outer edge region. In the driven case, the toroidal ion and electron flows are in the same direction, and two-fluid effects are less important since the ExB drift is dominant. In the decaying case, the toroidal ion and electron flows are opposite in the outer edge region, and two-fluid effects are significant locally in the edge due to the ion diamagnetic drift.

  1. Scoring and ranking of metabolic trees to computationally ...

    EPA Pesticide Factsheets

    Increasing awareness about endocrine disrupting chemicals (EDCs) in the environment has driven concern about their potential impact on human health and wildlife. Tens of thousands of natural and synthetic xenobiotics are presently in commerce with little to no toxicity data and therefore uncertainty about their impact on estrogen receptor (ER) signaling pathways and other toxicity endpoints. As such, there is a need for strategies that make use of available data to prioritize chemicals for testing. One of the major achievements within the EPA’s Endocrine Disruptor Screening Program (EDSP), was the network model combining 18 ER in vitro assays from ToxCast to predict in vivo estrogenic activity. This model overcomes the limitations of single in vitro assays at different steps of the ER pathway. However, it lacks many relevant features required to estimate safe exposure levels and the composite assays do not consider the complex metabolic processes that might produce bioactive entities in a living system. This problem is typically addressed using in vivo assays. The aim of this work is to design a computational and in vitro approach to prioritize compounds and perform a quantitative safety assessment. To this end, we pursue a tiered approach taking into account bioactivity and bioavailability of chemicals and their metabolites using a human uterine epithelial cell (Ishikawa)-based assay. This biologically relevant fit-for-purpose assay was designed to quantitati

  2. A novel 3D micron-scale DPTV (Defocused Particle Tracking Velocimetry) and its applications in microfluidic devices

    NASA Astrophysics Data System (ADS)

    Roberts, John

    2005-11-01

    The rapid advancements in micro/nano biotechnology demand quantitative tools for characterizing microfluidic flows in lab-on-a-chip applications, validation of computational results for fully 3D flows in complex micro-devices, and efficient observation of cellular dynamics in 3D. We present a novel 3D micron-scale DPTV (defocused particle tracking velocimetry) that is capable of mapping out 3D Lagrangian, as well as 3D Eulerian velocity flow fields at sub-micron resolution and with one camera. The main part of the imaging system is an epi-fluorescent microscope (Olympus IX 51), and the seeding particles are fluorescent particles with diameter range 300nm - 10um. A software package has been developed for identifying (x,y,z,t) coordinates of the particles using the defocused images. Using the imaging system, we successfully mapped the pressure driven flow fields in microfluidic channels. In particular, we measured the Laglangian flow fields in a microfluidic channel with a herring bone pattern at the bottom, the later is used to enhance fluid mixing in lateral directions. The 3D particle tracks revealed the flow structure that has only been seen in numerical computation. This work is supported by the National Science Foundation (CTS - 0514443), the Nanobiotechnology Center at Cornell, and The New York State Center for Life Science Enterprise.

  3. High-efficiency high-energy Ka source for the critically-required maximum illumination of x-ray optics on Z using Z-petawatt-driven laser-breakout-afterburner accelerated ultrarelativistic electrons LDRD .

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sefkow, Adam B.; Bennett, Guy R.

    2010-09-01

    Under the auspices of the Science of Extreme Environments LDRD program, a <2 year theoretical- and computational-physics study was performed (LDRD Project 130805) by Guy R Bennett (formally in Center-01600) and Adam B. Sefkow (Center-01600): To investigate novel target designs by which a short-pulse, PW-class beam could create a brighter K{alpha} x-ray source than by simple, direct-laser-irradiation of a flat foil; Direct-Foil-Irradiation (DFI). The computational studies - which are still ongoing at this writing - were performed primarily on the RedStorm supercomputer at Sandia National Laboratories Albuquerque site. The motivation for a higher efficiency K{alpha} emitter was very clear: asmore » the backlighter flux for any x-ray imaging technique on the Z accelerator increases, the signal-to-noise and signal-to-background ratios improve. This ultimately allows the imaging system to reach its full quantitative potential as a diagnostic. Depending on the particular application/experiment this would imply, for example, that the system would have reached its full design spatial resolution and thus the capability to see features that might otherwise be indiscernible with a traditional DFI-like x-ray source. This LDRD began FY09 and ended FY10.« less

  4. Topological analysis of group fragmentation in multiagent systems

    NASA Astrophysics Data System (ADS)

    DeLellis, Pietro; Porfiri, Maurizio; Bollt, Erik M.

    2013-02-01

    In social animals, the presence of conflicts of interest or multiple leaders can promote the emergence of two or more subgroups. Such subgroups are easily recognizable by human observers, yet a quantitative and objective measure of group fragmentation is currently lacking. In this paper, we explore the feasibility of detecting group fragmentation by embedding the raw data from the individuals' motions on a low-dimensional manifold and analyzing the topological features of this manifold. To perform the embedding, we employ the isomap algorithm, which is a data-driven machine learning tool extensively used in computer vision. We implement this procedure on a data set generated by a modified à la Vicsek model, where agents are partitioned into two or more subsets and an independent leader is assigned to each subset. The dimensionality of the embedding manifold is shown to be a measure of the number of emerging subgroups in the selected observation window and a cluster analysis is proposed to aid the interpretation of these findings. To explore the feasibility of using this approach to characterize group fragmentation in real time and thus reduce the computational cost in data processing and storage, we propose an interpolation method based on an inverse mapping from the embedding space to the original space. The effectiveness of the interpolation technique is illustrated on a test-bed example with potential impact on the regulation of collective behavior of animal groups using robotic stimuli.

  5. Advances in computational approaches for prioritizing driver mutations and significantly mutated genes in cancer genomes.

    PubMed

    Cheng, Feixiong; Zhao, Junfei; Zhao, Zhongming

    2016-07-01

    Cancer is often driven by the accumulation of genetic alterations, including single nucleotide variants, small insertions or deletions, gene fusions, copy-number variations, and large chromosomal rearrangements. Recent advances in next-generation sequencing technologies have helped investigators generate massive amounts of cancer genomic data and catalog somatic mutations in both common and rare cancer types. So far, the somatic mutation landscapes and signatures of >10 major cancer types have been reported; however, pinpointing driver mutations and cancer genes from millions of available cancer somatic mutations remains a monumental challenge. To tackle this important task, many methods and computational tools have been developed during the past several years and, thus, a review of its advances is urgently needed. Here, we first summarize the main features of these methods and tools for whole-exome, whole-genome and whole-transcriptome sequencing data. Then, we discuss major challenges like tumor intra-heterogeneity, tumor sample saturation and functionality of synonymous mutations in cancer, all of which may result in false-positive discoveries. Finally, we highlight new directions in studying regulatory roles of noncoding somatic mutations and quantitatively measuring circulating tumor DNA in cancer. This review may help investigators find an appropriate tool for detecting potential driver or actionable mutations in rapidly emerging precision cancer medicine. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  6. An Interactive Computer Program for Simulating the Effects of Olivine Fractionation from Basaltic and Ultrabasic Liquids.

    ERIC Educational Resources Information Center

    Pearce, Thomas H.

    1983-01-01

    Describes interactive computer program (listing available from author) which simulates olivine fractionation from basaltic/ultrabasic liquid. The menu-driven nature of the program (for Apple II microcomputer) allows students to select ideal Rayleigh fractionation or equilibrium crystallization. (JN)

  7. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  8. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    DOE PAGES

    Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung; ...

    2016-11-01

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  9. Quantitative Adverse Outcome Pathways and Their Application to Predictive Toxicology

    EPA Science Inventory

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course p...

  10. Prediction of Environmental Impact of High-Energy Materials with Atomistic Computer Simulations

    DTIC Science & Technology

    2010-11-01

    from a training set of compounds. Other methods include Quantitative Struc- ture-Activity Relationship ( QSAR ) and Quantitative Structure-Property...26 28 the development of QSPR/ QSAR models, in contrast to boiling points and critical parameters derived from empirical correlations, to improve...Quadratic Configuration Interaction Singles Doubles QSAR Quantitative Structure-Activity Relationship QSPR Quantitative Structure-Property

  11. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    PubMed Central

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  12. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-12-15

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less

  13. INVESTIGATING UNCERTAINTY AND SENSITIVITY IN INTEGRATED, MULTIMEDIA ENVIRONMENTAL MODELS: TOOLS FOR FRAMES-3MRA

    EPA Science Inventory

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites...

  14. Modeling noisy resonant system response

    NASA Astrophysics Data System (ADS)

    Weber, Patrick Thomas; Walrath, David Edwin

    2017-02-01

    In this paper, a theory-based model replicating empirical acoustic resonant signals is presented and studied to understand sources of noise present in acoustic signals. Statistical properties of empirical signals are quantified and a noise amplitude parameter, which models frequency and amplitude-based noise, is created, defined, and presented. This theory-driven model isolates each phenomenon and allows for parameters to be independently studied. Using seven independent degrees of freedom, this model will accurately reproduce qualitative and quantitative properties measured from laboratory data. Results are presented and demonstrate success in replicating qualitative and quantitative properties of experimental data.

  15. Clinical application of a light-pen computer system for quantitative angiography

    NASA Technical Reports Server (NTRS)

    Alderman, E. L.

    1975-01-01

    The paper describes an angiographic analysis system which uses a video disk for recording and playback, a light-pen for data input, minicomputer processing, and an electrostatic printer/plotter for hardcopy output. The method is applied to quantitative analysis of ventricular volumes, sequential ventriculography for assessment of physiologic and pharmacologic interventions, analysis of instantaneous time sequence of ventricular systolic and diastolic events, and quantitation of segmental abnormalities. The system is shown to provide the capability for computation of ventricular volumes and other measurements from operator-defined margins by greatly reducing the tedium and errors associated with manual planimetry.

  16. Quantitative relations between fishing mortality, spawning stress mortality and biomass growth rate (computed with numerical model FISHMO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laevastu, T.

    1983-01-01

    The effects of fishing on a given species biomass have been quantitatively evaluated. A constant recruitment is assumed in this study, but the evaluation can be computed on any known age distribution of exploitable biomass. Fishing mortality is assumed to be constant with age; however, spawning stress mortality increases with age. When fishing (mortality) increases, the spawning stress mortality decreases relative to total and exploitable biomasses. These changes are quantitatively shown for two species from the Bering Sea - walleye pollock, Theragra chalcogramma, and yellowfin sole, Limanda aspera.

  17. Computing Quantitative Characteristics of Finite-State Real-Time Systems

    DTIC Science & Technology

    1994-05-04

    Current methods for verifying real - time systems are essentially decision procedures that establish whether the system model satisfies a given...specification. We present a general method for computing quantitative information about finite-state real - time systems . We have developed algorithms that...our technique can be extended to a more general representation of real - time systems , namely, timed transition graphs. The algorithms presented in this

  18. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    PubMed

    Bonham, Kevin S; Stefan, Melanie I

    2017-10-01

    While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  19. Accuracy and Precision of Radioactivity Quantification in Nuclear Medicine Images

    PubMed Central

    Frey, Eric C.; Humm, John L.; Ljungberg, Michael

    2012-01-01

    The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. PMID:22475429

  20. Toward Computational Cumulative Biology by Combining Models of Biological Datasets

    PubMed Central

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations—for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database. PMID:25427176

  1. Toward computational cumulative biology by combining models of biological datasets.

    PubMed

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations-for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database.

  2. Computer-based training (CBT) intervention reduces workplace violence and harassment for homecare workers.

    PubMed

    Glass, Nancy; Hanson, Ginger C; Anger, W Kent; Laharnar, Naima; Campbell, Jacquelyn C; Weinstein, Marc; Perrin, Nancy

    2017-07-01

    The study examines the effectiveness of a workplace violence and harassment prevention and response program with female homecare workers in a consumer driven model of care. Homecare workers were randomized to either; computer based training (CBT only) or computer-based training with homecare worker peer facilitation (CBT + peer). Participants completed measures on confidence, incidents of violence, and harassment, health and work outcomes at baseline, 3, 6 months post-baseline. Homecare workers reported improved confidence to prevent and respond to workplace violence and harassment and a reduction in incidents of workplace violence and harassment in both groups at 6-month follow-up. A decrease in negative health and work outcomes associated with violence and harassment were not reported in the groups. CBT alone or with trained peer facilitation with homecare workers can increase confidence and reduce incidents of workplace violence and harassment in a consumer-driven model of care. © 2017 Wiley Periodicals, Inc.

  3. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1988-01-01

    The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.

  4. Principle and experimental investigation of current-driven negative-inductance superconducting quantum interference device

    NASA Astrophysics Data System (ADS)

    Li, Hao; Liu, Jianshe; Zhang, Yingshan; Cai, Han; Li, Gang; Liu, Qichun; Han, Siyuan; Chen, Wei

    2017-03-01

    A negative-inductance superconducting quantum interference device (nSQUID) is an adiabatic superconducting logic device with high energy efficiency, and therefore a promising building block for large-scale low-power superconducting computing. However, the principle of the nSQUID is not that straightforward and an nSQUID driven by voltage is vulnerable to common mode noise. We investigate a single nSQUID driven by current instead of voltage, and clarify the principle of the adiabatic transition of the current-driven nSQUID between different states. The basic logic operations of the current-driven nSQUID with proper parameters are simulated by WRspice. The corresponding circuit is fabricated with a 100 A cm-2 Nb-based lift-off process, and the experimental results at low temperature confirm the basic logic operations as a gated buffer.

  5. 1, 2, 3, 4: infusing quantitative literacy into introductory biology.

    PubMed

    Speth, Elena Bray; Momsen, Jennifer L; Moyerbrailean, Gregory A; Ebert-May, Diane; Long, Tammy M; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills.

  6. Teaching Computer Science Courses in Distance Learning

    ERIC Educational Resources Information Center

    Huan, Xiaoli; Shehane, Ronald; Ali, Adel

    2011-01-01

    As the success of distance learning (DL) has driven universities to increase the courses offered online, certain challenges arise when teaching computer science (CS) courses to students who are not physically co-located and have individual learning schedules. Teaching CS courses involves high level demonstrations and interactivity between the…

  7. COMPUTER SIMULATIONS OF LUNG AIRWAY STRUCTURES USING DATA-DRIVEN SURFACE MODELING TECHNIQUES

    EPA Science Inventory

    ABSTRACT

    Knowledge of human lung morphology is a subject critical to many areas of medicine. The visualization of lung structures naturally lends itself to computer graphics modeling due to the large number of airways involved and the complexities of the branching systems...

  8. First Steps in Computational Systems Biology: A Practical Session in Metabolic Modeling and Simulation

    ERIC Educational Resources Information Center

    Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel

    2009-01-01

    A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…

  9. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  10. Ontology-Driven Discovery of Scientific Computational Entities

    ERIC Educational Resources Information Center

    Brazier, Pearl W.

    2010-01-01

    Many geoscientists use modern computational resources, such as software applications, Web services, scientific workflows and datasets that are readily available on the Internet, to support their research and many common tasks. These resources are often shared via human contact and sometimes stored in data portals; however, they are not necessarily…

  11. CMEIAS color segmentation: an improved computing technology to process color images for quantitative microbial ecology studies at single-cell resolution.

    PubMed

    Gross, Colin A; Reddy, Chandan K; Dazzo, Frank B

    2010-02-01

    Quantitative microscopy and digital image analysis are underutilized in microbial ecology largely because of the laborious task to segment foreground object pixels from background, especially in complex color micrographs of environmental samples. In this paper, we describe an improved computing technology developed to alleviate this limitation. The system's uniqueness is its ability to edit digital images accurately when presented with the difficult yet commonplace challenge of removing background pixels whose three-dimensional color space overlaps the range that defines foreground objects. Image segmentation is accomplished by utilizing algorithms that address color and spatial relationships of user-selected foreground object pixels. Performance of the color segmentation algorithm evaluated on 26 complex micrographs at single pixel resolution had an overall pixel classification accuracy of 99+%. Several applications illustrate how this improved computing technology can successfully resolve numerous challenges of complex color segmentation in order to produce images from which quantitative information can be accurately extracted, thereby gain new perspectives on the in situ ecology of microorganisms. Examples include improvements in the quantitative analysis of (1) microbial abundance and phylotype diversity of single cells classified by their discriminating color within heterogeneous communities, (2) cell viability, (3) spatial relationships and intensity of bacterial gene expression involved in cellular communication between individual cells within rhizoplane biofilms, and (4) biofilm ecophysiology based on ribotype-differentiated radioactive substrate utilization. The stand-alone executable file plus user manual and tutorial images for this color segmentation computing application are freely available at http://cme.msu.edu/cmeias/ . This improved computing technology opens new opportunities of imaging applications where discriminating colors really matter most, thereby strengthening quantitative microscopy-based approaches to advance microbial ecology in situ at individual single-cell resolution.

  12. AnchorDock: Blind and Flexible Anchor-Driven Peptide Docking.

    PubMed

    Ben-Shimon, Avraham; Niv, Masha Y

    2015-05-05

    The huge conformational space stemming from the inherent flexibility of peptides is among the main obstacles to successful and efficient computational modeling of protein-peptide interactions. Current peptide docking methods typically overcome this challenge using prior knowledge from the structure of the complex. Here we introduce AnchorDock, a peptide docking approach, which automatically targets the docking search to the most relevant parts of the conformational space. This is done by precomputing the free peptide's structure and by computationally identifying anchoring spots on the protein surface. Next, a free peptide conformation undergoes anchor-driven simulated annealing molecular dynamics simulations around the predicted anchoring spots. In the challenging task of a completely blind docking test, AnchorDock produced exceptionally good results (backbone root-mean-square deviation ≤ 2.2Å, rank ≤15) for 10 of 13 unbound cases tested. The impressive performance of AnchorDock supports a molecular recognition pathway that is driven via pre-existing local structural elements. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. A video, text, and speech-driven realistic 3-d virtual head for human-machine interface.

    PubMed

    Yu, Jun; Wang, Zeng-Fu

    2015-05-01

    A multiple inputs-driven realistic facial animation system based on 3-D virtual head for human-machine interface is proposed. The system can be driven independently by video, text, and speech, thus can interact with humans through diverse interfaces. The combination of parameterized model and muscular model is used to obtain a tradeoff between computational efficiency and high realism of 3-D facial animation. The online appearance model is used to track 3-D facial motion from video in the framework of particle filtering, and multiple measurements, i.e., pixel color value of input image and Gabor wavelet coefficient of illumination ratio image, are infused to reduce the influence of lighting and person dependence for the construction of online appearance model. The tri-phone model is used to reduce the computational consumption of visual co-articulation in speech synchronized viseme synthesis without sacrificing any performance. The objective and subjective experiments show that the system is suitable for human-machine interaction.

  14. Functionally dissociable influences on learning rate in a dynamic environment

    PubMed Central

    McGuire, Joseph T.; Nassar, Matthew R.; Gold, Joshua I.; Kable, Joseph W.

    2015-01-01

    Summary Maintaining accurate beliefs in a changing environment requires dynamically adapting the rate at which one learns from new experiences. Beliefs should be stable in the face of noisy data, but malleable in periods of change or uncertainty. Here we used computational modeling, psychophysics and fMRI to show that adaptive learning is not a unitary phenomenon in the brain. Rather, it can be decomposed into three computationally and neuroanatomically distinct factors that were evident in human subjects performing a spatial-prediction task: (1) surprise-driven belief updating, related to BOLD activity in visual cortex; (2) uncertainty-driven belief updating, related to anterior prefrontal and parietal activity; and (3) reward-driven belief updating, a context-inappropriate behavioral tendency related to activity in ventral striatum. These distinct factors converged in a core system governing adaptive learning. This system, which included dorsomedial frontal cortex, responded to all three factors and predicted belief updating both across trials and across individuals. PMID:25459409

  15. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  16. An entangled-light-emitting diode.

    PubMed

    Salter, C L; Stevenson, R M; Farrer, I; Nicoll, C A; Ritchie, D A; Shields, A J

    2010-06-03

    An optical quantum computer, powerful enough to solve problems so far intractable using conventional digital logic, requires a large number of entangled photons. At present, entangled-light sources are optically driven with lasers, which are impractical for quantum computing owing to the bulk and complexity of the optics required for large-scale applications. Parametric down-conversion is the most widely used source of entangled light, and has been used to implement non-destructive quantum logic gates. However, these sources are Poissonian and probabilistically emit zero or multiple entangled photon pairs in most cycles, fundamentally limiting the success probability of quantum computational operations. These complications can be overcome by using an electrically driven on-demand source of entangled photon pairs, but so far such a source has not been produced. Here we report the realization of an electrically driven source of entangled photon pairs, consisting of a quantum dot embedded in a semiconductor light-emitting diode (LED) structure. We show that the device emits entangled photon pairs under d.c. and a.c. injection, the latter achieving an entanglement fidelity of up to 0.82. Entangled light with such high fidelity is sufficient for application in quantum relays, in core components of quantum computing such as teleportation, and in entanglement swapping. The a.c. operation of the entangled-light-emitting diode (ELED) indicates its potential function as an on-demand source without the need for a complicated laser driving system; consequently, the ELED is at present the best source on which to base future scalable quantum information applications.

  17. An exposure-response analysis based on rifampin suggests CYP3A4 induction is driven by AUC: an in vitro investigation.

    PubMed

    Chang, Cheng; Yang, Xin; Fahmi, Odette A; Riccardi, Keith A; Di, Li; Obach, R Scott

    2017-08-01

    1. Induction is an important mechanism contributing to drug-drug interactions. It is most commonly evaluated in the human hepatocyte assay over 48-h or 72-h incubation period. However, whether the overall exposure (i.e. Area Under the Curve (AUC) or C ave ) or maximum exposure (i.e. C max ) of the inducer is responsible for the magnitude of subsequent induction has not been thoroughly investigated. Additionally, in vitro induction assays are typically treated as static systems, which could lead to inaccurate induction potency estimation. Hence, European Medicines Agency (EMA) guidance now specifies quantitation of drug levels in the incubation. 2. This work treated the typical in vitro evaluation of rifampin induction as an in vivo system by generating various target engagement profiles, measuring free rifampin concentration over 3 d of incubation and evaluating the impact of these factors on final induction response. 3. This rifampin-based analysis demonstrates that the induction process is driven by time-averaged target engagement (i.e. AUC-driven). Additionally, depletion of rifampin in the incubation medium over 3 d as well as non-specific/specific binding were observed. 4. These findings should help aid the discovery of clinical candidates with minimal induction liability and further expand our knowledge in the quantitative translatability of in vitro induction assays.

  18. The rheology and microstructure of aging thermoreversible colloidal gels & attractive driven glasses

    NASA Astrophysics Data System (ADS)

    Wagner, Norman; Gordon, Melissa; Kloxin, Christopher

    The properties of colloidal gels and glasses are known to change with age, but the particle-level mechanisms by which aging occurs is are fully understood, which limits our ability to predict macroscopic behavior in these systems. In this work, we quantitatively relate rheological aging to structural aging of a model, homogenous gel and attractive driven glass by simultaneously measuring the bulk properties and gel microstructure using rheometry and small angle neutron scattering (Rheo-SANS), respectively. Specifically, we develop a quantitative and predictive relationship between the macroscopic properties and the underlying microstructure (i . e . , the effective strength of attraction) of an aging colloidal gel and attractive driven glass and study it as a function of the thermal and shear history. Analysis with mode coupling theory is consistent with local particle rearrangements as the mechanism of aging, which lead to monotonically increasing interaction strengths in a continuously evolving material and strongly supports aging as a trajectory in the free energy landscape dominated by local particle relaxations. The analyses and conclusions of this study may be 1) industrially relevant to products that age on commercial timescales, such as paints and pharmaceuticals, 2) applicable to other dynamically arrested systems, such as metallic glasses, and 3) used in the design of new materials. NIST Center for Neutron Research CNS cooperative agreement number #70NANB12H239 and NASA Grant No. NNX15AI19H.

  19. Quantitative single-photon emission computed tomography/computed tomography for technetium pertechnetate thyroid uptake measurement

    PubMed Central

    Lee, Hyunjong; Kim, Ji Hyun; Kang, Yeon-koo; Moon, Jae Hoon; So, Young; Lee, Won Woo

    2016-01-01

    Abstract Objectives: Technetium pertechnetate (99mTcO4) is a radioactive tracer used to assess thyroid function by thyroid uptake system (TUS). However, the TUS often fails to deliver accurate measurements of the percent of thyroid uptake (%thyroid uptake) of 99mTcO4. Here, we investigated the usefulness of quantitative single-photon emission computed tomography/computed tomography (SPECT/CT) after injection of 99mTcO4 in detecting thyroid function abnormalities. Materials and methods: We retrospectively reviewed data from 50 patients (male:female = 15:35; age, 46.2 ± 16.3 years; 17 Graves disease, 13 thyroiditis, and 20 euthyroid). All patients underwent 99mTcO4 quantitative SPECT/CT (185 MBq = 5 mCi), which yielded %thyroid uptake and standardized uptake value (SUV). Twenty-one (10 Graves disease and 11 thyroiditis) of the 50 patients also underwent conventional %thyroid uptake measurements using a TUS. Results: Quantitative SPECT/CT parameters (%thyroid uptake, SUVmean, and SUVmax) were the highest in Graves disease, second highest in euthyroid, and lowest in thyroiditis (P < 0.0001, Kruskal–Wallis test). TUS significantly overestimated the %thyroid uptake compared with SPECT/CT (P < 0.0001, paired t test) because other 99mTcO4 sources in addition to thyroid, such as salivary glands and saliva, contributed to the %thyroid uptake result by TUS, whereas %thyroid uptake, SUVmean and SUVmax from the SPECT/CT were associated with the functional status of thyroid. Conclusions: Quantitative SPECT/CT is more accurate than conventional TUS for measuring 99mTcO4 %thyroid uptake. Quantitative measurements using SPECT/CT may facilitate more accurate assessment of thyroid tracer uptake. PMID:27399139

  20. Intention-to-Treat Analysis in Partially Nested Randomized Controlled Trials with Real-World Complexity

    ERIC Educational Resources Information Center

    Schweig, Jonathan David; Pane, John F.

    2016-01-01

    Demands for scientific knowledge of what works in educational policy and practice has driven interest in quantitative investigations of educational outcomes, and randomized controlled trials (RCTs) have proliferated under these conditions. In educational settings, even when individuals are randomized, both experimental and control students are…

  1. Examining Construct Validity of the Quantitative Literacy VALUE Rubric in College-Level STEM Assignments

    ERIC Educational Resources Information Center

    Gray, Julie S.; Brown, Melissa A.; Connolly, John P.

    2017-01-01

    Data-driven decision making is increasingly viewed as essential in a globally competitive society. Initiatives to augment standardized testing with performance-based assessment have increased as educators progressively respond to mandates for authentic measurement of student attainment. To meet this challenge, multidisciplinary rubrics were…

  2. Data Driven Program Planning for GIS Instruction

    ERIC Educational Resources Information Center

    Scarletto, Edith

    2013-01-01

    This study used both focus groups (qualitative) and survey data (quantitative) to develop and expand an instruction program for GIS services. It examined the needs and preferences faculty and graduate students have for learning about GIS applications for teaching and research. While faculty preferred in person workshops and graduate students…

  3. The Radicalism of the Liberal Arts Tradition.

    ERIC Educational Resources Information Center

    Lears, Jackson

    2003-01-01

    Discusses the threat to intellectual freedom in the academy from market-driven managerial influence, the impulse to subject universities to quantitative standards of efficiency and productivity, to turn knowledge into a commodity, and to transform open sites of inquiry into corporate research laboratories and job-training centers. Calls for the…

  4. Integrating watershed- and farm-scale modeling framework for targeting critical source areas while maintaining farm economic viability

    USDA-ARS?s Scientific Manuscript database

    Quantitative risk assessments of pollution and data related to the effectiveness of mitigating best management practices (BMPs) are important aspects of nonpoint source (NPS) pollution control efforts, particularly those driven by specific water quality objectives and by measurable improvement goals...

  5. A vision and strategy for exposure modelling at the U.S. EPA Office of Research and Development

    EPA Science Inventory

    Traditional, hazard-driven, single-chemical risk assessment practices cannot keep pace with the vast and growing numbers of chemicals in commerce. A well-defined, quantitative, and defensible means of identifying those with the greatest risk potential is needed, with exposure con...

  6. Language Teachers' Conceptions of Assessment: An Egyptian Perspective

    ERIC Educational Resources Information Center

    Gebril, Atta

    2017-01-01

    The current study investigates the assessment conceptions of both pre-service and in-service English teachers within a high-stakes, test-driven context in Egypt. For this purpose, 170 Egyptian pre-service and in-service teachers completed an assessment conceptions questionnaire. Quantitative and qualitative data analysis were employed to answer…

  7. Intel ISEF 2008 Student Handbook

    ERIC Educational Resources Information Center

    Science Service, 2008

    2008-01-01

    Research is a process by which people discover or create new knowledge about the world in which they live. The International Science and Engineering Fair (ISEF) and Affiliated Fairs are research (data) driven. Students design research projects that provide quantitative data through experimentation followed by analysis and application of that data.…

  8. Various vibration modes in a silicon ring resonator driven by p–n diode actuators formed in the lateral direction

    NASA Astrophysics Data System (ADS)

    Tsushima, Takafumi; Asahi, Yoichi; Tanigawa, Hiroshi; Furutsuka, Takashi; Suzuki, Kenichiro

    2018-06-01

    In this paper, we describe p–n diode actuators that are formed in the lateral direction on resonators. Because previously reported p–n diode actuators, which were driven by a force parallel to the electrostatic force induced in a p–n diode, were fabricated in the perpendicular direction to the surface, the fabrication process to satisfy the requirement of realizing a p–n junction set in the middle of the plate thickness has been difficult. The resonators in this work are driven by p–n diodes formed in the lateral direction, making the process easy. We have fabricated a silicon ring resonator that has in-plane vibration using p–n–p and n–p–n diode actuators formed in the lateral direction. First, we consider a space charge model that can sufficiently accurately describe the force induced in p–n diode actuators and compare it with the capacitance model used in most computer simulations. Then, we show that multiplying the vibration amplitude calculated by computer simulation by the modification coefficient of 4/3 provides the vibration amplitude in the p–n diode actuators. Good agreement of the theory with experimental results of the in-plane vibration measured for silicon ring resonators is obtained. The computer simulation is very useful for evaluating various vibration modes in resonators driven by the p–n diode actuators. The small amplitude of the p–n diode actuator measured in this work is expected to increase greatly with increased doping of the actuator.

  9. Studying Scientific Discovery by Computer Simulation.

    DTIC Science & Technology

    1983-03-30

    Mendel’s laws of inheritance, the law of Gay- Lussac for gaseous reactions, tile law of Dulong and Petit, the derivation of atomic weights by Avogadro...neceseary mid identify by block number) scientific discovery -ittri sic properties physical laws extensive terms data-driven heuristics intensive...terms theory-driven heuristics conservation laws 20. ABSTRACT (Continue on revere. side It necessary and identify by block number) Scientific discovery

  10. Attitudes towards Computer and Computer Self-Efficacy as Predictors of Preservice Mathematics Teachers' Computer Anxiety

    ERIC Educational Resources Information Center

    Awofala, Adeneye O. A.; Akinoso, Sabainah O.; Fatade, Alfred O.

    2017-01-01

    The study investigated attitudes towards computer and computer self-efficacy as predictors of computer anxiety among 310 preservice mathematics teachers from five higher institutions of learning in Lagos and Ogun States of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…

  11. Numerical simulation of magmatic hydrothermal systems

    USGS Publications Warehouse

    Ingebritsen, S.E.; Geiger, S.; Hurwitz, S.; Driesner, T.

    2010-01-01

    The dynamic behavior of magmatic hydrothermal systems entails coupled and nonlinear multiphase flow, heat and solute transport, and deformation in highly heterogeneous media. Thus, quantitative analysis of these systems depends mainly on numerical solution of coupled partial differential equations and complementary equations of state (EOS). The past 2 decades have seen steady growth of computational power and the development of numerical models that have eliminated or minimized the need for various simplifying assumptions. Considerable heuristic insight has been gained from process-oriented numerical modeling. Recent modeling efforts employing relatively complete EOS and accurate transport calculations have revealed dynamic behavior that was damped by linearized, less accurate models, including fluid property control of hydrothermal plume temperatures and three-dimensional geometries. Other recent modeling results have further elucidated the controlling role of permeability structure and revealed the potential for significant hydrothermally driven deformation. Key areas for future reSearch include incorporation of accurate EOS for the complete H2O-NaCl-CO2 system, more realistic treatment of material heterogeneity in space and time, realistic description of large-scale relative permeability behavior, and intercode benchmarking comparisons. Copyright 2010 by the American Geophysical Union.

  12. Continued Development and Validation of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2015-11-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks; determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and provide an intermediate step between theory and future experiments. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (~ 36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. Results from verification of the PSI-TET extended MHD model using the GEM magnetic reconnection challenge will also be presented along with investigation of injector configurations for future SIHI experiments using Taylor state equilibrium calculations. Work supported by DoE.

  13. An integrated theory of attention and decision making in visual signal detection.

    PubMed

    Smith, Philip L; Ratcliff, Roger

    2009-04-01

    The simplest attentional task, detecting a cued stimulus in an otherwise empty visual field, produces complex patterns of performance. Attentional cues interact with backward masks and with spatial uncertainty, and there is a dissociation in the effects of these variables on accuracy and on response time. A computational theory of performance in this task is described. The theory links visual encoding, masking, spatial attention, visual short-term memory (VSTM), and perceptual decision making in an integrated dynamic framework. The theory assumes that decisions are made by a diffusion process driven by a neurally plausible, shunting VSTM. The VSTM trace encodes the transient outputs of early visual filters in a durable form that is preserved for the time needed to make a decision. Attention increases the efficiency of VSTM encoding, either by increasing the rate of trace formation or by reducing the delay before trace formation begins. The theory provides a detailed, quantitative account of attentional effects in spatial cuing tasks at the level of response accuracy and the response time distributions. (c) 2009 APA, all rights reserved

  14. Investigation on magnetoacoustic signal generation with magnetic induction and its application to electrical conductivity reconstruction.

    PubMed

    Ma, Qingyu; He, Bin

    2007-08-21

    A theoretical study on the magnetoacoustic signal generation with magnetic induction and its applications to electrical conductivity reconstruction is conducted. An object with a concentric cylindrical geometry is located in a static magnetic field and a pulsed magnetic field. Driven by Lorentz force generated by the static magnetic field, the magnetically induced eddy current produces acoustic vibration and the propagated sound wave is received by a transducer around the object to reconstruct the corresponding electrical conductivity distribution of the object. A theory on the magnetoacoustic waveform generation for a circular symmetric model is provided as a forward problem. The explicit formulae and quantitative algorithm for the electrical conductivity reconstruction are then presented as an inverse problem. Computer simulations were conducted to test the proposed theory and assess the performance of the inverse algorithms for a multi-layer cylindrical model. The present simulation results confirm the validity of the proposed theory and suggest the feasibility of reconstructing electrical conductivity distribution based on the proposed theory on the magnetoacoustic signal generation with magnetic induction.

  15. Validation and Continued Development of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2016-10-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.

  16. Quantitative analysis of terahertz spectra for illicit drugs using adaptive-range micro-genetic algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin

    2011-08-01

    In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.

  17. Computational Fluid Dynamics Simulation of Flows in an Oxidation Ditch Driven by a New Surface Aerator

    PubMed Central

    Huang, Weidong; Li, Kun; Wang, Gan; Wang, Yingzhe

    2013-01-01

    Abstract In this article, we present a newly designed inverse umbrella surface aerator, and tested its performance in driving flow of an oxidation ditch. Results show that it has a better performance in driving the oxidation ditch than the original one with higher average velocity and more uniform flow field. We also present a computational fluid dynamics model for predicting the flow field in an oxidation ditch driven by a surface aerator. The improved momentum source term approach to simulate the flow field of the oxidation ditch driven by an inverse umbrella surface aerator was developed and validated through experiments. Four kinds of turbulent models were investigated with the approach, including the standard k−ɛ model, RNG k−ɛ model, realizable k−ɛ model, and Reynolds stress model, and the predicted data were compared with those calculated with the multiple rotating reference frame approach (MRF) and sliding mesh approach (SM). Results of the momentum source term approach are in good agreement with the experimental data, and its prediction accuracy is better than MRF, close to SM. It is also found that the momentum source term approach has lower computational expenses, is simpler to preprocess, and is easier to use. PMID:24302850

  18. Living Emotions, Avoiding Emotions: Behavioral Investigation of the Regulation of Socially Driven Emotions

    PubMed Central

    Grecucci, Alessandro; Giorgetta, Cinzia; Bonini, Nicolao; Sanfey, Alan G.

    2013-01-01

    Emotion regulation is important for psychological well-being. Although it is known that alternative regulation strategies may have different emotional consequences, the effectiveness of such strategies for socially driven emotions remains unclear. In this study we investigated the efficacy of different forms of reappraisal on responses to the selfish and altruistic behavior of others in the Dictator Game. In Experiment 1, subjects mentalized the intentions of the other player in one condition, and took distance from the situation in the other. Emotion ratings were recorded after each offer. Compared with a baseline condition, mentalizing led subjects to experience their emotions more positively when receiving both selfish and altruistic proposals, whereas distancing decreased the valence when receiving altruistic offers, but did not affect the perception of selfish behavior. In Experiment 2, subjects played with both computer and human partners while reappraising the meaning of the player’s intentions (with a human partner) or the meaning of the situation (with a computer partner). Results showed that both contexts were effectively modulated by reappraisal, however a stronger effect was observed when the donor was a human partner, as compared to a computer partner. Taken together, these results demonstrate that socially driven emotions can be successfully modulated by reappraisal strategies that focus on the reinterpretation of others’ intentions. PMID:23349645

  19. HMI Data Driven Magnetohydrodynamic Model Predicted Active Region Photospheric Heating Rates: Their Scale Invariant, Flare Like Power Law Distributions, and Their Possible Association With Flares

    NASA Technical Reports Server (NTRS)

    Goodman, Michael L.; Kwan, Chiman; Ayhan, Bulent; Shang, Eric L.

    2017-01-01

    There are many flare forecasting models. For an excellent review and comparison of some of them see Barnes et al. (2016). All these models are successful to some degree, but there is a need for better models. We claim the most successful models explicitly or implicitly base their forecasts on various estimates of components of the photospheric current density J, based on observations of the photospheric magnetic field B. However, none of the models we are aware of compute the complete J. We seek to develop a better model based on computing the complete photospheric J. Initial results from this model are presented in this talk. We present a data driven, near photospheric, 3 D, non-force free magnetohydrodynamic (MHD) model that computes time series of the total J, and associated resistive heating rate in each pixel at the photosphere in the neutral line regions (NLRs) of 14 active regions (ARs). The model is driven by time series of B measured by the Helioseismic & Magnetic Imager (HMI) on the Solar Dynamics Observatory (SDO) satellite. Spurious Doppler periods due to SDO orbital motion are filtered out of the time series of B in every AR pixel. Errors in B due to these periods can be significant.

  20. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    PubMed Central

    2011-01-01

    Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM) framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level of detail and the realism of its embedded modeling approach, opens up the platform to simulate realistic epidemic scenarios. These features make the GLEaMviz computational tool a convenient teaching/training tool as well as a first step toward the development of a computational tool aimed at facilitating the use and exploitation of computational models for the policy making and scenario analysis of infectious disease outbreaks. PMID:21288355

  1. A silicon-nanowire memory driven by optical gradient force induced bistability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, B.; Institute of Microelectronics, A*STAR; Cai, H., E-mail: caih@ime.a-star.edu.sg

    2015-12-28

    In this paper, a bistable optical-driven silicon-nanowire memory is demonstrated, which employs ring resonator to generate optical gradient force over a doubly clamped silicon-nanowire. Two stable deformation positions of a doubly clamped silicon-nanowire represent two memory states (“0” and “1”) and can be set/reset by modulating the light intensity (<3 mW) based on the optical force induced bistability. The time response of the optical-driven memory is less than 250 ns. It has applications in the fields of all optical communication, quantum computing, and optomechanical circuits.

  2. Autoradiographic method for quantitation of deposition and distribution of radiocalcium in bone

    PubMed Central

    Lawrence Riggs, B; Bassingthwaighte, James B.; Jowsey, Jenifer; Peter Pequegnat, E

    2010-01-01

    A method is described for quantitating autoradiographs of bone-seeking isotopes in microscopic sections of bone. Autoradiographs of bone sections containing 45Ca and internal calibration standards are automatically scanned with a microdensitometer. The digitized optical density output is stored on magnetic tape and is converted by computer to equivalent activity of 45Ca per gram of bone. The computer determines the total 45Ca uptake in the bone section and, on the basis of optical density and anatomic position, quantitatively divides the uptake into 4 components, each representing a separate physiologic process (bone formation, secondary mineralization, diffuse long-term exchange, and surface short-term exchange). The method is also applicable for quantitative analysis of microradiographs of bone sections for mineral content and density. PMID:5416906

  3. Upper Atmosphere Research Report Number 21. Summary of Upper Atmosphere Rocket Research Firings

    DTIC Science & Technology

    1954-02-01

    computer . The sky screens are essentially theodolites which view the rocket through a pair of - crossed rods which are driven closed by an electric motor...positions are electrically measured and fed into a computer . The computer continously predicts the point of impact of the rocket 411 were its thrust...Without such equipment it is neces- sary to rely on optical ’fixes’, sound ranging, or the Impact Point Computer to provide such information. In the early

  4. High-performance computational fluid dynamics: a custom-code approach

    NASA Astrophysics Data System (ADS)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  5. Utility of Quantitative Parameters from Single-Photon Emission Computed Tomography/Computed Tomography in Patients with Destructive Thyroiditis.

    PubMed

    Kim, Ji-Young; Kim, Ji Hyun; Moon, Jae Hoon; Kim, Kyoung Min; Oh, Tae Jung; Lee, Dong-Hwa; So, Young; Lee, Won Woo

    2018-01-01

    Quantitative parameters from Tc-99m pertechnetate single-photon emission computed tomography/computed tomography (SPECT/CT) are emerging as novel diagnostic markers for functional thyroid diseases. We intended to assess the utility of SPECT/CT parameters in patients with destructive thyroiditis. Thirty-five destructive thyroiditis patients (7 males and 28 females; mean age, 47.3 ± 13.0 years) and 20 euthyroid patients (6 males and 14 females; mean age, 45.0 ± 14.8 years) who underwent Tc-99m pertechnetate quantitative SPECT/CT were retrospectively enrolled. Quantitative parameters from the SPECT/CT (%uptake, standardized uptake value [SUV], thyroid volume, and functional thyroid mass [SUVmean × thyroid volume]) and thyroid hormone levels were investigated to assess correlations and predict the prognosis for destructive thyroiditis. The occurrence of hypothyroidism was the outcome for prognosis. All the SPECT/CT quantitative parameters were significantly lower in the 35 destructive thyroiditis patients compared to the 20 euthyroid patients using the same SPECT/CT scanner and protocol ( p < 0.001 for all parameters). T3 and free T4 did not correlate with any SPECT/CT parameters, but thyroid-stimulating hormone (TSH) significantly correlated with %uptake ( p = 0.004), SUVmean ( p < 0.001), SUVmax ( p = 0.002), and functional thyroid mass ( p < 0.001). Of the 35 destructive thyroiditis patients, 16 progressed to hypothyroidism. On univariate and multivariate analyses, only T3 levels were associated with the later occurrence of hypothyroidism ( p = 0.002, exp(β) = 1.022, 95% confidence interval: 1.008 - 1.035). Novel quantitative SPECT/CT parameters could discriminate patients with destructive thyroiditis from euthyroid patients, suggesting the robustness of the quantitative SPECT/CT approach. However, disease progression of destructive thyroiditis could not be predicted using the parameters, as these only correlated with TSH, but not with T3, the sole predictor of the later occurrence of hypothyroidism.

  6. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  7. Utility of Quantitative Parameters from Single-Photon Emission Computed Tomography/Computed Tomography in Patients with Destructive Thyroiditis

    PubMed Central

    Kim, Ji-Young; Kim, Ji Hyun; Moon, Jae Hoon; Kim, Kyoung Min; Oh, Tae Jung; Lee, Dong-Hwa; So, Young

    2018-01-01

    Objective Quantitative parameters from Tc-99m pertechnetate single-photon emission computed tomography/computed tomography (SPECT/CT) are emerging as novel diagnostic markers for functional thyroid diseases. We intended to assess the utility of SPECT/CT parameters in patients with destructive thyroiditis. Materials and Methods Thirty-five destructive thyroiditis patients (7 males and 28 females; mean age, 47.3 ± 13.0 years) and 20 euthyroid patients (6 males and 14 females; mean age, 45.0 ± 14.8 years) who underwent Tc-99m pertechnetate quantitative SPECT/CT were retrospectively enrolled. Quantitative parameters from the SPECT/CT (%uptake, standardized uptake value [SUV], thyroid volume, and functional thyroid mass [SUVmean × thyroid volume]) and thyroid hormone levels were investigated to assess correlations and predict the prognosis for destructive thyroiditis. The occurrence of hypothyroidism was the outcome for prognosis. Results All the SPECT/CT quantitative parameters were significantly lower in the 35 destructive thyroiditis patients compared to the 20 euthyroid patients using the same SPECT/CT scanner and protocol (p < 0.001 for all parameters). T3 and free T4 did not correlate with any SPECT/CT parameters, but thyroid-stimulating hormone (TSH) significantly correlated with %uptake (p = 0.004), SUVmean (p < 0.001), SUVmax (p = 0.002), and functional thyroid mass (p < 0.001). Of the 35 destructive thyroiditis patients, 16 progressed to hypothyroidism. On univariate and multivariate analyses, only T3 levels were associated with the later occurrence of hypothyroidism (p = 0.002, exp(β) = 1.022, 95% confidence interval: 1.008 – 1.035). Conclusion Novel quantitative SPECT/CT parameters could discriminate patients with destructive thyroiditis from euthyroid patients, suggesting the robustness of the quantitative SPECT/CT approach. However, disease progression of destructive thyroiditis could not be predicted using the parameters, as these only correlated with TSH, but not with T3, the sole predictor of the later occurrence of hypothyroidism. PMID:29713225

  8. Computation of hypersonic flows with finite rate condensation and evaporation of water

    NASA Technical Reports Server (NTRS)

    Perrell, Eric R.; Candler, Graham V.; Erickson, Wayne D.; Wieting, Alan R.

    1993-01-01

    A computer program for modelling 2D hypersonic flows of gases containing water vapor and liquid water droplets is presented. The effects of interphase mass, momentum and energy transfer are studied. Computations are compared with existing quasi-1D calculations on the nozzle of the NASA Langley Eight Foot High Temperature Tunnel, a hypersonic wind tunnel driven by combustion of natural gas in oxygen enriched air.

  9. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    NASA Astrophysics Data System (ADS)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html

  10. JobCenter: an open source, cross-platform, and distributed job queue management system optimized for scalability and versatility.

    PubMed

    Jaschob, Daniel; Riffle, Michael

    2012-07-30

    Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. JobCenter is a client-server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or "in the cloud") and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/.

  11. Coil combination for receive array spectroscopy: Are data-driven methods superior to methods using computed field maps?

    PubMed

    Rodgers, Christopher T; Robson, Matthew D

    2016-02-01

    Combining spectra from receive arrays, particularly X-nuclear spectra with low signal-to-noise ratios (SNRs), is challenging. We test whether data-driven combination methods are better than using computed coil sensitivities. Several combination algorithms are recast into the notation of Roemer's classic formula, showing that they differ primarily in their estimation of coil receive sensitivities. This viewpoint reveals two extensions of the whitened singular-value decomposition (WSVD) algorithm, using temporal or temporal + spatial apodization to improve the coil sensitivities, and thus the combined spectral SNR. Radiofrequency fields from an array were simulated and used to make synthetic spectra. These were combined with 10 algorithms. The combined spectra were then assessed in terms of their SNR. Validation used phantoms and cardiac (31) P spectra from five subjects at 3T. Combined spectral SNRs from simulations, phantoms, and humans showed the same trends. In phantoms, the combined SNR using computed coil sensitivities was lower than with WSVD combination whenever the WSVD SNR was >14 (or >11 with temporal apodization, or >9 with temporal + spatial apodization). These new apodized WSVD methods gave higher SNRs than other data-driven methods. In the human torso, at frequencies ≥49 MHz, data-driven combination is preferable to using computed coil sensitivities. Magn Reson, 2015. © 2015 The Authors. Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. Magn Reson Med 75:473-487, 2016. © 2015 The Authors. Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. © 2015 The Authors. Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.

  12. The Influence of Reconstruction Kernel on Bone Mineral and Strength Estimates Using Quantitative Computed Tomography and Finite Element Analysis.

    PubMed

    Michalski, Andrew S; Edwards, W Brent; Boyd, Steven K

    2017-10-17

    Quantitative computed tomography has been posed as an alternative imaging modality to investigate osteoporosis. We examined the influence of computed tomography convolution back-projection reconstruction kernels on the analysis of bone quantity and estimated mechanical properties in the proximal femur. Eighteen computed tomography scans of the proximal femur were reconstructed using both a standard smoothing reconstruction kernel and a bone-sharpening reconstruction kernel. Following phantom-based density calibration, we calculated typical bone quantity outcomes of integral volumetric bone mineral density, bone volume, and bone mineral content. Additionally, we performed finite element analysis in a standard sideways fall on the hip loading configuration. Significant differences for all outcome measures, except integral bone volume, were observed between the 2 reconstruction kernels. Volumetric bone mineral density measured using images reconstructed by the standard kernel was significantly lower (6.7%, p < 0.001) when compared with images reconstructed using the bone-sharpening kernel. Furthermore, the whole-bone stiffness and the failure load measured in images reconstructed by the standard kernel were significantly lower (16.5%, p < 0.001, and 18.2%, p < 0.001, respectively) when compared with the image reconstructed by the bone-sharpening kernel. These data suggest that for future quantitative computed tomography studies, a standardized reconstruction kernel will maximize reproducibility, independent of the use of a quantitative calibration phantom. Copyright © 2017 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  13. Sensory processing during viewing of cinematographic material: Computational modeling and functional neuroimaging

    PubMed Central

    Bordier, Cecile; Puja, Francesco; Macaluso, Emiliano

    2013-01-01

    The investigation of brain activity using naturalistic, ecologically-valid stimuli is becoming an important challenge for neuroscience research. Several approaches have been proposed, primarily relying on data-driven methods (e.g. independent component analysis, ICA). However, data-driven methods often require some post-hoc interpretation of the imaging results to draw inferences about the underlying sensory, motor or cognitive functions. Here, we propose using a biologically-plausible computational model to extract (multi-)sensory stimulus statistics that can be used for standard hypothesis-driven analyses (general linear model, GLM). We ran two separate fMRI experiments, which both involved subjects watching an episode of a TV-series. In Exp 1, we manipulated the presentation by switching on-and-off color, motion and/or sound at variable intervals, whereas in Exp 2, the video was played in the original version, with all the consequent continuous changes of the different sensory features intact. Both for vision and audition, we extracted stimulus statistics corresponding to spatial and temporal discontinuities of low-level features, as well as a combined measure related to the overall stimulus saliency. Results showed that activity in occipital visual cortex and the superior temporal auditory cortex co-varied with changes of low-level features. Visual saliency was found to further boost activity in extra-striate visual cortex plus posterior parietal cortex, while auditory saliency was found to enhance activity in the superior temporal cortex. Data-driven ICA analyses of the same datasets also identified “sensory” networks comprising visual and auditory areas, but without providing specific information about the possible underlying processes, e.g., these processes could relate to modality, stimulus features and/or saliency. We conclude that the combination of computational modeling and GLM enables the tracking of the impact of bottom–up signals on brain activity during viewing of complex and dynamic multisensory stimuli, beyond the capability of purely data-driven approaches. PMID:23202431

  14. Multimodal computational microscopy based on transport of intensity equation

    NASA Astrophysics Data System (ADS)

    Li, Jiaji; Chen, Qian; Sun, Jiasong; Zhang, Jialin; Zuo, Chao

    2016-12-01

    Transport of intensity equation (TIE) is a powerful tool for phase retrieval and quantitative phase imaging, which requires intensity measurements only at axially closely spaced planes without a separate reference beam. It does not require coherent illumination and works well on conventional bright-field microscopes. The quantitative phase reconstructed by TIE gives valuable information that has been encoded in the complex wave field by passage through a sample of interest. Such information may provide tremendous flexibility to emulate various microscopy modalities computationally without requiring specialized hardware components. We develop a requisite theory to describe such a hybrid computational multimodal imaging system, which yields quantitative phase, Zernike phase contrast, differential interference contrast, and light field moment imaging, simultaneously. It makes the various observations for biomedical samples easy. Then we give the experimental demonstration of these ideas by time-lapse imaging of live HeLa cell mitosis. Experimental results verify that a tunable lens-based TIE system, combined with the appropriate postprocessing algorithm, can achieve a variety of promising imaging modalities in parallel with the quantitative phase images for the dynamic study of cellular processes.

  15. Simultaneous modeling of visual saliency and value computation improves predictions of economic choice.

    PubMed

    Towal, R Blythe; Mormann, Milica; Koch, Christof

    2013-10-01

    Many decisions we make require visually identifying and evaluating numerous alternatives quickly. These usually vary in reward, or value, and in low-level visual properties, such as saliency. Both saliency and value influence the final decision. In particular, saliency affects fixation locations and durations, which are predictive of choices. However, it is unknown how saliency propagates to the final decision. Moreover, the relative influence of saliency and value is unclear. Here we address these questions with an integrated model that combines a perceptual decision process about where and when to look with an economic decision process about what to choose. The perceptual decision process is modeled as a drift-diffusion model (DDM) process for each alternative. Using psychophysical data from a multiple-alternative, forced-choice task, in which subjects have to pick one food item from a crowded display via eye movements, we test four models where each DDM process is driven by (i) saliency or (ii) value alone or (iii) an additive or (iv) a multiplicative combination of both. We find that models including both saliency and value weighted in a one-third to two-thirds ratio (saliency-to-value) significantly outperform models based on either quantity alone. These eye fixation patterns modulate an economic decision process, also described as a DDM process driven by value. Our combined model quantitatively explains fixation patterns and choices with similar or better accuracy than previous models, suggesting that visual saliency has a smaller, but significant, influence than value and that saliency affects choices indirectly through perceptual decisions that modulate economic decisions.

  16. Simultaneous modeling of visual saliency and value computation improves predictions of economic choice

    PubMed Central

    Towal, R. Blythe; Mormann, Milica; Koch, Christof

    2013-01-01

    Many decisions we make require visually identifying and evaluating numerous alternatives quickly. These usually vary in reward, or value, and in low-level visual properties, such as saliency. Both saliency and value influence the final decision. In particular, saliency affects fixation locations and durations, which are predictive of choices. However, it is unknown how saliency propagates to the final decision. Moreover, the relative influence of saliency and value is unclear. Here we address these questions with an integrated model that combines a perceptual decision process about where and when to look with an economic decision process about what to choose. The perceptual decision process is modeled as a drift–diffusion model (DDM) process for each alternative. Using psychophysical data from a multiple-alternative, forced-choice task, in which subjects have to pick one food item from a crowded display via eye movements, we test four models where each DDM process is driven by (i) saliency or (ii) value alone or (iii) an additive or (iv) a multiplicative combination of both. We find that models including both saliency and value weighted in a one-third to two-thirds ratio (saliency-to-value) significantly outperform models based on either quantity alone. These eye fixation patterns modulate an economic decision process, also described as a DDM process driven by value. Our combined model quantitatively explains fixation patterns and choices with similar or better accuracy than previous models, suggesting that visual saliency has a smaller, but significant, influence than value and that saliency affects choices indirectly through perceptual decisions that modulate economic decisions. PMID:24019496

  17. Real-time computing platform for spiking neurons (RT-spike).

    PubMed

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  18. The Further Development of CSIEC Project Driven by Application and Evaluation in English Education

    ERIC Educational Resources Information Center

    Jia, Jiyou; Chen, Weichao

    2009-01-01

    In this paper, we present the comprehensive version of CSIEC (Computer Simulation in Educational Communication), an interactive web-based human-computer dialogue system with natural language for English instruction, and its tentative application and evaluation in English education. First, we briefly introduce the motivation for this project,…

  19. The Development and Deployment of a Virtual Unit Operations Laboratory

    ERIC Educational Resources Information Center

    Vaidyanath, Sreeram; Williams, Jason; Hilliard, Marcus; Wiesner, Theodore

    2007-01-01

    Computer-simulated experiments offer many benefits to engineering curricula in the areas of safety, cost, and flexibility. We report our experience in developing and deploying a computer-simulated unit operations laboratory, driven by the guiding principle of maximum fidelity to the physical lab. We find that, while the up-front investment in…

  20. An Interactive Computer-Based Conferencing System to Accommodate Students' Learning Process.

    ERIC Educational Resources Information Center

    Saiedian, Hossein

    1993-01-01

    Describes an integrated computer-based conferencing and mail system called ICMS (Integrated Conferencing and Mail System) that was developed to encourage students to participate in class discussions more actively. The menu-driven user interface is explained, and ICMS's role in promoting self-assessment and critical thinking is discussed. (eight…

  1. Reconsidering Simulations in Science Education at a Distance: Features of Effective Use

    ERIC Educational Resources Information Center

    Blake, C.; Scanlon, E.

    2007-01-01

    This paper proposes a reconsideration of use of computer simulations in science education. We discuss three studies of the use of science simulations for undergraduate distance learning students. The first one, "The Driven Pendulum" simulation is a computer-based experiment on the behaviour of a pendulum. The second simulation, "Evolve" is…

  2. Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course

    ERIC Educational Resources Information Center

    McGowan, Ian S.

    2016-01-01

    Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…

  3. Mind, Brain, and Education in the Digital Era

    ERIC Educational Resources Information Center

    Battro, Antonio M.; Fischer, Kurt W.

    2012-01-01

    Computers are everywhere, and they are transforming the human world. The technology of computers and the Internet is radically changing the ways that people learn and communicate. In the midst of this technology-driven revolution people need to examine the changes to analyze how they are altering interaction and human culture. The changes have…

  4. Learning Computing Topics in Undergraduate Information Systems Courses: Managing Perceived Difficulty

    ERIC Educational Resources Information Center

    Wall, Jeffrey D.; Knapp, Janice

    2014-01-01

    Learning technical computing skills is increasingly important in our technology driven society. However, learning technical skills in information systems (IS) courses can be difficult. More than 20 percent of students in some technical courses may dropout or fail. Unfortunately, little is known about students' perceptions of the difficulty of…

  5. Teaching of Computer Science Topics Using Meta-Programming-Based GLOs and LEGO Robots

    ERIC Educational Resources Information Center

    Štuikys, Vytautas; Burbaite, Renata; Damaševicius, Robertas

    2013-01-01

    The paper's contribution is a methodology that integrates two educational technologies (GLO and LEGO robot) to teach Computer Science (CS) topics at the school level. We present the methodology as a framework of 5 components (pedagogical activities, technology driven processes, tools, knowledge transfer actors, and pedagogical outcomes) and…

  6. Where Computer Science and Cultural Studies Collide

    ERIC Educational Resources Information Center

    Kirschenbaum, Matthew

    2009-01-01

    Most users have no more knowledge of what their computer or code is actually doing than most automobile owners have of their carburetor or catalytic converter. Nor is any such knowledge necessarily needed. But for academics, driven by an increasing emphasis on the materiality of new media--that is, the social, cultural, and economic factors…

  7. Presentation Trainer: What Experts and Computers Can Tell about Your Nonverbal Communication

    ERIC Educational Resources Information Center

    Schneider, J.; Börner, D.; van Rosmalen, P.; Specht, M.

    2017-01-01

    The ability to present effectively is essential for professionals; therefore, oral communication courses have become part of the curricula for higher education studies. However, speaking in public is still a challenge for many graduates. To tackle this problem, driven by the recent advances in computer vision techniques and prosody analysis,…

  8. Speed in Information Processing with a Computer Driven Visual Display in a Real-time Digital Simulation. M.S. Thesis - Virginia Polytechnic Inst.

    NASA Technical Reports Server (NTRS)

    Kyle, R. G.

    1972-01-01

    Information transfer between the operator and computer-generated display systems is an area where the human factors engineer discovers little useful design data relating human performance to system effectiveness. This study utilized a computer-driven, cathode-ray-tube graphic display to quantify human response speed in a sequential information processing task. The performance criteria was response time to sixteen cell elements of a square matrix display. A stimulus signal instruction specified selected cell locations by both row and column identification. An equal probable number code, from one to four, was assigned at random to the sixteen cells of the matrix and correspondingly required one of four, matched keyed-response alternatives. The display format corresponded to a sequence of diagnostic system maintenance events, that enable the operator to verify prime system status, engage backup redundancy for failed subsystem components, and exercise alternate decision-making judgements. The experimental task bypassed the skilled decision-making element and computer processing time, in order to determine a lower bound on the basic response speed for given stimulus/response hardware arrangement.

  9. Chemical reaction path modeling of hydrothermal processes on Mars: Preliminary results

    NASA Technical Reports Server (NTRS)

    Plumlee, Geoffrey S.; Ridley, W. Ian

    1992-01-01

    Hydrothermal processes are thought to have had significant roles in the development of surficial mineralogies and morphological features on Mars. For example, a significant proportion of the Martian soil could consist of the erosional products of hydrothermally altered impact melt sheets. In this model, impact-driven, vapor-dominated hydrothermal systems hydrothermally altered the surrounding rocks and transported volatiles such as S and Cl to the surface. Further support for impact-driven hydrothermal alteration on Mars was provided by studies of the Ries crater, Germany, where suevite deposits were extensively altered to montmorillonite clays by inferred low-temperature (100-130 C) hydrothermal fluids. It was also suggested that surface outflow from both impact-driven and volcano-driven hydrothermal systems could generate the valley networks, thereby eliminating the need for an early warm wet climate. We use computer-driven chemical reaction path calculation to model chemical processes which were likely associated with postulated Martian hydrothermal systems.

  10. Quantitative computed tomography and aerosol morphometry in COPD and alpha1-antitrypsin deficiency.

    PubMed

    Shaker, S B; Maltbaek, N; Brand, P; Haeussermann, S; Dirksen, A

    2005-01-01

    Relative area of emphysema below -910 Hounsfield units (RA-910) and 15th percentile density (PD15) are quantitative computed tomography (CT) parameters used in the diagnosis of emphysema. New concepts for noninvasive diagnosis of emphysema are aerosol-derived airway morphometry, which measures effective airspace dimensions (EAD) and aerosol bolus dispersion (ABD). Quantitative CT, ABD and EAD were compared in 20 smokers with chronic obstructive pulmonary disease (COPD) and 22 patients with alpha1-antitrypsin deficiency (AAD) with a similar degree of airway obstruction and reduced diffusion capacity. In both groups, there was a significant correlation between RA-910 and PD15 and pulmonary function tests (PFTs). A significant correlation was also found between EAD, RA-910 and PD15 in the study population as a whole. Upon separation into two groups, the significance disappeared for the smokers with COPD and strengthened for those with AAD, where EAD correlated significantly with RA-910 and PD15. ABD was similar in the two groups and did not correlate with PFT and quantitative CT in either group. In conclusion, based on quantitative computed tomography and aerosol-derived airway morphometry, emphysema was significantly more severe in patients with alpha1-antitrypsin deficiency compared with patients with usual emphysema, despite similar measures of pulmonary function tests.

  11. Quantitative Study on Computer Self-Efficacy and Computer Anxiety Differences in Academic Major and Residential Status

    ERIC Educational Resources Information Center

    Binkley, Zachary Wayne McClellan

    2017-01-01

    This study investigates computer self-efficacy and computer anxiety within 61 students across two academic majors, Aviation and Sports and Exercise Science, while investigating the impact residential status, age, and gender has on those two psychological constructs. The purpose of the study is to find if computer self-efficacy and computer anxiety…

  12. Slicing for Biology.

    ERIC Educational Resources Information Center

    Ekstrom, James

    2001-01-01

    Advocates using computer imaging technology to assist students in doing projects in which determining density is important. Students can study quantitative comparisons of masses, lengths, and widths using computer software. Includes figures displaying computer images of shells, yeast cultures, and the Aral Sea. (SAH)

  13. Exposure Science and the US EPA National Center for Computational Toxicology

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The...

  14. Analysis of the Relationship Between Climate and NDVI Variability at Global Scales

    NASA Technical Reports Server (NTRS)

    Zeng, Fan-Wei; Collatz, G. James; Pinzon, Jorge; Ivanoff, Alvaro

    2011-01-01

    interannual variability in modeled (CASA) C flux is in part caused by interannual variability in Normalized Difference Vegetation Index (NDVI) Fraction of Photosynthetically Active Radiation (FPAR). This study confirms a mechanism producing variability in modeled NPP: -- NDVI (FPAR) interannual variability is strongly driven by climate; -- The climate driven variability in NDVI (FPAR) can lead to much larger fluctuation in NPP vs. the NPP computed from FPAR climatology

  15. Petri net model for analysis of concurrently processed complex algorithms

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1986-01-01

    This paper presents a Petri-net model suitable for analyzing the concurrent processing of computationally complex algorithms. The decomposed operations are to be processed in a multiple processor, data driven architecture. Of particular interest is the application of the model to both the description of the data/control flow of a particular algorithm, and to the general specification of the data driven architecture. A candidate architecture is also presented.

  16. Functional recovery in upper limb function in stroke survivors by using brain-computer interface A single case A-B-A-B design.

    PubMed

    Ono, Takashi; Mukaino, Masahiko; Ushiba, Junichi

    2013-01-01

    Resent studies suggest that brain-computer interface (BCI) training for chronic stroke patient is useful to improve their motor function of paretic hand. However, these studies does not show the extent of the contribution of the BCI clearly because they prescribed BCI with other rehabilitation systems, e.g. an orthosis itself, a robotic intervention, or electrical stimulation. We therefore compared neurological effects between interventions with neuromuscular electrical stimulation (NMES) with motor imagery and BCI-driven NMES, employing an ABAB experimental design. In epoch A, the subject received NMES on paretic extensor digitorum communis (EDC). The subject was asked to attempt finger extension simultaneously. In epoch B, the subject received NMES when BCI system detected motor-related electroencephalogram change while attempting motor imagery. Both epochs were carried out for 60 min per day, 5 days per week. As a result, EMG activity of EDC was enhanced by BCI-driven NMES and significant cortico-muscular coherence was observed at the final evaluation. These results indicate that the training by BCI-driven NMES is effective even compared to motor imagery combined with NMES, suggesting the superiority of closed-loop training with BCI-driven NMES to open-loop NMES for chronic stroke patients.

  17. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology

    PubMed Central

    2017-01-01

    While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance. PMID:29023441

  18. Biomarkers: Delivering on the expectation of molecularly driven, quantitative health.

    PubMed

    Wilson, Jennifer L; Altman, Russ B

    2018-02-01

    Biomarkers are the pillars of precision medicine and are delivering on expectations of molecular, quantitative health. These features have made clinical decisions more precise and personalized, but require a high bar for validation. Biomarkers have improved health outcomes in a few areas such as cancer, pharmacogenetics, and safety. Burgeoning big data research infrastructure, the internet of things, and increased patient participation will accelerate discovery in the many areas that have not yet realized the full potential of biomarkers for precision health. Here we review themes of biomarker discovery, current implementations of biomarkers for precision health, and future opportunities and challenges for biomarker discovery. Impact statement Precision medicine evolved because of the understanding that human disease is molecularly driven and is highly variable across patients. This understanding has made biomarkers, a diverse class of biological measurements, more relevant for disease diagnosis, monitoring, and selection of treatment strategy. Biomarkers' impact on precision medicine can be seen in cancer, pharmacogenomics, and safety. The successes in these cases suggest many more applications for biomarkers and a greater impact for precision medicine across the spectrum of human disease. The authors assess the status of biomarker-guided medical practice by analyzing themes for biomarker discovery, reviewing the impact of these markers in the clinic, and highlight future and ongoing challenges for biomarker discovery. This work is timely and relevant, as the molecular, quantitative approach of precision medicine is spreading to many disease indications.

  19. Quantitative metrics for evaluating the phased roll-out of clinical information systems.

    PubMed

    Wong, David; Wu, Nicolas; Watkinson, Peter

    2017-09-01

    We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  20. Quantitative Temporal in Vivo Proteomics Deciphers the Transition of Virus-Driven Myeloid Cells into M2 Macrophages

    PubMed Central

    2017-01-01

    Myeloid cells play a central role in the context of viral eradication, yet precisely how these cells differentiate throughout the course of acute infections is poorly understood. In this study, we have developed a novel quantitative temporal in vivo proteomics (QTiPs) platform to capture proteomic signatures of temporally transitioning virus-driven myeloid cells directly in situ, thus taking into consideration host–virus interactions throughout the course of an infection. QTiPs, in combination with phenotypic, functional, and metabolic analyses, elucidated a pivotal role for inflammatory CD11b+, Ly6G–, Ly6Chigh-low cells in antiviral immune response and viral clearance. Most importantly, the time-resolved QTiPs data set showed the transition of CD11b+, Ly6G–, Ly6Chigh-low cells into M2-like macrophages, which displayed increased antigen-presentation capacities and bioenergetic demands late in infection. We elucidated the pivotal role of myeloid cells in virus clearance and show how these cells phenotypically, functionally, and metabolically undergo a timely transition from inflammatory to M2-like macrophages in vivo. With respect to the growing appreciation for in vivo examination of viral–host interactions and for the role of myeloid cells, this study elucidates the use of quantitative proteomics to reveal the role and response of distinct immune cell populations throughout the course of virus infection. PMID:28768414

  1. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Allcock, William; Beggio, Chris

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less

  2. Strange nonchaotic attractors for computation

    NASA Astrophysics Data System (ADS)

    Sathish Aravindh, M.; Venkatesan, A.; Lakshmanan, M.

    2018-05-01

    We investigate the response of quasiperiodically driven nonlinear systems exhibiting strange nonchaotic attractors (SNAs) to deterministic input signals. We show that if one uses two square waves in an aperiodic manner as input to a quasiperiodically driven double-well Duffing oscillator system, the response of the system can produce logical output controlled by such a forcing. Changing the threshold or biasing of the system changes the output to another logic operation and memory latch. The interplay of nonlinearity and quasiperiodic forcing yields logical behavior, and the emergent outcome of such a system is a logic gate. It is further shown that the logical behaviors persist even for an experimental noise floor. Thus the SNA turns out to be an efficient tool for computation.

  3. Cycle-averaged dynamics of a periodically driven, closed-loop circulation model

    NASA Technical Reports Server (NTRS)

    Heldt, T.; Chang, J. L.; Chen, J. J. S.; Verghese, G. C.; Mark, R. G.

    2005-01-01

    Time-varying elastance models have been used extensively in the past to simulate the pulsatile nature of cardiovascular waveforms. Frequently, however, one is interested in dynamics that occur over longer time scales, in which case a detailed simulation of each cardiac contraction becomes computationally burdensome. In this paper, we apply circuit-averaging techniques to a periodically driven, closed-loop, three-compartment recirculation model. The resultant cycle-averaged model is linear and time invariant, and greatly reduces the computational burden. It is also amenable to systematic order reduction methods that lead to further efficiencies. Despite its simplicity, the averaged model captures the dynamics relevant to the representation of a range of cardiovascular reflex mechanisms. c2004 Elsevier Ltd. All rights reserved.

  4. Single-photon emitting diode in silicon carbide.

    PubMed

    Lohrmann, A; Iwamoto, N; Bodrog, Z; Castelletto, S; Ohshima, T; Karle, T J; Gali, A; Prawer, S; McCallum, J C; Johnson, B C

    2015-07-23

    Electrically driven single-photon emitting devices have immediate applications in quantum cryptography, quantum computation and single-photon metrology. Mature device fabrication protocols and the recent observations of single defect systems with quantum functionalities make silicon carbide an ideal material to build such devices. Here, we demonstrate the fabrication of bright single-photon emitting diodes. The electrically driven emitters display fully polarized output, superior photon statistics (with a count rate of >300 kHz) and stability in both continuous and pulsed modes, all at room temperature. The atomic origin of the single-photon source is proposed. These results provide a foundation for the large scale integration of single-photon sources into a broad range of applications, such as quantum cryptography or linear optics quantum computing.

  5. Data-driven indexing mechanism for the recognition of polyhedral objects

    NASA Astrophysics Data System (ADS)

    McLean, Stewart; Horan, Peter; Caelli, Terry M.

    1992-02-01

    This paper is concerned with the problem of searching large model databases. To date, most object recognition systems have concentrated on the problem of matching using simple searching algorithms. This is quite acceptable when the number of object models is small. However, in the future, general purpose computer vision systems will be required to recognize hundreds or perhaps thousands of objects and, in such circumstances, efficient searching algorithms will be needed. The problem of searching a large model database is one which must be addressed if future computer vision systems are to be at all effective. In this paper we present a method we call data-driven feature-indexed hypothesis generation as one solution to the problem of searching large model databases.

  6. TheCellMap.org: A Web-Accessible Database for Visualizing and Mining the Global Yeast Genetic Interaction Network

    PubMed Central

    Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L.; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2017-01-01

    Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae. In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. PMID:28325812

  7. TheCellMap.org: A Web-Accessible Database for Visualizing and Mining the Global Yeast Genetic Interaction Network.

    PubMed

    Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2017-05-05

    Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. Copyright © 2017 Usaj et al.

  8. The Implications of Pervasive Computing on Network Design

    NASA Astrophysics Data System (ADS)

    Briscoe, R.

    Mark Weiser's late-1980s vision of an age of calm technology with pervasive computing disappearing into the fabric of the world [1] has been tempered by an industry-driven vision with more of a feel of conspicuous consumption. In the modified version, everyone carries around consumer electronics to provide natural, seamless interactions both with other people and with the information world, particularly for eCommerce, but still through a pervasive computing fabric.

  9. Development of an Interactive Social Media Tool for Parents with Concerns about Vaccines

    ERIC Educational Resources Information Center

    Shoup, Jo Ann; Wagner, Nicole M.; Kraus, Courtney R.; Narwaney, Komal J.; Goddard, Kristin S.; Glanz, Jason M.

    2015-01-01

    Objective: Describe a process for designing, building, and evaluating a theory-driven social media intervention tool to help reduce parental concerns about vaccination. Method: We developed an interactive web-based tool using quantitative and qualitative methods (e.g., survey, focus groups, individual interviews, and usability testing). Results:…

  10. Dual Credit/Dual Enrollment and Data Driven Policy Implementation

    ERIC Educational Resources Information Center

    Lichtenberger, Eric; Witt, M. Allison; Blankenberger, Bob; Franklin, Doug

    2014-01-01

    The use of dual credit has been expanding rapidly. Dual credit is a college course taken by a high school student for which both college and high school credit is given. Previous studies provided limited quantitative evidence that dual credit/dual enrollment is directly connected to positive student outcomes. In this study, predictive statistics…

  11. Quantitation & Case-Study-Driven Inquiry to Enhance Yeast Fermentation Studies

    ERIC Educational Resources Information Center

    Grammer, Robert T.

    2012-01-01

    We propose a procedure for the assay of fermentation in yeast in microcentrifuge tubes that is simple and rapid, permitting assay replicates, descriptive statistics, and the preparation of line graphs that indicate reproducibility. Using regression and simple derivatives to determine initial velocities, we suggest methods to compare the effects of…

  12. Hydraulic transport across hydrophilic and hydrophobic nanopores: Flow experiments with water and n-hexane.

    PubMed

    Gruener, Simon; Wallacher, Dirk; Greulich, Stefanie; Busch, Mark; Huber, Patrick

    2016-01-01

    We experimentally explore pressure-driven flow of water and n-hexane across nanoporous silica (Vycor glass monoliths with 7- or 10-nm pore diameters, respectively) as a function of temperature and surface functionalization (native and silanized glass surfaces). Hydraulic flow rates are measured by applying hydrostatic pressures via inert gases (argon and helium, pressurized up to 70 bar) on the upstream side in a capacitor-based membrane permeability setup. For the native, hydrophilic silica walls, the measured hydraulic permeabilities can be quantitatively accounted for by bulk fluidity provided we assume a sticking boundary layer, i.e., a negative velocity slip length of molecular dimensions. The thickness of this boundary layer is discussed with regard to previous capillarity-driven flow experiments (spontaneous imbibition) and with regard to velocity slippage at the pore walls resulting from dissolved gas. Water flow across the silanized, hydrophobic nanopores is blocked up to a hydrostatic pressure of at least 70 bar. The absence of a sticking boundary layer quantitatively accounts for an enhanced n-hexane permeability in the hydrophobic compared to the hydrophilic nanopores.

  13. Assessing and predicting drug-induced anticholinergic risks: an integrated computational approach.

    PubMed

    Xu, Dong; Anderson, Heather D; Tao, Aoxiang; Hannah, Katia L; Linnebur, Sunny A; Valuck, Robert J; Culbertson, Vaughn L

    2017-11-01

    Anticholinergic (AC) adverse drug events (ADEs) are caused by inhibition of muscarinic receptors as a result of designated or off-target drug-receptor interactions. In practice, AC toxicity is assessed primarily based on clinician experience. The goal of this study was to evaluate a novel concept of integrating big pharmacological and healthcare data to assess clinical AC toxicity risks. AC toxicity scores (ATSs) were computed using drug-receptor inhibitions identified through pharmacological data screening. A longitudinal retrospective cohort study using medical claims data was performed to quantify AC clinical risks. ATS was compared with two previously reported toxicity measures. A quantitative structure-activity relationship (QSAR) model was established for rapid assessment and prediction of AC clinical risks. A total of 25 common medications, and 575,228 exposed and unexposed patients were analyzed. Our data indicated that ATS is more consistent with the trend of AC outcomes than other toxicity methods. Incorporating drug pharmacokinetic parameters to ATS yielded a QSAR model with excellent correlation to AC incident rate ( R 2 = 0.83) and predictive performance (cross validation Q 2 = 0.64). Good correlation and predictive performance ( R 2 = 0.68/ Q 2 = 0.29) were also obtained for an M2 receptor-specific QSAR model and tachycardia, an M2 receptor-specific ADE. Albeit using a small medication sample size, our pilot data demonstrated the potential and feasibility of a new computational AC toxicity scoring approach driven by underlying pharmacology and big data analytics. Follow-up work is under way to further develop the ATS scoring approach and clinical toxicity predictive model using a large number of medications and clinical parameters.

  14. Accelerator on a Chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    England, Joel

    2014-06-30

    SLAC's Joel England explains how the same fabrication techniques used for silicon computer microchips allowed their team to create the new laser-driven particle accelerator chips. (SLAC Multimedia Communications)

  15. Accelerator on a Chip

    ScienceCinema

    England, Joel

    2018-01-16

    SLAC's Joel England explains how the same fabrication techniques used for silicon computer microchips allowed their team to create the new laser-driven particle accelerator chips. (SLAC Multimedia Communications)

  16. 1, 2, 3, 4: Infusing Quantitative Literacy into Introductory Biology

    PubMed Central

    Momsen, Jennifer L.; Moyerbrailean, Gregory A.; Ebert-May, Diane; Long, Tammy M.; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills. PMID:20810965

  17. Towards Test Driven Development for Computational Science with pFUnit

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Clune, Thomas L.

    2014-01-01

    Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.

  18. Computer simulation of the metastatic progression.

    PubMed

    Wedemann, Gero; Bethge, Anja; Haustein, Volker; Schumacher, Udo

    2014-01-01

    A novel computer model based on a discrete event simulation procedure describes quantitatively the processes underlying the metastatic cascade. Analytical functions describe the size of the primary tumor and the metastases, while a rate function models the intravasation events of the primary tumor and metastases. Events describe the behavior of the malignant cells until the formation of new metastases. The results of the computer simulations are in quantitative agreement with clinical data determined from a patient with hepatocellular carcinoma in the liver. The model provides a more detailed view on the process than a conventional mathematical model. In particular, the implications of interventions on metastasis formation can be calculated.

  19. Combining Ultrasound Pulse-Echo and Transmission Computed Tomography for Quantitative Imaging the Cortical Shell of Long Bone Replicas

    NASA Astrophysics Data System (ADS)

    Shortell, Matthew P.; Althomali, Marwan A. M.; Wille, Marie-Luise; Langton, Christian M.

    2017-11-01

    We demonstrate a simple technique for quantitative ultrasound imaging of the cortical shell of long bone replicas. Traditional ultrasound computed tomography instruments use the transmitted or reflected waves for separate reconstructions but suffer from strong refraction artefacts in highly heterogenous samples such as bones in soft tissue. The technique described here simplifies the long bone to a two-component composite and uses both the transmitted and reflected waves for reconstructions, allowing the speed of sound and thickness of the cortical shell to be calculated accurately. The technique is simple to implement, computationally inexpensive and sample positioning errors are minimal.

  20. Qualitative and quantitative interpretation of SEM image using digital image processing.

    PubMed

    Saladra, Dawid; Kopernik, Magdalena

    2016-10-01

    The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  1. [The development of a computer model in the quantitative assessment of thallium-201 myocardial scintigraphy].

    PubMed

    Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A

    1993-05-01

    Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".

  2. A fully coupled method for massively parallel simulation of hydraulically driven fractures in 3-dimensions: FULLY COUPLED PARALLEL SIMULATION OF HYDRAULIC FRACTURES IN 3-D

    DOE PAGES

    Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.; ...

    2016-09-18

    This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.

  3. A microscale turbine driven by diffusive mass flux.

    PubMed

    Yang, Mingcheng; Liu, Rui; Ripoll, Marisol; Chen, Ke

    2015-10-07

    An external diffusive mass flux is shown to be able to generate a mechanical torque on a microscale object based on anisotropic diffusiophoresis. In light of this finding, we propose a theoretical prototype micro-turbine driven purely by diffusive mass flux, which is in strong contrast to conventional turbines driven by convective mass flows. The rotational velocity of the proposed turbine is determined by the external concentration gradient, the geometry and the diffusiophoretic properties of the turbine. This scenario is validated by performing computer simulations. Our finding thus provides a new type of chemo-mechanical response which could be used to exploit existing chemical energies at small scales.

  4. A fully coupled method for massively parallel simulation of hydraulically driven fractures in 3-dimensions: FULLY COUPLED PARALLEL SIMULATION OF HYDRAULIC FRACTURES IN 3-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.

    This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.

  5. A General and Efficient Method for Incorporating Precise Spike Times in Globally Time-Driven Simulations

    PubMed Central

    Hanuschkin, Alexander; Kunkel, Susanne; Helias, Moritz; Morrison, Abigail; Diesmann, Markus

    2010-01-01

    Traditionally, event-driven simulations have been limited to the very restricted class of neuronal models for which the timing of future spikes can be expressed in closed form. Recently, the class of models that is amenable to event-driven simulation has been extended by the development of techniques to accurately calculate firing times for some integrate-and-fire neuron models that do not enable the prediction of future spikes in closed form. The motivation of this development is the general perception that time-driven simulations are imprecise. Here, we demonstrate that a globally time-driven scheme can calculate firing times that cannot be discriminated from those calculated by an event-driven implementation of the same model; moreover, the time-driven scheme incurs lower computational costs. The key insight is that time-driven methods are based on identifying a threshold crossing in the recent past, which can be implemented by a much simpler algorithm than the techniques for predicting future threshold crossings that are necessary for event-driven approaches. As run time is dominated by the cost of the operations performed at each incoming spike, which includes spike prediction in the case of event-driven simulation and retrospective detection in the case of time-driven simulation, the simple time-driven algorithm outperforms the event-driven approaches. Additionally, our method is generally applicable to all commonly used integrate-and-fire neuronal models; we show that a non-linear model employing a standard adaptive solver can reproduce a reference spike train with a high degree of precision. PMID:21031031

  6. Identifying and locating surface defects in wood: Part of an automated lumber processing system

    Treesearch

    Richard W. Conners; Charles W. McMillin; Kingyao Lin; Ramon E. Vasquez-Espinosa

    1983-01-01

    Continued increases in the cost of materials and labor make it imperative for furniture manufacturers to control costs by improved yield and increased productivity. This paper describes an Automated Lumber Processing System (ALPS) that employs computer tomography, optical scanning technology, the calculation of an optimum cutting strategy, and 1 computer-driven laser...

  7. Toward a Singleton Undergraduate Computer Graphics Course in Small and Medium-Sized Colleges

    ERIC Educational Resources Information Center

    Shesh, Amit

    2013-01-01

    This article discusses the evolution of a single undergraduate computer graphics course over five semesters, driven by a primary question: if one could offer only one undergraduate course in graphics, what would it include? This constraint is relevant to many small and medium-sized colleges that lack resources, adequate expertise, and enrollment…

  8. A New Computational Tool for Understanding Light-Matter Interactions

    DTIC Science & Technology

    2016-02-11

    SECURITY CLASSIFICATION OF: Plasmonic resonance of a metallic nanostructure results from coherent motion of its conduction electrons driven by...Box 12211 Research Triangle Park, NC 27709-2211 Plasmonics , light-matter interaction, time-dependent density functional theory, modeling and...reviewed journals: Final Report: A New Computational Tool For Understanding Light-Matter Interactions Report Title Plasmonic resonance of a metallic

  9. Digital Youth Divas: Exploring Narrative-Driven Curriculum to Spark Middle School Girls' Interest in Computational Activities

    ERIC Educational Resources Information Center

    Pinkard, Nichole; Erete, Sheena; Martin, Caitlin K.; McKinney de Royston, Maxine

    2017-01-01

    Women use technology to mediate numerous aspects of their professional and personal lives. Yet, few design and create these technologies given that women, especially women of color, are grossly underrepresented in computer science and engineering courses. Decisions about participation in STEM are frequently made prior to high school, and these…

  10. Computer measurement of arterial disease

    NASA Technical Reports Server (NTRS)

    Armstrong, J.; Selzer, R. H.; Barndt, R.; Blankenhorn, D. H.; Brooks, S.

    1980-01-01

    Image processing technique quantifies human atherosclerosis by computer analysis of arterial angiograms. X-ray film images are scanned and digitized, arterial shadow is tracked, and several quantitative measures of lumen irregularity are computed. In other tests, excellent agreement was found between computer evaluation of femoral angiograms on living subjects and evaluation by teams of trained angiographers.

  11. Computational Algorithmization: Limitations in Problem Solving Skills in Computational Sciences Majors at University of Oriente

    ERIC Educational Resources Information Center

    Castillo, Antonio S.; Berenguer, Isabel A.; Sánchez, Alexander G.; Álvarez, Tomás R. R.

    2017-01-01

    This paper analyzes the results of a diagnostic study carried out with second year students of the computational sciences majors at University of Oriente, Cuba, to determine the limitations that they present in computational algorithmization. An exploratory research was developed using quantitative and qualitative methods. The results allowed…

  12. Dense CO in Mrk 71-A: Superwind Suppressed in a Young Super Star Cluster

    NASA Astrophysics Data System (ADS)

    Oey, M. S.; Herrera, C. N.; Silich, Sergiy; Reiter, Megan; James, Bethan L.; Jaskot, A. E.; Micheva, Genoveva

    2017-11-01

    We report the detection of CO(J=2-1) coincident with the super star cluster (SSC) Mrk 71-A in the nearby Green Pea analog galaxy, NGC 2366. Our observations with the Northern Extended Millimeter Array reveal a compact, ˜7 pc, molecular cloud whose mass ({10}5 {M}⊙ ) is similar to that of the SSC, consistent with a high star formation efficiency, on the order of 0.5. There are two spatially distinct components separated by 11 {km} {{{s}}}-1. If expanding, these could be due to momentum-driven stellar wind feedback. Alternatively, we may be seeing remnants of the infalling, colliding clouds responsible for triggering the SSC formation. The kinematics are also consistent with a virialized system. These extreme, high-density, star-forming conditions inhibit energy-driven feedback; the co-spatial existence of a massive, molecular cloud with the SSC supports this scenario, and we quantitatively confirm that any wind-driven feedback in Mrk 71-A is momentum-driven, rather than energy-driven. Since Mrk 71-A is a candidate Lyman continuum emitter, this implies that energy-driven superwinds may not be a necessary condition for the escape of ionizing radiation. In addition, the detection of nebular continuum emission yields an accurate astrometric position for the Mrk 71-A. We also detect four other massive molecular clouds in this giant star-forming complex.

  13. Magnetization dynamics driven by spin-polarized current in nanomagnets

    NASA Astrophysics Data System (ADS)

    Carpentieri, M.; Torres, L.; Azzerboni, B.; Finocchio, G.; Consolo, G.; Lopez-Diaz, L.

    2007-09-01

    In this report, micromagnetic simulations of magnetization dynamics driven by spin-polarized currents (SPCs) on magnetic nanopillars of permalloy/Cu/permalloy with different rectangular cross-sections are presented. Complete dynamical stability diagrams from initial parallel and antiparallel states have been computed for 100 ns. The effects of a space-dependent polarization function together with the presence of magnetostatic coupling from the fixed layer and classical Ampere field have been taken into account.

  14. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING

    EPA Science Inventory

    The overall goal of the EPA-ORD NERL research program on Computational Toxicology (CompTox) is to provide the Agency with the tools of modern chemistry, biology, and computing to improve quantitative risk assessments and reduce uncertainties in the source-to-adverse outcome conti...

  15. Master-slave mixed arrays for data-flow computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, T.L.; Fisher, P.D.

    1983-01-01

    Control cells (masters) and computation cells (slaves) are mixed in regular geometric patterns to form reconfigurable arrays known as master-slave mixed arrays (MSMAS). Interconnections of the corners and edges of the hexagonal control cells and the edges of the hexagonal computation cells are used to construct synchronous and asynchronous communication networks, which support local computation and local communication. Data-driven computations result in self-directed ring pipelines within the MSMA, and composite data-flow computations are executed in a pipelined fashion. By viewing an MSMA as a computing network of tightly-linked ring pipelines, data-flow programs can be uniformly distributed over these pipelines formore » efficient resource utilisation. 9 references.« less

  16. Advances in the computation of the Sjöstrand, Rossi, and Feynman distributions

    DOE PAGES

    Talamo, A.; Gohar, Y.; Gabrielli, F.; ...

    2017-02-01

    This study illustrates recent computational advances in the application of the Sjöstrand (area), Rossi, and Feynman methods to estimate the effective multiplication factor of a subcritical system driven by an external neutron source. The methodologies introduced in this study have been validated with the experimental results from the KUKA facility of Japan by Monte Carlo (MCNP6 and MCNPX) and deterministic (ERANOS, VARIANT, and PARTISN) codes. When the assembly is driven by a pulsed neutron source generated by a particle accelerator and delayed neutrons are at equilibrium, the Sjöstrand method becomes extremely fast if the integral of the reaction rate frommore » a single pulse is split into two parts. These two integrals distinguish between the neutron counts during and after the pulse period. To conclude, when the facility is driven by a spontaneous fission neutron source, the timestamps of the detector neutron counts can be obtained up to the nanosecond precision using MCNP6, which allows obtaining the Rossi and Feynman distributions.« less

  17. Validation of High-Fidelity CFD Simulations for Rocket Injector Design

    NASA Technical Reports Server (NTRS)

    Tucker, P. Kevin; Menon, Suresh; Merkle, Charles L.; Oefelein, Joseph C.; Yang, Vigor

    2008-01-01

    Computational fluid dynamics (CFD) has the potential to improve the historical rocket injector design process by evaluating the sensitivity of performance and injector-driven thermal environments to the details of the injector geometry and key operational parameters. Methodical verification and validation efforts on a range of coaxial injector elements have shown the current production CFD capability must be improved in order to quantitatively impact the injector design process. This paper documents the status of a focused effort to compare and understand the predictive capabilities and computational requirements of a range of CFD methodologies on a set of single element injector model problems. The steady Reynolds-Average Navier-Stokes (RANS), unsteady Reynolds-Average Navier-Stokes (URANS) and three different approaches using the Large Eddy Simulation (LES) technique were used to simulate the initial model problem, a single element coaxial injector using gaseous oxygen and gaseous hydrogen propellants. While one high-fidelity LES result matches the experimental combustion chamber wall heat flux very well, there is no monotonic convergence to the data with increasing computational tool fidelity. Systematic evaluation of key flow field regions such as the flame zone, the head end recirculation zone and the downstream near wall zone has shed significant, though as of yet incomplete, light on the complex, underlying causes for the performance level of each technique. 1 Aerospace Engineer and Combustion CFD Team Leader, MS ER42, NASA MSFC, AL 35812, Senior Member, AIAA. 2 Professor and Director, Computational Combustion Laboratory, School of Aerospace Engineering, 270 Ferst Dr., Atlanta, GA 30332, Associate Fellow, AIAA. 3 Reilly Professor of Engineering, School of Mechanical Engineering, 585 Purdue Mall, West Lafayette, IN 47907, Fellow, AIAA. 4 Principal Member of Technical Staff, Combustion Research Facility, 7011 East Avenue, MS9051, Livermore, CA 94550, Associate Fellow, AIAA. 5 J. L. and G. H. McCain Endowed Chair, Mechanical Engineering, 104 Research Building East, University Park, PA 16802, Fellow, AIAA. American Institute of Aeronautics and Astronautics 1

  18. Analyzing huge pathology images with open source software.

    PubMed

    Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc

    2013-06-06

    Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here:http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272.

  19. An integral-factorized implementation of the driven similarity renormalization group second-order multireference perturbation theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hannon, Kevin P.; Li, Chenyang; Evangelista, Francesco A., E-mail: francesco.evangelista@emory.edu

    2016-05-28

    We report an efficient implementation of a second-order multireference perturbation theory based on the driven similarity renormalization group (DSRG-MRPT2) [C. Li and F. A. Evangelista, J. Chem. Theory Comput. 11, 2097 (2015)]. Our implementation employs factorized two-electron integrals to avoid storage of large four-index intermediates. It also exploits the block structure of the reference density matrices to reduce the computational cost to that of second-order Møller–Plesset perturbation theory. Our new DSRG-MRPT2 implementation is benchmarked on ten naphthyne isomers using basis sets up to quintuple-ζ quality. We find that the singlet-triplet splittings (Δ{sub ST}) of the naphthyne isomers strongly depend onmore » the equilibrium structures. For a consistent set of geometries, the Δ{sub ST} values predicted by the DSRG-MRPT2 are in good agreements with those computed by the reduced multireference coupled cluster theory with singles, doubles, and perturbative triples.« less

  20. Cloudweaver: Adaptive and Data-Driven Workload Manager for Generic Clouds

    NASA Astrophysics Data System (ADS)

    Li, Rui; Chen, Lei; Li, Wen-Syan

    Cloud computing denotes the latest trend in application development for parallel computing on massive data volumes. It relies on clouds of servers to handle tasks that used to be managed by an individual server. With cloud computing, software vendors can provide business intelligence and data analytic services for internet scale data sets. Many open source projects, such as Hadoop, offer various software components that are essential for building a cloud infrastructure. Current Hadoop (and many others) requires users to configure cloud infrastructures via programs and APIs and such configuration is fixed during the runtime. In this chapter, we propose a workload manager (WLM), called CloudWeaver, which provides automated configuration of a cloud infrastructure for runtime execution. The workload management is data-driven and can adapt to dynamic nature of operator throughput during different execution phases. CloudWeaver works for a single job and a workload consisting of multiple jobs running concurrently, which aims at maximum throughput using a minimum set of processors.

  1. Hierarchy of Information Processing in the Brain: A Novel 'Intrinsic Ignition' Framework.

    PubMed

    Deco, Gustavo; Kringelbach, Morten L

    2017-06-07

    A general theory of brain function has to be able to explain local and non-local network computations over space and time. We propose a new framework to capture the key principles of how local activity influences global computation, i.e., describing the propagation of information and thus the broadness of communication driven by local activity. More specifically, we consider the diversity in space (nodes or brain regions) over time using the concept of intrinsic ignition, which are naturally occurring intrinsic perturbations reflecting the capability of a given brain area to propagate neuronal activity to other regions in a given brain state. Characterizing the profile of intrinsic ignition for a given brain state provides insight into the precise nature of hierarchical information processing. Combining this data-driven method with a causal whole-brain computational model can provide novel insights into the imbalance of brain states found in neuropsychiatric disorders. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Initial Results from an Energy-Aware Airborne Dynamic, Data-Driven Application System Performing Sampling in Coherent Boundary-Layer Structures

    NASA Astrophysics Data System (ADS)

    Frew, E.; Argrow, B. M.; Houston, A. L.; Weiss, C.

    2014-12-01

    The energy-aware airborne dynamic, data-driven application system (EA-DDDAS) performs persistent sampling in complex atmospheric conditions by exploiting wind energy using the dynamic data-driven application system paradigm. The main challenge for future airborne sampling missions is operation with tight integration of physical and computational resources over wireless communication networks, in complex atmospheric conditions. The physical resources considered here include sensor platforms, particularly mobile Doppler radar and unmanned aircraft, the complex conditions in which they operate, and the region of interest. Autonomous operation requires distributed computational effort connected by layered wireless communication. Onboard decision-making and coordination algorithms can be enhanced by atmospheric models that assimilate input from physics-based models and wind fields derived from multiple sources. These models are generally too complex to be run onboard the aircraft, so they need to be executed in ground vehicles in the field, and connected over broadband or other wireless links back to the field. Finally, the wind field environment drives strong interaction between the computational and physical systems, both as a challenge to autonomous path planning algorithms and as a novel energy source that can be exploited to improve system range and endurance. Implementation details of a complete EA-DDDAS will be provided, along with preliminary flight test results targeting coherent boundary-layer structures.

  3. Does Homework Really Matter for College Students in Quantitatively-Based Courses?

    ERIC Educational Resources Information Center

    Young, Nichole; Dollman, Amanda; Angel, N. Faye

    2016-01-01

    This investigation was initiated by two students in an Advanced Computer Applications course. They sought to examine the influence of graded homework on final grades in quantitatively-based business courses. They were provided with data from three quantitatively-based core business courses over a period of five years for a total of 10 semesters of…

  4. A Text Analysis of the Marine Corps Fitness Report

    DTIC Science & Technology

    2017-06-01

    difficulty in quantitatively analyzing textual. The study pulls 835 anonymous and non-attributable surveys between 2005 and 2009 from the Center for... quantitative assessments of performance. 14. SUBJECT TERMS natural language processing, fitness reports, computational linguistics, manpower 15. NUMBER...Corps provide word-picture guidance to distinguish talented Marines and promote conformity in issuing quantitative assessments of performance. vi

  5. Quantitative ROESY analysis of computational models: structural studies of citalopram and β-cyclodextrin complexes by (1) H-NMR and computational methods.

    PubMed

    Ali, Syed Mashhood; Shamim, Shazia

    2015-07-01

    Complexation of racemic citalopram with β-cyclodextrin (β-CD) in aqueous medium was investigated to determine atom-accurate structure of the inclusion complexes. (1) H-NMR chemical shift change data of β-CD cavity protons in the presence of citalopram confirmed the formation of 1 : 1 inclusion complexes. ROESY spectrum confirmed the presence of aromatic ring in the β-CD cavity but whether one of the two or both rings was not clear. Molecular mechanics and molecular dynamic calculations showed the entry of fluoro-ring from wider side of β-CD cavity as the most favored mode of inclusion. Minimum energy computational models were analyzed for their accuracy in atomic coordinates by comparison of calculated and experimental intermolecular ROESY peak intensities, which were not found in agreement. Several least energy computational models were refined and analyzed till calculated and experimental intensities were compatible. The results demonstrate that computational models of CD complexes need to be analyzed for atom-accuracy and quantitative ROESY analysis is a promising method. Moreover, the study also validates that the quantitative use of ROESY is feasible even with longer mixing times if peak intensity ratios instead of absolute intensities are used. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Genomic Prediction Accounting for Residual Heteroskedasticity

    PubMed Central

    Ou, Zhining; Tempelman, Robert J.; Steibel, Juan P.; Ernst, Catherine W.; Bates, Ronald O.; Bello, Nora M.

    2015-01-01

    Whole-genome prediction (WGP) models that use single-nucleotide polymorphism marker information to predict genetic merit of animals and plants typically assume homogeneous residual variance. However, variability is often heterogeneous across agricultural production systems and may subsequently bias WGP-based inferences. This study extends classical WGP models based on normality, heavy-tailed specifications and variable selection to explicitly account for environmentally-driven residual heteroskedasticity under a hierarchical Bayesian mixed-models framework. WGP models assuming homogeneous or heterogeneous residual variances were fitted to training data generated under simulation scenarios reflecting a gradient of increasing heteroskedasticity. Model fit was based on pseudo-Bayes factors and also on prediction accuracy of genomic breeding values computed on a validation data subset one generation removed from the simulated training dataset. Homogeneous vs. heterogeneous residual variance WGP models were also fitted to two quantitative traits, namely 45-min postmortem carcass temperature and loin muscle pH, recorded in a swine resource population dataset prescreened for high and mild residual heteroskedasticity, respectively. Fit of competing WGP models was compared using pseudo-Bayes factors. Predictive ability, defined as the correlation between predicted and observed phenotypes in validation sets of a five-fold cross-validation was also computed. Heteroskedastic error WGP models showed improved model fit and enhanced prediction accuracy compared to homoskedastic error WGP models although the magnitude of the improvement was small (less than two percentage points net gain in prediction accuracy). Nevertheless, accounting for residual heteroskedasticity did improve accuracy of selection, especially on individuals of extreme genetic merit. PMID:26564950

  7. Toward a systematic exploration of nano-bio interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, Xue; Liu, Fang; Liu, Yin

    Many studies of nanomaterials make non-systematic alterations of nanoparticle physicochemical properties. Given the immense size of the property space for nanomaterials, such approaches are not very useful in elucidating fundamental relationships between inherent physicochemical properties of these materials and their interactions with, and effects on, biological systems. Data driven artificial intelligence methods such as machine learning algorithms have proven highly effective in generating models with good predictivity and some degree of interpretability. They can provide a viable method of reducing or eliminating animal testing. However, careful experimental design with the modelling of the results in mind is a proven andmore » efficient way of exploring large materials spaces. This approach, coupled with high speed automated experimental synthesis and characterization technologies now appearing, is the fastest route to developing models that regulatory bodies may find useful. We advocate greatly increased focus on systematic modification of physicochemical properties of nanoparticles combined with comprehensive biological evaluation and computational analysis. This is essential to obtain better mechanistic understanding of nano-bio interactions, and to derive quantitatively predictive and robust models for the properties of nanomaterials that have useful domains of applicability. - Highlights: • Nanomaterials studies make non-systematic alterations to nanoparticle properties. • Vast nanomaterials property spaces require systematic studies of nano-bio interactions. • Experimental design and modelling are efficient ways of exploring materials spaces. • We advocate systematic modification and computational analysis to probe nano-bio interactions.« less

  8. An open-source framework for analyzing N-electron dynamics. II. Hybrid density functional theory/configuration interaction methodology.

    PubMed

    Hermann, Gunter; Pohl, Vincent; Tremblay, Jean Christophe

    2017-10-30

    In this contribution, we extend our framework for analyzing and visualizing correlated many-electron dynamics to non-variational, highly scalable electronic structure method. Specifically, an explicitly time-dependent electronic wave packet is written as a linear combination of N-electron wave functions at the configuration interaction singles (CIS) level, which are obtained from a reference time-dependent density functional theory (TDDFT) calculation. The procedure is implemented in the open-source Python program detCI@ORBKIT, which extends the capabilities of our recently published post-processing toolbox (Hermann et al., J. Comput. Chem. 2016, 37, 1511). From the output of standard quantum chemistry packages using atom-centered Gaussian-type basis functions, the framework exploits the multideterminental structure of the hybrid TDDFT/CIS wave packet to compute fundamental one-electron quantities such as difference electronic densities, transient electronic flux densities, and transition dipole moments. The hybrid scheme is benchmarked against wave function data for the laser-driven state selective excitation in LiH. It is shown that all features of the electron dynamics are in good quantitative agreement with the higher-level method provided a judicious choice of functional is made. Broadband excitation of a medium-sized organic chromophore further demonstrates the scalability of the method. In addition, the time-dependent flux densities unravel the mechanistic details of the simulated charge migration process at a glance. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  9. JobCenter: an open source, cross-platform, and distributed job queue management system optimized for scalability and versatility

    PubMed Central

    2012-01-01

    Background Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. Results JobCenter is a client–server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or “in the cloud”) and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. Conclusions JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/. PMID:22846423

  10. Subject-specific computer simulation model for determining elbow loading in one-handed tennis backhand groundstrokes.

    PubMed

    King, Mark A; Glynn, Jonathan A; Mitchell, Sean R

    2011-11-01

    A subject-specific angle-driven computer model of a tennis player, combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts, was developed to determine the effect of ball-racket impacts on loading at the elbow for one-handed backhand groundstrokes. Matching subject-specific computer simulations of a typical topspin/slice one-handed backhand groundstroke performed by an elite tennis player were done with root mean square differences between performance and matching simulations of < 0.5 degrees over a 50 ms period starting from ball impact. Simulation results suggest that for similar ball-racket impact conditions, the difference in elbow loading for a topspin and slice one-handed backhand groundstroke is relatively small. In this study, the relatively small differences in elbow loading may be due to comparable angle-time histories at the wrist and elbow joints with the major kinematic differences occurring at the shoulder. Using a subject-specific angle-driven computer model combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts allows peak internal loading, net impulse, and shock due to ball-racket impact to be calculated which would not otherwise be possible without impractical invasive techniques. This study provides a basis for further investigation of the factors that may increase elbow loading during tennis strokes.

  11. Efficacy of brain-computer interface-driven neuromuscular electrical stimulation for chronic paresis after stroke.

    PubMed

    Mukaino, Masahiko; Ono, Takashi; Shindo, Keiichiro; Fujiwara, Toshiyuki; Ota, Tetsuo; Kimura, Akio; Liu, Meigen; Ushiba, Junichi

    2014-04-01

    Brain computer interface technology is of great interest to researchers as a potential therapeutic measure for people with severe neurological disorders. The aim of this study was to examine the efficacy of brain computer interface, by comparing conventional neuromuscular electrical stimulation and brain computer interface-driven neuromuscular electrical stimulation, using an A-B-A-B withdrawal single-subject design. A 38-year-old male with severe hemiplegia due to a putaminal haemorrhage participated in this study. The design involved 2 epochs. In epoch A, the patient attempted to open his fingers during the application of neuromuscular electrical stimulation, irrespective of his actual brain activity. In epoch B, neuromuscular electrical stimulation was applied only when a significant motor-related cortical potential was observed in the electroencephalogram. The subject initially showed diffuse functional magnetic resonance imaging activation and small electro-encephalogram responses while attempting finger movement. Epoch A was associated with few neurological or clinical signs of improvement. Epoch B, with a brain computer interface, was associated with marked lateralization of electroencephalogram (EEG) and blood oxygenation level dependent responses. Voluntary electromyogram (EMG) activity, with significant EEG-EMG coherence, was also prompted. Clinical improvement in upper-extremity function and muscle tone was observed. These results indicate that self-directed training with a brain computer interface may induce activity- dependent cortical plasticity and promote functional recovery. This preliminary clinical investigation encourages further research using a controlled design.

  12. Strategic directions of computing at Fermilab

    NASA Astrophysics Data System (ADS)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  13. A method for evaluating the murine pulmonary vasculature using micro-computed tomography.

    PubMed

    Phillips, Michael R; Moore, Scott M; Shah, Mansi; Lee, Clara; Lee, Yueh Z; Faber, James E; McLean, Sean E

    2017-01-01

    Significant mortality and morbidity are associated with alterations in the pulmonary vasculature. While techniques have been described for quantitative morphometry of whole-lung arterial trees in larger animals, no methods have been described in mice. We report a method for the quantitative assessment of murine pulmonary arterial vasculature using high-resolution computed tomography scanning. Mice were harvested at 2 weeks, 4 weeks, and 3 months of age. The pulmonary artery vascular tree was pressure perfused to maximal dilation with a radio-opaque casting material with viscosity and pressure set to prevent capillary transit and venous filling. The lungs were fixed and scanned on a specimen computed tomography scanner at 8-μm resolution, and the vessels were segmented. Vessels were grouped into categories based on lumen diameter and branch generation. Robust high-resolution segmentation was achieved, permitting detailed quantitation of pulmonary vascular morphometrics. As expected, postnatal lung development was associated with progressive increase in small-vessel number and arterial branching complexity. These methods for quantitative analysis of the pulmonary vasculature in postnatal and adult mice provide a useful tool for the evaluation of mouse models of disease that affect the pulmonary vasculature. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Leaf epidermis images for robust identification of plants

    PubMed Central

    da Silva, Núbia Rosa; Oliveira, Marcos William da Silva; Filho, Humberto Antunes de Almeida; Pinheiro, Luiz Felipe Souza; Rossatto, Davi Rodrigo; Kolb, Rosana Marta; Bruno, Odemir Martinez

    2016-01-01

    This paper proposes a methodology for plant analysis and identification based on extracting texture features from microscopic images of leaf epidermis. All the experiments were carried out using 32 plant species with 309 epidermal samples captured by an optical microscope coupled to a digital camera. The results of the computational methods using texture features were compared to the conventional approach, where quantitative measurements of stomatal traits (density, length and width) were manually obtained. Epidermis image classification using texture has achieved a success rate of over 96%, while success rate was around 60% for quantitative measurements taken manually. Furthermore, we verified the robustness of our method accounting for natural phenotypic plasticity of stomata, analysing samples from the same species grown in different environments. Texture methods were robust even when considering phenotypic plasticity of stomatal traits with a decrease of 20% in the success rate, as quantitative measurements proved to be fully sensitive with a decrease of 77%. Results from the comparison between the computational approach and the conventional quantitative measurements lead us to discover how computational systems are advantageous and promising in terms of solving problems related to Botany, such as species identification. PMID:27217018

  15. Mapping Quantitative Traits in Unselected Families: Algorithms and Examples

    PubMed Central

    Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David

    2009-01-01

    Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016

  16. Computational Properties of the Hippocampus Increase the Efficiency of Goal-Directed Foraging through Hierarchical Reinforcement Learning

    PubMed Central

    Chalmers, Eric; Luczak, Artur; Gruber, Aaron J.

    2016-01-01

    The mammalian brain is thought to use a version of Model-based Reinforcement Learning (MBRL) to guide “goal-directed” behavior, wherein animals consider goals and make plans to acquire desired outcomes. However, conventional MBRL algorithms do not fully explain animals' ability to rapidly adapt to environmental changes, or learn multiple complex tasks. They also require extensive computation, suggesting that goal-directed behavior is cognitively expensive. We propose here that key features of processing in the hippocampus support a flexible MBRL mechanism for spatial navigation that is computationally efficient and can adapt quickly to change. We investigate this idea by implementing a computational MBRL framework that incorporates features inspired by computational properties of the hippocampus: a hierarchical representation of space, “forward sweeps” through future spatial trajectories, and context-driven remapping of place cells. We find that a hierarchical abstraction of space greatly reduces the computational load (mental effort) required for adaptation to changing environmental conditions, and allows efficient scaling to large problems. It also allows abstract knowledge gained at high levels to guide adaptation to new obstacles. Moreover, a context-driven remapping mechanism allows learning and memory of multiple tasks. Simulating dorsal or ventral hippocampal lesions in our computational framework qualitatively reproduces behavioral deficits observed in rodents with analogous lesions. The framework may thus embody key features of how the brain organizes model-based RL to efficiently solve navigation and other difficult tasks. PMID:28018203

  17. Computational Skills for Biology Students

    ERIC Educational Resources Information Center

    Gross, Louis J.

    2008-01-01

    This interview with Distinguished Science Award recipient Louis J. Gross highlights essential computational skills for modern biology, including: (1) teaching concepts listed in the Math & Bio 2010 report; (2) illustrating to students that jobs today require quantitative skills; and (3) resources and materials that focus on computational skills.

  18. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  19. More details...
  20. A Quantitative Geochemical Target for Modeling the Formation of the Earth and Moon

    NASA Technical Reports Server (NTRS)

    Boyce, Jeremy W.; Barnes, Jessica J.; McCubbin, Francis M.

    2017-01-01

    The past decade has been one of geochemical, isotopic, and computational advances that are bringing the laboratory measurements and computational modeling neighborhoods of the Earth-Moon community to ever closer proximity. We are now however in the position to become even better neighbors: modelers can generate testable hypthotheses for geochemists; and geochemists can provide quantitive targets for modelers. Here we present a robust example of the latter based on Cl isotope measurements of mare basalts.

  21. Novel Application of Quantitative Single-Photon Emission Computed Tomography/Computed Tomography to Predict Early Response to Methimazole in Graves' Disease

    PubMed Central

    Kim, Hyun Joo; Bang, Ji-In; Kim, Ji-Young; Moon, Jae Hoon; So, Young

    2017-01-01

    Objective Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. Materials and Methods A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of 99mTc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and %uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. Results GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only %uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A %uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). Conclusion A novel parameter of thyroid %uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients. PMID:28458607

  1. UC Merced Center for Computational Biology Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Michael; Watanabe, Masakatsu

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less

  2. Extreme learning machine for reduced order modeling of turbulent geophysical flows.

    PubMed

    San, Omer; Maulik, Romit

    2018-04-01

    We investigate the application of artificial neural networks to stabilize proper orthogonal decomposition-based reduced order models for quasistationary geophysical turbulent flows. An extreme learning machine concept is introduced for computing an eddy-viscosity closure dynamically to incorporate the effects of the truncated modes. We consider a four-gyre wind-driven ocean circulation problem as our prototype setting to assess the performance of the proposed data-driven approach. Our framework provides a significant reduction in computational time and effectively retains the dynamics of the full-order model during the forward simulation period beyond the training data set. Furthermore, we show that the method is robust for larger choices of time steps and can be used as an efficient and reliable tool for long time integration of general circulation models.

  3. Extreme learning machine for reduced order modeling of turbulent geophysical flows

    NASA Astrophysics Data System (ADS)

    San, Omer; Maulik, Romit

    2018-04-01

    We investigate the application of artificial neural networks to stabilize proper orthogonal decomposition-based reduced order models for quasistationary geophysical turbulent flows. An extreme learning machine concept is introduced for computing an eddy-viscosity closure dynamically to incorporate the effects of the truncated modes. We consider a four-gyre wind-driven ocean circulation problem as our prototype setting to assess the performance of the proposed data-driven approach. Our framework provides a significant reduction in computational time and effectively retains the dynamics of the full-order model during the forward simulation period beyond the training data set. Furthermore, we show that the method is robust for larger choices of time steps and can be used as an efficient and reliable tool for long time integration of general circulation models.

  4. Asynchronous Data Retrieval from an Object-Oriented Database

    NASA Astrophysics Data System (ADS)

    Gilbert, Jonathan P.; Bic, Lubomir

    We present an object-oriented semantic database model which, similar to other object-oriented systems, combines the virtues of four concepts: the functional data model, a property inheritance hierarchy, abstract data types and message-driven computation. The main emphasis is on the last of these four concepts. We describe generic procedures that permit queries to be processed in a purely message-driven manner. A database is represented as a network of nodes and directed arcs, in which each node is a logical processing element, capable of communicating with other nodes by exchanging messages. This eliminates the need for shared memory and for centralized control during query processing. Hence, the model is suitable for implementation on a multiprocessor computer architecture, consisting of large numbers of loosely coupled processing elements.

  5. Quantitative collision induced mass spectrometry of substituted piperazines - A correlative analysis between theory and experiment

    NASA Astrophysics Data System (ADS)

    Ivanova, Bojidarka; Spiteller, Michael

    2017-12-01

    The present paper deals with quantitative kinetics and thermodynamics of collision induced dissociation (CID) reactions of piperazines under different experimental conditions together with a systematic description of effect of counter-ions on common MS fragment reactions of piperazines; and intra-molecular effect of quaternary cyclization of substituted piperazines yielding to quaternary salts. There are discussed quantitative model equations of rate constants as well as free Gibbs energies of series of m-independent CID fragment processes in GP, which have been evidenced experimentally. Both kinetic and thermodynamic parameters are also predicted by computational density functional theory (DFT) and ab initio both static and dynamic methods. The paper examines validity of Maxwell-Boltzmann distribution to non-Boltzmann CID processes in quantitatively as well. The experiments conducted within the latter framework yield to an excellent correspondence with theoretical quantum chemical modeling. The important property of presented model equations of reaction kinetics is the applicability in predicting unknown and assigning of known mass spectrometric (MS) patterns. The nature of "GP" continuum of CID-MS coupled scheme of measurements with electrospray ionization (ESI) source is discussed, performing parallel computations in gas-phase (GP) and polar continuum at different temperatures and ionic strengths. The effect of pressure is presented. The study contributes significantly to methodological and phenomenological developments of CID-MS and its analytical implementations for quantitative and structural analyses. It also demonstrates great prospective of a complementary application of experimental CID-MS and computational quantum chemistry studying chemical reactivity, among others. To a considerable extend this work underlies the place of computational quantum chemistry to the field of experimental analytical chemistry in particular highlighting the structural analysis.

  6. Learning Systems Biology: Conceptual Considerations toward a Web-Based Learning Platform

    ERIC Educational Resources Information Center

    Emmert-Streib, Frank; Dehmer, Matthias; Lyardet, Fernando

    2013-01-01

    Within recent years, there is an increasing need to train students, from biology and beyond, in quantitative methods that are relevant to cope with data-driven biology. Systems Biology is such a field that places a particular focus on the functional aspect of biology and molecular interacting processes. This paper deals with the conceptual design…

  7. What's in a Prerequisite? A Mixed-Methods Approach to Identifying the Impact of a Prerequisite Course

    ERIC Educational Resources Information Center

    Sato, Brian K.; Lee, Amanda K.; Alam, Usman; Dang, Jennifer V.; Dacanay, Samantha J.; Morgado, Pedro; Pirino, Giorgia; Brunner, Jo Ellen; Castillo, Leanne A.; Chan, Valerie W.; Sandholtz, Judith H.

    2017-01-01

    Despite the ubiquity of prerequisites in undergraduate science, technology, engineering, and mathematics curricula, there has been minimal effort to assess their value in a data-driven manner. Using both quantitative and qualitative data, we examined the impact of prerequisites in the context of a microbiology lecture and lab course pairing.…

  8. Perception of Secondary School Teachers on Teaching Reading Skills in Content Areas

    ERIC Educational Resources Information Center

    Faulk, Stephen L.

    2013-01-01

    Reading is an essential skill in education and the current technologically-driven workforce, yet it is a skill that is not mastered by all. Secondary students in a school district in central Alabama have demonstrated a low mastery rate on the reading portion of a standardized test. This quantitative research study used a non-experimental…

  9. A Quantitative and Model-Driven Approach to Assessing Higher Education in the United States of America

    ERIC Educational Resources Information Center

    Huang, Zuqing; Qiu, Robin G.

    2016-01-01

    University ranking or higher education assessment in general has been attracting more and more public attention over the years. However, the subjectivity-based evaluation index and indicator selections and weights that are widely adopted in most existing ranking systems have been called into question. In other words, the objectivity and…

  10. Data-Driven Decisions: Using Equity Theory to Highlight Implications for Underserved Students

    ERIC Educational Resources Information Center

    Fowler, Denver J.; Brown, Kelly

    2018-01-01

    By using equity theory through a social justice lens, the authors intend to highlight how data are currently being used to solve the "what" and not the "why" as it relates to achievement gaps for marginalized students in urban settings. School practitioners have been utilizing quantitative data, such as district and state…

  11. Dynamic Reaction Figures: An Integrative Vehicle for Understanding Chemical Reactions

    ERIC Educational Resources Information Center

    Schultz, Emeric

    2008-01-01

    A highly flexible learning tool, referred to as a dynamic reaction figure, is described. Application of these figures can (i) yield the correct chemical equation by simply following a set of menu driven directions; (ii) present the underlying "mechanism" in chemical reactions; and (iii) help to solve quantitative problems in a number of different…

  12. A Novel Method for Measurement and Characterization of Soil Macroporosity

    Treesearch

    Christopher Barton; Tasos Karathanasis

    2002-01-01

    Quantitative macropore characterizations were performed in large zero-tension soil lysimeters of a Maury silt loam (fine, mixed, mesic Typic Paleudalf) and a Loradale silt loam (fine, silty, mixed, mesic Typic Axgiudoll) soil in an effort to assess potential colloid transport. Steel pipe sections (50 cm diameter X 100 cm length) were hydraulically driven into the soil...

  13. Application of Universal Design for Learning in Corporate Technical Training Design: A Quantitative Study

    ERIC Educational Resources Information Center

    Irbe, Aina G.

    2016-01-01

    With the rise of a globalized economy and an overall increase in online learning, corporate organizations have increased training through the online environment at a rapid pace. Providing effective training the employee can immediately apply to the job has driven a need to improve online training programs. Numerous studies have identified that the…

  14. Metaphors and Meaning: Principals' Perceptions of Teacher Evaluation Implementation

    ERIC Educational Resources Information Center

    Derrington, Mary Lynne

    2013-01-01

    This Southeastern state was awarded one of the first two Race to The Top (RTTT) grants the U. S. Department of Education funded. A key piece of the state's winning application was a legislative mandate to implement an intensive, quantitative, and accountability driven teacher evaluation system beginning with the 2011-2012 school year. The new law…

  15. Use of quantitative SPECT/CT reconstruction in 99mTc-sestamibi imaging of patients with renal masses.

    PubMed

    Jones, Krystyna M; Solnes, Lilja B; Rowe, Steven P; Gorin, Michael A; Sheikhbahaei, Sara; Fung, George; Frey, Eric C; Allaf, Mohamad E; Du, Yong; Javadi, Mehrbod S

    2018-02-01

    Technetium-99m ( 99m Tc)-sestamibi single-photon emission computed tomography/computed tomography (SPECT/CT) has previously been shown to allow for the accurate differentiation of benign renal oncocytomas and hybrid oncocytic/chromophobe tumors (HOCTs) apart from other malignant renal tumor histologies, with oncocytomas/HOCTs showing high uptake and renal cell carcinoma (RCC) showing low uptake based on uptake ratios from non-quantitative single-photon emission computed tomography (SPECT) reconstructions. However, in this study, several tumors fell close to the uptake ratio cutoff, likely due to limitations in conventional SPECT/CT reconstruction methods. We hypothesized that application of quantitative SPECT/CT (QSPECT) reconstruction methods developed by our group would provide more robust separation of hot and cold lesions, serving as an imaging framework on which quantitative biomarkers can be validated for evaluation of renal masses with 99m Tc-sestamibi. Single-photon emission computed tomography data were reconstructed using the clinical Flash 3D reconstruction and QSPECT methods. Two blinded readers then characterized each tumor as hot or cold. Semi-quantitative uptake ratios were calculated by dividing lesion activity by background renal activity for both Flash 3D and QSPECT reconstructions. The difference between median (mean) hot and cold tumor uptake ratios measured 0.655 (0.73) with the QSPECT method and 0.624 (0.67) with the conventional method, resulting in increased separation between hot and cold tumors. Sub-analysis of 7 lesions near the separation point showed a higher absolute difference (0.16) between QPSECT and Flash 3D mean uptake ratios compared to the remaining lesions. Our finding of improved separation between uptake ratios of hot and cold lesions using QSPECT reconstruction lays the foundation for additional quantitative SPECT techniques such as SPECT-UV in the setting of renal 99m Tc-sestamibi and other SPECT/CT exams. With robust quantitative image reconstruction and biomarker analysis, there may be an expanded role for SPECT/CT imaging in renal masses and other pathologic conditions.

  16. Synthetic dosage lethality in the human metabolic network is highly predictive of tumor growth and cancer patient survival.

    PubMed

    Megchelenbrink, Wout; Katzir, Rotem; Lu, Xiaowen; Ruppin, Eytan; Notebaart, Richard A

    2015-09-29

    Synthetic dosage lethality (SDL) denotes a genetic interaction between two genes whereby the underexpression of gene A combined with the overexpression of gene B is lethal. SDLs offer a promising way to kill cancer cells by inhibiting the activity of SDL partners of activated oncogenes in tumors, which are often difficult to target directly. As experimental genome-wide SDL screens are still scarce, here we introduce a network-level computational modeling framework that quantitatively predicts human SDLs in metabolism. For each enzyme pair (A, B) we systematically knock out the flux through A combined with a stepwise flux increase through B and search for pairs that reduce cellular growth more than when either enzyme is perturbed individually. The predictive signal of the emerging network of 12,000 SDLs is demonstrated in five different ways. (i) It can be successfully used to predict gene essentiality in shRNA cancer cell line screens. Moving to clinical tumors, we show that (ii) SDLs are significantly underrepresented in tumors. Furthermore, breast cancer tumors with SDLs active (iii) have smaller sizes and (iv) result in increased patient survival, indicating that activation of SDLs increases cancer vulnerability. Finally, (v) patient survival improves when multiple SDLs are present, pointing to a cumulative effect. This study lays the basis for quantitative identification of cancer SDLs in a model-based mechanistic manner. The approach presented can be used to identify SDLs in species and cell types in which "omics" data necessary for data-driven identification are missing.

  17. Keeping it Together: Advanced algorithms and software for magma dynamics (and other coupled multi-physics problems)

    NASA Astrophysics Data System (ADS)

    Spiegelman, M.; Wilson, C. R.

    2011-12-01

    A quantitative theory of magma production and transport is essential for understanding the dynamics of magmatic plate boundaries, intra-plate volcanism and the geochemical evolution of the planet. It also provides one of the most challenging computational problems in solid Earth science, as it requires consistent coupling of fluid and solid mechanics together with the thermodynamics of melting and reactive flows. Considerable work on these problems over the past two decades shows that small changes in assumptions of coupling (e.g. the relationship between melt fraction and solid rheology), can have profound changes on the behavior of these systems which in turn affects critical computational choices such as discretizations, solvers and preconditioners. To make progress in exploring and understanding this physically rich system requires a computational framework that allows more flexible, high-level description of multi-physics problems as well as increased flexibility in composing efficient algorithms for solution of the full non-linear coupled system. Fortunately, recent advances in available computational libraries and algorithms provide a platform for implementing such a framework. We present results from a new model building system that leverages functionality from both the FEniCS project (www.fenicsproject.org) and PETSc libraries (www.mcs.anl.gov/petsc) along with a model independent options system and gui, Spud (amcg.ese.ic.ac.uk/Spud). Key features from FEniCS include fully unstructured FEM with a wide range of elements; a high-level language (ufl) and code generation compiler (FFC) for describing the weak forms of residuals and automatic differentiation for calculation of exact and approximate jacobians. The overall strategy is to monitor/calculate residuals and jacobians for the entire non-linear system of equations within a global non-linear solve based on PETSc's SNES routines. PETSc already provides a wide range of solvers and preconditioners, from parallel sparse direct to algebraic multigrid, that can be chosen at runtime. In particular, we make extensive use of PETSc's FieldSplit block preconditioners that allow us to use optimal solvers for subproblems (such as Stokes, or advection/diffusion of temperature) as preconditioners for the full problem. Thus these routines let us reuse effective solving recipes/splittings from previous experience while monitoring the convergence of the global problem. These techniques often yield quadratic (Newton like) convergence for the work of standard Picard schemes. We will illustrate this new framework with examples from the Magma Dynamic Demonstration suite (MADDs) of well understood magma dynamics benchmark problems including stokes flow in ridge geometries, magmatic solitary waves and shear-driven melt bands. While development of this system has been driven by magma dynamics, this framework is much more general and can be used for a wide range of PDE based multi-physics models.

  18. Computational And Experimental Studies Of Three-Dimensional Flame Spread Over Liquid Fuel Pools

    NASA Technical Reports Server (NTRS)

    Ross, Howard D. (Technical Monitor); Cai, Jinsheng; Liu, Feng; Sirignano, William A.; Miller, Fletcher J.

    2003-01-01

    Schiller, Ross, and Sirignano (1996) studied ignition and flame spread above liquid fuels initially below the flashpoint temperature by using a two-dimensional computational fluid dynamics code that solves the coupled equations of both the gas and the liquid phases. Pulsating flame spread was attributed to the establishment of a gas-phase recirculation cell that forms just ahead of the flame leading edge because of the opposing effect of buoyancy-driven flow in the gas phase and the thermocapillary-driven flow in the liquid phase. Schiller and Sirignano (1996) extended the same study to include flame spread with forced opposed flow in the gas phase. A transitional flow velocity was found above which an originally uniform spreading flame pulsates. The same type of gas-phase recirculation cell caused by the combination of forced opposed flow, buoyancy-driven flow, and thermocapillary-driven concurrent flow was responsible for the pulsating flame spread. Ross and Miller (1998) and Miller and Ross (1998) performed experimental work that corroborates the computational findings of Schiller, Ross, and Sirignano (1996) and Schiller and Sirignano (1996). Cai, Liu, and Sirignano (2002) developed a more comprehensive three-dimensional model and computer code for the flame spread problem. Many improvements in modeling and numerical algorithms were incorporated in the three-dimensional model. Pools of finite width and length were studied in air channels of prescribed height and width. Significant three-dimensional effects around and along the pool edge were observed. The same three-dimensional code is used to study the detailed effects of pool depth, pool width, opposed air flow velocity, and different levels of air oxygen concentration (Cai, Liu, and Sirignano, 2003). Significant three-dimensional effects showing an unsteady wavy flame front for cases of wide pool width are found for the first time in computation, after being noted previously by experimental observers (Ross and Miller, 1999). Regions of uniform and pulsating flame spread are mapped for the flow conditions of pool depth, opposed flow velocity, initial pool temperature, and air oxygen concentration under both normal and microgravity conditions. Details can be found in Cai et al. (2002, 2003). Experimental results recently performed at NASA Glenn of flame spread across a wide, shallow pool as a function of liquid temperature are also presented here.

  19. Autonomous Electrothermal Facility for Oil Recovery Intensification Fed by Wind Driven Power Unit

    NASA Astrophysics Data System (ADS)

    Belsky, Aleksey A.; Dobush, Vasiliy S.

    2017-10-01

    This paper describes the structure of autonomous facility fed by wind driven power unit for intensification of viscous and heavy crude oil recovery by means of heat impact on productive strata. Computer based service simulation of this facility was performed. Operational energy characteristics were obtained for various operational modes of facility. The optimal resistance of heating element of the downhole heater was determined for maximum operating efficiency of wind power unit.

  20. Solid–Liquid Phase Change Driven by Internal Heat Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Crepeau; Ali s. Siahpush

    2012-07-01

    This article presents results of solid-liquid phase change, the Stefan Problem, where melting is driven internal heat generation, in a cylindrical geometry. The comparison between a quasi-static analytical solution for Stefan numbers less than one and numerical solutions shows good agreement. The computational results of phase change with internal heat generation show how convection cells form in the liquid region. A scale analysis of the same problem shows four distinct regions of the melting process.

  1. Electric-field-driven electron-transfer in mixed-valence molecules.

    PubMed

    Blair, Enrique P; Corcelli, Steven A; Lent, Craig S

    2016-07-07

    Molecular quantum-dot cellular automata is a computing paradigm in which digital information is encoded by the charge configuration of a mixed-valence molecule. General-purpose computing can be achieved by arranging these compounds on a substrate and exploiting intermolecular Coulombic coupling. The operation of such a device relies on nonequilibrium electron transfer (ET), whereby the time-varying electric field of one molecule induces an ET event in a neighboring molecule. The magnitude of the electric fields can be quite large because of close spatial proximity, and the induced ET rate is a measure of the nonequilibrium response of the molecule. We calculate the electric-field-driven ET rate for a model mixed-valence compound. The mixed-valence molecule is regarded as a two-state electronic system coupled to a molecular vibrational mode, which is, in turn, coupled to a thermal environment. Both the electronic and vibrational degrees-of-freedom are treated quantum mechanically, and the dissipative vibrational-bath interaction is modeled with the Lindblad equation. This approach captures both tunneling and nonadiabatic dynamics. Relationships between microscopic molecular properties and the driven ET rate are explored for two time-dependent applied fields: an abruptly switched field and a linearly ramped field. In both cases, the driven ET rate is only weakly temperature dependent. When the model is applied using parameters appropriate to a specific mixed-valence molecule, diferrocenylacetylene, terahertz-range ET transfer rates are predicted.

  2. Data-driven non-linear elasticity: constitutive manifold construction and problem discretization

    NASA Astrophysics Data System (ADS)

    Ibañez, Ruben; Borzacchiello, Domenico; Aguado, Jose Vicente; Abisset-Chavanne, Emmanuelle; Cueto, Elias; Ladeveze, Pierre; Chinesta, Francisco

    2017-11-01

    The use of constitutive equations calibrated from data has been implemented into standard numerical solvers for successfully addressing a variety problems encountered in simulation-based engineering sciences (SBES). However, the complexity remains constantly increasing due to the need of increasingly detailed models as well as the use of engineered materials. Data-Driven simulation constitutes a potential change of paradigm in SBES. Standard simulation in computational mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy,\\ldots ), whereas the second one consists of models that scientists have extracted from collected, either natural or synthetic, data. Data-driven (or data-intensive) simulation consists of directly linking experimental data to computers in order to perform numerical simulations. These simulations will employ laws, universally recognized as epistemic, while minimizing the need of explicit, often phenomenological, models. The main drawback of such an approach is the large amount of required data, some of them inaccessible from the nowadays testing facilities. Such difficulty can be circumvented in many cases, and in any case alleviated, by considering complex tests, collecting as many data as possible and then using a data-driven inverse approach in order to generate the whole constitutive manifold from few complex experimental tests, as discussed in the present work.

  3. Phase field model of fluid-driven fracture in elastic media: Immersed-fracture formulation and validation with analytical solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santillán, David; Juanes, Ruben; Cueto-Felgueroso, Luis

    Propagation of fluid-driven fractures plays an important role in natural and engineering processes, including transport of magma in the lithosphere, geologic sequestration of carbon dioxide, and oil and gas recovery from low-permeability formations, among many others. The simulation of fracture propagation poses a computational challenge as a result of the complex physics of fracture and the need to capture disparate length scales. Phase field models represent fractures as a diffuse interface and enjoy the advantage that fracture nucleation, propagation, branching, or twisting can be simulated without ad hoc computational strategies like remeshing or local enrichment of the solution space. Heremore » we propose a new quasi-static phase field formulation for modeling fluid-driven fracturing in elastic media at small strains. The approach fully couples the fluid flow in the fracture (described via the Reynolds lubrication approximation) and the deformation of the surrounding medium. The flow is solved on a lower dimensionality mesh immersed in the elastic medium. This approach leads to accurate coupling of both physics. We assessed the performance of the model extensively by comparing results for the evolution of fracture length, aperture, and fracture fluid pressure against analytical solutions under different fracture propagation regimes. Thus, the excellent performance of the numerical model in all regimes builds confidence in the applicability of phase field approaches to simulate fluid-driven fracture.« less

  4. Phase field model of fluid-driven fracture in elastic media: Immersed-fracture formulation and validation with analytical solutions

    DOE PAGES

    Santillán, David; Juanes, Ruben; Cueto-Felgueroso, Luis

    2017-04-20

    Propagation of fluid-driven fractures plays an important role in natural and engineering processes, including transport of magma in the lithosphere, geologic sequestration of carbon dioxide, and oil and gas recovery from low-permeability formations, among many others. The simulation of fracture propagation poses a computational challenge as a result of the complex physics of fracture and the need to capture disparate length scales. Phase field models represent fractures as a diffuse interface and enjoy the advantage that fracture nucleation, propagation, branching, or twisting can be simulated without ad hoc computational strategies like remeshing or local enrichment of the solution space. Heremore » we propose a new quasi-static phase field formulation for modeling fluid-driven fracturing in elastic media at small strains. The approach fully couples the fluid flow in the fracture (described via the Reynolds lubrication approximation) and the deformation of the surrounding medium. The flow is solved on a lower dimensionality mesh immersed in the elastic medium. This approach leads to accurate coupling of both physics. We assessed the performance of the model extensively by comparing results for the evolution of fracture length, aperture, and fracture fluid pressure against analytical solutions under different fracture propagation regimes. Thus, the excellent performance of the numerical model in all regimes builds confidence in the applicability of phase field approaches to simulate fluid-driven fracture.« less

  5. Advances on the constitutive characterization of composites via multiaxial robotic testing and design optimization

    Treesearch

    John G. Michopoulos; John Hermanson; Athanasios Iliopoulos

    2014-01-01

    The research areas of mutiaxial robotic testing and design optimization have been recently utilized for the purpose of data-driven constitutive characterization of anisotropic material systems. This effort has been enabled by both the progress in the areas of computers and information in engineering as well as the progress in computational automation. Although our...

  6. [An interactive three-dimensional model of the human body].

    PubMed

    Liem, S L

    2009-01-01

    Driven by advanced computer technology, it is now possible to show the human anatomy on a computer. On the internet, the Visible Body programme makes it possible to navigate in all directions through the anatomical structures of the human body, using mouse and keyboard. Visible Body is a wonderful tool to give insight in the human structures, body functions and organs.

  7. A 21st Century Science, Technology, and Innovation Strategy for Americas National Security

    DTIC Science & Technology

    2016-05-01

    areas. Advanced Computing and Communications The exponential growth of the digital economy, driven by ubiquitous computing and communication...weapons- focused R&D, many of the capabilities being developed have significant dual-use potential. Digital connectivity, for instance, brings...scale than traditional recombinant DNA techniques, and to share these designs digitally . Nanotechnology promises the ability to engineer entirely

  8. An Educational MONTE CARLO Simulation/Animation Program for the Cosmic Rays Muons and a Prototype Computer-Driven Hardware Display.

    ERIC Educational Resources Information Center

    Kalkanis, G.; Sarris, M. M.

    1999-01-01

    Describes an educational software program for the study of and detection methods for the cosmic ray muons passing through several light transparent materials (i.e., water, air, etc.). Simulates muons and Cherenkov photons' paths and interactions and visualizes/animates them on the computer screen using Monte Carlo methods/techniques which employ…

  9. The Museum Wearable: Real-Time Sensor-Driven Understanding of Visitors' Interests for Personalized Visually-Augmented Museum Experiences.

    ERIC Educational Resources Information Center

    Sparacino, Flavia

    This paper describes the museum wearable: a wearable computer that orchestrates an audiovisual narration as a function of the visitors' interests gathered from their physical path in the museum and length of stops. The wearable consists of a lightweight and small computer that people carry inside a shoulder pack. It offers an audiovisual…

  10. Active DNA gels

    NASA Astrophysics Data System (ADS)

    Saleh, Omar A.; Fygenson, Deborah K.; Bertrand, Olivier J. N.; Park, Chang Young

    2013-02-01

    Research into the mechanics and fluctuations of living cells has revealed the key role played by the cytoskeleton, a gel of stiff filaments driven out of equilibrium by force-generating motor proteins. Inspired by the extraordinary mechanical functions that the cytoskeleton imparts to the cell, we sought to create an artificial gel with similar characteristics. We identified DNA, and DNA-based motor proteins, as functional counterparts to the constituents of the cytoskeleton. We used DNA selfassembly to create a gel, and characterized its fluctuations and mechanics both before and after activation by the motor. We found that certain aspects of the DNA gel quantitatively match those of cytoskeletal networks, indicating the universal features of motor-driven, non-equilibrium networks.

  11. Stable solar-driven oxidation of water by semiconducting photoanodes protected by transparent catalytic nickel oxide films.

    PubMed

    Sun, Ke; Saadi, Fadl H; Lichterman, Michael F; Hale, William G; Wang, Hsin-Ping; Zhou, Xinghao; Plymale, Noah T; Omelchenko, Stefan T; He, Jr-Hau; Papadantonakis, Kimberly M; Brunschwig, Bruce S; Lewis, Nathan S

    2015-03-24

    Reactively sputtered nickel oxide (NiOx) films provide transparent, antireflective, electrically conductive, chemically stable coatings that also are highly active electrocatalysts for the oxidation of water to O2(g). These NiOx coatings provide protective layers on a variety of technologically important semiconducting photoanodes, including textured crystalline Si passivated by amorphous silicon, crystalline n-type cadmium telluride, and hydrogenated amorphous silicon. Under anodic operation in 1.0 M aqueous potassium hydroxide (pH 14) in the presence of simulated sunlight, the NiOx films stabilized all of these self-passivating, high-efficiency semiconducting photoelectrodes for >100 h of sustained, quantitative solar-driven oxidation of water to O2(g).

  12. Nuclear electric propulsion operational reliability and crew safety study: NEP systems/modeling report

    NASA Technical Reports Server (NTRS)

    Karns, James

    1993-01-01

    The objective of this study was to establish the initial quantitative reliability bounds for nuclear electric propulsion systems in a manned Mars mission required to ensure crew safety and mission success. Finding the reliability bounds involves balancing top-down (mission driven) requirements and bottom-up (technology driven) capabilities. In seeking this balance we hope to accomplish the following: (1) provide design insights into the achievability of the baseline design in terms of reliability requirements, given the existing technology base; (2) suggest alternative design approaches which might enhance reliability and crew safety; and (3) indicate what technology areas require significant research and development to achieve the reliability objectives.

  13. Characteristics of spondylotic myelopathy on 3D driven-equilibrium fast spin echo and 2D fast spin echo magnetic resonance imaging: a retrospective cross-sectional study.

    PubMed

    Abdulhadi, Mike A; Perno, Joseph R; Melhem, Elias R; Nucifora, Paolo G P

    2014-01-01

    In patients with spinal stenosis, magnetic resonance imaging of the cervical spine can be improved by using 3D driven-equilibrium fast spin echo sequences to provide a high-resolution assessment of osseous and ligamentous structures. However, it is not yet clear whether 3D driven-equilibrium fast spin echo sequences adequately evaluate the spinal cord itself. As a result, they are generally supplemented by additional 2D fast spin echo sequences, adding time to the examination and potential discomfort to the patient. Here we investigate the hypothesis that in patients with spinal stenosis and spondylotic myelopathy, 3D driven-equilibrium fast spin echo sequences can characterize cord lesions equally well as 2D fast spin echo sequences. We performed a retrospective analysis of 30 adult patients with spondylotic myelopathy who had been examined with both 3D driven-equilibrium fast spin echo sequences and 2D fast spin echo sequences at the same scanning session. The two sequences were inspected separately for each patient, and visible cord lesions were manually traced. We found no significant differences between 3D driven-equilibrium fast spin echo and 2D fast spin echo sequences in the mean number, mean area, or mean transverse dimensions of spondylotic cord lesions. Nevertheless, the mean contrast-to-noise ratio of cord lesions was decreased on 3D driven-equilibrium fast spin echo sequences compared to 2D fast spin echo sequences. These findings suggest that 3D driven-equilibrium fast spin echo sequences do not need supplemental 2D fast spin echo sequences for the diagnosis of spondylotic myelopathy, but they may be less well suited for quantitative signal measurements in the spinal cord.

  14. Microscopic origin and macroscopic implications of lane formation in mixtures of oppositely-driven particles

    NASA Astrophysics Data System (ADS)

    Whitelam, Stephen

    Colloidal particles of two types, driven in opposite directions, can segregate into lanes. I will describe some results on this phenomenon obtained by simple physical arguments and computer simulations. Laning results from rectification of diffusion on the scale of a particle diameter: oppositely-driven particles must, in the time taken to encounter each other in the direction of the drive, diffuse in the perpendicular direction by about one particle diameter. This geometric constraint implies that the diffusion constant of a particle, in the presence of those of the opposite type, grows approximately linearly with Peclet number, a prediction confirmed by our numerics. Such environment-dependent diffusion is statistically similar to an effective interparticle attraction; consistent with this observation, we find that oppositely-driven colloids display features characteristic of the simplest model system possessing both interparticle attractions and persistent motion, the driven Ising lattice gas. Office of Science, Office of Basic Energy Sciences, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.

  15. Quantitative Predictive Models for Systemic Toxicity (SOT)

    EPA Science Inventory

    Models to identify systemic and specific target organ toxicity were developed to help transition the field of toxicology towards computational models. By leveraging multiple data sources to incorporate read-across and machine learning approaches, a quantitative model of systemic ...

  16. Quantitative Evaluation of a Planetary Renderer for Terrain Relative Navigation

    NASA Astrophysics Data System (ADS)

    Amoroso, E.; Jones, H.; Otten, N.; Wettergreen, D.; Whittaker, W.

    2016-11-01

    A ray-tracing computer renderer tool is presented based on LOLA and LROC elevation models and is quantitatively compared to LRO WAC and NAC images for photometric accuracy. We investigated using rendered images for terrain relative navigation.

  17. Medical privacy protection based on granular computing.

    PubMed

    Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng

    2004-10-01

    Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.

  18. Computation of the three-dimensional medial surface dynamics of the vocal folds.

    PubMed

    Döllinger, Michael; Berry, David A

    2006-01-01

    To increase our understanding of pathological and healthy voice production, quantitative measurement of the medial surface dynamics of the vocal folds is significant, albeit rarely performed because of the inaccessibility of the vocal folds. Using an excised hemilarynx methodology, a new calibration technique, herein referred to as the linear approximate (LA) method, was introduced to compute the three-dimensional coordinates of fleshpoints along the entire medial surface of the vocal fold. The results were compared with results from the direct linear transform. An associated error estimation was presented, demonstrating the improved accuracy of the new method. A test on real data was reported including computation of quantitative measurements of vocal fold dynamics.

  19. Quantitation of Flavanols, Proanthocyanidins, Isoflavones, Flavanones, Dihydrochalcones, Stilbenes, Benzoic Acid Derivatives Using Ultraviolet Absorbance after Identification by Liquid Chromatography–Mass Spectrometry

    PubMed Central

    Lin, Long-Ze; Harnly, James M.

    2013-01-01

    A general method was developed for the systematic quantitation of flavanols, proanthocyanidins, isoflavones, flavanones, dihydrochalcones, stilbenes, and hydroxybenzoic acid derivatives (mainly hydrolyzable tannins) based on UV band II absorbance arising from the benzoyl structure. The compound structures and the wavelength maximum were well correlated and were divided into four groups: the flavanols and proanthocyanidins at 278 nm, hydrolyzable tannins at 274 nm, flavanones at 288 nm, and isoflavones at 260 nm. Within each group, molar relative response factors (MRRFs) were computed for each compound based on the absorbance ratio of the compound and the group reference standard. Response factors were computed for the compounds as purchased (MRRF), after drying (MRRFD), and as the best predicted value (MRRFP). Concentrations for each compound were computed based on calibration with the group reference standard and the MRRFP. The quantitation of catechins, proanthocyanidins, and gallic acid derivatives in white tea was used as an example. PMID:22577798

  20. Internal and External Crisis Early Warning and Monitoring.

    DTIC Science & Technology

    1980-12-01

    refining EWAMS. Initial EWAMS research revolved around the testing of quantitative political indicators, the development of general scans, and the...Initial Research ...................27 3.1.1 Quantitative indicators .......... 28 03.1.2 General scans.................34 3.1.3 Computer base...generalizations reinforce the desirability of the research from the vantage point of the I&W thrust. One is the proliferation of quantitative and

Top