ERIC Educational Resources Information Center
BOLDT, MILTON; POKORNY, HARRY
THIRTY-THREE MACHINE SHOP INSTRUCTORS FROM 17 STATES PARTICIPATED IN AN 8-WEEK SEMINAR TO DEVELOP THE SKILLS AND KNOWLEDGE ESSENTIAL FOR TEACHING THE OPERATION OF NUMERICALLY CONTROLLED MACHINE TOOLS. THE SEMINAR WAS GIVEN FROM JUNE 20 TO AUGUST 12, 1966, WITH COLLEGE CREDIT AVAILABLE THROUGH STOUT STATE UNIVERSITY. THE PARTICIPANTS COMPLETED AN…
Ensembles of NLP Tools for Data Element Extraction from Clinical Notes
Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D.; Day, Michele E.; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan
2016-01-01
Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort. PMID:28269947
Ensembles of NLP Tools for Data Element Extraction from Clinical Notes.
Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D; Day, Michele E; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan
2016-01-01
Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort.
ERIC Educational Resources Information Center
Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar
2016-01-01
The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…
Engineering With Nature Geographic Project Mapping Tool (EWN ProMap)
2015-07-01
EWN ProMap database provides numerous case studies for infrastructure projects such as breakwaters, river engineering dikes, and seawalls that have...the EWN Project Mapping Tool (EWN ProMap) is to assist users in their search for case study information that can be valuable for developing EWN ideas...Essential elements of EWN include: (1) using science and engineering to produce operational efficiencies supporting sustainable delivery of
Number Talks Build Numerical Reasoning
ERIC Educational Resources Information Center
Parrish, Sherry D.
2011-01-01
"Classroom number talks," five- to fifteen-minute conversations around purposefully crafted computation problems, are a productive tool that can be incorporated into classroom instruction to combine the essential processes and habits of mind of doing math. During number talks, students are asked to communicate their thinking when presenting and…
NASA Astrophysics Data System (ADS)
Johnson, Daniel; Huerta, E. A.; Haas, Roland
2018-01-01
Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.
Paranoia.Ada: A diagnostic program to evaluate Ada floating-point arithmetic
NASA Technical Reports Server (NTRS)
Hjermstad, Chris
1986-01-01
Many essential software functions in the mission critical computer resource application domain depend on floating point arithmetic. Numerically intensive functions associated with the Space Station project, such as emphemeris generation or the implementation of Kalman filters, are likely to employ the floating point facilities of Ada. Paranoia.Ada appears to be a valuabe program to insure that Ada environments and their underlying hardware exhibit the precision and correctness required to satisfy mission computational requirements. As a diagnostic tool, Paranoia.Ada reveals many essential characteristics of an Ada floating point implementation. Equipped with such knowledge, programmers need not tremble before the complex task of floating point computation.
Phase separation and the formation of cellular bodies
NASA Astrophysics Data System (ADS)
Xu, Bin; Broedersz, Chase P.; Meir, Yigal; Wingreen, Ned S.
Cellular bodies in eukaryotic cells spontaneously assemble to form cellular compartments. Among other functions, these bodies carry out essential biochemical reactions. Cellular bodies form micron-sized structures, which, unlike canonical cell organelles, are not surrounded by membranes. A recent in vitro experiment has shown that phase separation of polymers in solution can explain the formation of cellular bodies. We constructed a lattice-polymer model to capture the essential mechanism leading to this phase separation. We used both analytical and numerical tools to predict the phase diagram of a system of two interacting polymers, including the concentration of each polymer type in the condensed and dilute phase.
ERIC Educational Resources Information Center
Hilty, Eleanor Blair, Ed.
2011-01-01
Over the past two decades, numerous textbooks have been published on teacher leadership; however, this is the only volume that provides a definitive overview of the scholarship and writing being done in the field of teacher leadership. This book introduces the reader to the scholarship of over 35 authors, and thus, becomes an essential tool needed…
Modeling languages for biochemical network simulation: reaction vs equation based approaches.
Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya
2010-01-01
Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.
Towards a metadata scheme for the description of materials - the description of microstructures
NASA Astrophysics Data System (ADS)
Schmitz, Georg J.; Böttger, Bernd; Apel, Markus; Eiken, Janin; Laschet, Gottfried; Altenfeld, Ralph; Berger, Ralf; Boussinot, Guillaume; Viardin, Alexandre
2016-01-01
The property of any material is essentially determined by its microstructure. Numerical models are increasingly the focus of modern engineering as helpful tools for tailoring and optimization of custom-designed microstructures by suitable processing and alloy design. A huge variety of software tools is available to predict various microstructural aspects for different materials. In the general frame of an integrated computational materials engineering (ICME) approach, these microstructure models provide the link between models operating at the atomistic or electronic scales, and models operating on the macroscopic scale of the component and its processing. In view of an improved interoperability of all these different tools it is highly desirable to establish a standardized nomenclature and methodology for the exchange of microstructure data. The scope of this article is to provide a comprehensive system of metadata descriptors for the description of a 3D microstructure. The presented descriptors are limited to a mere geometric description of a static microstructure and have to be complemented by further descriptors, e.g. for properties, numerical representations, kinetic data, and others in the future. Further attributes to each descriptor, e.g. on data origin, data uncertainty, and data validity range are being defined in ongoing work. The proposed descriptors are intended to be independent of any specific numerical representation. The descriptors defined in this article may serve as a first basis for standardization and will simplify the data exchange between different numerical models, as well as promote the integration of experimental data into numerical models of microstructures. An HDF5 template data file for a simple, three phase Al-Cu microstructure being based on the defined descriptors complements this article.
Towards a metadata scheme for the description of materials - the description of microstructures.
Schmitz, Georg J; Böttger, Bernd; Apel, Markus; Eiken, Janin; Laschet, Gottfried; Altenfeld, Ralph; Berger, Ralf; Boussinot, Guillaume; Viardin, Alexandre
2016-01-01
The property of any material is essentially determined by its microstructure. Numerical models are increasingly the focus of modern engineering as helpful tools for tailoring and optimization of custom-designed microstructures by suitable processing and alloy design. A huge variety of software tools is available to predict various microstructural aspects for different materials. In the general frame of an integrated computational materials engineering (ICME) approach, these microstructure models provide the link between models operating at the atomistic or electronic scales, and models operating on the macroscopic scale of the component and its processing. In view of an improved interoperability of all these different tools it is highly desirable to establish a standardized nomenclature and methodology for the exchange of microstructure data. The scope of this article is to provide a comprehensive system of metadata descriptors for the description of a 3D microstructure. The presented descriptors are limited to a mere geometric description of a static microstructure and have to be complemented by further descriptors, e.g. for properties, numerical representations, kinetic data, and others in the future. Further attributes to each descriptor, e.g. on data origin, data uncertainty, and data validity range are being defined in ongoing work. The proposed descriptors are intended to be independent of any specific numerical representation. The descriptors defined in this article may serve as a first basis for standardization and will simplify the data exchange between different numerical models, as well as promote the integration of experimental data into numerical models of microstructures. An HDF5 template data file for a simple, three phase Al-Cu microstructure being based on the defined descriptors complements this article.
Cell chips as new tools for cell biology--results, perspectives and opportunities.
Primiceri, Elisabetta; Chiriacò, Maria Serena; Rinaldi, Ross; Maruccio, Giuseppe
2013-10-07
Cell culture technologies were initially developed as research tools for studying cell functions, but nowadays they are essential for the biotechnology industry, with rapidly expanding applications requiring more and more advancements with respect to traditional tools. Miniaturization and integration of sensors and microfluidic components with cell culture techniques open the way to the development of cellomics as a new field of research targeting innovative analytic platforms for high-throughput studies. This approach enables advanced cell studies under controllable conditions by providing inexpensive, easy-to-operate devices. Thanks to their numerous advantages cell-chips have become a hotspot in biosensors and bioelectronics fields and have been applied to very different fields. In this review exemplary applications will be discussed, for cell counting and detection, cytotoxicity assays, migration assays and stem cell studies.
From hacking the human genome to editing organs.
Tobita, Takamasa; Guzman-Lepe, Jorge; Collin de l'Hortet, Alexandra
2015-01-01
In the recent decades, human genome engineering has been one of the major interesting research subjects, essentially because it raises new possibilities for personalized medicine and biotechnologies. With the development of engineered nucleases such as the Zinc Finger Nucleases (ZFNs), the Transcription activator-like effector nucleases (TALENs) and more recently the Clustered Regularly Interspaced short Palindromic Repeats (CRISPR), the field of human genome edition has evolved very rapidly. Every new genetic tool is broadening the scope of applications on human tissues, even before we can completely master each of these tools. In this review, we will present the recent advances regarding human genome edition tools, we will discuss the numerous implications they have in research and medicine, and we will mention the limits and concerns about such technologies.
From hacking the human genome to editing organs
Tobita, Takamasa; Guzman-Lepe, Jorge; Collin de l'Hortet, Alexandra
2015-01-01
ABSTRACT In the recent decades, human genome engineering has been one of the major interesting research subjects, essentially because it raises new possibilities for personalized medicine and biotechnologies. With the development of engineered nucleases such as the Zinc Finger Nucleases (ZFNs), the Transcription activator-like effector nucleases (TALENs) and more recently the Clustered Regularly Interspaced short Palindromic Repeats (CRISPR), the field of human genome edition has evolved very rapidly. Every new genetic tool is broadening the scope of applications on human tissues, even before we can completely master each of these tools. In this review, we will present the recent advances regarding human genome edition tools, we will discuss the numerous implications they have in research and medicine, and we will mention the limits and concerns about such technologies PMID:26588350
Numerical study of read scheme in one-selector one-resistor crossbar array
NASA Astrophysics Data System (ADS)
Kim, Sungho; Kim, Hee-Dong; Choi, Sung-Jin
2015-12-01
A comprehensive numerical circuit analysis of read schemes of a one selector-one resistance change memory (1S1R) crossbar array is carried out. Three schemes-the ground, V/2, and V/3 schemes-are compared with each other in terms of sensing margin and power consumption. Without the aid of a complex analytical approach or SPICE-based simulation, a simple numerical iteration method is developed to simulate entire current flows and node voltages within a crossbar array. Understanding such phenomena is essential in successfully evaluating the electrical specifications of selectors for suppressing intrinsic drawbacks of crossbar arrays, such as sneaky current paths and series line resistance problems. This method provides a quantitative tool for the accurate analysis of crossbar arrays and provides guidelines for developing an optimal read scheme, array configuration, and selector device specifications.
Impact of tool wear on cross wedge rolling process stability and on product quality
NASA Astrophysics Data System (ADS)
Gutierrez, Catalina; Langlois, Laurent; Baudouin, Cyrille; Bigot, Régis; Fremeaux, Eric
2017-10-01
Cross wedge rolling (CWR) is a metal forming process used in the automotive industry. One of its applications is in the manufacturing process of connecting rods. CWR transforms a cylindrical billet into a complex axisymmetrical shape with an accurate distribution of material. This preform is forged into shape in a forging die. In order to improve CWR tool lifecycle and product quality it is essential to understand tool wear evolution and the physical phenomena that change on the CWR process due to the resulting geometry of the tool when undergoing tool wear. In order to understand CWR tool wear behavior, numerical simulations are necessary. Nevertheless, if the simulations are performed with the CAD geometry of the tool, results are limited. To solve this difficulty, two numerical simulations with FORGE® were performed using the real geometry of the tools (both up and lower roll) at two different states: (1) before starting lifecycle and (2) end of lifecycle. The tools were 3D measured with ATOS triple scan by GOM® using optical 3D measuring techniques. The result was a high-resolution point cloud of the entire geometry of the tool. Each 3D point cloud was digitalized and converted into a STL format. The geometry of the tools in a STL format was input for the 3D simulations. Both simulations were compared. Defects of products obtained in simulation were compared to main defects of products found industrially. Two main defects are: (a) surface defects on the preform that are not fixed in the die forging operation; and (b) Preform bent (no longer straight), with two possible impacts: on the one hand that the robot cannot grab it to take it to the forging stage; on the other hand, an unfilled section in the forging operation.
A new digitized reverse correction method for hypoid gears based on a one-dimensional probe
NASA Astrophysics Data System (ADS)
Li, Tianxing; Li, Jubo; Deng, Xiaozhong; Yang, Jianjun; Li, Genggeng; Ma, Wensuo
2017-12-01
In order to improve the tooth surface geometric accuracy and transmission quality of hypoid gears, a new digitized reverse correction method is proposed based on the measurement data from a one-dimensional probe. The minimization of tooth surface geometrical deviations is realized from the perspective of mathematical analysis and reverse engineering. Combining the analysis of complex tooth surface generation principles and the measurement mechanism of one-dimensional probes, the mathematical relationship between the theoretical designed tooth surface, the actual machined tooth surface and the deviation tooth surface is established, the mapping relation between machine-tool settings and tooth surface deviations is derived, and the essential connection between the accurate calculation of tooth surface deviations and the reverse correction method of machine-tool settings is revealed. Furthermore, a reverse correction model of machine-tool settings is built, a reverse correction strategy is planned, and the minimization of tooth surface deviations is achieved by means of the method of numerical iterative reverse solution. On this basis, a digitized reverse correction system for hypoid gears is developed by the organic combination of numerical control generation, accurate measurement, computer numerical processing, and digitized correction. Finally, the correctness and practicability of the digitized reverse correction method are proved through a reverse correction experiment. The experimental results show that the tooth surface geometric deviations meet the engineering requirements after two trial cuts and one correction.
A Computational Framework for Efficient Low Temperature Plasma Simulations
NASA Astrophysics Data System (ADS)
Verma, Abhishek Kumar; Venkattraman, Ayyaswamy
2016-10-01
Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345
Application of multi-grid method on the simulation of incremental forging processes
NASA Astrophysics Data System (ADS)
Ramadan, Mohamad; Khaled, Mahmoud; Fourment, Lionel
2016-10-01
Numerical simulation becomes essential in manufacturing large part by incremental forging processes. It is a splendid tool allowing to show physical phenomena however behind the scenes, an expensive bill should be paid, that is the computational time. That is why many techniques are developed to decrease the computational time of numerical simulation. Multi-Grid method is a numerical procedure that permits to reduce computational time of numerical calculation by performing the resolution of the system of equations on several mesh of decreasing size which allows to smooth faster the low frequency of the solution as well as its high frequency. In this paper a Multi-Grid method is applied to cogging process in the software Forge 3. The study is carried out using increasing number of degrees of freedom. The results shows that calculation time is divide by two for a mesh of 39,000 nodes. The method is promising especially if coupled with Multi-Mesh method.
Molecular Genetics of Mycobacteriophages
HATFULL, GRAHAM F.
2014-01-01
Mycobacteriophages have provided numerous essential tools for mycobacterial genetics, including delivery systems for transposons, reporter genes, and allelic exchange substrates, and components for plasmid vectors and mutagenesis. Their genetically diverse genomes also reveal insights into the broader nature of the phage population and the evolutionary mechanisms that give rise to it. The substantial advances in our understanding of the biology of mycobacteriophages including a large collection of completely sequenced genomes indicates a rich potential for further contributions in tuberculosis genetics and beyond. PMID:25328854
Quantifying uncertainty and computational complexity for pore-scale simulations
NASA Astrophysics Data System (ADS)
Chen, C.; Yuan, Z.; Wang, P.; Yang, X.; Zhenyan, L.
2016-12-01
Pore-scale simulation is an essential tool to understand the complex physical process in many environmental problems, from multi-phase flow in the subsurface to fuel cells. However, in practice, factors such as sample heterogeneity, data sparsity and in general, our insufficient knowledge of the underlying process, render many simulation parameters and hence the prediction results uncertain. Meanwhile, most pore-scale simulations (in particular, direct numerical simulation) incur high computational cost due to finely-resolved spatio-temporal scales, which further limits our data/samples collection. To address those challenges, we propose a novel framework based on the general polynomial chaos (gPC) and build a surrogate model representing the essential features of the underlying system. To be specific, we apply the novel framework to analyze the uncertainties of the system behavior based on a series of pore-scale numerical experiments, such as flow and reactive transport in 2D heterogeneous porous media and 3D packed beds. Comparing with recent pore-scale uncertainty quantification studies using Monte Carlo techniques, our new framework requires fewer number of realizations and hence considerably reduce the overall computational cost, while maintaining the desired accuracy.
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Fotia, Matthew L.; Hoke, John; Schauer, Fred
2015-01-01
A quasi-two-dimensional, computational fluid dynamic (CFD) simulation of a rotating detonation engine (RDE) is described. The simulation operates in the detonation frame of reference and utilizes a relatively coarse grid such that only the essential primary flow field structure is captured. This construction and other simplifications yield rapidly converging, steady solutions. Viscous effects, and heat transfer effects are modeled using source terms. The effects of potential inlet flow reversals are modeled using boundary conditions. Results from the simulation are compared to measured data from an experimental RDE rig with a converging-diverging nozzle added. The comparison is favorable for the two operating points examined. The utility of the code as a performance optimization tool and a diagnostic tool are discussed.
OVERVIEW OF NEUTRON MEASUREMENTS IN JET FUSION DEVICE.
Batistoni, P; Villari, R; Obryk, B; Packer, L W; Stamatelatos, I E; Popovichev, S; Colangeli, A; Colling, B; Fonnesu, N; Loreti, S; Klix, A; Klosowski, M; Malik, K; Naish, J; Pillon, M; Vasilopoulou, T; De Felice, P; Pimpinella, M; Quintieri, L
2017-10-05
The design and operation of ITER experimental fusion reactor requires the development of neutron measurement techniques and numerical tools to derive the fusion power and the radiation field in the device and in the surrounding areas. Nuclear analyses provide essential input to the conceptual design, optimisation, engineering and safety case in ITER and power plant studies. The required radiation transport calculations are extremely challenging because of the large physical extent of the reactor plant, the complexity of the geometry, and the combination of deep penetration and streaming paths. This article reports the experimental activities which are carried-out at JET to validate the neutronics measurements methods and numerical tools used in ITER and power plant design. A new deuterium-tritium campaign is proposed in 2019 at JET: the unique 14 MeV neutron yields produced will be exploited as much as possible to validate measurement techniques, codes, procedures and data currently used in ITER design thus reducing the related uncertainties and the associated risks in the machine operation. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Simulations of binary black hole mergers
NASA Astrophysics Data System (ADS)
Lovelace, Geoffrey
2017-01-01
Advanced LIGO's observations of merging binary black holes have inaugurated the era of gravitational wave astronomy. Accurate models of binary black holes and the gravitational waves they emit are helping Advanced LIGO to find as many gravitational waves as possible and to learn as much as possible about the waves' sources. These models require numerical-relativity simulations of binary black holes, because near the time when the black holes merge, all analytic approximations break down. Following breakthroughs in 2005, many research groups have built numerical-relativity codes capable of simulating binary black holes. In this talk, I will discuss current challenges in simulating binary black holes for gravitational-wave astronomy, and I will discuss the tremendous progress that has already enabled such simulations to become an essential tool for Advanced LIGO.
Numerical implementation of equations for photon motion in Kerr spacetime
NASA Astrophysics Data System (ADS)
Bursa, Michal
2017-12-01
Raytracing is one of the essential tools for accurate modeling of spectra and variability of various astrophysical objects. It has a major importance in relativistic environments, where light endures to a number of relativistic effects. Because the trajectories of light rays in curved spacetimes, and in Kerr spacetime in particular, are highly non-trivial, we summarize the equations governing the motion of photon (or any other zero rest mass particle) and give analytic solution of the equations that can be further used in practical computer implementations.
Exercise and type 2 diabetes: molecular mechanisms regulating glucose uptake in skeletal muscle
Goodyear, Laurie J.
2014-01-01
Exercise is a well-established tool to prevent and combat type 2 diabetes. Exercise improves whole body metabolic health in people with type 2 diabetes, and adaptations to skeletal muscle are essential for this improvement. An acute bout of exercise increases skeletal muscle glucose uptake, while chronic exercise training improves mitochondrial function, increases mitochondrial biogenesis, and increases the expression of glucose transporter proteins and numerous metabolic genes. This review focuses on the molecular mechanisms that mediate the effects of exercise to increase glucose uptake in skeletal muscle. PMID:25434013
Measures of fine motor skills in people with tremor disorders: appraisal and interpretation.
Norman, Kathleen E; Héroux, Martin E
2013-01-01
People with Parkinson's disease, essential tremor, or other movement disorders involving tremor have changes in fine motor skills that are among the hallmarks of these diseases. Numerous measurement tools have been created and other methods devised to measure such changes in fine motor skills. Measurement tools may focus on specific features - e.g., motor skills or dexterity, slowness in movement execution associated with parkinsonian bradykinesia, or magnitude of tremor. Less obviously, some tools may be better suited than others for specific goals such as detecting subtle dysfunction early in disease, revealing aspects of brain function affected by disease, or tracking changes expected from treatment or disease progression. The purpose of this review is to describe and appraise selected measurement tools of fine motor skills appropriate for people with tremor disorders. In this context, we consider the tools' content - i.e., what movement features they focus on. In addition, we consider how measurement tools of fine motor skills relate to measures of a person's disease state or a person's function. These considerations affect how one should select and interpret the results of these tools in laboratory and clinical contexts.
Microbial Ecology: Where are we now?
Boughner, Lisa A; Singh, Pallavi
2016-11-01
Conventional microbiological methods have been readily taken over by newer molecular techniques due to the ease of use, reproducibility, sensitivity and speed of working with nucleic acids. These tools allow high throughput analysis of complex and diverse microbial communities, such as those in soil, freshwater, saltwater, or the microbiota living in collaboration with a host organism (plant, mouse, human, etc). For instance, these methods have been robustly used for characterizing the plant (rhizosphere), animal and human microbiome specifically the complex intestinal microbiota. The human body has been referred to as the Superorganism since microbial genes are more numerous than the number of human genes and are essential to the health of the host. In this review we provide an overview of the Next Generation tools currently available to study microbial ecology, along with their limitations and advantages.
Generalized Differential Calculus and Applications to Optimization
NASA Astrophysics Data System (ADS)
Rector, Robert Blake Hayden
This thesis contains contributions in three areas: the theory of generalized calculus, numerical algorithms for operations research, and applications of optimization to problems in modern electric power systems. A geometric approach is used to advance the theory and tools used for studying generalized notions of derivatives for nonsmooth functions. These advances specifically pertain to methods for calculating subdifferentials and to expanding our understanding of a certain notion of derivative of set-valued maps, called the coderivative, in infinite dimensions. A strong understanding of the subdifferential is essential for numerical optimization algorithms, which are developed and applied to nonsmooth problems in operations research, including non-convex problems. Finally, an optimization framework is applied to solve a problem in electric power systems involving a smart solar inverter and battery storage system providing energy and ancillary services to the grid.
Hoang, Van T; Buss, Eike C; Wang, Wenwen; Hoffmann, Isabel; Raffel, Simon; Zepeda-Moreno, Abraham; Baran, Natalia; Wuchter, Patrick; Eckstein, Volker; Trumpp, Andreas; Jauch, Anna; Ho, Anthony D; Lutz, Christoph
2015-08-01
To understand the precise disease driving mechanisms in acute myeloid leukemia (AML), comparison of patient matched hematopoietic stem cells (HSC) and leukemia stem cells (LSC) is essential. In this analysis, we have examined the value of aldehyde dehydrogenase (ALDH) activity in combination with CD34 expression for the separation of HSC from LSC in 104 patients with de novo AML. The majority of AML patients (80 out of 104) had low percentages of cells with high ALDH activity (ALDH(+) cells; <1.9%; ALDH-rare AML), whereas 24 patients had relatively numerous ALDH(+) cells (≥1.9%; ALDH-numerous AML). In patients with ALDH-rare AML, normal HSC could be separated by their CD34(+) ALDH(+) phenotype, whereas LSC were exclusively detected among CD34(+) ALDH(-) cells. For patients with ALDH-numerous AML, the CD34(+) ALDH(+) subset consisted mainly of LSC and separation from HSC was not feasible. Functional analyses further showed that ALDH(+) cells from ALDH-numerous AML were quiescent, refractory to ARA-C treatment and capable of leukemic engraftment in a xenogenic mouse transplantation model. Clinically, resistance to chemotherapy and poor long-term outcome were also characteristic for patients with ALDH-numerous AML providing an additional risk-stratification tool. The difference in spectrum and relevance of ALDH activity in the putative LSC populations demonstrates, in addition to phenotypic and genetic, also functional heterogeneity of leukemic cells and suggests divergent roles for ALDH activity in normal HSC versus LSC. By acknowledging these differences our study provides a new and useful tool for prospective identification of AML cases in which separation of HSC from LSC is possible. © 2014 UICC.
Microbial reporters of metal bioavailability
Magrisso, Sagi; Erel, Yigal; Belkin, Shimshon
2008-01-01
Summary When attempting to assess the extent and the implications of environmental pollution, it is often essential to quantify not only the total concentration of the studied contaminant but also its bioavailable fraction: higher bioavailability, often correlated with increased mobility, signifies enhanced risk but may also facilitate bioremediation. Genetically engineered microorganisms, tailored to respond by a quantifiable signal to the presence of the target chemical(s), may serve as powerful tools for bioavailability assessment. This review summarizes the current knowledge on such microbial bioreporters designed to assay metal bioavailability. Numerous bacterial metal‐sensor strains have been developed over the past 15 years, displaying very high detection sensitivities for a broad spectrum of environmentally significant metal targets. These constructs are based on the use of a relatively small number of gene promoters as the sensing elements, and an even smaller selection of molecular reporter systems; they comprise a potentially useful panel of tools for simple and cost‐effective determination of the bioavailability of heavy metals in the environment, and for the quantification of the non‐bioavailable fraction of the pollutant. In spite of their inherent advantages, however, these tools have not yet been put to actual use in the evaluation of metal bioavailability in a real environmental remediation scheme. For this to happen, acceptance by regulatory authorities is essential, as is a standardization of assay conditions. PMID:21261850
Knoll, Michaela; Ciaccia, Ettore; Dekeling, René; Kvadsheim, Petter; Liddell, Kate; Gunnarsson, Stig-Lennart; Ludwig, Stefan; Nissen, Ivor; Lorenzen, Dirk; Kreimeyer, Roman; Pavan, Gianni; Meneghetti, Nello; Nordlund, Nina; Benders, Frank; van der Zwan, Timo; van Zon, Tim; Fraser, Leanne; Johansson, Torbjörn; Garmelius, Martin
2016-01-01
Within the European Defense Agency (EDA), the Protection of Marine Mammals (PoMM) project, a comprehensive common marine mammal database essential for risk mitigation tools, was established. The database, built on an extensive dataset collection with the focus on areas of operational interest for European navies, consists of annual and seasonal distribution and density maps, random and systematic sightings, an encyclopedia providing knowledge on the characteristics of 126 marine mammal species, data on marine mammal protection areas, and audio information including numerous examples of various vocalizations. Special investigations on marine mammal acoustics were carried out to improve the detection and classification capabilities.
Numerical analysis of thermal drilling technique on titanium sheet metal
NASA Astrophysics Data System (ADS)
Kumar, R.; Hynes, N. Rajesh Jesudoss
2018-05-01
Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.
Microbial Ecology: Where are we now?
2016-01-01
Conventional microbiological methods have been readily taken over by newer molecular techniques due to the ease of use, reproducibility, sensitivity and speed of working with nucleic acids. These tools allow high throughput analysis of complex and diverse microbial communities, such as those in soil, freshwater, saltwater, or the microbiota living in collaboration with a host organism (plant, mouse, human, etc). For instance, these methods have been robustly used for characterizing the plant (rhizosphere), animal and human microbiome specifically the complex intestinal microbiota. The human body has been referred to as the Superorganism since microbial genes are more numerous than the number of human genes and are essential to the health of the host. In this review we provide an overview of the Next Generation tools currently available to study microbial ecology, along with their limitations and advantages. PMID:27975077
CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.
Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran
2015-01-01
Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.
CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis
Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran
2015-01-01
Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275
Prompt and Precise Prototyping
NASA Technical Reports Server (NTRS)
2003-01-01
For Sanders Design International, Inc., of Wilton, New Hampshire, every passing second between the concept and realization of a product is essential to succeed in the rapid prototyping industry where amongst heavy competition, faster time-to-market means more business. To separate itself from its rivals, Sanders Design aligned with NASA's Marshall Space Flight Center to develop what it considers to be the most accurate rapid prototyping machine for fabrication of extremely precise tooling prototypes. The company's Rapid ToolMaker System has revolutionized production of high quality, small-to-medium sized prototype patterns and tooling molds with an exactness that surpasses that of computer numerically-controlled (CNC) machining devices. Created with funding and support from Marshall under a Small Business Innovation Research (SBIR) contract, the Rapid ToolMaker is a dual-use technology with applications in both commercial and military aerospace fields. The advanced technology provides cost savings in the design and manufacturing of automotive, electronic, and medical parts, as well as in other areas of consumer interest, such as jewelry and toys. For aerospace applications, the Rapid ToolMaker enables fabrication of high-quality turbine and compressor blades for jet engines on unmanned air vehicles, aircraft, and missiles.
Steps Towards Understanding Large-scale Deformation of Gas Hydrate-bearing Sediments
NASA Astrophysics Data System (ADS)
Gupta, S.; Deusner, C.; Haeckel, M.; Kossel, E.
2016-12-01
Marine sediments bearing gas hydrates are typically characterized by heterogeneity in the gas hydrate distribution and anisotropy in the sediment-gas hydrate fabric properties. Gas hydrates also contribute to the strength and stiffness of the marine sediment, and any disturbance in the thermodynamic stability of the gas hydrates is likely to affect the geomechanical stability of the sediment. Understanding mechanisms and triggers of large-strain deformation and failure of marine gas hydrate-bearing sediments is an area of extensive research, particularly in the context of marine slope-stability and industrial gas production. The ultimate objective is to predict severe deformation events such as regional-scale slope failure or excessive sand production by using numerical simulation tools. The development of such tools essentially requires a careful analysis of thermo-hydro-chemo-mechanical behavior of gas hydrate-bearing sediments at lab-scale, and its stepwise integration into reservoir-scale simulators through definition of effective variables, use of suitable constitutive relations, and application of scaling laws. One of the focus areas of our research is to understand the bulk coupled behavior of marine gas hydrate systems with contributions from micro-scale characteristics, transport-reaction dynamics, and structural heterogeneity through experimental flow-through studies using high-pressure triaxial test systems and advanced tomographical tools (CT, ERT, MRI). We combine these studies to develop mathematical model and numerical simulation tools which could be used to predict the coupled hydro-geomechanical behavior of marine gas hydrate reservoirs in a large-strain framework. Here we will present some of our recent results from closely co-ordinated experimental and numerical simulation studies with an objective to capture the large-deformation behavior relevant to different gas production scenarios. We will also report on a variety of mechanically relevant test scenarios focusing on effects of dynamic changes in gas hydrate saturation, highly uneven gas hydrate distributions, focused fluid migration and gas hydrate production through depressurization and CO2 injection.
Tsunami-induced boulder transport - combining physical experiments and numerical modelling
NASA Astrophysics Data System (ADS)
Oetjen, Jan; Engel, Max; May, Simon Matthias; Schüttrumpf, Holger; Brueckner, Helmut; Prasad Pudasaini, Shiva
2016-04-01
Coasts are crucial areas for living, economy, recreation, transportation, and various sectors of industry. Many of them are exposed to high-energy wave events. With regard to the ongoing population growth in low-elevation coastal areas, the urgent need for developing suitable management measures, especially for hazards like tsunamis, becomes obvious. These measures require supporting tools which allow an exact estimation of impact parameters like inundation height, inundation area, and wave energy. Focussing on tsunamis, geological archives can provide essential information on frequency and magnitude on a longer time scale in order to support coastal hazard management. While fine-grained deposits may quickly be altered after deposition, multi-ton coarse clasts (boulders) may represent an information source on past tsunami events with a much higher preservation potential. Applying numerical hydrodynamic coupled boulder transport models (BTM) is a commonly used approach to analyse characteristics (e.g. wave height, flow velocity) of the corresponding tsunami. Correct computations of tsunamis and the induced boulder transport can provide essential event-specific information, including wave heights, runup and direction. Although several valuable numerical models for tsunami-induced boulder transport exist (e. g. Goto et al., 2007; Imamura et al., 2008), some important basic aspects of both tsunami hydrodynamics and corresponding boulder transport have not yet been entirely understood. Therefore, our project aims at these questions in four crucial aspects of boulder transport by a tsunami: (i) influence of sediment load, (ii) influence of complex boulder shapes other than idealized rectangular shapes, (iii) momentum transfers between multiple boulders, and (iv) influence of non-uniform bathymetries and topographies both on tsunami and boulder. The investigation of these aspects in physical experiments and the correct implementation of an advanced model is an urgent need since they have been largely neglected. In order to tackle these gaps, we develop a novel BTM in two steps. First, scaled physical experiments are performed that determine the exact hydrodynamic processes within a tsunami during boulder transportations. Furthermore, the experiments are the basis for calibrating the numerical BTM. The BTM is based on the numerical two-phase mass flow model of Pudasaini (2012) that employs an advanced and unified high-resolution computational tool for mixtures consisting of the solid and fluid components and their interactions. This allows for the motion of the boulder while interacting with the particle-laden tsunami on the inundated coastal plane as a function of the total fluid and solid stresses. Our approach leads to fundamentally new insights in to the essential physical processes in BTM. Goto, K., Chavanich, S. A., Imamura, F., Kunthasap, P., Matsui, T., Minoura, K., Sugawara, D. and Yanagisawa, H.: Distribution, origin and transport process of boulders deposited by the 2004 Indian Ocean tsunami at Pakarang Cape, Thailand. Sediment. Geol., 202, 821-837, 2007. Imamura, F., Goto, K. and Ohkubo, S.: A numerical model of the transport of a boulder by tsunami. J. Geophys. Res. Oceans, 113, C01008, 2008. Pudasaini, S. P.: A general two-phase debris flow model. J. Geophys. Res. Earth Surf., 117, F03010, 2012.
Seita, Matteo; Volpi, Marco; Patala, Srikanth; ...
2016-06-24
Grain boundaries (GBs) govern many properties of polycrystalline materials. However, because of their structural variability, our knowledge of GB constitutive relations is still very limited. We present a novel method to characterise the complete crystallography of individual GBs non-destructively, with high-throughput, and using commercially available tools. This method combines electron diffraction, optical reflectance and numerical image analysis to determine all five crystallographic parameters of numerous GBs in samples with through-thickness grains. We demonstrate the technique by measuring the crystallographic character of about 1,000 individual GBs in aluminum in a single run. Our method enables cost- and time-effective assembly of crystallography–propertymore » databases for thousands of individual GBs. Furthermore, such databases are essential for identifying GB constitutive relations and for predicting GB-related behaviours of polycrystalline solids.« less
Interagency Transition Team Development and Facilitation. Essential Tools.
ERIC Educational Resources Information Center
Stodden, Robert A.; Brown, Steven E.; Galloway, L. M.; Mrazek, Susan; Noy, Liora
2005-01-01
The purpose of this Essential Tool is to assist state-level transition coordinators and others responsible for forming, conducting, and evaluating the performance of interagency transition teams that are focused upon the school and post-school needs of youth with disabilities. This Essential Tool is designed to guide the coordination efforts of…
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.
NASA Astrophysics Data System (ADS)
Roubinet, D.; Linde, N.; Jougnot, D.; Irving, J.
2016-05-01
Numerous field experiments suggest that the self-potential (SP) geophysical method may allow for the detection of hydraulically active fractures and provide information about fracture properties. However, a lack of suitable numerical tools for modeling streaming potentials in fractured media prevents quantitative interpretation and limits our understanding of how the SP method can be used in this regard. To address this issue, we present a highly efficient two-dimensional discrete-dual-porosity approach for solving the fluid flow and associated self-potential problems in fractured rock. Our approach is specifically designed for complex fracture networks that cannot be investigated using standard numerical methods. We then simulate SP signals associated with pumping conditions for a number of examples to show that (i) accounting for matrix fluid flow is essential for accurate SP modeling and (ii) the sensitivity of SP to hydraulically active fractures is intimately linked with fracture-matrix fluid interactions. This implies that fractures associated with strong SP amplitudes are likely to be hydraulically conductive, attracting fluid flow from the surrounding matrix.
Modeling tidal hydrodynamics of San Diego Bay, California
Wang, P.-F.; Cheng, R.T.; Richter, K.; Gross, E.S.; Sutton, D.; Gartner, J.W.
1998-01-01
In 1983, current data were collected by the National Oceanic and Atmospheric Administration using mechanical current meters. During 1992 through 1996, acoustic Doppler current profilers as well as mechanical current meters and tide gauges were used. These measurements not only document tides and tidal currents in San Diego Bay, but also provide independent data sets for model calibration and verification. A high resolution (100-m grid), depth-averaged, numerical hydrodynamic model has been implemented for San Diego Bay to describe essential tidal hydrodynamic processes in the bay. The model is calibrated using the 1983 data set and verified using the more recent 1992-1996 data. Discrepancies between model predictions and field data in beth model calibration and verification are on the order of the magnitude of uncertainties in the field data. The calibrated and verified numerical model has been used to quantify residence time and dilution and flushing of contaminant effluent into San Diego Bay. Furthermore, the numerical model has become an important research tool in ongoing hydrodynamic and water quality studies and in guiding future field data collection programs.
New insight in spiral drawing analysis methods - Application to action tremor quantification.
Legrand, André Pierre; Rivals, Isabelle; Richard, Aliénor; Apartis, Emmanuelle; Roze, Emmanuel; Vidailhet, Marie; Meunier, Sabine; Hainque, Elodie
2017-10-01
Spiral drawing is one of the standard tests used to assess tremor severity for the clinical evaluation of medical treatments. Tremor severity is estimated through visual rating of the drawings by movement disorders experts. Different approaches based on the mathematical signal analysis of the recorded spiral drawings were proposed to replace this rater dependent estimate. The objective of the present study is to propose new numerical methods and to evaluate them in terms of agreement with visual rating and reproducibility. Series of spiral drawings of patients with essential tremor were visually rated by a board of experts. In addition to the usual velocity analysis, three new numerical methods were tested and compared, namely static and dynamic unraveling, and empirical mode decomposition. The reproducibility of both visual and numerical ratings was estimated, and their agreement was evaluated. The statistical analysis demonstrated excellent agreement between visual and numerical ratings, and more reproducible results with numerical methods than with visual ratings. The velocity method and the new numerical methods are in good agreement. Among the latter, static and dynamic unravelling both display a smaller dispersion and are easier for automatic analysis. The reliable scores obtained through the proposed numerical methods allow considering that their implementation on a digitized tablet, be it connected with a computer or independent, provides an efficient automatic tool for tremor severity assessment. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
Essential oils: from extraction to encapsulation.
El Asbahani, A; Miladi, K; Badri, W; Sala, M; Aït Addi, E H; Casabianca, H; El Mousadik, A; Hartmann, D; Jilale, A; Renaud, F N R; Elaissari, A
2015-04-10
Essential oils are natural products which have many interesting applications. Extraction of essential oils from plants is performed by classical and innovative methods. Numerous encapsulation processes have been developed and reported in the literature in order to encapsulate biomolecules, active molecules, nanocrystals, oils and also essential oils for various applications such as in vitro diagnosis, therapy, cosmetic, textile, food etc. Essential oils encapsulation led to numerous new formulations with new applications. This insures the protection of the fragile oil and controlled release. The most commonly prepared carriers are polymer particles, liposomes and solid lipid nanoparticles. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Yoshihara, H.
1978-01-01
The problem of designing the wing-fuselage configuration of an advanced transonic commercial airliner and the optimization of a supercruiser fighter are sketched, pointing out the essential fluid mechanical phenomena that play an important role. Such problems suggest that for a numerical method to be useful, it must be able to treat highly three dimensional turbulent separations, flows with jet engine exhausts, and complex vehicle configurations. Weaknesses of the two principal tools of the aerodynamicist, the wind tunnel and the computer, suggest a complementing combined use of these tools, which is illustrated by the case of the transonic wing-fuselage design. The anticipated difficulties in developing an adequate turbulent transport model suggest that such an approach may have to suffice for an extended period. On a longer term, experimentation of turbulent transport in meaningful cases must be intensified to provide a data base for both modeling and theory validation purposes.
Andreeva, Antonina
2016-06-15
The Structural Classification of Proteins (SCOP) database has facilitated the development of many tools and algorithms and it has been successfully used in protein structure prediction and large-scale genome annotations. During the development of SCOP, numerous exceptions were found to topological rules, along with complex evolutionary scenarios and peculiarities in proteins including the ability to fold into alternative structures. This article reviews cases of structural variations observed for individual proteins and among groups of homologues, knowledge of which is essential for protein structure modelling. © 2016 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.
NASA Astrophysics Data System (ADS)
Todesco, Micol; Neri, Augusto; Demaria, Cristina; Marmo, Costantino; Macedonio, Giovanni
2006-07-01
Dissemination of scientific results to the general public has become increasingly important in our society. When science deals with natural hazards, public outreach is even more important: on the one hand, it contributes to hazard perception and it is a necessary step toward preparedness and risk mitigation; on the other hand, it contributes to establish a positive link of mutual confidence between scientific community and the population living at risk. The existence of such a link plays a relevant role in hazard communication, which in turn is essential to mitigate the risk. In this work, we present a tool that we have developed to illustrate our scientific results on pyroclastic flow propagation at Vesuvius. This tool, a CD-ROM that we developed joining scientific data with appropriate knowledge in communication sciences is meant to be a first prototype that will be used to test the validity of this approach to public outreach. The multimedia guide contains figures, images of real volcanoes and computer animations obtained through numerical modeling of pyroclastic density currents. Explanatory text, kept as short and simple as possible, illustrates both the process and the methodology applied to study this very dangerous natural phenomenon. In this first version, the CD-ROM will be distributed among selected categories of end-users together with a short questionnaire that we have drawn to test its readability. Future releases will include feedback from the users, further advancement of scientific results as well as a higher degree of interactivity.
On contact modelling in isogeometric analysis
NASA Astrophysics Data System (ADS)
Cardoso, R. P. R.; Adetoro, O. B.
2017-11-01
IsoGeometric Analysis (IGA) has proved to be a reliable numerical tool for the simulation of structural behaviour and fluid mechanics. The main reasons for this popularity are essentially due to: (i) the possibility of using higher order polynomials for the basis functions; (ii) the high convergence rates possible to achieve; (iii) the possibility to operate directly on CAD geometry without the need to resort to a mesh of elements. The major drawback of IGA is the non-interpolatory characteristic of the basis functions, which adds a difficulty in handling essential boundary conditions and makes it particularly challenging for contact analysis. In this work, the IGA is expanded to include frictionless contact procedures for sheet metal forming analyses. Non-Uniform Rational B-Splines (NURBS) are going to be used for the modelling of rigid tools as well as for the modelling of the deformable blank sheet. The contact methods developed are based on a two-step contact search scheme, where during the first step a global search algorithm is used for the allocation of contact knots into potential contact faces and a second (local) contact search scheme where point inversion techniques are used for the calculation of the contact penetration gap. For completeness, elastoplastic procedures are also included for a proper description of the entire IGA of sheet metal forming processes.
Galindo, F; de Aluja, A; Cagigas, R; Huerta, L A; Tadich, T A
2018-01-01
Equids are still used for diverse chores in Mexico and are essential for the livelihoods of numerous families. Appropriate health and behavior are prerequisites for performing work without affecting welfare. This study aimed to assess the welfare of working equids in Tuliman, applying the hands-on donkey tool. This tool evaluates five dimensions (behavior, body condition score [BCS], wounds, lameness, and other health issues) and was applied to 438 working equids (horses, mules, and donkeys). The Kruskall-Wallis test was applied to investigate differences between species and sex. Donkeys were more common; they also presented more positive behaviors and less lameness (p < 0.05). No differences were found for BCS among species on a scale ranging from 1 to 5 (mean BCS for donkeys = 1.9; mules = 2; and horses = 1.8). Mares had significantly lower BCS (mean = 1.5) than stallions (p < 0.05) and geldings (mean = 1.9). Overall mules had better welfare evaluations. The tool allowed detection of welfare issues in working equids; a practical outcome would be implementing local welfare strategies according to its results.
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
NASA Astrophysics Data System (ADS)
Jiménez-Forteza, Xisco; Keitel, David; Husa, Sascha; Hannam, Mark; Khan, Sebastian; Pürrer, Michael
2017-03-01
Numerical relativity is an essential tool in studying the coalescence of binary black holes (BBHs). It is still computationally prohibitive to cover the BBH parameter space exhaustively, making phenomenological fitting formulas for BBH waveforms and final-state properties important for practical applications. We describe a general hierarchical bottom-up fitting methodology to design and calibrate fits to numerical relativity simulations for the three-dimensional parameter space of quasicircular nonprecessing merging BBHs, spanned by mass ratio and by the individual spin components orthogonal to the orbital plane. Particular attention is paid to incorporating the extreme-mass-ratio limit and to the subdominant unequal-spin effects. As an illustration of the method, we provide two applications, to the final spin and final mass (or equivalently: radiated energy) of the remnant black hole. Fitting to 427 numerical relativity simulations, we obtain results broadly consistent with previously published fits, but improving in overall accuracy and particularly in the approach to extremal limits and for unequal-spin configurations. We also discuss the importance of data quality studies when combining simulations from diverse sources, how detailed error budgets will be necessary for further improvements of these already highly accurate fits, and how this first detailed study of unequal-spin effects helps in choosing the most informative parameters for future numerical relativity runs.
A Review of Numerical Simulation and Analytical Modeling for Medical Devices Safety in MRI
Kabil, J.; Belguerras, L.; Trattnig, S.; Pasquier, C.; Missoffe, A.
2016-01-01
Summary Objectives To review past and present challenges and ongoing trends in numerical simulation for MRI (Magnetic Resonance Imaging) safety evaluation of medical devices. Methods A wide literature review on numerical and analytical simulation on simple or complex medical devices in MRI electromagnetic fields shows the evolutions through time and a growing concern for MRI safety over the years. Major issues and achievements are described, as well as current trends and perspectives in this research field. Results Numerical simulation of medical devices is constantly evolving, supported by calculation methods now well-established. Implants with simple geometry can often be simulated in a computational human model, but one issue remaining today is the experimental validation of these human models. A great concern is to assess RF heating on implants too complex to be traditionally simulated, like pacemaker leads. Thus, ongoing researches focus on alternative hybrids methods, both numerical and experimental, with for example a transfer function method. For the static field and gradient fields, analytical models can be used for dimensioning simple implants shapes, but limited for complex geometries that cannot be studied with simplifying assumptions. Conclusions Numerical simulation is an essential tool for MRI safety testing of medical devices. The main issues remain the accuracy of simulations compared to real life and the studies of complex devices; but as the research field is constantly evolving, some promising ideas are now under investigation to take up the challenges. PMID:27830244
Dwyer, Robyn; Fraser, Suzanne
2017-06-01
It is widely accepted that alcohol and other drug consumption is profoundly gendered. Just where this gendering is occurring, however, remains the subject of debate. We contend that one important and overlooked site where the gendering of substance consumption and addiction is taking place is through AOD research itself: in particular, through the addiction screening and diagnostic tools designed to measure and track substance consumption and problems within populations. These tools establish key criteria and set numerical threshold scores for the identification of problems. In many of these tools, separate threshold scores for women and men are established or recommended. Drawing on Karen Barad's concept of post-humanist performativity, in this article we examine the ways in which gender itself is being materialised by these apparatuses of measurement. We focus primarily on the Drug Use Disorders Identification Test (DUDIT) tool as an exemplar of gendering processes that operate across addiction tools more broadly. We consider gendering processes operating through tools questions themselves and we also examine the quantification and legitimation processes used in establishing gender difference and the implications these have for women. We find tools rely on and reproduce narrow and marginalising assumptions about women as essentially fragile and vulnerable and simultaneously reinforce normative expectations that women sacrifice pleasure. The seemingly objective and neutral quantification processes operating in tools naturalise gender as they enact it. Copyright © 2017 Elsevier B.V. All rights reserved.
Commodities Trading: An Essential Economic Tool.
ERIC Educational Resources Information Center
Welch, Mary A., Ed.
1989-01-01
This issue focuses on commodities trading as an essential economic tool. Activities include critical thinking about marketing decisions and discussion on how futures markets and options are used as important economic tools. Discussion questions and a special student project are included. (EH)
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment. PMID:26368541
Forensic surface metrology: tool mark evidence.
Gambino, Carol; McLaughlin, Patrick; Kuo, Loretta; Kammerman, Frani; Shenkin, Peter; Diaczuk, Peter; Petraco, Nicholas; Hamby, James; Petraco, Nicholas D K
2011-01-01
Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9 mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy. Copyright © 2011 Wiley Periodicals, Inc.
da Silva, Francilene Vieira; de Barros Fernandes, Hélio; Oliveira, Irisdalva Sousa; Viana, Ana Flávia Seraine Custódio; da Costa, Douglas Soares; Lopes, Miriam Teresa Paz; de Lira, Kamila Lopes; Quintans-Júnior, Lucindo José; de Sousa, Adriano Antunes; de Cássia Meneses Oliveira, Rita
2016-11-01
(-)-Linalool is a monoterpene constituent of many essential oils. This particular monoterpene has both anti-inflammatory and antimicrobial activity. Moreover, this compound has been shown to be antinociceptive. However, the poor chemical stability and short half-life prevents the clinical application of (-)-linalool and many other essential oils. Important to the topic of this study, β-cyclodextrin (β-CD) has been used to increase the solubility, stability, and pharmacological effects of numerous lipophilic compounds in vivo. In this study, the gastroprotective activities of (-)-linalool (LIN) and linalool incorporated into inclusion complex containing β-cyclodextrin (LIN-βCD) were evaluated using models of acute and chronic gastric ulcers in rodents. LIN and LIN-βCD showed strong gastroprotective activity (p < 0.001). The LIN-βCD complex revealed that the gastroprotective effect was significantly improved compared with LIN uncomplexed, suggesting that this improvement is related to increased solubility and stability. Taking together the potentiation of the antioxidant profile of this monoterpene, our results suggest that β-CD may represent an important tool for improved gastroprotective activity of (-)-linalool and other water-insoluble compounds.
Trypanosome RNA Editing Mediator Complex proteins have distinct functions in gRNA utilization.
Simpson, Rachel M; Bruno, Andrew E; Chen, Runpu; Lott, Kaylen; Tylec, Brianna L; Bard, Jonathan E; Sun, Yijun; Buck, Michael J; Read, Laurie K
2017-07-27
Uridine insertion/deletion RNA editing is an essential process in kinetoplastid parasites whereby mitochondrial mRNAs are modified through the specific insertion and deletion of uridines to generate functional open reading frames, many of which encode components of the mitochondrial respiratory chain. The roles of numerous non-enzymatic editing factors have remained opaque given the limitations of conventional methods to interrogate the order and mechanism by which editing progresses and thus roles of individual proteins. Here, we examined whole populations of partially edited sequences using high throughput sequencing and a novel bioinformatic platform, the Trypanosome RNA Editing Alignment Tool (TREAT), to elucidate the roles of three proteins in the RNA Editing Mediator Complex (REMC). We determined that the factors examined function in the progression of editing through a gRNA; however, they have distinct roles and REMC is likely heterogeneous in composition. We provide the first evidence that editing can proceed through numerous paths within a single gRNA and that non-linear modifications are essential, generating commonly observed junction regions. Our data support a model in which RNA editing is executed via multiple paths that necessitate successive re-modification of junction regions facilitated, in part, by the REMC variant containing TbRGG2 and MRB8180. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Quantitative aspects of inductively coupled plasma mass spectrometry
NASA Astrophysics Data System (ADS)
Bulska, Ewa; Wagner, Barbara
2016-10-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.
Calculation of electromagnetic force in electromagnetic forming process of metal sheet
NASA Astrophysics Data System (ADS)
Xu, Da; Liu, Xuesong; Fang, Kun; Fang, Hongyuan
2010-06-01
Electromagnetic forming (EMF) is a forming process that relies on the inductive electromagnetic force to deform metallic workpiece at high speed. Calculation of the electromagnetic force is essential to understand the EMF process. However, accurate calculation requires complex numerical solution, in which the coupling between the electromagnetic process and the deformation of workpiece needs be considered. In this paper, an appropriate formula has been developed to calculate the electromagnetic force in metal work-piece in the sheet EMF process. The effects of the geometric size of coil, the material properties, and the parameters of discharge circuit on electromagnetic force are taken into consideration. Through the formula, the electromagnetic force at different time and in different positions of the workpiece can be predicted. The calculated electromagnetic force and magnetic field are in good agreement with the numerical and experimental results. The accurate prediction of the electromagnetic force provides an insight into the physical process of the EMF and a powerful tool to design optimum EMF systems.
van Rhee, Henk; Hak, Tony
2017-01-01
We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932
Stachler, Aris-Edda; Marchfelder, Anita
2016-07-15
The clustered regularly interspaced short palindromic repeats (CRISPR)-Cas system is used by bacteria and archaea to fend off foreign genetic elements. Since its discovery it has been developed into numerous applications like genome editing and regulation of transcription in eukaryotes and bacteria. For archaea currently no tools for transcriptional repression exist. Because molecular biology analyses in archaea become more and more widespread such a tool is vital for investigating the biological function of essential genes in archaea. Here we use the model archaeon Haloferax volcanii to demonstrate that its endogenous CRISPR-Cas system I-B can be harnessed to repress gene expression in archaea. Deletion of cas3 and cas6b genes results in efficient repression of transcription. crRNAs targeting the promoter region reduced transcript levels down to 8%. crRNAs targeting the reading frame have only slight impact on transcription. crRNAs that target the coding strand repress expression only down to 88%, whereas crRNAs targeting the template strand repress expression down to 8%. Repression of an essential gene results in reduction of transcription levels down to 22%. Targeting efficiencies can be enhanced by expressing a catalytically inactive Cas3 mutant. Genes can be targeted on plasmids or on the chromosome, they can be monocistronic or part of a polycistronic operon. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Stachler, Aris-Edda; Marchfelder, Anita
2016-01-01
The clustered regularly interspaced short palindromic repeats (CRISPR)-Cas system is used by bacteria and archaea to fend off foreign genetic elements. Since its discovery it has been developed into numerous applications like genome editing and regulation of transcription in eukaryotes and bacteria. For archaea currently no tools for transcriptional repression exist. Because molecular biology analyses in archaea become more and more widespread such a tool is vital for investigating the biological function of essential genes in archaea. Here we use the model archaeon Haloferax volcanii to demonstrate that its endogenous CRISPR-Cas system I-B can be harnessed to repress gene expression in archaea. Deletion of cas3 and cas6b genes results in efficient repression of transcription. crRNAs targeting the promoter region reduced transcript levels down to 8%. crRNAs targeting the reading frame have only slight impact on transcription. crRNAs that target the coding strand repress expression only down to 88%, whereas crRNAs targeting the template strand repress expression down to 8%. Repression of an essential gene results in reduction of transcription levels down to 22%. Targeting efficiencies can be enhanced by expressing a catalytically inactive Cas3 mutant. Genes can be targeted on plasmids or on the chromosome, they can be monocistronic or part of a polycistronic operon. PMID:27226589
An Essential Role for Pediatricians: Becoming Child Poverty Change Agents for a Lifetime.
Plax, Katie; Donnelly, Jeanine; Federico, Steven G; Brock, Leonard; Kaczorowski, Jeffrey M
2016-04-01
Poverty has profound and enduring effects on the health and well-being of children, as well as their subsequent adult health and success. It is essential for pediatricians to work to reduce child poverty and to ameliorate its effects on children. Pediatricians have important and needed tools to do this work: authority/power as physicians, understanding of science and evidence-based approaches, and first-hand, real-life knowledge and love of children and families. These tools need to be applied in partnership with community-based organizations/leaders, educators, human service providers, business leaders, philanthropists, and policymakers. Examples of the effects of pediatricians on the issue of child poverty are seen in Ferguson, Missouri; Denver, Colorado; and Rochester, New York. In addition, national models exist such as the American Academy of Pediatrics Community Pediatrics Training Initiative, which engages numerous pediatric faculty to learn and work together to make changes for children and families who live in poverty and to teach these skills to pediatric trainees. Some key themes/lessons for a pediatrician working to make changes in a community are to bear witness to and recognize injustice for children and families; identify an area of passion; review the evidence and gain expertise on the issue; build relationships and partnerships with community leaders and organizations; and advocate for effective solutions. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
Nucleation and microstructure development in Cr-Mo-V tool steel during gas atomization
NASA Astrophysics Data System (ADS)
Behúlová, M.; Grgač, P.; Čička, R.
2017-11-01
Nucleation studies of undercooled metallic melts are of essential interest for the understanding of phase selection, growth kinetics and microstructure development during their rapid non-equilibrium solidification. The paper deals with the modelling of nucleation processes and microstructure development in the hypoeutectic tool steel Ch12MF4 with the chemical composition of 2.37% C, 12.06 % Cr, 1.2% Mo, 4.0% V and balance Fe [wt. %] in the process of nitrogen gas atomization. Based on the classical theory of homogeneous nucleation, the nucleation temperature of molten rapidly cooled spherical particles from this alloy with diameter from 40 μm to 600 μm in the gas atomization process is calculated using various estimations of parameters influencing the nucleation process - the Gibbs free energy difference between solid and liquid phases and the solid/liquid interfacial energy. Results of numerical calculations are compared with experimentally measured nucleation temperatures during levitation experiments and microstructures developed in rapidly solidified powder particles from the investigated alloy.
Monitoring biological diversity: strategies, tools, limitations, and challenges
Beever, E.A.
2006-01-01
Monitoring is an assessment of the spatial and temporal variability in one or more ecosystem properties, and is an essential component of adaptive management. Monitoring can help determine whether mandated environmental standards are being met and can provide an early-warning system of ecological change. Development of a strategy for monitoring biological diversity will likely be most successful when based upon clearly articulated goals and objectives and may be enhanced by including several key steps in the process. Ideally, monitoring of biological diversity will measure not only composition, but also structure and function at the spatial and temporal scales of interest. Although biodiversity monitoring has several key limitations as well as numerous theoretical and practical challenges, many tools and strategies are available to address or overcome such challenges; I summarize several of these. Due to the diversity of spatio-temporal scales and comprehensiveness encompassed by existing definitions of biological diversity, an effective monitoring design will reflect the desired sampling domain of interest and its key stressors, available funding, legal requirements, and organizational goals.
Virtual tryout planning in automotive industry based on simulation metamodels
NASA Astrophysics Data System (ADS)
Harsch, D.; Heingärtner, J.; Hortig, D.; Hora, P.
2016-11-01
Deep drawn sheet metal parts are increasingly designed to the feasibility limit, thus achieving a robust manufacturing is often challenging. The fluctuation of process and material properties often lead to robustness problems. Therefore, numerical simulations are used to detect the critical regions. To enhance the agreement with the real process conditions, the material data are acquired through a variety of experiments. Furthermore, the force distribution is taken into account. The simulation metamodel contains the virtual knowledge of a particular forming process, which is determined based on a series of finite element simulations with variable input parameters. Based on the metamodels, virtual process windows can be displayed for different configurations. This helps to improve the operating point as well as to adjust process settings in case the process becomes unstable. Furthermore, the time of tool tryout can be shortened due to transfer of the virtual knowledge contained in the metamodels on the optimisation of the drawbeads. This allows the tool manufacturer to focus on the essential, to save time and to recognize complex relationships.
Wellbore inertial directional surveying system
Andreas, R.D.; Heck, G.M.; Kohler, S.M.; Watts, A.C.
1982-09-08
A wellbore inertial directional surveying system for providing a complete directional survey of an oil or gas well borehole to determine the displacement in all three directions of the borehole path relative to the well head at the surface. The information generated by the present invention is especially useful when numerous wells are drilled to different geographical targets from a single offshore platform. Accurate knowledge of the path of the borehole allows proper well spacing and provides assurance that target formations are reached. The tool is lowered down into a borehole on an electrical cable. A computer positioned on the surface communicates with the tool via the cable. The tool contains a sensor block which is supported on a single gimbal, the rotation axis of which is aligned with the cylinder axis of the tool and, correspondingly, the borehole. The gyroscope measurement of the sensor block rotation is used in a null-seeking servo loop which essentially prevents rotation of the sensor block about the gimbal axis. Angular rates of the sensor block about axes which are perpendicular to te gimbal axis are measured by gyroscopes in a manner similar to a strapped-down arrangement. Three accelerometers provide acceleration information as the tool is lowered within the borehole. The uphole computer derives position information based upon acceleration information and angular rate information. Kalman estimation techniques are used to compensate for system errors. 25 figures.
General Vulnerability and Exposure Profile to Tsunami in Puerto Rico
NASA Astrophysics Data System (ADS)
Ruiz, R.; Huérfano-Moreno, V.
2012-12-01
The Puerto Rico archipelago, located in the seismically active Caribbean region, has been directly affected by tsunamis in the last two centuries. The M 7.3 tsunamigenic earthquake, which occurred on October 11, 1918, caused $29 million in damage, death of 116 people and 100 residents were reported as missing. Presently, deficiencies on urban planning have induced an increase on the number of vulnerable people living inside the tsunami flood areas. Tsunami-prone areas have been delimited for Puerto Rico based on numerical tsunami modeling. However, the demographic, social and physical (e.g. critical and essential facilities) characteristics of these areas have not been documented in detail. We are conducting a municipality and community-level tsunami vulnerability and exposure study using Geographical Information System (GIS) tool. The results of our study are being integrated into the Puerto Rico Disaster Decision Support Tool (DDST). The DDST is a tool that brings access, at no cost, to a variety of updated geo-referenced information for Puerto Rico. This tool provides internet-based scalable maps that will aid emergency managers and decision-makers on their responsibilities and will improve Puerto Rico communities' resilience against tsunami hazard. This project aims to provide an initial estimate of Puerto Rico vulnerability and exposure to tsunami and brings to the community a technological tool that will help increase their awareness of this hazard and to assist them on their decisions.
Wellbore inertial directional surveying system
Andreas, Ronald D.; Heck, G. Michael; Kohler, Stewart M.; Watts, Alfred C.
1991-01-01
A wellbore inertial directional surveying system for providing a complete directional survey of an oil or gas well borehole to determine the displacement in all three directions of the borehole path relative to the well head at the surface. The information generated by the present invention is especially useful when numerous wells are drilled to different geographical targets from a single off-shore platform. Accurate knowledge of the path of the borehole allows proper well spacing and provides assurance that target formations are reached. The tool is lowered down into a borehole on the electrical cable. A computer positioned on the surface communicates with the tool via the cable. The tool contains a sensor block which is supported on a single gimbal, the rotation axis of which is aligned with the cylinder axis of the tool and, correspondingly, the borehole. The gyroscope measurement of the sensor block rotation is used in a null-seeking servo loop which essentially prevents rotation of the sensor block aboutthe gimbal axis. Angular rates of the sensor block about axes which are perpendicular to the gimbal axis are measured by gyroscopes in a manner similar to a strapped-down arrangement. Three accelerometers provide acceleration information as the tool is lowered within the borehole. The uphole computer derives position information based upon acceleration information and anular rate information. Kalman estimation techniques are used to compensate for system errors.
Modeling Kelvin Wave Cascades in Superfluid Helium
NASA Astrophysics Data System (ADS)
Boffetta, G.; Celani, A.; Dezzani, D.; Laurie, J.; Nazarenko, S.
2009-09-01
We study two different types of simplified models for Kelvin wave turbulence on quantized vortex lines in superfluids near zero temperature. Our first model is obtained from a truncated expansion of the Local Induction Approximation (Truncated-LIA) and it is shown to possess the same scalings and the essential behaviour as the full Biot-Savart model, being much simpler than the later and, therefore, more amenable to theoretical and numerical investigations. The Truncated-LIA model supports six-wave interactions and dual cascades, which are clearly demonstrated via the direct numerical simulation of this model in the present paper. In particular, our simulations confirm presence of the weak turbulence regime and the theoretically predicted spectra for the direct energy cascade and the inverse wave action cascade. The second type of model we study, the Differential Approximation Model (DAM), takes a further drastic simplification by assuming locality of interactions in k-space via using a differential closure that preserves the main scalings of the Kelvin wave dynamics. DAMs are even more amenable to study and they form a useful tool by providing simple analytical solutions in the cases when extra physical effects are present, e.g. forcing by reconnections, friction dissipation and phonon radiation. We study these models numerically and test their theoretical predictions, in particular the formation of the stationary spectra, and closeness of numerics for the higher-order DAM to the analytical predictions for the lower-order DAM.
Suurmond, Robert; van Rhee, Henk; Hak, Tony
2017-12-01
We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Requirements for clinical information modelling tools.
Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak
2015-07-01
This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
The UEA sRNA Workbench (version 4.4): a comprehensive suite of tools for analyzing miRNAs and sRNAs.
Stocks, Matthew B; Mohorianu, Irina; Beckers, Matthew; Paicu, Claudia; Moxon, Simon; Thody, Joshua; Dalmay, Tamas; Moulton, Vincent
2018-05-02
RNA interference, a highly conserved regulatory mechanism, is mediated via small RNAs. Recent technical advances enabled the analysis of larger, complex datasets and the investigation of microRNAs and the less known small interfering RNAs. However, the size and intricacy of current data requires a comprehensive set of tools, able to discriminate the patterns from the low-level, noise-like, variation; numerous and varied suggestions from the community represent an invaluable source of ideas for future tools, the ability of the community to contribute to this software is essential. We present a new version of the UEA sRNA Workbench, reconfigured to allow an easy insertion of new tools/workflows. In its released form, it comprises of a suite of tools in a user-friendly environment, with enhanced capabilities for a comprehensive processing of sRNA-seq data e.g. tools for an accurate prediction of sRNA loci (CoLIde) and miRNA loci (miRCat2), as well as workflows to guide the users through common steps such as quality checking of the input data, normalization of abundances or detection of differential expression represent the first step in sRNA-seq analyses. The UEA sRNA Workbench is available at: http://srna-workbench.cmp.uea.ac.uk The source code is available at: https://github.com/sRNAworkbenchuea/UEA_sRNA_Workbench. v.moulton@uea.ac.uk.
NaviCell Web Service for network-based data visualization.
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei
2015-07-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
ConfocalCheck - A Software Tool for the Automated Monitoring of Confocal Microscope Performance
Hng, Keng Imm; Dormann, Dirk
2013-01-01
Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system’s performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments. PMID:24224017
Hierarchical programming for data storage and visualization
Donovan, John M.; Smith, Peter E.; ,
2001-01-01
Graphics software is an essential tool for interpreting, analyzing, and presenting data from multidimensional hydrodynamic models used in estuarine and coastal ocean studies. The post-processing of time-varying three-dimensional model output presents unique requirements for data visualization because of the large volume of data that can be generated and the multitude of time scales that must be examined. Such data can relate to estuarine or coastal ocean environments and come from numerical models or field instruments. One useful software tool for the display, editing, visualization, and printing of graphical data is the Gr application, written by the first author for use in U.S. Geological Survey San Francisco Bay Program. The Gr application has been made available to the public via the Internet since the year 2000. The Gr application is written in the Java (Sun Microsystems, Nov. 29, 2001) programming language and uses the Extensible Markup Language standard for hierarchical data storage. Gr presents a hierarchy of objects to the user that can be edited using a common interface. Java's object-oriented capabilities allow Gr to treat data, graphics, and tools equally and to save them all to a single XML file.
NaviCell Web Service for network-based data visualization
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei
2015-01-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393
Who is teaching what, when? An evolving online tool to manage dental curricula.
Walton, Joanne N
2014-03-01
There are numerous issues in the documentation and ongoing development of health professions curricula. It seems that curriculum information falls quickly out of date between accreditation cycles, while students and faculty members struggle in the meantime with the "hidden curriculum" and unintended redundancies and gaps. Beyond knowing what is in the curriculum lies the frustration of timetabling learning in a transparent way while allowing for on-the-fly changes and improvements. The University of British Columbia Faculty of Dentistry set out to develop a curriculum database to answer the simple but challenging question "who is teaching what, when?" That tool, dubbed "OSCAR," has evolved to not only document the dental curriculum, but as a shared instrument that also holds the curricula and scheduling detail of the dental hygiene degree and clinical graduate programs. In addition to providing documentation ranging from reports for accreditation to daily information critical to faculty administrators and staff, OSCAR provides faculty and students with individual timetables and pushes updates via text, email, and calendar changes. It incorporates reminders and session resources for students and can be updated by both faculty members and staff. OSCAR has evolved into an essential tool for tracking, scheduling, and improving the school's curricula.
NASA Astrophysics Data System (ADS)
Jougnot, D.; Roubinet, D.; Linde, N.; Irving, J.
2016-12-01
Quantifying fluid flow in fractured media is a critical challenge in a wide variety of research fields and applications. To this end, geophysics offers a variety of tools that can provide important information on subsurface physical properties in a noninvasive manner. Most geophysical techniques infer fluid flow by data or model differencing in time or space (i.e., they are not directly sensitive to flow occurring at the time of the measurements). An exception is the self-potential (SP) method. When water flows in the subsurface, an excess of charge in the pore water that counterbalances electric charges at the mineral-pore water interface gives rise to a streaming current and an associated streaming potential. The latter can be measured with the SP technique, meaning that the method is directly sensitive to fluid flow. Whereas numerous field experiments suggest that the SP method may allow for the detection of hydraulically active fractures, suitable tools for numerically modeling streaming potentials in fractured media do not exist. Here, we present a highly efficient two-dimensional discrete-dual-porosity approach for solving the fluid-flow and associated self-potential problems in fractured domains. Our approach is specifically designed for complex fracture networks that cannot be investigated using standard numerical methods due to computational limitations. We then simulate SP signals associated with pumping conditions for a number of examples to show that (i) accounting for matrix fluid flow is essential for accurate SP modeling and (ii) the sensitivity of SP to hydraulically active fractures is intimately linked with fracture-matrix fluid interactions. This implies that fractures associated with strong SP amplitudes are likely to be hydraulically conductive, attracting fluid flow from the surrounding matrix.
Lipids of aquatic sediments, recent and ancient
NASA Technical Reports Server (NTRS)
Eglinton, G.; Hajibrahim, S. K.; Maxwell, J. R.; Quirke, J. M. E.; Shaw, G. J.; Volkman, J. K.; Wardroper, A. M. K.
1979-01-01
Computerized gas chromatography-mass spectrometry (GC-MS) is now an essential tool in the analysis of the complex mixtures of lipids (geolipids) encountered in aquatic sediments, both 'recent' (less than 1 million years old) and ancient. The application of MS, and particularly GC-MS, has been instrumental in the rapid development of organic geochemistry and environmental organic chemistry in recent years. The techniques used have resulted in the identification of numerous compounds of a variety of types in sediments. Most attention has been concentrated on molecules of limited size, mainly below 500 molecular mass, and of limited functionality, for examples, hydrocarbons, fatty acids and alcohols. Examples from recent studies (at Bristol) of contemporary, 'recent' and ancient sediments are presented and discussed.
Rocky Mountain Center for Conservation Genetics and Systematics
Oyler-McCance, S.J.; Quinn, T.W.
2005-01-01
The use of molecular genetic tools has become increasingly important in addressing conservation issues pertaining to plants and animals. Genetic information can be used to augment studies of population dynamics and population viability, investigate systematic, refine taxonomic definitions, investigate population structure and gene flow, and document genetic diversity in a variety of plant and animal species. Further, genetic techniques are being used to investigate mating systems through paternity analysis, and analyze ancient DNA samples from museum specimens, and estimate population size and survival rates using DNA as a unique marker. Such information is essential for the sound management of small, isolated populations of concern and is currently being used by universities, zoos, the U.S. Fish and Wildlife Service, and numerous state fish and wildlife agencies.
Clinical Use of an Enterprise Data Warehouse
Evans, R. Scott; Lloyd, James F.; Pierce, Lee A.
2012-01-01
The enormous amount of data being collected by electronic medical records (EMR) has found additional value when integrated and stored in data warehouses. The enterprise data warehouse (EDW) allows all data from an organization with numerous inpatient and outpatient facilities to be integrated and analyzed. We have found the EDW at Intermountain Healthcare to not only be an essential tool for management and strategic decision making, but also for patient specific clinical decision support. This paper presents the structure and two case studies of a framework that has provided us the ability to create a number of decision support applications that are dependent on the integration of previous enterprise-wide data in addition to a patient’s current information in the EMR. PMID:23304288
Quantitative aspects of inductively coupled plasma mass spectrometry
Wagner, Barbara
2016-01-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhinefrank, Kenneth E.; Lenee-Bluhm, Pukha; Prudell, Joseph H.
The most prudent path to a full-scale design, build and deployment of a wave energy conversion (WEC) system involves establishment of validated numerical models using physical experiments in a methodical scaling program. This Project provides essential additional rounds of wave tank testing at 1:33 scale and ocean/bay testing at a 1:7 scale, necessary to validate numerical modeling that is essential to a utility-scale WEC design and associated certification.
2017-01-01
Although Arabic numerals (like ‘2016’ and ‘3.14’) are ubiquitous, we show that in interactive computer applications they are often misleading and surprisingly unreliable. We introduce interactive numerals as a new concept and show, like Roman numerals and Arabic numerals, interactive numerals introduce another way of using and thinking about numbers. Properly understanding interactive numerals is essential for all computer applications that involve numerical data entered by users, including finance, medicine, aviation and science. PMID:28484609
Toolpack mathematical software development environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osterweil, L.
1982-07-21
The purpose of this research project was to produce a well integrated set of tools for the support of numerical computation. The project entailed the specification, design and implementation of both a diversity of tools and an innovative tool integration mechanism. This large configuration of tightly integrated tools comprises an environment for numerical software development, and has been named Toolpack/IST (Integrated System of Tools). Following the creation of this environment in prototype form, the environment software was readied for widespread distribution by transitioning it to a development organization for systematization, documentation and distribution. It is expected that public release ofmore » Toolpack/IST will begin imminently and will provide a basis for evaluation of the innovative software approaches taken as well as a uniform set of development tools for the numerical software community.« less
NASA Astrophysics Data System (ADS)
Griesel, S.; Mundry, R.; Kakuschke, A.; Fonfara, S.; Siebert, U.; Prange, A.
2006-11-01
Mineral and essential trace elements are involved in numerous physiological processes in mammals. Often, diseases are associated with an imbalance of the electrolyte homeostasis. In this study, the concentrations of mineral elements (P, S, K, Ca) and essential trace elements (Fe, Cu, Zn, Se, Rb, Sr) in whole blood of harbor seals ( Phoca vitulina) were determined using total-reflection X-ray fluorescence spectrometry (TXRF). Samples from 81 free-ranging harbor seals from the North Sea and two captive seals were collected during 2003-2005. Reference ranges and element correlations for health status determination were derived for P, S, K, Ca, Fe, Cu, and Zn level in whole blood. Grouping the seals by age, gender and sample location the concentration levels of the elements were compared. The blood from two captive seals with signs of diseases and four free-ranging seals showed reduced element levels of P, S, and Ca and differences in element correlation of electrolytes were ascertained. Thus, simultaneous measurements of several elements in only 500 μL volumes of whole blood provide the possibility to obtain information on both, the electrolyte balance and the hydration status of the seals. The method could therefore serve as an additional biomonitoring tool for the health assessment.
NASA Astrophysics Data System (ADS)
Wu, Yanling
2018-05-01
In this paper, the extreme waves were generated using the open source computational fluid dynamic (CFD) tools — OpenFOAM and Waves2FOAM — using linear and nonlinear NewWave input. They were used to conduct the numerical simulation of the wave impact process. Numerical tools based on first-order (with and without stretching) and second-order NewWave are investigated. The simulation to predict force loading for the offshore platform under the extreme weather condition is implemented and compared.
The MATH--Open Source Application for Easier Learning of Numerical Mathematics
ERIC Educational Resources Information Center
Glaser-Opitz, Henrich; Budajová, Kristina
2016-01-01
The article introduces a software application (MATH) supporting an education of Applied Mathematics, with focus on Numerical Mathematics. The MATH is an easy to use tool supporting various numerical methods calculations with graphical user interface and integrated plotting tool for graphical representation written in Qt with extensive use of Qwt…
Numerical simulations of novel high-power high-brightness diode laser structures
NASA Astrophysics Data System (ADS)
Boucke, Konstantin; Rogg, Joseph; Kelemen, Marc T.; Poprawe, Reinhart; Weimann, Guenter
2001-07-01
One of the key topics in today's semiconductor laser development activities is to increase the brightness of high-power diode lasers. Although structures showing an increased brightness have been developed specific draw-backs of these structures lead to a still strong demand for investigation of alternative concepts. Especially for the investigation of basically novel structures easy-to-use and fast simulation tools are essential to avoid unnecessary, cost and time consuming experiments. A diode laser simulation tool based on finite difference representations of the Helmholtz equation in 'wide-angle' approximation and the carrier diffusion equation has been developed. An optimized numerical algorithm leads to short execution times of a few seconds per resonator round-trip on a standard PC. After each round-trip characteristics like optical output power, beam profile and beam parameters are calculated. A graphical user interface allows online monitoring of the simulation results. The simulation tool is used to investigate a novel high-power, high-brightness diode laser structure, the so-called 'Z-Structure'. In this structure an increased brightness is achieved by reducing the divergency angle of the beam by angular filtering: The round trip path of the beam is two times folded using internal total reflection at surfaces defined by a small index step in the semiconductor material, forming a stretched 'Z'. The sharp decrease of the reflectivity for angles of incidence above the angle of total reflection leads to a narrowing of the angular spectrum of the beam. The simulations of the 'Z-Structure' indicate an increase of the beam quality by a factor of five to ten compared to standard broad-area lasers.
Sadick, Maliha; Dally, Franz Josef; Schönberg, Stefan O; Stroszczynski, Christian; Wohlgemuth, Walter A
2017-10-01
Background Radiology is an interdisciplinary field dedicated to the diagnosis and treatment of numerous diseases and is involved in the development of multimodal treatment concepts. Method Interdisciplinary case management, a broad spectrum of diagnostic imaging facilities and dedicated endovascular radiological treatment options are valuable tools that allow radiology to set up an interdisciplinary center for vascular anomalies. Results Image-based diagnosis combined with endovascular treatment options is an essential tool for the treatment of patients with highly complex vascular diseases. These vascular anomalies can affect numerous parts of the body so that a multidisciplinary treatment approach is required for optimal patient care. Conclusion This paper discusses the possibilities and challenges regarding effective and efficient patient management in connection with the formation of an interdisciplinary center for vascular anomalies with strengthening of the clinical role of radiologists. Key points · Vascular anomalies, which include vascular tumors and malformations, are complex to diagnose and treat.. · There are far more patients with vascular anomalies requiring therapy than interdisciplinary centers for vascular anomalies - there is currently a shortage of dedicated interdisciplinary centers for vascular anomalies in Germany that can provide dedicated care for affected patients.. · Radiology includes a broad spectrum of diagnostic and minimally invasive therapeutic tools which allow the formation of an interdisciplinary center for vascular anomalies for effective, efficient and comprehensive patient management.. Citation Format · Sadick M, Dally FJ, Schönberg SO et al. Strategies in Interventional Radiology: Formation of an Interdisciplinary Center of Vascular Anomalies - Chances and Challenges for Effective and Efficient Patient Management. Fortschr Röntgenstr 2017; 189: 957 - 966. © Georg Thieme Verlag KG Stuttgart · New York.
Multi-omic Mitoprotease Profiling Defines a Role for Oct1p in Coenzyme Q Production.
Veling, Mike T; Reidenbach, Andrew G; Freiberger, Elyse C; Kwiecien, Nicholas W; Hutchins, Paul D; Drahnak, Michael J; Jochem, Adam; Ulbrich, Arne; Rush, Matthew J P; Russell, Jason D; Coon, Joshua J; Pagliarini, David J
2017-12-07
Mitoproteases are becoming recognized as key regulators of diverse mitochondrial functions, although their direct substrates are often difficult to discern. Through multi-omic profiling of diverse Saccharomyces cerevisiae mitoprotease deletion strains, we predicted numerous associations between mitoproteases and distinct mitochondrial processes. These include a strong association between the mitochondrial matrix octapeptidase Oct1p and coenzyme Q (CoQ) biosynthesis-a pathway essential for mitochondrial respiration. Through Edman sequencing and in vitro and in vivo biochemistry, we demonstrated that Oct1p directly processes the N terminus of the CoQ-related methyltransferase, Coq5p, which markedly improves its stability. A single mutation to the Oct1p recognition motif in Coq5p disrupted its processing in vivo, leading to CoQ deficiency and respiratory incompetence. This work defines the Oct1p processing of Coq5p as an essential post-translational event for proper CoQ production. Additionally, our data visualization tool enables efficient exploration of mitoprotease profiles that can serve as the basis for future mechanistic investigations. Copyright © 2017 Elsevier Inc. All rights reserved.
Monitoring Object Library Usage and Changes
NASA Technical Reports Server (NTRS)
Owen, R. K.; Craw, James M. (Technical Monitor)
1995-01-01
The NASA Ames Numerical Aerodynamic Simulation program Aeronautics Consolidated Supercomputing Facility (NAS/ACSF) supercomputing center services over 1600 users, and has numerous analysts with root access. Several tools have been developed to monitor object library usage and changes. Some of the tools do "noninvasive" monitoring and other tools implement run-time logging even for object-only libraries. The run-time logging identifies who, when, and what is being used. The benefits are that real usage can be measured, unused libraries can be discontinued, training and optimization efforts can be focused at those numerical methods that are actually used. An overview of the tools will be given and the results will be discussed.
Equation-free analysis of agent-based models and systematic parameter determination
NASA Astrophysics Data System (ADS)
Thomas, Spencer A.; Lloyd, David J. B.; Skeldon, Anne C.
2016-12-01
Agent based models (ABM)s are increasingly used in social science, economics, mathematics, biology and computer science to describe time dependent systems in circumstances where a description in terms of equations is difficult. Yet few tools are currently available for the systematic analysis of ABM behaviour. Numerical continuation and bifurcation analysis is a well-established tool for the study of deterministic systems. Recently, equation-free (EF) methods have been developed to extend numerical continuation techniques to systems where the dynamics are described at a microscopic scale and continuation of a macroscopic property of the system is considered. To date, the practical use of EF methods has been limited by; (1) the over-head of application-specific implementation; (2) the laborious configuration of problem-specific parameters; and (3) large ensemble sizes (potentially) leading to computationally restrictive run-times. In this paper we address these issues with our tool for the EF continuation of stochastic systems, which includes algorithms to systematically configuration problem specific parameters and enhance robustness to noise. Our tool is generic and can be applied to any 'black-box' simulator and determines the essential EF parameters prior to EF analysis. Robustness is significantly improved using our convergence-constraint with a corrector-repeat (C3R) method. This algorithm automatically detects outliers based on the dynamics of the underlying system enabling both an order of magnitude reduction in ensemble size and continuation of systems at much higher levels of noise than classical approaches. We demonstrate our method with application to several ABM models, revealing parameter dependence, bifurcation and stability analysis of these complex systems giving a deep understanding of the dynamical behaviour of the models in a way that is not otherwise easily obtainable. In each case we demonstrate our systematic parameter determination stage for configuring the system specific EF parameters.
NASA Astrophysics Data System (ADS)
Pickett, Leon, Jr.
Past research has conclusively shown that long fiber structural composites possess superior specific energy absorption characteristics as compared to steel and aluminum structures. However, destructive physical testing of composites is very costly and time consuming. As a result, numerical solutions are desirable as an alternative to experimental testing. Up until this point, very little numerical work has been successful in predicting the energy absorption of composite crush structures. This research investigates the ability to use commercially available numerical modeling tools to approximate the energy absorption capability of long-fiber composite crush tubes. This study is significant because it provides a preliminary analysis of the suitability of LS-DYNA to numerically characterize the crushing behavior of a dynamic axial impact crushing event. Composite crushing theory suggests that there are several crushing mechanisms occurring during a composite crush event. This research evaluates the capability and suitability of employing, LS-DYNA, to simulate the dynamic crush event of an E-glass/epoxy cylindrical tube. The model employed is the composite "progressive failure model", a much more limited failure model when compared to the experimental failure events which naturally occur. This numerical model employs (1) matrix cracking, (2) compression, and (3) fiber breakage failure modes only. The motivation for the work comes from the need to reduce the significant cost associated with experimental trials. This research chronicles some preliminary efforts to better understand the mechanics essential in pursuit of this goal. The immediate goal is to begin to provide deeper understanding of a composite crush event and ultimately create a viable alternative to destructive testing of composite crush tubes.
NASA Astrophysics Data System (ADS)
Sigurdson, J.; Tagerud, J.
1986-05-01
A UNIDO publication about machine tools with automatic control discusses the following: (1) numerical control (NC) machine tool perspectives, definition of NC, flexible manufacturing systems, robots and their industrial application, research and development, and sensors; (2) experience in developing a capability in NC machine tools; (3) policy issues; (4) procedures for retrieval of relevant documentation from data bases. Diagrams, statistics, bibliography are included.
Interferometric correction system for a numerically controlled machine
Burleson, Robert R.
1978-01-01
An interferometric correction system for a numerically controlled machine is provided to improve the positioning accuracy of a machine tool, for example, for a high-precision numerically controlled machine. A laser interferometer feedback system is used to monitor the positioning of the machine tool which is being moved by command pulses to a positioning system to position the tool. The correction system compares the commanded position as indicated by a command pulse train applied to the positioning system with the actual position of the tool as monitored by the laser interferometer. If the tool position lags the commanded position by a preselected error, additional pulses are added to the pulse train applied to the positioning system to advance the tool closer to the commanded position, thereby reducing the lag error. If the actual tool position is leading in comparison to the commanded position, pulses are deleted from the pulse train where the advance error exceeds the preselected error magnitude to correct the position error of the tool relative to the commanded position.
Davie, Jeremiah J; Faitar, Silviu L
2017-01-01
Currently, time-consuming serial in vitro experimentation involving immunocytochemistry or radiolabeled materials is required to identify which of the numerous Rab-GTPases (Rab) and Rab-GTPase activating proteins (RabGAP) are capable of functional interactions. These interactions are essential for numerous cellular functions, and in silico methods of reducing in vitro trial and error would accelerate the pace of research in cell biology. We have utilized a combination of three-dimensional protein modeling and protein bioinformatics to identify domains present in Rab proteins that are predictive of their functional interaction with a specific RabGAP. The RabF2 and RabSF1 domains appear to play functional roles in mediating the interaction between Rabs and RabGAPs. Moreover, the RabSF1 domain can be used to make in silico predictions of functional Rab/RabGAP pairs. This method is expected to be a broadly applicable tool for predicting protein-protein interactions where existing crystal structures for homologs of the proteins of interest are available.
A mathematical model for simulating noise suppression of lined ejectors
NASA Technical Reports Server (NTRS)
Watson, Willie R.
1994-01-01
A mathematical model containing the essential features embodied in the noise suppression of lined ejectors is presented. Although some simplification of the physics is necessary to render the model mathematically tractable, the current model is the most versatile and technologically advanced at the current time. A system of linearized equations and the boundary conditions governing the sound field are derived starting from the equations of fluid dynamics. A nonreflecting boundary condition is developed. In view of the complex nature of the equations, a parametric study requires the use of numerical techniques and modern computers. A finite element algorithm that solves the differential equations coupled with the boundary condition is then introduced. The numerical method results in a matrix equation with several hundred thousand degrees of freedom that is solved efficiently on a supercomputer. The model is validated by comparing results either with exact solutions or with approximate solutions from other works. In each case, excellent correlations are obtained. The usefulness of the model as an optimization tool and the importance of variable impedance liners as a mechanism for achieving broadband suppression within a lined ejector are demonstrated.
ERIC Educational Resources Information Center
Bowyer, James
2015-01-01
Four components of the Kodály concept are delineated here: philosophy, objectives, essential tools, and lesson planning process. After outlining the tenets of the Kodály philosophy and objectives, the article presents the Kodály concept's essential tools, including singing, movable "do" solfège, rhythm syllables, hand signs, singing on…
ERIC Educational Resources Information Center
del Monte, Rick
2009-01-01
As students know, the tools in their backpacks can influence success. If they are off to math class, a good calculator is essential. When on their way to English class, a laptop is fundamental. Building facility executives too have tools in their backpacks to assure the successful creation of educational buildings. Only their tools are…
Analysis of the thermo-mechanical deformations in a hot forging tool by numerical simulation
NASA Astrophysics Data System (ADS)
L-Cancelos, R.; Varas, F.; Martín, E.; Viéitez, I.
2016-03-01
Although programs have been developed for the design of tools for hot forging, its design is still largely based on the experience of the tool maker. This obliges to build some test matrices and correct their errors to minimize distortions in the forged piece. This phase prior to mass production consumes time and material resources, which makes the final product more expensive. The forging tools are usually constituted by various parts made of different grades of steel, which in turn have different mechanical properties and therefore suffer different degrees of strain. Furthermore, the tools used in the hot forging are exposed to a thermal field that also induces strain or stress based on the degree of confinement of the piece. Therefore, the mechanical behaviour of the assembly is determined by the contact between the different pieces. The numerical simulation allows to analyse different configurations and anticipate possible defects before tool making, thus, reducing the costs of this preliminary phase. In order to improve the dimensional quality of the manufactured parts, the work presented here focuses on the application of a numerical model to a hot forging manufacturing process in order to predict the areas of the forging die subjected to large deformations. The thermo-mechanical model developed and implemented with free software (Code-Aster) includes the strains of thermal origin, strains during forge impact and contact effects. The numerical results are validated with experimental measurements in a tooling set that produces forged crankshafts for the automotive industry. The numerical results show good agreement with the experimental tests. Thereby, a very useful tool for the design of tooling sets for hot forging is achieved.
3D Dynamics of the Near-Surface Layer of the Ocean in the Presence of Freshwater Influx
NASA Astrophysics Data System (ADS)
Dean, C.; Soloviev, A.
2015-12-01
Freshwater inflow due to convective rains or river runoff produces lenses of freshened water in the near surface layer of the ocean. These lenses are localized in space and typically involve both salinity and temperature anomalies. Due to significant density anomalies, strong pressure gradients develop, which result in lateral spreading of freshwater lenses in a form resembling gravity currents. Gravity currents inherently involve three-dimensional dynamics. The gravity current head can include the Kelvin-Helmholtz billows with vertical density inversions. In this work, we have conducted a series of numerical experiments using computational fluid dynamics tools. These numerical simulations were designed to elucidate the relationship between vertical mixing and horizontal advection of salinity under various environmental conditions and potential impact on the pollution transport including oil spills. The near-surface data from the field experiments in the Gulf of Mexico during the SCOPE experiment were available for validation of numerical simulations. In particular, we observed a freshwater layer within a few-meter depth range and, in some cases, a density inversion at the edge of the freshwater lens, which is consistent with the results of numerical simulations. In conclusion, we discuss applicability of these results to the interpretation of Aquarius and SMOS sea surface salinity satellite measurements. The results of this study indicate that 3D dynamics of the near-surface layer of the ocean are essential in the presence of freshwater inflow.
NASA Technical Reports Server (NTRS)
Shu, Chi-Wang
1992-01-01
The nonlinear stability of compact schemes for shock calculations is investigated. In recent years compact schemes were used in various numerical simulations including direct numerical simulation of turbulence. However to apply them to problems containing shocks, one has to resolve the problem of spurious numerical oscillation and nonlinear instability. A framework to apply nonlinear limiting to a local mean is introduced. The resulting scheme can be proven total variation (1D) or maximum norm (multi D) stable and produces nice numerical results in the test cases. The result is summarized in the preprint entitled 'Nonlinearly Stable Compact Schemes for Shock Calculations', which was submitted to SIAM Journal on Numerical Analysis. Research was continued on issues related to two and three dimensional essentially non-oscillatory (ENO) schemes. The main research topics include: parallel implementation of ENO schemes on Connection Machines; boundary conditions; shock interaction with hydrogen bubbles, a preparation for the full combustion simulation; and direct numerical simulation of compressible sheared turbulence.
NASA Astrophysics Data System (ADS)
Leclercq, Sylvain; Lidbury, David; Van Dyck, Steven; Moinereau, Dominique; Alamo, Ana; Mazouzi, Abdou Al
2010-11-01
In nuclear power plants, materials may undergo degradation due to severe irradiation conditions that may limit their operational life. Utilities that operate these reactors need to quantify the ageing and the potential degradations of some essential structures of the power plant to ensure safe and reliable plant operation. So far, the material databases needed to take account of these degradations in the design and safe operation of installations mainly rely on long-term irradiation programs in test reactors as well as on mechanical or corrosion testing in specialized hot cells. Continuous progress in the physical understanding of the phenomena involved in irradiation damage and continuous progress in computer sciences have now made possible the development of multi-scale numerical tools able to simulate the effects of irradiation on materials microstructure. A first step towards this goal has been successfully reached through the development of the RPV-2 and Toughness Module numerical tools by the scientific community created around the FP6 PERFECT project. These tools allow to simulate irradiation effects on the constitutive behaviour of the reactor pressure vessel low alloy steel, and also on its failure properties. Relying on the existing PERFECT Roadmap, the 4 years Collaborative Project PERFORM 60 has mainly for objective to develop multi-scale tools aimed at predicting the combined effects of irradiation and corrosion on internals (austenitic stainless steels) and also to improve existing ones on RPV (bainitic steels). PERFORM 60 is based on two technical sub-projects: (i) RPV and (ii) internals. In addition to these technical sub-projects, the Users' Group and Training sub-project shall allow representatives of constructors, utilities, research organizations… from Europe, USA and Japan to receive the information and training to get their own appraisal on limits and potentialities of the developed tools. An important effort will also be made to teach young researchers in the field of materials' degradation. PERFORM 60 has officially started on March 1st, 2009 with 20 European organizations and Universities involved in the nuclear field.
Bishai, David; Sherry, Melissa; Pereira, Claudia C; Chicumbe, Sergio; Mbofana, Francisco; Boore, Amy; Smith, Monica; Nhambi, Leonel; Borse, Nagesh N
2016-01-01
This study describes the development of a self-audit tool for public health and the associated methodology for implementing a district health system self-audit tool that can provide quantitative data on how district governments perceive their performance of the essential public health functions. Development began with a consensus-building process to engage Ministry of Health and provincial health officers in Mozambique and Botswana. We then worked with lists of relevant public health functions as determined by these stakeholders to adapt a self-audit tool describing essential public health functions to each country's health system. We then piloted the tool across districts in both countries and conducted interviews with district health personnel to determine health workers' perception of the usefulness of the approach. Country stakeholders were able to develop consensus around 11 essential public health functions that were relevant in each country. Pilots of the self-audit tool enabled the tool to be effectively shortened. Pilots also disclosed a tendency to upcode during self-audits that was checked by group deliberation. Convening sessions at the district enabled better attendance and representative deliberation. Instant feedback from the audit was a feature that 100% of pilot respondents found most useful. The development of metrics that provide feedback on public health performance can be used as an aid in the self-assessment of health system performance at the district level. Measurements of practice can open the door to future applications for practice improvement and research into the determinants and consequences of better public health practice. The current tool can be assessed for its usefulness to district health managers in improving their public health practice. The tool can also be used by the Ministry of Health or external donors in the African region for monitoring the district-level performance of the essential public health functions.
Bishai, David; Sherry, Melissa; Pereira, Claudia C.; Chicumbe, Sergio; Mbofana, Francisco; Boore, Amy; Smith, Monica; Nhambi, Leonel; Borse, Nagesh N.
2018-01-01
Introduction This study describes the development of a self-audit tool for public health and the associated methodology for implementing a district health system self-audit tool that can provide quantitative data on how district governments perceive their own performance of the essential public health functions. Methods Development began with a consensus building process to engage Ministry of Health and provincial health officers in Mozambique and Botswana. We then worked with lists of relevant public health functions as determined by these stakeholders to adapt a self-audit tool describing essential public health functions to each country’s health system. We then piloted the tool across districts in both countries and conducted interviews with district health personnel to determine health workers’ perception of the usefulness of the approach. Results Country stakeholders were able to develop consensus around eleven essential public health functions that were relevant in each country. Pilots of the self-audit tool enabled the tool to be effectively shortened. Pilots also disclosed a tendency to up code during self-audits that was checked by group deliberation. Convening sessions at the district enabled better attendance and representative deliberation. Instant feedback from the audit was a feature that 100% of pilot respondents found most useful. Conclusions The development of metrics that provide feedback on public health performance can be used as an aid in the self-assessment of health system performance at the district level. Measurements of practice can open the door to future applications for practice improvement and research into the determinants and consequences of better public health practice. The current tool can be assessed for its usefulness to district health managers in improving their public health practice. The tool can also be used by ministry of health or external donors in the African region for monitoring the district level performance of the essential public health functions. PMID:27682727
Predictive Modeling of Risk Associated with Temperature Extremes over Continental US
NASA Astrophysics Data System (ADS)
Kravtsov, S.; Roebber, P.; Brazauskas, V.
2016-12-01
We build an extremely statistically accurate, essentially bias-free empirical emulator of atmospheric surface temperature and apply it for meteorological risk assessment over the domain of continental US. The resulting prediction scheme achieves an order-of-magnitude or larger gain of numerical efficiency compared with the schemes based on high-resolution dynamical atmospheric models, leading to unprecedented accuracy of the estimated risk distributions. The empirical model construction methodology is based on our earlier work, but is further modified to account for the influence of large-scale, global climate change on regional US weather and climate. The resulting estimates of the time-dependent, spatially extended probability of temperature extremes over the simulation period can be used as a risk management tool by insurance companies and regulatory governmental agencies.
Visualization of terahertz surface waves propagation on metal foils
Wang, Xinke; Wang, Sen; Sun, Wenfeng; Feng, Shengfei; Han, Peng; Yan, Haitao; Ye, Jiasheng; Zhang, Yan
2016-01-01
Exploitation of surface plasmonic devices (SPDs) in the terahertz (THz) band is always beneficial for broadening the application potential of THz technologies. To clarify features of SPDs, a practical characterization means is essential for accurately observing the complex field distribution of a THz surface wave (TSW). Here, a THz digital holographic imaging system is employed to coherently exhibit temporal variations and spectral properties of TSWs activated by a rectangular or semicircular slit structure on metal foils. Advantages of the imaging system are comprehensively elucidated, including the exclusive measurement of TSWs and fall-off of the time consumption. Numerical simulations of experimental procedures further verify the imaging measurement accuracy. It can be anticipated that this imaging system will provide a versatile tool for analyzing the performance and principle of SPDs. PMID:26729652
Fibro/Adipogenic Progenitors (FAPs): Isolation by FACS and Culture.
Low, Marcela; Eisner, Christine; Rossi, Fabio
2017-01-01
Fibro/adipogenic progenitors (FAPs ) are tissue-resident mesenchymal stromal cells (MSCs). Current literature supports a role for these cells in the homeostasis and repair of multiple tissues suggesting that FAPs may have extensive therapeutic potential in the treatment of numerous diseases. In this context, it is crucial to establish efficient and reproducible procedures to purify FAP populations from various tissues. Here, we describe a protocol for the isolation and cell culture of FAPs from murine skeletal muscle using fluorescence -activated cell sorting (FACS), which is particularly useful for experiments where high cell purity is an essential requirement. Identification, isolation, and cell culture of FAPs represent powerful tools that will help us to understand the role of these cells in different conditions and facilitate the development of safe and effective new treatments for diseases.
Temperature and composition profile during double-track laser cladding of H13 tool steel
NASA Astrophysics Data System (ADS)
He, X.; Yu, G.; Mazumder, J.
2010-01-01
Multi-track laser cladding is now applied commercially in a range of industries such as automotive, mining and aerospace due to its diversified potential for material processing. The knowledge of temperature, velocity and composition distribution history is essential for a better understanding of the process and subsequent microstructure evolution and properties. Numerical simulation not only helps to understand the complex physical phenomena and underlying principles involved in this process, but it can also be used in the process prediction and system control. The double-track coaxial laser cladding with H13 tool steel powder injection is simulated using a comprehensive three-dimensional model, based on the mass, momentum, energy conservation and solute transport equation. Some important physical phenomena, such as heat transfer, phase changes, mass addition and fluid flow, are taken into account in the calculation. The physical properties for a mixture of solid and liquid phase are defined by treating it as a continuum media. The velocity of the laser beam during the transition between two tracks is considered. The evolution of temperature and composition of different monitoring locations is simulated.
NASA Astrophysics Data System (ADS)
Follette, K.; McCarthy, D.
2012-08-01
Current trends in the teaching of high school and college science avoid numerical engagement because nearly all students lack basic arithmetic skills and experience anxiety when encountering numbers. Nevertheless, such skills are essential to science and vital to becoming savvy consumers, citizens capable of recognizing pseudoscience, and discerning interpreters of statistics in ever-present polls, studies, and surveys in which our society is awash. Can a general-education collegiate course motivate students to value numeracy and to improve their quantitative skills in what may well be their final opportunity in formal education? We present a tool to assess whether skills in numeracy/quantitative literacy can be fostered and improved in college students through the vehicle of non-major introductory courses in astronomy. Initial classroom applications define the magnitude of this problem and indicate that significant improvements are possible. Based on these initial results we offer this tool online and hope to collaborate with other educators, both formal and informal, to develop effective mechanisms for encouraging all students to value and improve their skills in basic numeracy.
Sherman, Lawrence; Clement, Peter T; Cherian, Meena N; Ndayimirije, Nestor; Noel, Luc; Dahn, Bernice; Gwenigale, Walter T; Kushner, Adam L
2011-01-01
To document infrastructure, personnel, procedures performed, and supplies and equipment available at all county hospitals in Liberia using the World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care. Survey of county hospitals using the World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care. Sixteen county hospitals in Liberia. Infrastructure, personnel, procedures performed, and supplies and equipment available. Uniformly, gross deficiencies in infrastructure, personnel, and supplies and equipment were identified. The World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care was useful in identifying baseline emergency and surgical conditions for evidenced-based planning. To achieve the Poverty Reduction Strategy and delivery of the Basic Package of Health and Social Welfare Services, additional resources and manpower are needed to improve surgical and anesthetic care.
The Web as an educational tool for/in learning/teaching bioinformatics statistics.
Oliver, J; Pisano, M E; Alonso, T; Roca, P
2005-12-01
Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.
Nicholas, Dequina; Proctor, Elizabeth A; Raval, Forum M; Ip, Blanche C; Habib, Chloe; Ritou, Eleni; Grammatopoulos, Tom N; Steenkamp, Devin; Dooms, Hans; Apovian, Caroline M; Lauffenburger, Douglas A; Nikolajczyk, Barbara S
2017-01-01
Numerous studies show that mitochondrial energy generation determines the effectiveness of immune responses. Furthermore, changes in mitochondrial function may regulate lymphocyte function in inflammatory diseases like type 2 diabetes. Analysis of lymphocyte mitochondrial function has been facilitated by introduction of 96-well format extracellular flux (XF96) analyzers, but the technology remains imperfect for analysis of human lymphocytes. Limitations in XF technology include the lack of practical protocols for analysis of archived human cells, and inadequate data analysis tools that require manual quality checks. Current analysis tools for XF outcomes are also unable to automatically assess data quality and delete untenable data from the relatively high number of biological replicates needed to power complex human cell studies. The objectives of work presented herein are to test the impact of common cellular manipulations on XF outcomes, and to develop and validate a new automated tool that objectively analyzes a virtually unlimited number of samples to quantitate mitochondrial function in immune cells. We present significant improvements on previous XF analyses of primary human cells that will be absolutely essential to test the prediction that changes in immune cell mitochondrial function and fuel sources support immune dysfunction in chronic inflammatory diseases like type 2 diabetes.
Brown, Raymond J.
1977-01-01
The present invention relates to a tool setting device for use with numerically controlled machine tools, such as lathes and milling machines. A reference position of the machine tool relative to the workpiece along both the X and Y axes is utilized by the control circuit for driving the tool through its program. This reference position is determined for both axes by displacing a single linear variable displacement transducer (LVDT) with the machine tool through a T-shaped pivotal bar. The use of the T-shaped bar allows the cutting tool to be moved sequentially in the X or Y direction for indicating the actual position of the machine tool relative to the predetermined desired position in the numerical control circuit by using a single LVDT.
Learning GIS and exploring geolocated data with the all-in-one Geolokit toolbox for Google Earth
NASA Astrophysics Data System (ADS)
Watlet, A.; Triantafyllou, A.; Bastin, C.
2016-12-01
GIS software are today's essential tools to gather and visualize geological data, to apply spatial and temporal analysis and finally, to create and share interactive maps for further investigations in geosciences. Such skills are especially essential to learn for students who go through fieldtrips, samples collections or field experiments. However, time is generally missing to teach in detail all the aspects of visualizing geolocated geoscientific data. For these purposes, we developed Geolokit: a lightweight freeware dedicated to geodata visualization and written in Python, a high-level, cross-platform programming language. Geolokit software is accessible through a graphical user interface, designed to run in parallel with Google Earth, benefitting from the numerous interactive capabilities. It is designed as a very user-friendly toolbox that allows `geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to visualize these into the Google Earth environment using KML code; with no require of third party software, except Google Earth itself. Geolokit comes with a large number of geosciences labels, symbols, colours and placemarks and is applicable to display several types of geolocated data, including: Multi-points datasets Automatically computed contours of multi-points datasets via several interpolation methods Discrete planar and linear structural geology data in 2D or 3D supporting large range of structures input format Clustered stereonets and rose diagrams 2D cross-sections as vertical sections Georeferenced maps and grids with user defined coordinates Field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS In the end, Geolokit is helpful for quickly visualizing and exploring data without losing too much time in the numerous capabilities of GIS software suites. We are looking for students and teachers to discover all the functionalities of Geolokit. As this project is under development and planned to be open source, we are definitely looking to discussions regarding particular needs or ideas, and to contributions in the Geolokit project.
Use of ePortfolio Presentations in a Baccalaureate Nursing Program
ERIC Educational Resources Information Center
Feather, Rebecca; Ricci, Margaret
2014-01-01
Portfolios are an essential tool for demonstrating professional accomplishments and documenting professional growth in a variety of professions. Because of the competitive job market for new graduate nurses in health care, the development and use of an ePortfolio can be an essential tool for the application and interview process. The purpose of…
Numerical tool for tsunami risk assessment in the southern coast of Dominican Republic
NASA Astrophysics Data System (ADS)
Macias Sanchez, J.; Llorente Isidro, M.; Ortega, S.; Gonzalez Vida, J. M., Sr.; Castro, M. J.
2016-12-01
The southern coast of Dominican Republic is a very populated region, with several important cities including Santo Domingo, its capital. Important activities are rooted in the southern coast including tourism, industry, commercial ports, and, energy facilities, among others. According to historical reports, it has been impacted by big earthquakes accompanied by tsunamis as in Azua in 1751 and recently Pedernales in 2010, but their sources are not clearly identified. The aim of the present work is to develop a numerical tool to simulate the impact in the southern coast of the Dominican Republic of tsunamis generated in the Caribbean Sea. This tool, based on the Tsunami-HySEA model from EDANYA group (University of Malaga, Spain), could be used in the framework of a Tsunami Early Warning Systems due the very short computing times when only propagation is computed or it could be used to assess inundation impact, computing inundation with a initial 5 meter resolution. Numerical results corresponding to three theoretical sources are used to test the numerical tool.
Validation of Digital Spiral Analysis as Outcome Parameter for Clinical Trials in Essential Tremor
Haubenberger, Dietrich; Kalowitz, Daniel; Nahab, Fatta B.; Toro, Camilo; Ippolito, Dominic; Luckenbaugh, David A.; Wittevrongel, Loretta; Hallett, Mark
2014-01-01
Essential tremor, one of the most prevalent movement disorders, is characterized by kinetic and postural tremor affecting activities of daily living. Spiral drawing is commonly used to visually rate tremor intensity, as part of the routine clinical assessment of tremor and as a tool in clinical trials. We present a strategy to quantify tremor severity from spirals drawn on a digitizing tablet. We validate our method against a well-established visual spiral rating method and compare both methods on their capacity to capture a therapeutic effect, as defined by the change in clinical essential tremor rating scale after an ethanol challenge. Fifty-four Archimedes spirals were drawn using a digitizing tablet by nine ethanol-responsive patients with essential tremor before and at five consecutive time points after the administration of ethanol in a standardized treatment intervention. Quantitative spiral tremor severity was estimated from the velocity tremor peak amplitude after numerical derivation and Fourier transformation of pen-tip positions. In randomly ordered sets, spirals were scored by seven trained raters, using Bain and Findley’s 0 to 10 rating scale. Computerized scores correlated with visual ratings (P < 0.0001). The correlation was significant at each time point before and after ethanol (P < 0.005). Quantitative ratings provided better sensitivity than visual rating to capture the effects of an ethanol challenge (P < 0.05). Using a standardized treatment approach, we were able to demonstrate that spirography time-series analysis is a valid, reliable method to document tremor intensity and a more sensitive measure for small effects than currently available visual spiral rating methods. PMID:21714004
Pseudo-shock waves and their interactions in high-speed intakes
NASA Astrophysics Data System (ADS)
Gnani, F.; Zare-Behtash, H.; Kontis, K.
2016-04-01
In an air-breathing engine the flow deceleration from supersonic to subsonic conditions takes places inside the isolator through a gradual compression consisting of a series of shock waves. The wave system, referred to as a pseudo-shock wave or shock train, establishes the combustion chamber entrance conditions, and therefore influences the performance of the entire propulsion system. The characteristics of the pseudo-shock depend on a number of variables which make this flow phenomenon particularly challenging to be analysed. Difficulties in experimentally obtaining accurate flow quantities at high speeds and discrepancies of numerical approaches with measured data have been readily reported. Understanding the flow physics in the presence of the interaction of numerous shock waves with the boundary layer in internal flows is essential to developing methods and control strategies. To counteract the negative effects of shock wave/boundary layer interactions, which are responsible for the engine unstart process, multiple flow control methodologies have been proposed. Improved analytical models, advanced experimental methodologies and numerical simulations have allowed a more in-depth analysis of the flow physics. The present paper aims to bring together the main results, on the shock train structure and its associated phenomena inside isolators, studied using the aforementioned tools. Several promising flow control techniques that have more recently been applied to manipulate the shock wave/boundary layer interaction are also examined in this review.
GIS-MODFLOW: Ein kleines OpenSource-Werkzeug zur Anbindung von GIS-Daten an MODFLOW
NASA Astrophysics Data System (ADS)
Gossel, Wolfgang
2013-06-01
The numerical model MODFLOW (Harbaugh 2005) is an efficient and up-to-date tool for groundwater flow modelling. On the other hand, Geo-Information-Systems (GIS) provide useful tools for data preparation and visualization that can also be incorporated in numerical groundwater modelling. An interface between both would therefore be useful for many hydrogeological investigations. To date, several integrated stand-alone tools have been developed that rely on MODFLOW, MODPATH and transport modelling tools. Simultaneously, several open source-GIS codes were developed to improve functionality and ease of use. These GIS tools can be used as pre- and post-processors of the numerical model MODFLOW via a suitable interface. Here we present GIS-MODFLOW as an open-source tool that provides a new universal interface by using the ESRI ASCII GRID data format that can be converted into MODFLOW input data. This tool can also treat MODFLOW results. Such a combination of MODFLOW and open-source GIS opens new possibilities to render groundwater flow modelling, and simulation results, available to larger circles of hydrogeologists.
Verifying the error bound of numerical computation implemented in computer systems
Sawada, Jun
2013-03-12
A verification tool receives a finite precision definition for an approximation of an infinite precision numerical function implemented in a processor in the form of a polynomial of bounded functions. The verification tool receives a domain for verifying outputs of segments associated with the infinite precision numerical function. The verification tool splits the domain into at least two segments, wherein each segment is non-overlapping with any other segment and converts, for each segment, a polynomial of bounded functions for the segment to a simplified formula comprising a polynomial, an inequality, and a constant for a selected segment. The verification tool calculates upper bounds of the polynomial for the at least two segments, beginning with the selected segment and reports the segments that violate a bounding condition.
NASA Astrophysics Data System (ADS)
Henderson, Michael
1997-08-01
The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.
Essential Oils in Foods: From Ancient Times to the 21st Century.
Sendra, Esther
2016-06-14
Medicinal plants and culinary herbs have been used since ancient times. Essential oils (EO) are a mixture of numerous compounds, mainly terpenes, alcohols, acids, esters, epoxides, aldehydes, ketones,aminesandsulfides,thatareprobablyproducedbyplantsasaresponsetostress[1].[...].
NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL
To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...
Computational tool for optimizing the essential oils utilization in inhibiting the bacterial growth
El-Attar, Noha E; Awad, Wael A
2017-01-01
Day after day, the importance of relying on nature in many fields such as food, medical, pharmaceutical industries, and others is increasing. Essential oils (EOs) are considered as one of the most significant natural products for use as antimicrobials, antioxidants, antitumorals, and anti-inflammatories. Optimizing the usage of EOs is a big challenge faced by the scientific researchers because of the complexity of chemical composition of every EO, in addition to the difficulties to determine the best in inhibiting the bacterial activity. The goal of this article is to present a new computational tool based on two methodologies: reduction by using rough sets and optimization with particle swarm optimization. The developed tool dubbed as Essential Oil Reduction and Optimization Tool is applied on 24 types of EOs that have been tested toward 17 different species of bacteria. PMID:28919787
Weirick, Tyler; John, David; Uchida, Shizuka
2017-03-01
Maintaining the consistency of genomic annotations is an increasingly complex task because of the iterative and dynamic nature of assembly and annotation, growing numbers of biological databases and insufficient integration of annotations across databases. As information exchange among databases is poor, a 'novel' sequence from one reference annotation could be annotated in another. Furthermore, relationships to nearby or overlapping annotated transcripts are even more complicated when using different genome assemblies. To better understand these problems, we surveyed current and previous versions of genomic assemblies and annotations across a number of public databases containing long noncoding RNA. We identified numerous discrepancies of transcripts regarding their genomic locations, transcript lengths and identifiers. Further investigation showed that the positional differences between reference annotations of essentially the same transcript could lead to differences in its measured expression at the RNA level. To aid in resolving these problems, we present the algorithm 'Universal Genomic Accession Hash (UGAHash)' and created an open source web tool to encourage the usage of the UGAHash algorithm. The UGAHash web tool (http://ugahash.uni-frankfurt.de) can be accessed freely without registration. The web tool allows researchers to generate Universal Genomic Accessions for genomic features or to explore annotations deposited in the public databases of the past and present versions. We anticipate that the UGAHash web tool will be a valuable tool to check for the existence of transcripts before judging the newly discovered transcripts as novel. © The Author 2016. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Non-invasive imaging of actinic cheilitis and squamous cell carcinoma of the lip.
Lupu, Mihai; Caruntu, Ana; Caruntu, Constantin; Boda, Daniel; Moraru, Liliana; Voiculescu, Vlad; Bastian, Alexandra
2018-05-01
An early diagnosis is of overwhelming importance for the management and prognosis of mucocutaneous cancer. Actinic cheilitis (AC), defined by the clonal expansion of genomically unstable keratinocytes, is the most common potentially malignant lesion affecting the lips. Squamous cell carcinoma (SCC) is the most frequent oral malignancy, and there is strong evidence that the majority of the SCCs of the lip originate from AC. There is considerable difficulty in discerning between dysplasia and invasive carcinomas solely on a clinical basis. Although dermoscopy has become an essential tool for skin tumor evaluation, reflectance confocal microscopy (RCM) is a non-invasive imaging technology that has proved itself extremely useful in the diagnosis and monitoring of several skin diseases, including AC and SCC. The present study aimed to re-emphasize the usefulness of RCM in the early detection of malignant transformation, using AC and SCC of the lips as working examples. Due to the apparent innocuousness of AC for numerous patients, it is not possible to overstress the importance of a correct and early diagnosis, proper treatment and long-term patient follow-up as being essential for preventing the progression to lip SCC, or for its timely diagnosis.
Non-invasive imaging of actinic cheilitis and squamous cell carcinoma of the lip
Lupu, Mihai; Caruntu, Ana; Caruntu, Constantin; Boda, Daniel; Moraru, Liliana; Voiculescu, Vlad; Bastian, Alexandra
2018-01-01
An early diagnosis is of overwhelming importance for the management and prognosis of mucocutaneous cancer. Actinic cheilitis (AC), defined by the clonal expansion of genomically unstable keratinocytes, is the most common potentially malignant lesion affecting the lips. Squamous cell carcinoma (SCC) is the most frequent oral malignancy, and there is strong evidence that the majority of the SCCs of the lip originate from AC. There is considerable difficulty in discerning between dysplasia and invasive carcinomas solely on a clinical basis. Although dermoscopy has become an essential tool for skin tumor evaluation, reflectance confocal microscopy (RCM) is a non-invasive imaging technology that has proved itself extremely useful in the diagnosis and monitoring of several skin diseases, including AC and SCC. The present study aimed to re-emphasize the usefulness of RCM in the early detection of malignant transformation, using AC and SCC of the lips as working examples. Due to the apparent innocuousness of AC for numerous patients, it is not possible to overstress the importance of a correct and early diagnosis, proper treatment and long-term patient follow-up as being essential for preventing the progression to lip SCC, or for its timely diagnosis. PMID:29725529
Spike: Artificial intelligence scheduling for Hubble space telescope
NASA Technical Reports Server (NTRS)
Johnston, Mark; Miller, Glenn; Sponsler, Jeff; Vick, Shon; Jackson, Robert
1990-01-01
Efficient utilization of spacecraft resources is essential, but the accompanying scheduling problems are often computationally intractable and are difficult to approximate because of the presence of numerous interacting constraints. Artificial intelligence techniques were applied to the scheduling of the NASA/ESA Hubble Space Telescope (HST). This presents a particularly challenging problem since a yearlong observing program can contain some tens of thousands of exposures which are subject to a large number of scientific, operational, spacecraft, and environmental constraints. New techniques were developed for machine reasoning about scheduling constraints and goals, especially in cases where uncertainty is an important scheduling consideration and where resolving conflicts among conflicting preferences is essential. These technique were utilized in a set of workstation based scheduling tools (Spike) for HST. Graphical displays of activities, constraints, and schedules are an important feature of the system. High level scheduling strategies using both rule based and neural network approaches were developed. While the specific constraints implemented are those most relevant to HST, the framework developed is far more general and could easily handle other kinds of scheduling problems. The concept and implementation of the Spike system are described along with some experiments in adapting Spike to other spacecraft scheduling domains.
Working with the superabrasives industry to optimize tooling for grinding brittle materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, J.S.; Piscotty, M.A.; Blaedel, K.L.
1996-05-01
The optics manufacturing industry is undertaking a significant modernization, as computer-numeric-controlled (CNC) equipment is joining or replacing open-loop equipment and hand lapping/polishing on the shop floor. Several prototype CNC lens grinding platforms employing ring tools are undergoing development and demonstration at the Center for Optics Manufacturing in Rochester, NY, and several machine tool companies have CNC product lines aimed at the optics industry. Benefits to using CNC ring tool grinding equipment include: essentially unlimited flexibility in selecting radii of curvature without special radiused tooling, the potential for CIM linkages to CAD workstations, and the cultural shift from craftsmen with undocumentedmore » procedures to CNC machine operators employing computerized routines for process control. In recent years, these developments, have inspired a number of US optics companies to invest in CNC equipment and participate in process development activities involving bound diamond tooling. This modernization process,extends beyond large optics companies that have historically embraced advanced equipment, to also include smaller optical shops where a shift to CNC equipment requires a significant company commitment. This paper addresses our efforts to optimize fine grinding wheels to support the new generation of CNC equipment. We begin with a discussion of how fine grinding fits into the optical production process, and then describe an initiative for improving the linkage between optics industry and the grinding wheel industry. For the purposes of this paper, we define fine wheels to have diamond sizes below 20 micrometers, which includes wheels used for what is sometimes called medium grinding (e.g. 10-20 micrometers diamond) and for fine grinding (e.g. 2-4 micrometers diamond).« less
1985-10-01
83K0385 FINAL REPORT D Vol. 4 00 THERMAL EFFECTS ON THE ACCURACY OF LD NUME" 1ICALLY CONTROLLED MACHINE TOOLS PREPARED BY I Raghunath Venugopal and M...OF NUMERICALLY CONTROLLED MACHINE TOOLS 12 PERSONAL AJ’HOR(S) Venunorial, Raghunath and M. M. Barash 13a TYPE OF REPORT 13b TIME COVERED 14 DATE OF...TOOLS Prepared by Raghunath Venugopal and M. M. Barash Accesion For Unannounced 0 Justification ........................................... October 1085
Digital Technology Snapshot of the Literacy and Essential Skills Field 2013. Summary Report
ERIC Educational Resources Information Center
Trottier, Vicki
2013-01-01
From January to March 2013, "Canadian Literacy and Learning Network" (CLLN) conducted a snapshot to provide information about how digital technology tools are being used in the Literacy and Essential Skills (L/ES) field. The snapshot focused primarily on digital tools and activities that meet the organizational needs of provincial and…
TRIM—3D: a three-dimensional model for accurate simulation of shallow water flow
Casulli, Vincenzo; Bertolazzi, Enrico; Cheng, Ralph T.
1993-01-01
A semi-implicit finite difference formulation for the numerical solution of three-dimensional tidal circulation is discussed. The governing equations are the three-dimensional Reynolds equations in which the pressure is assumed to be hydrostatic. A minimal degree of implicitness has been introduced in the finite difference formula so that the resulting algorithm permits the use of large time steps at a minimal computational cost. This formulation includes the simulation of flooding and drying of tidal flats, and is fully vectorizable for an efficient implementation on modern vector computers. The high computational efficiency of this method has made it possible to provide the fine details of circulation structure in complex regions that previous studies were unable to obtain. For proper interpretation of the model results suitable interactive graphics is also an essential tool.
Complexity-entropy causality plane: A useful approach for distinguishing songs
NASA Astrophysics Data System (ADS)
Ribeiro, Haroldo V.; Zunino, Luciano; Mendes, Renio S.; Lenzi, Ervin K.
2012-04-01
Nowadays we are often faced with huge databases resulting from the rapid growth of data storage technologies. This is particularly true when dealing with music databases. In this context, it is essential to have techniques and tools able to discriminate properties from these massive sets. In this work, we report on a statistical analysis of more than ten thousand songs aiming to obtain a complexity hierarchy. Our approach is based on the estimation of the permutation entropy combined with an intensive complexity measure, building up the complexity-entropy causality plane. The results obtained indicate that this representation space is very promising to discriminate songs as well as to allow a relative quantitative comparison among songs. Additionally, we believe that the here-reported method may be applied in practical situations since it is simple, robust and has a fast numerical implementation.
Cultural Resources Collection Analysis Albeni Falls Project, Northern Idaho.
1987-01-01
numerous pestles and mortars, bolas stones, nephrite adzes, notched pebbles or net weights, an atlatl weight, and several unique incised and carved...tools including flaked and ground stone was documented; bifacial tools, drills, gravers, scrapers, numerous pestles and mortars, bolas stones, nephrite...59 27 Pestles ............................................................ 60 28 Zoomorphic pestle (?) fragment
Enhance your team-based qualitative research.
Fernald, Douglas H; Duclos, Christine W
2005-01-01
Qualitative research projects often involve the collaborative efforts of a research team. Challenges inherent in teamwork include changes in membership and differences in analytical style, philosophy, training, experience, and skill. This article discusses teamwork issues and tools and techniques used to improve team-based qualitative research. We drew on our experiences in working on numerous projects of varying, size, duration, and purpose. Through trials of different tools and techniques, expert consultation, and review of the literature, we learned to improve how we build teams, manage information, and disseminate results. Attention given to team members and team processes is as important as choosing appropriate analytical tools and techniques. Attentive team leadership, commitment to early and regular team meetings, and discussion of roles, responsibilities, and expectations all help build more effective teams and establish clear norms. As data are collected and analyzed, it is important to anticipate potential problems from differing skills and styles, and how information and files are managed. Discuss analytical preferences and biases and set clear guidelines and practices for how data will be analyzed and handled. As emerging ideas and findings disperse across team members, common tools (such as summary forms and data grids), coding conventions, intermediate goals or products, and regular documentation help capture essential ideas and insights. In a team setting, little should be left to chance. This article identifies ways to improve team-based qualitative research with more a considered and systematic approach. Qualitative researchers will benefit from further examination and discussion of effective, field-tested, team-based strategies.
Visualization of small scale structures on high resolution DEMs
NASA Astrophysics Data System (ADS)
Kokalj, Žiga; Zakšek, Klemen; Pehani, Peter; Čotar, Klemen; Oštir, Krištof
2015-04-01
Knowledge on the terrain morphology is very important for observation of numerous processes and events and digital elevation models are therefore one of the most important datasets in geographic analyses. Furthermore, recognition of natural and anthropogenic microrelief structures, which can be observed on detailed terrain models derived from aerial laser scanning (lidar) or structure-from-motion photogrammetry, is of paramount importance in many applications. In this paper we thus examine and evaluate methods of raster lidar data visualization for the determination (recognition) of microrelief features and present a series of strategies to assist selecting the preferred visualization of choice for structures of various shapes and sizes, set in varied landscapes. Often the answer is not definite and more frequently a combination of techniques has to be used to map a very diverse landscape. Researchers can only very recently benefit from free software for calculation of advanced visualization techniques. These tools are often difficult to understand, have numerous options that confuse the user, or require and produce non-standard data formats, because they were written for specific purposes. We therefore designed the Relief Visualization Toolbox (RVT) as a free, easy-to-use, standalone application to create visualisations from high-resolution digital elevation data. It is tailored for the very beginners in relief interpretation, but it can also be used by more advanced users in data processing and geographic information systems. It offers a range of techniques, such as simple hillshading and its derivatives, slope gradient, trend removal, positive and negative openness, sky-view factor, and anisotropic sky-view factor. All included methods have been proven to be effective for detection of small scale features and the default settings are optimised to accomplish this task. However, the usability of the tool goes beyond computation for visualization purposes, as sky-view factor, for example, is an essential variable in many fields, e.g. in meteorology. RVT produces two types of results: 1) the original files have a full range of values and are intended for further analyses in geographic information systems, 2) the simplified versions are histogram stretched for visualization purposes and saved as 8-bit GeoTIFF files. This means that they can be explored in non-GIS software, e.g. with simple picture viewers, which is essential when a larger community of non-specialists needs to be considered, e.g. in public collaborative projects. The tool recognizes all frequently used single band raster formats and supports elevation raster file data conversion.
A Figure of Merit: Quantifying the Probability of a Nuclear Reactor Accident.
Wellock, Thomas R
In recent decades, probabilistic risk assessment (PRA) has become an essential tool in risk analysis and management in many industries and government agencies. The origins of PRA date to the 1975 publication of the U.S. Nuclear Regulatory Commission's (NRC) Reactor Safety Study led by MIT professor Norman Rasmussen. The "Rasmussen Report" inspired considerable political and scholarly disputes over the motives behind it and the value of its methods and numerical estimates of risk. The Report's controversies have overshadowed the deeper technical origins of risk assessment. Nuclear experts had long sought to express risk in a "figure of merit" to verify the safety of weapons and, later, civilian reactors. By the 1970s, technical advances in PRA gave the methodology the potential to serve political ends, too. The Report, it was hoped, would prove nuclear power's safety to a growing chorus of critics. Subsequent attacks on the Report's methods and numerical estimates damaged the NRC's credibility. PRA's fortunes revived when the 1979 Three Mile Island accident demonstrated PRA's potential for improving the safety of nuclear power and other technical systems. Nevertheless, the Report's controversies endure in mistrust of PRA and its experts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schanen, Michel; Marin, Oana; Zhang, Hong
Adjoints are an important computational tool for large-scale sensitivity evaluation, uncertainty quantification, and derivative-based optimization. An essential component of their performance is the storage/recomputation balance in which efficient checkpointing methods play a key role. We introduce a novel asynchronous two-level adjoint checkpointing scheme for multistep numerical time discretizations targeted at large-scale numerical simulations. The checkpointing scheme combines bandwidth-limited disk checkpointing and binomial memory checkpointing. Based on assumptions about the target petascale systems, which we later demonstrate to be realistic on the IBM Blue Gene/Q system Mira, we create a model of the expected performance of our checkpointing approach and validatemore » it using the highly scalable Navier-Stokes spectralelement solver Nek5000 on small to moderate subsystems of the Mira supercomputer. In turn, this allows us to predict optimal algorithmic choices when using all of Mira. We also demonstrate that two-level checkpointing is significantly superior to single-level checkpointing when adjoining a large number of time integration steps. To our knowledge, this is the first time two-level checkpointing had been designed, implemented, tuned, and demonstrated on fluid dynamics codes at large scale of 50k+ cores.« less
Experimental validation of ultrasonic NDE simulation software
NASA Astrophysics Data System (ADS)
Dib, Gerges; Larche, Michael; Diaz, Aaron A.; Crawford, Susan L.; Prowant, Matthew S.; Anderson, Michael T.
2016-02-01
Computer modeling and simulation is becoming an essential tool for transducer design and insight into ultrasonic nondestructive evaluation (UT-NDE). As the popularity of simulation tools for UT-NDE increases, it becomes important to assess their reliability to model acoustic responses from defects in operating components and provide information that is consistent with in-field inspection data. This includes information about the detectability of different defect types for a given UT probe. Recently, a cooperative program between the Electrical Power Research Institute and the U.S. Nuclear Regulatory Commission was established to validate numerical modeling software commonly used for simulating UT-NDE of nuclear power plant components. In the first phase of this cooperative, extensive experimental UT measurements were conducted on machined notches with varying depth, length, and orientation in stainless steel plates. Then, the notches were modeled in CIVA, a semi-analytical NDE simulation platform developed by the French Commissariat a l'Energie Atomique, and their responses compared with the experimental measurements. Discrepancies between experimental and simulation results are due to either improper inputs to the simulation model, or to incorrect approximations and assumptions in the numerical models. To address the former, a variation study was conducted on the different parameters that are required as inputs for the model, specifically the specimen and transducer properties. Then, the ability of simulations to give accurate predictions regarding the detectability of the different defects was demonstrated. This includes the results in terms of the variations in defect amplitude indications, and the ratios between tip diffracted and specular signal amplitudes.
A manifold independent approach to understanding transport in stochastic dynamical systems
NASA Astrophysics Data System (ADS)
Bollt, Erik M.; Billings, Lora; Schwartz, Ira B.
2002-12-01
We develop a new collection of tools aimed at studying stochastically perturbed dynamical systems. Specifically, in the setting of bi-stability, that is a two-attractor system, it has previously been numerically observed that a small noise volume is sufficient to destroy would be zero-noise case barriers in the phase space (pseudo-barriers), thus creating a pre-heteroclinic tangency chaos-like behavior. The stochastic dynamical system has a corresponding Frobenius-Perron operator with a stochastic kernel, which describes how densities of initial conditions move under the noisy map. Thus in studying the action of the Frobenius-Perron operator, we learn about the transport of the map; we have employed a Galerkin-Ulam-like method to project the Frobenius-Perron operator onto a discrete basis set of characteristic functions to highlight this action localized in specified regions of the phase space. Graph theoretic methods allow us to re-order the resulting finite dimensional Markov operator approximation so as to highlight the regions of the original phase space which are particularly active pseudo-barriers of the stochastic dynamics. Our toolbox allows us to find: (1) regions of high activity of transport, (2) flux across pseudo-barriers, and also (3) expected time of escape from pseudo-basins. Some of these quantities are also possible via the manifold dependent stochastic Melnikov method, but Melnikov only applies to a very special class of models for which the unperturbed homoclinic orbit is available. Our methods are unique in that they can essentially be considered as a “black-box” of tools which can be applied to a wide range of stochastic dynamical systems in the absence of a priori knowledge of manifold structures. We use here a model of childhood diseases to showcase our methods. Our tools will allow us to make specific observations of: (1) loss of reducibility between basins with increasing noise, (2) identification in the phase space of active regions of stochastic transport, (3) stochastic flux which essentially completes the heteroclinic tangle.
Upper-limb tremor suppression with a 7DOF exoskeleton power-assist robot.
Kiguchi, Kazuo; Hayashi, Yoshiaki
2013-01-01
A tremor which is one of the involuntary motions is somewhat rhythmic motion that may occur in various body parts. Although there are several kinds of the tremor, an essential tremor is the most common tremor disorder of the arm. The essential tremor is a disorder of unknown cause, and it is common in the elderly. The essential tremor interferes with a patient's daily living activity, because it may occur during a voluntary motion. If a patient of an essential tremor uses an EMG-based controlled power-assist robot, the robot might misunderstand the user's motion intention because of the effect of the essential tremor. In that case, upper-limb power-assist robots must carry out tremor suppression as well as power-assist, since a person performs various precise tasks with certain tools by the upper-limb in daily living. Therefore, it is important to suppress the tremor at the hand and grasped tool. However, in the case of the tremor suppression control method which suppressed the vibrations of the hand and the tip of the tool, vibration of other part such as elbow might occur. In this paper, the tremor suppression control method for upper-limb power-assist robot is proposed. In the proposed method, the vibration of the elbow is suppressed in addition to the hand and the tip of the tool. The validity of the proposed method was verified by the experiments.
2012-01-05
learn about the latest designs , trends in fashion, and scientific breakthroughs in chair ergonomics . Using this tradeshow, the Furnishings Commodity...these tools is essential to designing the optimal contract that reaps the most value from the exchange. Therefore, this market intelligence guide is...portfolio matrix) that are transferrable to the not-for-profit sector are absent. Each of these tools is essential to designing the optimal contract that
Numerical modelling of tool wear in turning with cemented carbide cutting tools
NASA Astrophysics Data System (ADS)
Franco, P.; Estrems, M.; Faura, F.
2007-04-01
A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.
Microcomputer-Based Access to Machine-Readable Numeric Databases.
ERIC Educational Resources Information Center
Wenzel, Patrick
1988-01-01
Describes the use of microcomputers and relational database management systems to improve access to numeric databases by the Data and Program Library Service at the University of Wisconsin. The internal records management system, in-house reference tools, and plans to extend these tools to the entire campus are discussed. (3 references) (CLB)
Efficient hybrid-symbolic methods for quantum mechanical calculations
NASA Astrophysics Data System (ADS)
Scott, T. C.; Zhang, Wenxing
2015-06-01
We present hybrid symbolic-numerical tools to generate optimized numerical code for rapid prototyping and fast numerical computation starting from a computer algebra system (CAS) and tailored to any given quantum mechanical problem. Although a major focus concerns the quantum chemistry methods of H. Nakatsuji which has yielded successful and very accurate eigensolutions for small atoms and molecules, the tools are general and may be applied to any basis set calculation with a variational principle applied to its linear and non-linear parameters.
NASA Astrophysics Data System (ADS)
Kasprak, A.; Brasington, J.; Hafen, K.; Wheaton, J. M.
2015-12-01
Numerical models that predict channel evolution through time are an essential tool for investigating processes that occur over timescales which render field observation intractable. However, available morphodynamic models generally take one of two approaches to the complex problem of computing morphodynamics, resulting in oversimplification of the relevant physics (e.g. cellular models) or faithful, yet computationally intensive, representations of the hydraulic and sediment transport processes at play. The practical implication of these approaches is that river scientists must often choose between unrealistic results, in the case of the former, or computational demands that render modeling realistic spatiotemporal scales of channel evolution impossible. Here we present a new modeling framework that operates at the timescale of individual competent flows (e.g. floods), and uses a highly-simplified sediment transport routine that moves volumes of material according to morphologically-derived characteristic transport distances, or path lengths. Using this framework, we have constructed an open-source morphodynamic model, termed MoRPHED, which is here applied, and its validity investigated, at timescales ranging from a single event to a decade on two braided rivers in the UK and New Zealand. We do not purport that MoRPHED is the best, nor even an adequate, tool for modeling braided river dynamics at this range of timescales. Rather, our goal in this research is to explore the utility, feasibility, and sensitivity of an event-scale, path-length-based modeling framework for predicting braided river dynamics. To that end, we further explore (a) which processes are naturally emergent and which must be explicitly parameterized in the model, (b) the sensitivity of the model to the choice of particle travel distance, and (c) whether an event-scale model timestep is adequate for producing braided channel dynamics. The results of this research may inform techniques for future morphodynamic modeling that seeks to maximize computational resources while modeling fluvial dynamics at the timescales of change.
Artificial Boundary Conditions for Computation of Oscillating External Flows
NASA Technical Reports Server (NTRS)
Tsynkov, S. V.
1996-01-01
In this paper, we propose a new technique for the numerical treatment of external flow problems with oscillatory behavior of the solution in time. Specifically, we consider the case of unbounded compressible viscous plane flow past a finite body (airfoil). Oscillations of the flow in time may be caused by the time-periodic injection of fluid into the boundary layer, which in accordance with experimental data, may essentially increase the performance of the airfoil. To conduct the actual computations, we have to somehow restrict the original unbounded domain, that is, to introduce an artificial (external) boundary and to further consider only a finite computational domain. Consequently, we will need to formulate some artificial boundary conditions (ABC's) at the introduced external boundary. The ABC's we are aiming to obtain must meet a fundamental requirement. One should be able to uniquely complement the solution calculated inside the finite computational domain to its infinite exterior so that the original problem is solved within the desired accuracy. Our construction of such ABC's for oscillating flows is based on an essential assumption: the Navier-Stokes equations can be linearized in the far field against the free-stream back- ground. To actually compute the ABC's, we represent the far-field solution as a Fourier series in time and then apply the Difference Potentials Method (DPM) of V. S. Ryaben'kii. This paper contains a general theoretical description of the algorithm for setting the DPM-based ABC's for time-periodic external flows. Based on our experience in implementing analogous ABC's for steady-state problems (a simpler case), we expect that these boundary conditions will become an effective tool for constructing robust numerical methods to calculate oscillatory flows.
Cement bond evaluation method in horizontal wells using segmented bond tool
NASA Astrophysics Data System (ADS)
Song, Ruolong; He, Li
2018-06-01
Most of the existing cement evaluation technologies suffer from tool eccentralization due to gravity in highly deviated wells and horizontal wells. This paper proposes a correction method to lessen the effects of tool eccentralization on evaluation results of cement bond using segmented bond tool, which has an omnidirectional sonic transmitter and eight segmented receivers evenly arranged around the tool 2 ft from the transmitter. Using 3-D finite difference parallel numerical simulation method, we investigate the logging responses of centred and eccentred segmented bond tool in a variety of bond conditions. From the numerical results, we find that the tool eccentricity and channel azimuth can be estimated from measured sector amplitude. The average of the sector amplitude when the tool is eccentred can be corrected to the one when the tool is centred. Then the corrected amplitude will be used to calculate the channel size. The proposed method is applied to both synthetic and field data. For synthetic data, it turns out that this method can estimate the tool eccentricity with small error and the bond map is improved after correction. For field data, the tool eccentricity has a good agreement with the measured well deviation angle. Though this method still suffers from the low accuracy of calculating channel azimuth, the credibility of corrected bond map is improved especially in horizontal wells. It gives us a choice to evaluate the bond condition for horizontal wells using existing logging tool. The numerical results in this paper can provide aids for understanding measurements of segmented tool in both vertical and horizontal wells.
Epigenetic regulation of gene expression in cancer: techniques, resources and analysis
Kagohara, Luciane T; Stein-O’Brien, Genevieve L; Kelley, Dylan; Flam, Emily; Wick, Heather C; Danilova, Ludmila V; Easwaran, Hariharan; Favorov, Alexander V; Qian, Jiang; Gaykalova, Daria A; Fertig, Elana J
2018-01-01
Abstract Cancer is a complex disease, driven by aberrant activity in numerous signaling pathways in even individual malignant cells. Epigenetic changes are critical mediators of these functional changes that drive and maintain the malignant phenotype. Changes in DNA methylation, histone acetylation and methylation, noncoding RNAs, posttranslational modifications are all epigenetic drivers in cancer, independent of changes in the DNA sequence. These epigenetic alterations were once thought to be crucial only for the malignant phenotype maintenance. Now, epigenetic alterations are also recognized as critical for disrupting essential pathways that protect the cells from uncontrolled growth, longer survival and establishment in distant sites from the original tissue. In this review, we focus on DNA methylation and chromatin structure in cancer. The precise functional role of these alterations is an area of active research using emerging high-throughput approaches and bioinformatics analysis tools. Therefore, this review also describes these high-throughput measurement technologies, public domain databases for high-throughput epigenetic data in tumors and model systems and bioinformatics algorithms for their analysis. Advances in bioinformatics data that combine these epigenetic data with genomics data are essential to infer the function of specific epigenetic alterations in cancer. These integrative algorithms are also a focus of this review. Future studies using these emerging technologies will elucidate how alterations in the cancer epigenome cooperate with genetic aberrations during tumor initiation and progression. This deeper understanding is essential to future studies with epigenetics biomarkers and precision medicine using emerging epigenetic therapies. PMID:28968850
De la Fuente, Ildefonso M.; Cortes, Jesus M.; Perez-Pinilla, Martin B.; Ruiz-Rodriguez, Vicente; Veguillas, Juan
2011-01-01
Background Experimental observations and numerical studies with dissipative metabolic networks have shown that cellular enzymatic activity self-organizes spontaneously leading to the emergence of a metabolic core formed by a set of enzymatic reactions which are always active under all environmental conditions, while the rest of catalytic processes are only intermittently active. The reactions of the metabolic core are essential for biomass formation and to assure optimal metabolic performance. The on-off catalytic reactions and the metabolic core are essential elements of a Systemic Metabolic Structure which seems to be a key feature common to all cellular organisms. Methodology/Principal Findings In order to investigate the functional importance of the metabolic core we have studied different catalytic patterns of a dissipative metabolic network under different external conditions. The emerging biochemical data have been analysed using information-based dynamic tools, such as Pearson's correlation and Transfer Entropy (which measures effective functionality). Our results show that a functional structure of effective connectivity emerges which is dynamical and characterized by significant variations of bio-molecular information flows. Conclusions/Significance We have quantified essential aspects of the metabolic core functionality. The always active enzymatic reactions form a hub –with a high degree of effective connectivity- exhibiting a wide range of functional information values being able to act either as a source or as a sink of bio-molecular causal interactions. Likewise, we have found that the metabolic core is an essential part of an emergent functional structure characterized by catalytic modules and metabolic switches which allow critical transitions in enzymatic activity. Both, the metabolic core and the catalytic switches in which also intermittently-active enzymes are involved seem to be fundamental elements in the self-regulation of the Systemic Metabolic Structure. PMID:22125607
Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields
NASA Astrophysics Data System (ADS)
Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo
The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.
A Multimedia Child Developmental Screening Checklist: Design and Validation
Cheng, Hsin-Yi Kathy; Chen, Li-Ying; Cheng, Chih-Hsiu; Ju, Yan-Ying; Chen, Chia-Ling
2016-01-01
Background Identifying disability early in life confers long-term benefits for children. The Taipei City Child Development Screening tool, second version (Taipei II) provides checklists for 13 child age groups from 4 months to 6 years. However, the usability of a text-based screening tool largely depends on the literacy level and logical reasoning ability of the caregivers, as well as language barriers caused by increasing numbers of immigrants. Objective The objectives of this study were to (1) design and develop a Web-based multimedia version of the current Taipei II developmental screening tool, and (2) investigate the measurement equivalence of this multimedia version to the original paper-based version. Methods To develop the multimedia version of Taipei II, a team of experts created illustrations, translations, and dubbing of the original checklists. The developmental screening test was administered to a total of 390 primary caregivers of children aged between 4 months and 6 years. Results Psychometric testing revealed excellent agreement between the paper and multimedia versions of Taipei II. Good to excellent reliabilities were demonstrated for all age groups for both the cross-mode similarity (mode intraclass correlation range 0.85-0.96) and the test-retest reliability (r=.93). Regarding the usability, the mean score was 4.80 (SD 0.03), indicating that users were satisfied with their multimedia website experience. Conclusions The multimedia tool produced essentially equivalent results to the paper-based tool. In addition, it had numerous advantages, such as it can facilitate active participation and promote early screening of target populations. ClinicalTrial Clinicaltrials.gov NCT02359591; https://clinicaltrials.gov/ct2/show/NCT02359591 (Archived by WebCite at http://www.webcitation.org/6l21mmdNn) PMID:27777218
PC Software for Artificial Intelligence Applications.
Epp, H; Kalin, M; Miller, D
1988-05-06
Our review has emphasized that AI tools are programming languages inspired by some problem-solving paradigm. We want to underscore their status as programming languages; even if an AI tool seems to fit a problem perfectly, its proficient use still requires the training and practice associated with any programming language. The programming manuals for PC-Plus, Smalltalk/ V, and Nexpert Object are all tutorial in nature, and the corresponding software packages come with sample applications. We find the manuals to be uniformly good introductions that try to anticipate the problems of a user who is new to the technology. All three vendors offer free technical support by telephone to licensed users. AI tools are sometimes oversold as a way to make programming easy or to avoid it altogether. The truth is that AI tools demand programming-but programming that allows you to concentrate on the essentials of the problem. If we had to implement a diagnostic system, we would look first to a product such as PC-Plus rather than BASIC or C, because PC-Plus is designed specifically for such a problem, whereas these conventional languages are not. If we had to implement a system that required graphical interfaces and could benefit from inheritance, we would look first to an object-oriented system such as Smalltalk/V that provides built-in mechanisms for both. If we had to implement an expert system that called for some mix of AI and conventional techniques, we would look first to a product such as Nexpert Object that integrates various problem-solving technologies. Finally, we might use FORTRAN if we were concerned primarily with programming a well-defined numerical algorithm. AI tools are a valuable complement to traditional languages.
A Multimedia Child Developmental Screening Checklist: Design and Validation.
Cheng, Hsin-Yi Kathy; Chen, Li-Ying; Cheng, Chih-Hsiu; Ju, Yan-Ying; Chen, Chia-Ling; Tseng, Kevin C
2016-10-24
Identifying disability early in life confers long-term benefits for children. The Taipei City Child Development Screening tool, second version (Taipei II) provides checklists for 13 child age groups from 4 months to 6 years. However, the usability of a text-based screening tool largely depends on the literacy level and logical reasoning ability of the caregivers, as well as language barriers caused by increasing numbers of immigrants. The objectives of this study were to (1) design and develop a Web-based multimedia version of the current Taipei II developmental screening tool, and (2) investigate the measurement equivalence of this multimedia version to the original paper-based version. To develop the multimedia version of Taipei II, a team of experts created illustrations, translations, and dubbing of the original checklists. The developmental screening test was administered to a total of 390 primary caregivers of children aged between 4 months and 6 years. Psychometric testing revealed excellent agreement between the paper and multimedia versions of Taipei II. Good to excellent reliabilities were demonstrated for all age groups for both the cross-mode similarity (mode intraclass correlation range 0.85-0.96) and the test-retest reliability (r=.93). Regarding the usability, the mean score was 4.80 (SD 0.03), indicating that users were satisfied with their multimedia website experience. The multimedia tool produced essentially equivalent results to the paper-based tool. In addition, it had numerous advantages, such as it can facilitate active participation and promote early screening of target populations. Clinicaltrials.gov NCT02359591; https://clinicaltrials.gov/ct2/show/NCT02359591 (Archived by WebCite at http://www.webcitation.org/6l21mmdNn).
Web-based visual analysis for high-throughput genomics
2013-01-01
Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618
Computational ecology as an emerging science
Petrovskii, Sergei; Petrovskaya, Natalia
2012-01-01
It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336
A Numerical and Theoretical Study of Seismic Wave Diffraction in Complex Geologic Structure
1989-04-14
element methods for analyzing linear and nonlinear seismic effects in the surficial geologies relevant to several Air Force missions. The second...exact solution evaluated here indicates that edge-diffracted seismic wave fields calculated by discrete numerical methods probably exhibits significant...study is to demonstrate and validate some discrete numerical methods essential for analyzing linear and nonlinear seismic effects in the surficial
Enhanced CARES Software Enables Improved Ceramic Life Prediction
NASA Technical Reports Server (NTRS)
Janosik, Lesley A.
1997-01-01
The NASA Lewis Research Center has developed award-winning software that enables American industry to establish the reliability and life of brittle material (e.g., ceramic, intermetallic, graphite) structures in a wide variety of 21st century applications. The CARES (Ceramics Analysis and Reliability Evaluation of Structures) series of software is successfully used by numerous engineers in industrial, academic, and government organizations as an essential element of the structural design and material selection processes. The latest version of this software, CARES/Life, provides a general- purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. CARES/Life was recently enhanced by adding new modules designed to improve functionality and user-friendliness. In addition, a beta version of the newly-developed CARES/Creep program (for determining the creep life of monolithic ceramic components) has just been released to selected organizations.
13C-based metabolic flux analysis: fundamentals and practice.
Yang, Tae Hoon
2013-01-01
Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.
Alternative cytoskeletal landscapes: cytoskeletal novelty and evolution in basal excavate protists
Dawson, Scott C.; Paredez, Alexander R.
2016-01-01
Microbial eukaryotes encompass the majority of eukaryotic evolutionary and cytoskeletal diversity. The cytoskeletal complexity observed in multicellular organisms appears to be an expansion of components present in genomes of diverse microbial eukaryotes such as the basal lineage of flagellates, the Excavata. Excavate protists have complex and diverse cytoskeletal architectures and life cycles – essentially alternative cytoskeletal “landscapes” – yet still possess conserved microtubule- and actin-associated proteins. Comparative genomic analyses have revealed that a subset of excavates, however, lack many canonical actin-binding proteins central to actin cytoskeleton function in other eukaryotes. Overall, excavates possess numerous uncharacterized and “hypothetical” genes, and may represent an undiscovered reservoir of novel cytoskeletal genes and cytoskeletal mechanisms. The continued development of molecular genetic tools in these complex microbial eukaryotes will undoubtedly contribute to our overall understanding of cytoskeletal diversity and evolution. PMID:23312067
Flow measurement around a model ship with propeller and rudder
NASA Astrophysics Data System (ADS)
van, S. H.; Kim, W. J.; Yoon, H. S.; Lee, Y. Y.; Park, I. R.
2006-04-01
For the design of hull forms with better resistance and propulsive performance, it is essential to understand flow characteristics, such as wave and wake development, around a ship. Experimental data detailing the local flow characteristics are invaluable for the validation of the physical and numerical modeling of computational fluid dynamics (CFD) codes, which are recently gaining attention as efficient tools for hull form evaluation. This paper describes velocity and wave profiles measured in the towing tank for the KRISO 138,000 m3 LNG carrier model with propeller and rudder. The effects of propeller and rudder on the wake and wave profiles in the stern region are clearly identified. The results contained in this paper can provide an opportunity to explore integrated flow phenomena around a model ship in the self-propelled condition, and can be added to the International Towing Tank Conference benchmark data for CFD validation as the previous KCS and KVLCC cases.
Organic Chemistry and Biology: Chemical Biology Through the Eyes of Collaboration
Hruby, Victor J.
2011-01-01
From a scientific perspective, efforts to understand biology including what constitutes health and disease has become a chemical problem. However, chemists and biologists “see” the problems of understanding biology from different perspectives, and this has retarded progress in solving the problems especially as they relate to health and disease. This suggests that close collaboration between chemists and biologists is not only necessary but essential for progress in both the biology and chemistry that will provide solutions to the global questions of biology. This perspective has directed my scientific efforts for the past 45 years, and in this overview I provide my perspective of how the applications of synthetic chemistry, structural design, and numerous other chemical principles have intersected in my collaborations with biologists to provide new tools, new science, and new insights that were only made possible and fruitful by these collaborations. PMID:20000552
Association of Internet addiction and alexithymia - A scoping review.
Mahapatra, Ananya; Sharma, Pawan
2018-06-01
It has been hypothesized that individuals with alexithymia who have difficulty in identifying, expressing, and communicating emotions may overuse Internet as a tool of social interaction to better regulate their emotions and to fulfill their unmet social needs. Similarly, an increasing body of evidence suggests that alexithymia may also play an essential role in the etiopathogenesis of addictive disorders. We conducted a scoping review of questionnaire-based studies of problematic Internet use/Internet addiction and alexithymia. From initial 51 studies, all of the final 12 included studies demonstrated a significant positive association between scores of alexithymia and severity of Internet addiction. However, the causal direction of the association is not clear because the interplay of numerous other variables that could affect the relation has not been studied. There are limitations in the methodology of the studies conducted. Hence, we emphasise the need for longitudinal studies with stronger methodologies. Copyright © 2018 Elsevier Ltd. All rights reserved.
Lipids: From Chemical Structures, Biosynthesis, and Analyses to Industrial Applications.
Li-Beisson, Yonghua; Nakamura, Yuki; Harwood, John
2016-01-01
Lipids are one of the major subcellular components, and play numerous essential functions. As well as their physiological roles, oils stored in biomass are useful commodities for a variety of biotechnological applications including food, chemical feedstocks, and fuel. Due to their agronomic as well as economic and societal importance, lipids have historically been subjected to intensive studies. Major current efforts are to increase the energy density of cell biomass, and/or create designer oils suitable for specific applications. This chapter covers some basic aspects of what one needs to know about lipids: definition, structure, function, metabolism and focus is also given on the development of modern lipid analytical tools and major current engineering approaches for biotechnological applications. This introductory chapter is intended to serve as a primer for all subsequent chapters in this book outlining current development in specific areas of lipids and their metabolism.
OSCAR a Matlab based optical FFT code
NASA Astrophysics Data System (ADS)
Degallaix, Jérôme
2010-05-01
Optical simulation softwares are essential tools for designing and commissioning laser interferometers. This article aims to introduce OSCAR, a Matlab based FFT code, to the experimentalist community. OSCAR (Optical Simulation Containing Ansys Results) is used to simulate the steady state electric fields in optical cavities with realistic mirrors. The main advantage of OSCAR over other similar packages is the simplicity of its code requiring only a short time to master. As a result, even for a beginner, it is relatively easy to modify OSCAR to suit other specific purposes. OSCAR includes an extensive manual and numerous detailed examples such as simulating thermal aberration, calculating cavity eigen modes and diffraction loss, simulating flat beam cavities and three mirror ring cavities. An example is also provided about how to run OSCAR on the GPU of modern graphic cards instead of the CPU, making the simulation up to 20 times faster.
A DIY Ultrasonic Signal Generator for Sound Experiments
NASA Astrophysics Data System (ADS)
Riad, Ihab F.
2018-02-01
Many physics departments around the world have electronic and mechanical workshops attached to them that can help build experimental setups and instruments for research and the training of undergraduate students. The workshops are usually run by experienced technicians and equipped with expensive lathing, computer numerical control (CNC) machines, electric measuring instruments, and several other essential tools. However, in developing countries such as Sudan, the lack of qualified technicians and adequately equipped workshops hampers efforts by these departments to supplement their laboratories with the equipment they need. The only other option is to buy the needed equipment from specialized manufacturers. The latter option is not feasible for the departments in developing countries where funding for education and research is scarce and very limited and as equipment from these manufacturers is typically too expensive. These departments struggle significantly in equipping undergraduate teaching laboratories, and here we propose one way to address this.
Contact Modelling in Isogeometric Analysis: Application to Sheet Metal Forming Processes
NASA Astrophysics Data System (ADS)
Cardoso, Rui P. R.; Adetoro, O. B.; Adan, D.
2016-08-01
Isogeometric Analysis (IGA) has been growing in popularity in the past few years essentially due to the extra flexibility it introduces with the use of higher degrees in the basis functions leading to higher convergence rates. IGA also offers the capability of easily reproducing discontinuous displacement and/or strain fields by just manipulating the multiplicity of the knot parametric coordinates. Another advantage of IGA is that it uses the Non-Uniform Rational B-Splines (NURBS) basis functions, that are very common in CAD solid modelling, and consequently it makes easier the transition from CAD models to numerical analysis. In this work it is explored the contact analysis in IGA for both implicit and explicit time integration schemes. Special focus will be given on contact search and contact detection techniques under NURBS patches for both the rigid tools and the deformed sheet blank.
Utilization of sounding rockets and balloons in the German Space Programme
NASA Astrophysics Data System (ADS)
Preu, Peter; Friker, Achim; Frings, Wolfgang; Püttmann, Norbert
2005-08-01
Sounding rockets and balloons are important tools of Germany's Space Programme. DLR manages these activities and promotes scientific experiments and validation programmes within (1) Space Science, (2) Earth Observation, (3) Microgravity Research and (4) Re-entry Technologies (SHEFEX). In Space Science the present focus is at atmospheric research. Concerning Earth Observation balloon-borne measurements play a key role in the validation of atmospheric satellite sounders (ENVISAT). TEXUS and MAXUS sounding rockets are successfully used for short duration microgravity experiments. The Sharp Edge Flight Experiment SHEFEX will deliver data from a hypersonic flight for the validation of a new Thermal Protection System (TPS), wind tunnel testing and numerical analysis of aerothermodynamics. Signing the Revised Esrange and Andøya Special Project (EASP) Agreement 2006-2010 in June 2004 Germany has made an essential contribution to the long-term availability of the Scandinavian ranges for the European science community.
Li, Qi-Gang; He, Yong-Han; Wu, Huan; Yang, Cui-Ping; Pu, Shao-Yan; Fan, Song-Qing; Jiang, Li-Ping; Shen, Qiu-Shuo; Wang, Xiao-Xiong; Chen, Xiao-Qiong; Yu, Qin; Li, Ying; Sun, Chang; Wang, Xiangting; Zhou, Jumin; Li, Hai-Peng; Chen, Yong-Bin; Kong, Qing-Peng
2017-01-01
Heterogeneity in transcriptional data hampers the identification of differentially expressed genes (DEGs) and understanding of cancer, essentially because current methods rely on cross-sample normalization and/or distribution assumption-both sensitive to heterogeneous values. Here, we developed a new method, Cross-Value Association Analysis (CVAA), which overcomes the limitation and is more robust to heterogeneous data than the other methods. Applying CVAA to a more complex pan-cancer dataset containing 5,540 transcriptomes discovered numerous new DEGs and many previously rarely explored pathways/processes; some of them were validated, both in vitro and in vivo , to be crucial in tumorigenesis, e.g., alcohol metabolism ( ADH1B ), chromosome remodeling ( NCAPH ) and complement system ( Adipsin ). Together, we present a sharper tool to navigate large-scale expression data and gain new mechanistic insights into tumorigenesis.
Corte, Rosa María Muñoz; Estepa, Raúl García; Ramos, Bernardo Santos; Paloma, Francisco Javier Bautista
2009-01-01
To evaluate the quality of the pharmacotherapeutic recommendations included in the Integrated Care Procedures (PAIs regarding its initials in Spanish) of the Andalusian Ministry of Health, published up to March 2008, through the design and validation of a tool. The assessment tool was designed based on similar instruments, specifically the AGREE. Other criteria included were taken from various literature sources or were devised by ourselves. The tool was validated prior to being used. After applying it to all the PAIs, we examined the degree of compliance with these pharmacotherapeutic criteria, both as a whole and by PAIs subgroups. The developed tool is a questionnaire of 20 items, divided into 4 sections. The first section consists of the essential criteria, and the rest make reference to more specific, non essential criteria: definition of the level of evidence, thoroughness of information and definition of indicators. It was found that 4 of the 60 PAIs do not contain any type of therapeutic recommendation. No PAI fulfils all the items listed in the tool, however, 70 % of them fulfil the essential quality criteria established. There is a great variability in the content of pharmacotherapeutic recommendations for each PAI. Once the validity of the tool has been proved, it could be used to assess the quality of the therapeutic recommendations in clinical practice guidelines.
Software Aids for radiologists: Part 1, Useful Photoshop skills.
Gross, Joel A; Thapa, Mahesh M
2012-12-01
The purpose of this review is to describe the use of several essential techniques and tools in Adobe Photoshop image-editing software. The techniques shown expand on those previously described in the radiologic literature. Radiologists, especially those with minimal experience with image-editing software, can quickly apply a few essential Photoshop tools to minimize the frustration that can result from attempting to navigate a complex user interface.
A review and evaluation of numerical tools for fractional calculus and fractional order controls
NASA Astrophysics Data System (ADS)
Li, Zhuo; Liu, Lu; Dehghan, Sina; Chen, YangQuan; Xue, Dingyü
2017-06-01
In recent years, as fractional calculus becomes more and more broadly used in research across different academic disciplines, there are increasing demands for the numerical tools for the computation of fractional integration/differentiation, and the simulation of fractional order systems. Time to time, being asked about which tool is suitable for a specific application, the authors decide to carry out this survey to present recapitulative information of the available tools in the literature, in hope of benefiting researchers with different academic backgrounds. With this motivation, the present article collects the scattered tools into a dashboard view, briefly introduces their usage and algorithms, evaluates the accuracy, compares the performance, and provides informative comments for selection.
ERIC Educational Resources Information Center
Stanton, Michael; And Others
1985-01-01
Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…
KNOW ESSENTIALS: a tool for informed decisions in the absence of formal HTA systems.
Mathew, Joseph L
2011-04-01
Most developing countries and resource-limited settings lack robust health technology assessment (HTA) systems. Because the development of locally relevant HTA is not immediately viable, and the extrapolation of external HTA is inappropriate, a new model for evaluating health technologies is required. The aim of this study was to describe the development and application of KNOW ESSENTIALS, a tool facilitating evidence-based decisions on health technologies by stakeholders in settings lacking formal HTA systems. Current HTA methodology was examined through literature search. Additional issues relevant to resource-limited settings, but not adequately addressed in current methodology, were identified through further literature search, appraisal of contextually relevant issues, discussion with healthcare professionals familiar with the local context, and personal experience. A set of thirteen elements important for evidence-based decisions was identified, selected and combined into a tool with the mnemonic KNOW ESSENTIALS. Detailed definitions for each element, coding for the elements, and a system to evaluate a given health technology using the tool were developed. Developing countries and resource-limited settings face several challenges to informed decision making. Models that are relevant and applicable in high-income countries are unlikely in such settings. KNOW ESSENTIALS is an alternative that facilitates evidence-based decision making by stakeholders without formal expertise in HTA. The tool could be particularly useful, as an interim measure, in healthcare systems that are developing HTA capacity. It could also be useful anywhere when rapid evidence-based decisions on health technologies are required.
The eBioKit, a stand-alone educational platform for bioinformatics.
Hernández-de-Diego, Rafael; de Villiers, Etienne P; Klingström, Tomas; Gourlé, Hadrien; Conesa, Ana; Bongcam-Rudloff, Erik
2017-09-01
Bioinformatics skills have become essential for many research areas; however, the availability of qualified researchers is usually lower than the demand and training to increase the number of able bioinformaticians is an important task for the bioinformatics community. When conducting training or hands-on tutorials, the lack of control over the analysis tools and repositories often results in undesirable situations during training, as unavailable online tools or version conflicts may delay, complicate, or even prevent the successful completion of a training event. The eBioKit is a stand-alone educational platform that hosts numerous tools and databases for bioinformatics research and allows training to take place in a controlled environment. A key advantage of the eBioKit over other existing teaching solutions is that all the required software and databases are locally installed on the system, significantly reducing the dependence on the internet. Furthermore, the architecture of the eBioKit has demonstrated itself to be an excellent balance between portability and performance, not only making the eBioKit an exceptional educational tool but also providing small research groups with a platform to incorporate bioinformatics analysis in their research. As a result, the eBioKit has formed an integral part of training and research performed by a wide variety of universities and organizations such as the Pan African Bioinformatics Network (H3ABioNet) as part of the initiative Human Heredity and Health in Africa (H3Africa), the Southern Africa Network for Biosciences (SAnBio) initiative, the Biosciences eastern and central Africa (BecA) hub, and the International Glossina Genome Initiative.
The eBioKit, a stand-alone educational platform for bioinformatics
Conesa, Ana; Bongcam-Rudloff, Erik
2017-01-01
Bioinformatics skills have become essential for many research areas; however, the availability of qualified researchers is usually lower than the demand and training to increase the number of able bioinformaticians is an important task for the bioinformatics community. When conducting training or hands-on tutorials, the lack of control over the analysis tools and repositories often results in undesirable situations during training, as unavailable online tools or version conflicts may delay, complicate, or even prevent the successful completion of a training event. The eBioKit is a stand-alone educational platform that hosts numerous tools and databases for bioinformatics research and allows training to take place in a controlled environment. A key advantage of the eBioKit over other existing teaching solutions is that all the required software and databases are locally installed on the system, significantly reducing the dependence on the internet. Furthermore, the architecture of the eBioKit has demonstrated itself to be an excellent balance between portability and performance, not only making the eBioKit an exceptional educational tool but also providing small research groups with a platform to incorporate bioinformatics analysis in their research. As a result, the eBioKit has formed an integral part of training and research performed by a wide variety of universities and organizations such as the Pan African Bioinformatics Network (H3ABioNet) as part of the initiative Human Heredity and Health in Africa (H3Africa), the Southern Africa Network for Biosciences (SAnBio) initiative, the Biosciences eastern and central Africa (BecA) hub, and the International Glossina Genome Initiative. PMID:28910280
Nondimensional parameter for conformal grinding: combining machine and process parameters
NASA Astrophysics Data System (ADS)
Funkenbusch, Paul D.; Takahashi, Toshio; Gracewski, Sheryl M.; Ruckman, Jeffrey L.
1999-11-01
Conformal grinding of optical materials with CNC (Computer Numerical Control) machining equipment can be used to achieve precise control over complex part configurations. However complications can arise due to the need to fabricate complex geometrical shapes at reasonable production rates. For example high machine stiffness is essential, but the need to grind 'inside' small or highly concave surfaces may require use of tooling with less than ideal stiffness characteristics. If grinding generates loads sufficient for significant tool deflection, the programmed removal depth will not be achieved. Moreover since grinding load is a function of the volumetric removal rate the amount of load deflection can vary with location on the part, potentially producing complex figure errors. In addition to machine/tool stiffness and removal rate, load generation is a function of the process parameters. For example by reducing the feed rate of the tool into the part, both the load and resultant deflection/removal error can be decreased. However this must be balanced against the need for part through put. In this paper a simple model which permits combination of machine stiffness and process parameters into a single non-dimensional parameter is adapted for a conformal grinding geometry. Errors in removal can be minimized by maintaining this parameter above a critical value. Moreover, since the value of this parameter depends on the local part geometry, it can be used to optimize process settings during grinding. For example it may be used to guide adjustment of the feed rate as a function of location on the part to eliminate figure errors while minimizing the total grinding time required.
Arkansas' Curriculum Guide. Competency Based Typewriting.
ERIC Educational Resources Information Center
Arkansas State Dept. of Education, Little Rock. Div. of Vocational, Technical and Adult Education.
This guide contains the essential parts of a total curriculum for a one-year typewriting course at the secondary school level. Addressed in the individual units of the guide are the following topics: alphabetic keyboarding, numeric keyboarding, basic symbol keyboarding, skill development, problem typewriting, ten-key numeric pads, production…
NASA Technical Reports Server (NTRS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2008-01-01
An experimental and numerical investigation into the static and dynamic responses of shape memory alloy hybrid composite (SMAHC) beams is performed to provide quantitative validation of a recently commercialized numerical analysis/design tool for SMAHC structures. The SMAHC beam specimens consist of a composite matrix with embedded pre-strained SMA actuators, which act against the mechanical boundaries of the structure when thermally activated to adaptively stiffen the structure. Numerical results are produced from the numerical model as implemented into the commercial finite element code ABAQUS. A rigorous experimental investigation is undertaken to acquire high fidelity measurements including infrared thermography and projection moire interferometry for full-field temperature and displacement measurements, respectively. High fidelity numerical results are also obtained from the numerical model and include measured parameters, such as geometric imperfection and thermal load. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
Preconditioning for the Navier-Stokes equations with finite-rate chemistry
NASA Technical Reports Server (NTRS)
Godfrey, Andrew G.
1993-01-01
The extension of Van Leer's preconditioning procedure to generalized finite-rate chemistry is discussed. Application to viscous flow is begun with the proper preconditioning matrix for the one-dimensional Navier-Stokes equations. Eigenvalue stiffness is resolved and convergence-rate acceleration is demonstrated over the entire Mach-number range from nearly stagnant flow to hypersonic. Specific benefits are realized at the low and transonic flow speeds typical of complete propulsion-system simulations. The extended preconditioning matrix necessarily accounts for both thermal and chemical nonequilibrium. Numerical analysis reveals the possible theoretical improvements from using a preconditioner for all Mach number regimes. Numerical results confirm the expectations from the numerical analysis. Representative test cases include flows with previously troublesome embedded high-condition-number areas. Van Leer, Lee, and Roe recently developed an optimal, analytic preconditioning technique to reduce eigenvalue stiffness over the full Mach-number range. By multiplying the flux-balance residual with the preconditioning matrix, the acoustic wave speeds are scaled so that all waves propagate at the same rate, an essential property to eliminate inherent eigenvalue stiffness. This session discusses a synthesis of the thermochemical nonequilibrium flux-splitting developed by Grossman and Cinnella and the characteristic wave preconditioning of Van Leer into a powerful tool for implicitly solving two and three-dimensional flows with generalized finite-rate chemistry. For finite-rate chemistry, the state vector of unknowns is variable in length. Therefore, the preconditioning matrix extended to generalized finite-rate chemistry must accommodate a flexible system of moving waves. Fortunately, no new kind of wave appears in the system. The only existing waves are entropy and vorticity waves, which move with the fluid, and acoustic waves, which propagate in Mach number dependent directions. The nonequilibrium vibrational energies and species densities in the unknown state vector act strictly as convective waves. The essential concept for extending the preconditioning to generalized chemistry models is determining the differential variables which symmetrize the flux Jacobians. The extension is then straight-forward. This algorithm research effort will be released in a future version of the production level computational code coined the General Aerodynamic Simulation Program (GASP), developed by Walters, Slack, and McGrory.
Preserving Simplecticity in the Numerical Integration of Linear Beam Optics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Christopher K.
2017-07-01
Presented are mathematical tools and methods for the development of numerical integration techniques that preserve the symplectic condition inherent to mechanics. The intended audience is for beam physicists with backgrounds in numerical modeling and simulation with particular attention to beam optics applications. The paper focuses on Lie methods that are inherently symplectic regardless of the integration accuracy order. Section 2 provides the mathematically tools used in the sequel and necessary for the reader to extend the covered techniques. Section 3 places those tools in the context of charged-particle beam optics; in particular linear beam optics is presented in terms ofmore » a Lie algebraic matrix representation. Section 4 presents numerical stepping techniques with particular emphasis on a third-order leapfrog method. Section 5 discusses the modeling of field imperfections with particular attention to the fringe fields of quadrupole focusing magnets. The direct computation of a third order transfer matrix for a fringe field is shown.« less
USDA-ARS?s Scientific Manuscript database
Field trapping studies conducted in north-central Florida for the redbay ambrosia beetle (Xyleborus glabratus) captured numerous non-target ambrosia beetles, providing information on species diversity and relative abundance. Traps (Lindgren and sticky) baited with essential oil lures (manuka and p...
Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators
NASA Astrophysics Data System (ADS)
Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.
2015-12-01
Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.
A new tool to assess groundwater resources in the Mississippi embayment
Clark, Brian R.; Freiwald, David A.
2011-01-01
What is the Mississippi Embayment? The Mississippi embayment study area encompasses approximately 78,000 square miles in eight States and includes large parts of Arkansas, Louisiana, Mississippi, and Tennessee, and smaller areas of Alabama, Illinois, Kentucky, and Missouri (fig. 1). The Mississippi embayment is essentially a basin that slopes toward the Gulf of Mexico and is filled with sediments of alternating sand, silt, and clay layers. There are two principal aquifers in the embayment-the Mississippi River Valley alluvial aquifer (alluvial aquifer) and the middle Claiborne aquifer (fig. 1). The shallow alluvial aquifer is the primary source of groundwater for irrigation in the largely agricultural region, while the deeper middle Claiborne aquifer is a primary source of drinking water for many of the 5.2 million people living in the embayment. The U.S. Geological Survey (USGS) is conducting large-scale multidisciplinary regional studies of groundwater availability for the Nation. Studies comprise individual assessments of regional groundwater-flow systems that encompass varied terrains and document a comprehensive regional and national perspective of groundwater resources. Collectively, these studies are the foundation for the national assessment of groundwater availability and are conducted in cooperation with other Federal, State, local governments, and the private sector. Numerical groundwater-flow models are used in these studies to document effects of human activities and climate variability on groundwater levels, changes in aquifer storage, and flow between groundwater and surface-water bodies. As part of the Mississippi Embayment Regional Aquifer Study (MERAS), a numerical model was constructed of 13 layers over 78,000 square miles representing multiple aquifers and confining units for the period of 1870 to 2007. The model is a tool that was used to assess and better understand groundwater resources.
Numerical Simulation of Selecting Model Scale of Cable in Wind Tunnel Test
NASA Astrophysics Data System (ADS)
Huang, Yifeng; Yang, Jixin
The numerical simulation method based on computational Fluid Dynamics (CFD) provides a possible alternative means of physical wind tunnel test. Firstly, the correctness of the numerical simulation method is validated by one certain example. In order to select the minimum length of the cable as to a certain diameter in the numerical wind tunnel tests, the numerical wind tunnel tests based on CFD are carried out on the cables with several different length-diameter ratios (L/D). The results show that, when the L/D reaches to 18, the drag coefficient is stable essentially.
Numerical experiments on the accuracy of ENO and modified ENO schemes
NASA Technical Reports Server (NTRS)
Shu, Chi-Wang
1990-01-01
Further numerical experiments are made assessing an accuracy degeneracy phenomena. A modified essentially non-oscillatory (ENO) scheme is proposed, which recovers the correct order of accuracy for all the test problems with smooth initial conditions and gives comparable results with the original ENO schemes for discontinuous problems.
Assessment of the Flood Problems of the Taunton River Basin Massachusetts.
1978-12-01
essential for fish and provides a habitat for numerous varieties of aquatic oriented wildlife species. Of the com- bined forested wetland and open forest...Detailed flood elevation data essential for operation of regula- tions. Flood velocities, flood duration, wave action, erosion pr,,- blems and other...along with the preservation of as much trees and shrubs are essential . Where possible fast growing annual grass seed should be used, intermixed with
Optics simulations: a Python workshop
NASA Astrophysics Data System (ADS)
Ghalila, H.; Ammar, A.; Varadharajan, S.; Majdi, Y.; Zghal, M.; Lahmar, S.; Lakshminarayanan, V.
2017-08-01
Numerical simulations allow teachers and students to indirectly perform sophisticated experiments that cannot be realizable otherwise due to cost and other constraints. During the past few decades there has been an explosion in the development of numerical tools concurrently with open source environments such as Python software. This availability of open source software offers an incredible opportunity for advancing teaching methodologies as well as in research. More specifically it is possible to correlate theoretical knowledge with experimental measurements using "virtual" experiments. We have been working on the development of numerical simulation tools using the Python program package and we have concentrated on geometric and physical optics simulations. The advantage of doing hands-on numerical experiments is that it allows the student learner to be an active participant in the pedagogical/learning process rather than playing a passive role as in the traditional lecture format. Even in laboratory classes because of constraints of space, lack of equipment and often-large numbers of students, many students play a passive role since they work in groups of 3 or more students. Furthermore these new tools help students get a handle on numerical methods as well simulations and impart a "feel" for the physics under investigation.
Computation of fluid flow and pore-space properties estimation on micro-CT images of rock samples
NASA Astrophysics Data System (ADS)
Starnoni, M.; Pokrajac, D.; Neilson, J. E.
2017-09-01
Accurate determination of the petrophysical properties of rocks, namely REV, mean pore and grain size and absolute permeability, is essential for a broad range of engineering applications. Here, the petrophysical properties of rocks are calculated using an integrated approach comprising image processing, statistical correlation and numerical simulations. The Stokes equations of creeping flow for incompressible fluids are solved using the Finite-Volume SIMPLE algorithm. Simulations are then carried out on three-dimensional digital images obtained from micro-CT scanning of two rock formations: one sandstone and one carbonate. Permeability is predicted from the computed flow field using Darcy's law. It is shown that REV, REA and mean pore and grain size are effectively estimated using the two-point spatial correlation function. Homogeneity and anisotropy are also evaluated using the same statistical tools. A comparison of different absolute permeability estimates is also presented, revealing a good agreement between the numerical value and the experimentally determined one for the carbonate sample, but a large discrepancy for the sandstone. Finally, a new convergence criterion for the SIMPLE algorithm, and more generally for the family of pressure-correction methods, is presented. This criterion is based on satisfaction of bulk momentum balance, which makes it particularly useful for pore-scale modelling of reservoir rocks.
NASA Astrophysics Data System (ADS)
Moylan, Andrew; Scott, Susan M.; Searle, Anthony C.
2006-02-01
The software tool GRworkbench is an ongoing project in visual, numerical General Relativity at The Australian National University. Recently, GRworkbench has been significantly extended to facilitate numerical experimentation in analytically-defined space-times. The numerical differential geometric engine has been rewritten using functional programming techniques, enabling objects which are normally defined as functions in the formalism of differential geometry and General Relativity to be directly represented as function variables in the C++ code of GRworkbench. The new functional differential geometric engine allows for more accurate and efficient visualisation of objects in space-times and makes new, efficient computational techniques available. Motivated by the desire to investigate a recent scientific claim using GRworkbench, new tools for numerical experimentation have been implemented, allowing for the simulation of complex physical situations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
2011-01-01
An effective risk assessment system is needed to address the threat posed by an active or passive insider who, acting alone or in collusion, could attempt diversion or theft of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) is a self-assessment or inspection tool utilizing probabilistic risk assessment (PRA) methodology to calculate the system effectiveness of a nuclear facility's material protection, control, and accountability (MPC&A) system. The MSET process is divided into four distinct and separate parts: (1) Completion of the questionnaire that assembles information about the operations of every aspect of the MPC&A system; (2)more » Conversion of questionnaire data into numeric values associated with risk; (3) Analysis of the numeric data utilizing the MPC&A fault tree and the SAPHIRE computer software; and (4) Self-assessment using the MSET reports to perform the effectiveness evaluation of the facility's MPC&A system. The process should lead to confirmation that mitigating features of the system effectively minimize the threat, or it could lead to the conclusion that system improvements or upgrades are necessary to achieve acceptable protection against the threat. If the need for system improvements or upgrades is indicated when the system is analyzed, MSET provides the capability to evaluate potential or actual system improvements or upgrades. A facility's MC&A system can be evaluated at a point in time. The system can be reevaluated after upgrades are implemented or after other system changes occur. The total system or specific subareas within the system can be evaluated. Areas of potential system improvement can be assessed to determine where the most beneficial and cost-effective improvements should be made. Analyses of risk importance factors show that sustainability is essential for optimal performance and reveals where performance degradation has the greatest impact on total system risk. The risk importance factors show the amount of risk reduction achievable with potential upgrades and the amount of risk reduction achieved after upgrades are completed. Applying the risk assessment tool gives support to budget prioritization by showing where budget support levels must be sustained for MC&A functions most important to risk. Results of the risk assessment are also useful in supporting funding justifications for system improvements that significantly reduce system risk. The functional model, the system risk assessment tool, and the facility evaluation questionnaire are valuable educational tools for MPC&A personnel. These educational tools provide a framework for ongoing dialogue between organizations regarding the design, development, implementation, operation, assessment, and sustainability of MPC&A systems. An organization considering the use of MSET as an analytical tool for evaluating the effectiveness of its MPC&A system will benefit from conducting a complete MSET exercise at an existing nuclear facility.« less
Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview
ERIC Educational Resources Information Center
Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans
2017-01-01
Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…
Nabbe, P; Le Reste, J Y; Guillou-Landreat, M; Munoz Perez, M A; Argyriadou, S; Claveria, A; Fernández San Martín, M I; Czachowski, S; Lingner, H; Lygidakis, C; Sowinska, A; Chiron, B; Derriennic, J; Le Prielec, A; Le Floch, B; Montier, T; Van Marwijk, H; Van Royen, P
2017-01-01
Depression occurs frequently in primary care. Its broad clinical variability makes it difficult to diagnose. This makes it essential that family practitioner (FP) researchers have validated tools to minimize bias in studies of everyday practice. Which tools validated against psychiatric examination, according to the major depression criteria of DSM-IV or 5, can be used for research purposes? An international FP team conducted a systematic review using the following databases: Pubmed, Cochrane and Embase, from 2000/01/01 to 2015/10/01. The three databases search identified 770 abstracts: 546 abstracts were analyzed after duplicates had been removed (224 duplicates); 50 of the validity studies were eligible and 4 studies were included. In 4 studies, the following tools were found: GDS-5, GDS-15, GDS-30, CESD-R, HADS, PSC-51 and HSCL-25. Sensitivity, Specificity, Positive Predictive Value, Negative Predictive Value were collected. The Youden index was calculated. Using efficiency data alone to compare these studies could be misleading. Additional reliability, reproducibility and ergonomic data will be essential for making comparisons. This study selected seven tools, usable in primary care research, for the diagnosis of depression. In order to define the best tools in terms of efficiency, reproducibility, reliability and ergonomics for research in primary care, and for care itself, further research will be essential. Copyright © 2016. Published by Elsevier Masson SAS.
Numerical modeling process of embolization arteriovenous malformation
NASA Astrophysics Data System (ADS)
Cherevko, A. A.; Gologush, T. S.; Petrenko, I. A.; Ostapenko, V. V.
2017-10-01
Cerebral arteriovenous malformation is a difficult, dangerous, and most frequently encountered vascular failure of development. It consists of vessels of very small diameter, which perform a discharge of blood from the artery to the vein. In this regard it can be adequately modeled using porous medium. Endovascular embolization of arteriovenous malformation is effective treatment of such pathologies. However, the danger of intraoperative rupture during embolization still exists. The purpose is to model this process and build an optimization algorithm for arteriovenous malformation embolization. To study the different embolization variants, the initial-boundary value problems, describing the process of embolization, were solved numerically by using a new modification of CABARET scheme. The essential moments of embolization process were modeled in our numerical experiments. This approach well reproduces the essential features of discontinuous two-phase flows, arising in the embolization problems. It can be used for further study on the process of embolization.
Numerical Simulation of Fluid Flow in a Simple Rotor/Stator Pair
1991-06-01
describes a series of numerical experiments dealing with rotor/stator interactions in hydroturbines . The means of analysis was a nonconforming sliding...science and industry is the improvement of the efficiency of the hydroturbine . Numerical flow analysis is essential in order to properly conduct this...evaluation. The hydroturbine is typically modeled as an infinite series of rotor/stator pairs. Figure 1 is an illustration of an axial-flow machine with
Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F
2009-01-01
Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.
Essentials of Career Interest Assessment. Essentials of Psychological Assessment Series.
ERIC Educational Resources Information Center
Prince, Jeffrey P.; Heiser, Lisa J.
This book is a quick reference source to guide the career professional through the essentials of using the most popular career interest tools. It summarizes important technical aspects of each inventory, and offers step-by-step guidance in the interpretation and use of the various inventories. The chapters are: (1) "Overview"; (2)…
Beltrán-Navarro, Beatriz; Abreu-Mendoza, Roberto A; Matute, Esmeralda; Rosselli, Monica
2018-01-01
This article presents a tool for assessing the early numerical abilities of Spanish-speaking Mexican preschoolers. The Numerical Abilities Test, from the Evaluación Neuropsicológica Infantil-Preescolar (ENI-P), evaluates four core abilities of number development: magnitude comparison, counting, subitizing, and basic calculation. We evaluated 307 Spanish-speaking Mexican children aged 2 years 6 months to 4 years 11 months. Appropriate internal consistency and test-retest reliability were demonstrated. We also investigated the effect of age, children's school attendance, maternal education, and sex on children's numerical scores. The results showed that the four subtests captured development across ages. Critically, maternal education had an impact on children's performance in three out of the four subtests, but there was no effect associated with children's school attendance or sex. These results suggest that the Numerical Abilities Test is a reliable instrument for Spanish-speaking preschoolers. We discuss the implications of our outcomes for numerical development.
Ou, Ming-Chiu; Hsu, Tsung-Fu; Lai, Andrew C; Lin, Yu-Ting; Lin, Chia-Ching
2012-05-01
This study assessed the effectiveness of blended essential oils on menstrual cramps for outpatients with primary dysmenorrhea and explored the analgesic ingredients in the essential oils. A randomized, double-blind clinical trial was conducted. Forty-eight outpatients were diagnosed with primary dysmenorrhea by a gynecologist and had 10-point numeric rating scales that were more than 5. The patients were randomly assigned to an essential oil group (n = 24) and a synthetic fragrance group (n = 24). Essential oils blended with lavender (Lavandula officinalis), clary sage (Salvia sclarea) and marjoram (Origanum majorana) in a 2:1:1 ratio was diluted in unscented cream at 3% concentration for the essential oil group. All outpatients used the cream daily to massage their lower abdomen from the end of the last menstruation continuing to the beginning of the next menstruation. Both the numeric rating scale and the verbal rating scale significantly decreased (P < 0.001) after one menstrual cycle intervention in the two groups. The duration of pain was significantly reduced from 2.4 to 1.8 days after aromatherapy intervention in the essential oil group. Aromatic oil massage provided relief for outpatients with primary dysmenorrhea and reduced the duration of menstrual pain in the essential oil group. The blended essential oils contain four key analgesic components that amount to as much as 79.29%; these analgesic constitutes are linalyl acetate, linalool, eucalyptol, and β-caryophyllene. This study suggests that this blended formula can serve as a reference for alternative and complementary medicine on primary dysmenorrhea. © 2012 The Authors. Journal of Obstetrics and Gynaecology Research © 2012 Japan Society of Obstetrics and Gynecology.
Analysis of key technologies for virtual instruments metrology
NASA Astrophysics Data System (ADS)
Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang
2008-12-01
Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.
NASA Astrophysics Data System (ADS)
Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre
2018-05-01
Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.
Pediatric intensive care unit admission tool: a colorful approach.
Biddle, Amy
2007-12-01
This article discusses the development, implementation, and utilization of our institution's Pediatric Intensive Care Unit (PICU) Color-Coded Admission Status Tool. Rather than the historical method of identifying a maximum number of staffed beds, a tool was developed to color code the PICU's admission status. Previous methods had been ineffective and led to confusion between the PICU leadership team and the administration. The tool includes the previously missing components of staffing and acuity, which are essential in determining admission capability. The PICU tool has three colored levels: green indicates open for admissions; yellow, admission alert resulting from available beds or because staffing is not equal to the projected patient numbers or required acuity; and red, admissions on hold because only one trauma or arrest bed is available or staffing is not equal to the projected acuity. Yellow and red designations require specific actions and the medical director's approval. The tool has been highly successful and significantly impacted nursing with the inclusion of the essential component of nurse staffing necessary in determining bed availability.
NASA Astrophysics Data System (ADS)
Abdolkader, Tarek M.; Shaker, Ahmed; Alahmadi, A. N. M.
2018-07-01
With the continuous miniaturization of electronic devices, quantum-mechanical effects such as tunneling become more effective in many device applications. In this paper, a numerical simulation tool is developed under a MATLAB environment to calculate the tunneling probability and current through an arbitrary potential barrier comparing three different numerical techniques: the finite difference method, transfer matrix method, and transmission line method. For benchmarking, the tool is applied to many case studies such as the rectangular single barrier, rectangular double barrier, and continuous bell-shaped potential barrier, each compared to analytical solutions and giving the dependence of the error on the number of mesh points. In addition, a thorough study of the J ‑ V characteristics of MIM and MIIM diodes, used as rectifiers for rectenna solar cells, is presented and simulations are compared to experimental results showing satisfactory agreement. On the undergraduate level, the tool provides a deeper insight for students to compare numerical techniques used to solve various tunneling problems and helps students to choose a suitable technique for a certain application.
Numerical simulation of the geodynamo reaches Earth's core dynamical regime
NASA Astrophysics Data System (ADS)
Aubert, J.; Gastine, T.; Fournier, A.
2016-12-01
Numerical simulations of the geodynamo have been successful at reproducing a number of static (field morphology) and kinematic (secular variation patterns, core surface flows and westward drift) features of Earth's magnetic field, making them a tool of choice for the analysis and retrieval of geophysical information on Earth's core. However, classical numerical models have been run in a parameter regime far from that of the real system, prompting the question of whether we do get "the right answers for the wrong reasons", i.e. whether the agreement between models and nature simply occurs by chance and without physical relevance in the dynamics. In this presentation, we show that classical models succeed in describing the geodynamo because their large-scale spatial structure is essentially invariant as one progresses along a well-chosen path in parameter space to Earth's core conditions. This path is constrained by the need to enforce the relevant force balance (MAC or Magneto-Archimedes-Coriolis) and preserve the ratio of the convective overturn and magnetic diffusion times. Numerical simulations performed along this path are shown to be spatially invariant at scales larger than that where the magnetic energy is ohmically dissipated. This property enables the definition of large-eddy simulations that show good agreement with direct numerical simulations in the range where both are feasible, and that can be computed at unprecedented values of the control parameters, such as an Ekman number E=10-8. Combining direct and large-eddy simulations, large-scale invariance is observed over half the logarithmic distance in parameter space between classical models and Earth. The conditions reached at this mid-point of the path are furthermore shown to be representative of the rapidly-rotating, asymptotic dynamical regime in which Earth's core resides, with a MAC force balance undisturbed by viscosity or inertia, the enforcement of a Taylor state and strong-field dynamo action. We conclude that numerical modelling has advanced to a stage where it is possible to use models correctly representing the statics, kinematics and now the dynamics of the geodynamo. This opens the way to a better analysis of the geomagnetic field in the time and space domains.
Which benefits in the use of a modeling platform : The VSoil example.
NASA Astrophysics Data System (ADS)
Lafolie, François; Cousin, Isabelle; Mollier, Alain; Pot, Valérie; Maron, Pierre-Alain; Moitrier, Nicolas; Nouguier, Cedric; Moitrier, Nathalie; Beudez, Nicolas
2015-04-01
In the environmental community the need for coupling the models and the associated knowledges emerged recently. The development of a coupling tool or of a modeling platform is mainly driven by the necessity to create models accounting for multiple processes and to take into account the feed back between these processes. Models focusing on a restricted number of processes exist and thus the coupling of these numerical tools appeared as an efficient and rapid mean to fill up the identified gaps. Several tools have been proposed : OMS3 (David et al. 2013) ; CSDMS framework (Peckham et al. 2013) ; the Open MI project developed within the frame of European Community (Open MI, 2011). However, what we should expect from a modeling platform could be more ambitious than only coupling existing numerical codes. We believe that we need to share easily not only our numerical representations but also the attached knowledges. We need to rapidly and easily develop complex models to have tools to bring responses to current issues on soil functioning and soil evolution within the frame of global change. We also need to share in a common frame our visions of soil functioning at various scales, one the one hand to strengthen our collaborations, and, on the other hand, to make them visible by the other communities working on environmental issues. The presentation will briefly present the VSoil platform. The platform is able to manipulate concepts and numerical representations of these processes. The tool helps in assembling modules to create a model and automatically generates an executable code and a GUI. Potentialities of the tool will be illustrated on few selected cases.
Li, Daojin; Yin, Danyang; Chen, Yang; Liu, Zhen
2017-05-19
Protein phosphorylation is a major post-translational modification, which plays a vital role in cellular signaling of numerous biological processes. Mass spectrometry (MS) has been an essential tool for the analysis of protein phosphorylation, for which it is a key step to selectively enrich phosphopeptides from complex biological samples. In this study, metal-organic frameworks (MOFs)-based monolithic capillary has been successfully prepared as an effective sorbent for the selective enrichment of phosphopeptides and has been off-line coupled with matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS) for efficient analysis of phosphopeptides. Using š-casein as a representative phosphoprotein, efficient phosphorylation analysis by this off-line platform was verified. Phosphorylation analysis of a nonfat milk sample was also demonstrated. Through introducing large surface areas and highly ordered pores of MOFs into monolithic column, the MOFs-based monolithic capillary exhibited several significant advantages, such as excellent selectivity toward phosphopeptides, superb tolerance to interference and simple operation procedure. Because of these highly desirable properties, the MOFs-based monolithic capillary could be a useful tool for protein phosphorylation analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Multivariable harmonic balance analysis of the neuronal oscillator for leech swimming.
Chen, Zhiyong; Zheng, Min; Friesen, W Otto; Iwasaki, Tetsuya
2008-12-01
Biological systems, and particularly neuronal circuits, embody a very high level of complexity. Mathematical modeling is therefore essential for understanding how large sets of neurons with complex multiple interconnections work as a functional system. With the increase in computing power, it is now possible to numerically integrate a model with many variables to simulate behavior. However, such analysis can be time-consuming and may not reveal the mechanisms underlying the observed phenomena. An alternative, complementary approach is mathematical analysis, which can demonstrate direct and explicit relationships between a property of interest and system parameters. This paper introduces a mathematical tool for analyzing neuronal oscillator circuits based on multivariable harmonic balance (MHB). The tool is applied to a model of the central pattern generator (CPG) for leech swimming, which comprises a chain of weakly coupled segmental oscillators. The results demonstrate the effectiveness of the MHB method and provide analytical explanations for some CPG properties. In particular, the intersegmental phase lag is estimated to be the sum of a nominal value and a perturbation, where the former depends on the structure and span of the neuronal connections and the latter is roughly proportional to the period gradient, communication delay, and the reciprocal of the intersegmental coupling strength.
Editing of EIA coded, numerically controlled, machine tool tapes
NASA Technical Reports Server (NTRS)
Weiner, J. M.
1975-01-01
Editing of numerically controlled (N/C) machine tool tapes (8-level paper tape) using an interactive graphic display processor is described. A rapid technique required for correcting production errors in N/C tapes was developed using the interactive text editor on the IMLAC PDS-ID graphic display system and two special programs resident on disk. The correction technique and special programs for processing N/C tapes coded to EIA specifications are discussed.
Metric Use in the Tool Industry. A Status Report and a Test of Assessment Methodology.
1982-04-20
Weights and Measures) CIM - Computer-Integrated Manufacturing CNC - Computer Numerical Control DOD - Department of Defense DODISS - DOD Index of...numerically-controlled ( CNC ) machines that have an inch-millimeter selection switch and a corresponding dual readout scale. S -4- The use of both metric...satisfactorily met the demands of both domestic and foreign customers for metric machine tools by providing either metric- capable machines or NC and CNC
USDA-ARS?s Scientific Manuscript database
Aromatic plants produce organic compounds that may be involved in the defense of plants against phytopathogenic insects, bacteria, fungi, and viruses. One of these compounds called carvacrol that is found in high concentrations in essential oils such as oregano has been reported to exhibit numerous...
Paul G. Schaberg; Donald H. DeHayes; Gary J. Hawley; Samuel E. Nijensohn
2008-01-01
Healthy forests provide many of the essential ecosystem services upon which all life depends. Genetic diversity is an essential component of long-term forest health because it provides a basis for adaptation and resilience to environmental stress and change. In addition to natural processes, numerous anthropogenic factors deplete forest genetic resources. Genetic...
Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed
NASA Technical Reports Server (NTRS)
Follen, Gregory J.; Naiman, Cynthia G.
2001-01-01
The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.
NASA Astrophysics Data System (ADS)
Hunt, M. J.; Nuttle, W. K.; Cosby, B. J.; Marshall, F. E.
2005-05-01
Establishing minimum flow requirements in aquatic ecosystems is one way to stipulate controls on water withdrawals in a watershed. The basis of the determination is to identify the amount of flow needed to sustain a threshold ecological function. To develop minimum flow criteria an understanding of ecological response in relation to flow is essential. Several steps are needed including: (1) identification of important resources and ecological functions, (2) compilation of available information, (3) determination of historical conditions, (4) establishment of technical relationships between inflow and resources, and (5) identification of numeric criteria that reflect the threshold at which resources are harmed. The process is interdisciplinary requiring the integration of hydrologic and ecologic principles with quantitative assessments. The tools used quantify the ecological response and key questions related to how the quantity of flow influences the ecosystem are examined by comparing minimum flow determination in two different aquatic systems in South Florida. Each system is characterized by substantial hydrologic alteration. The first, the Caloosahatchee River is a riverine system, located on the southwest coast of Florida. The second, the Everglades- Florida Bay ecotone, is a wetland mangrove ecosystem, located on the southern tip of the Florida peninsula. In both cases freshwater submerged aquatic vegetation (Vallisneria americana or Ruppia maritima), located in areas of the saltwater- freshwater interface has been identified as a basis for minimum flow criteria. The integration of field studies, laboratory studies, and literature review was required. From this information we developed ecological modeling tools to quantify and predict plant growth in response to varying environmental variables. Coupled with hydrologic modeling tools questions relating to the quantity and timing of flow and ecological consequences in relation to normal variability are addressed.
Virtual reality based surgical assistance and training system for long duration space missions.
Montgomery, K; Thonier, G; Stephanides, M; Schendel, S
2001-01-01
Access to medical care during long duration space missions is extremely important. Numerous unanticipated medical problems will need to be addressed promptly and efficiently. Although telemedicine provides a convenient tool for remote diagnosis and treatment, it is impractical due to the long delay between data transmission and reception to Earth. While a well-trained surgeon-internist-astronaut would be an essential addition to the crew, the vast number of potential medical problems necessitate instant access to computerized, skill-enhancing and diagnostic tools. A functional prototype of a virtual reality based surgical training and assistance tool was created at our center, using low-power, small, lightweight components that would be easy to transport on a space mission. The system consists of a tracked, head-mounted display, a computer system, and a number of tracked surgical instruments. The software provides a real-time surgical simulation system with integrated monitoring and information retrieval and a voice input/output subsystem. Initial medical content for the system has been created, comprising craniofacial, hand, inner ear, and general anatomy, as well as information on a number of surgical procedures and techniques. One surgical specialty in particular, microsurgery, was provided as a full simulation due to its long training requirements, significant impact on result due to experience, and likelihood for need. However, the system is easily adapted to realistically simulate a large number of other surgical procedures. By providing a general system for surgical simulation and assistance, the astronaut-surgeon can maintain their skills, acquire new specialty skills, and use tools for computer-based surgical planning and assistance to minimize overall crew and mission risk.
Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain
NASA Technical Reports Server (NTRS)
Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem
2016-01-01
The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.
20 CFR 416.1220 - Property essential to self-support; general.
Code of Federal Regulations, 2011 CFR
2011-04-01
... and supplies, motor vehicles, and tools, etc.) used in a trade or business (as defined in § 404.1066... activities. Liquid resources other than those used as part of a trade or business are not property essential...
Topham, Debra; Drew, Debra
2017-12-01
CAPA is a multifaceted pain assessment tool that was adopted at a large tertiary Midwest hospital to replace the numeric scale for adult patients who could self-report their pain experience. This article describes the process of implementation and the effect on patient satisfaction scores. Use of the tool is supported by the premise that pain assessment entails more than just pain intensity and that assessment is an exchange of meaning between patients and clinicians dependent on internal and external factors. Implementation of the tool was a transformative process resulting in modest increases in patient satisfaction scores with pain management. Patient reports that "staff did everything to manage pain" had the biggest gains and were sustained for more than 2 years. The CAPA tool meets regulatory requirements for pain assessment. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
A survey of parallel programming tools
NASA Technical Reports Server (NTRS)
Cheng, Doreen Y.
1991-01-01
This survey examines 39 parallel programming tools. Focus is placed on those tool capabilites needed for parallel scientific programming rather than for general computer science. The tools are classified with current and future needs of Numerical Aerodynamic Simulator (NAS) in mind: existing and anticipated NAS supercomputers and workstations; operating systems; programming languages; and applications. They are divided into four categories: suggested acquisitions, tools already brought in; tools worth tracking; and tools eliminated from further consideration at this time.
NASA Technical Reports Server (NTRS)
Litvin, Faydor L.; Kuan, Chihping; Zhang, YI
1991-01-01
A numerical method is developed for the minimization of deviations of real tooth surfaces from the theoretical ones. The deviations are caused by errors of manufacturing, errors of installment of machine-tool settings and distortion of surfaces by heat-treatment. The deviations are determined by coordinate measurements of gear tooth surfaces. The minimization of deviations is based on the proper correction of initially applied machine-tool settings. The contents of accomplished research project cover the following topics: (1) Descriptions of the principle of coordinate measurements of gear tooth surfaces; (2) Deviation of theoretical tooth surfaces (with examples of surfaces of hypoid gears and references for spiral bevel gears); (3) Determination of the reference point and the grid; (4) Determination of the deviations of real tooth surfaces at the points of the grid; and (5) Determination of required corrections of machine-tool settings for minimization of deviations. The procedure for minimization of deviations is based on numerical solution of an overdetermined system of n linear equations in m unknowns (m much less than n ), where n is the number of points of measurements and m is the number of parameters of applied machine-tool settings to be corrected. The developed approach is illustrated with numerical examples.
USDA-ARS?s Scientific Manuscript database
The common bed bug (Cimex lectularius L.) resurged in the U.S. and many other countries over the past decade. The need for safe and effective bed bug control products propelled the development of numerous “green pesticides”, mostly with essential oils listed as active ingredients. Various inorganic ...
Numerical Simulation of Ground Coupling of Low Yield Nuclear Detonation
2010-06-01
Without nuclear testing, advanced simulation and experimental facilities, such as the National Ignition Facility ( NIF ), are essential to assuring...in planning future experimental work at NIF . 15. NUMBER OF PAGES 93 14. SUBJECT TERMS National Ignition Facility, GEODYN, Ground Coupling...simulation and experimental facilities, such as the National Ignition Facility ( NIF ), are essential to assuring safety, reliability, and effectiveness
Copper Trafficking in Plants and Its Implication on Cell Wall Dynamics
Printz, Bruno; Lutts, Stanley; Hausman, Jean-Francois; Sergeant, Kjell
2016-01-01
In plants, copper (Cu) acts as essential cofactor of numerous proteins. While the definitive number of these so-called cuproproteins is unknown, they perform central functions in plant cells. As micronutrient, a minimal amount of Cu is needed to ensure cellular functions. However, Cu excess may exert in contrast detrimental effects on plant primary production and even survival. Therefore it is essential for a plant to have a strictly controlled Cu homeostasis, an equilibrium that is both tissue and developmentally influenced. In the current review an overview is presented on the different stages of Cu transport from the soil into the plant and throughout the different plant tissues. Special emphasis is on the Cu-dependent responses mediated by the SPL7 transcription factor, and the crosstalk between this transcriptional regulation and microRNA-mediated suppression of translation of seemingly non-essential cuproproteins. Since Cu is an essential player in electron transport, we also review the recent insights into the molecular mechanisms controlling chloroplastic and mitochondrial Cu transport and homeostasis. We finally highlight the involvement of numerous Cu-proteins and Cu-dependent activities in the properties of one of the major Cu-accumulation sites in plants: the cell wall. PMID:27200069
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen Qinghu; Department of Physics, Zhejiang University, Hangzhou 310027; Yang Yuan
2010-11-15
Entanglement evolution of two independent Jaynes-Cummings atoms without the rotating-wave approximation (RWA) is studied by a numerically exact approach. Previous results based on the RWA are essentially modified in the strong-coupling regime (g{>=}0.1), which has been reached in the recent experiments on the flux qubit coupled to the LC resonator. For the initial Bell state with anticorrelated spins, entanglement sudden death (ESD) is absent in the RWA but does appear in the present numerical calculation without the RWA. Aperiodic entanglement evolution in the strong-coupling regime is observed. The strong atom-cavity coupling facilitates the ESD. The sign of the detuning playsmore » an essential role in the entanglement evolution for strong coupling, which is irrelevant in the RWA. Analytical results based on an unitary transformation are also given, which could not modify the RWA picture essentially. It is suggested that the activation of the photons may be the origin of ESD in this system.« less
Numerical model updating technique for structures using firefly algorithm
NASA Astrophysics Data System (ADS)
Sai Kubair, K.; Mohan, S. C.
2018-03-01
Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.
NASA Astrophysics Data System (ADS)
Cazzani, Antonio; Malagù, Marcello; Turco, Emilio
2016-03-01
We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.
Designing tools for oil exploration using nuclear modeling
NASA Astrophysics Data System (ADS)
Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike
2017-09-01
When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.
The efficiency of geophysical adjoint codes generated by automatic differentiation tools
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Köhl, A.; Stammer, D.
2016-02-01
The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.
ASSET. Assessment Simplification System for Elementary Teachers.
ERIC Educational Resources Information Center
Kentucky State Dept. of Education, Frankfort.
This document is designed to show the connections between assessment tools available for primary and intermediate grades in the Kentucky public schools. Sections of the document outline the essential assessment tools and give information about how they support and mirror each other. These tools can be used to bridge the knowledge of primary and…
A Standards-Based Grading and Reporting Tool for Faculty: Design and Implications
ERIC Educational Resources Information Center
Sadik, Alaa M.
2011-01-01
The use of standard-based assessment, grading and reporting tools is essential to ensure that assessment meets acceptable levels of quality and standardization. This study reports the design, development and evaluation of a standards-based assessment tool for the instructors at Sultan Qaboos University, Sultanate of Oman. The Rapid Applications…
Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool
ERIC Educational Resources Information Center
Aguirre, Julia M.; Zavala, Maria del Rosario
2013-01-01
In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…
Support System to Improve Reading Activity in Parkinson’s Disease and Essential Tremor Patients
Parrales Bravo, Franklin; Del Barrio García, Alberto A.; Gallego de la Sacristana, Mercedes; López Manzanares, Lydia; Vivancos, José; Ayala Rodrigo, José Luis
2017-01-01
The use of information and communication technologies (ICTs) to improve the quality of life of people with chronic and degenerative diseases is a topic receiving much attention nowadays. We can observe that new technologies have driven numerous scientific projects in e-Health, encompassing Smart and Mobile Health, in order to address all the matters related to data processing and health. Our work focuses on helping to improve the quality of life of people with Parkinson’s Disease (PD) and Essential Tremor (ET) by means of a low-cost platform that enables them to read books in an easy manner. Our system is composed of two robotic arms and a graphical interface developed for Android platforms. After several tests, our proposal has achieved a 96.5% accuracy for A4 80 gr non-glossy paper. Moreover, our system has outperformed the state-of-the-art platforms considering different types of paper and inclined surfaces. The feedback from ET and PD patients was collected at “La Princesa” University Hospital in Madrid and was used to study the user experience. Several features such as ease of use, speed, correct behavior or confidence were measured via patient feedback, and a high level of satisfaction was awarded to most of them. According to the patients, our system is a promising tool for facilitating the activity of reading. PMID:28467366
The use of microtechnology and nanotechnology in fabricating vascularized tissues.
Obregón, Raquel; Ramón-Azcón, Javier; Ahadian, Samad; Shiku, Hitoshi; Bae, Hojae; Ramalingam, Murugan; Matsue, Tomokazu
2014-01-01
Tissue engineering (TE) is a multidisciplinary research area that combines medicine, biology, and material science. In recent decades, microtechnology and nanotechnology have also been gradually integrated into this field and have become essential components of TE research. Tissues and complex organs in the body depend on a branched blood vessel system. One of the main objectives for TE researchers is to replicate this vessel system and obtain functional vascularized structures within engineered tissues or organs. With the help of new nanotechnology and microtechnology, significant progress has been made. Achievements include the design of nanoscale-level scaffolds with new functionalities, development of integrated and rapid nanotechnology methods for biofabrication of vascular tissues, discovery of new composite materials to direct differentiation of stem and inducible pluripotent stem cells into the vascular phenotype. Although numerous challenges to replicating vascularized tissue for clinical uses remain, the combination of these new advances has yielded new tools for producing functional vascular tissues in the near future.
Metacognitive gimmicks and their use by upper level physics students
NASA Astrophysics Data System (ADS)
White, Gary; Sikorski, Tiffany-Rose; Landay, Justin
2017-01-01
We report on the initial phases of a study of three particular metacognitive gimmicks that upper-level physics students can use as a tool in their problem-solving kit, namely: checking units for consistency, discerning whether limiting cases match physical intuition, and computing numerical values for reasonable-ness. Students in a one semester Griffiths electromagnetism course at a small private urban university campus are asked to respond to explicit prompts that encourage adopting these three methods for checking answers to physics problems, especially those problems for which an algebraic expression is part of the final answer. We explore how, and to what extent, these students adopt these gimmicks, as well as the time development of their use. While the term ``gimmick'' carries with it some pejorative baggage, we feel it describes the essential nature of the pedagogical idea adequately in that it gets attention, is easy for the students to remember, and represents, albeit perhaps in a surface way, some key ideas about which professional physicists care.
Synthesis of spherical calcium phosphate particles for dental and orthopedic applications
Bohner, Marc; Tadier, Solène; van Garderen, Noémie; de Gasparo, Alex; Döbelin, Nicola; Baroud, Gamal
2013-01-01
Calcium phosphate materials have been used increasingly in the past 40 years as bone graft substitutes in the dental and orthopedic fields. Accordingly, numerous fabrication methods have been proposed and used. However, the controlled production of spherical calcium phosphate particles remains a challenge. Since such particles are essential for the synthesis of pastes and cements delivered into the host bone by minimally-invasive approaches, the aim of the present document is to review their synthesis and applications. For that purpose, production methods were classified according to the used reagents (solutions, slurries, pastes, powders), dispersion media (gas, liquid, solid), dispersion tools (nozzle, propeller, sieve, mold), particle diameters of the end product (from 10 nm to 10 mm), and calcium phosphate phases. Low-temperature calcium phosphates such as monetite, brushite or octacalcium phosphate, as well as high-temperature calcium phosphates, such as hydroxyapatite, β-tricalcium phosphate or tetracalcium phosphate, were considered. More than a dozen production methods and over hundred scientific publications were discussed. PMID:23719177
Corrections to Newton’s law of gravitation - application to hybrid Bloch brane
NASA Astrophysics Data System (ADS)
Almeida, C. A. S.; Veras, D. F. S.; Dantas, D. M.
2018-02-01
We present in this work, the calculations of corrections in the Newton’s law of gravitation due to Kaluza-Klein gravitons in five-dimensional warped thick braneworld scenarios. We consider here a recently proposed model, namely, the hybrid Bloch brane. This model couples two scalar fields to gravity and is engendered from a domain wall-like defect. Also, two other models the so-called asymmetric hybrid brane and compact brane are considered. Such models are deformations of the ϕ 4 and sine-Gordon topological defects, respectively. Therefore we consider the branes engendered by such defects and we also compute the corrections in their cases. In order to attain the mass spectrum and its corresponding eigenfunctions which are the essential quantities for computing the correction to the Newtonian potential, we develop a suitable numerical technique. The calculation of slight deviations in the gravitational potential may be used as a selection tool for braneworld scenarios matching with future experimental measurements in high energy collisions
Neilson, Matthew P; Mackenzie, John A; Webb, Steven D; Insall, Robert H
2010-11-01
In this paper we present a computational tool that enables the simulation of mathematical models of cell migration and chemotaxis on an evolving cell membrane. Recent models require the numerical solution of systems of reaction-diffusion equations on the evolving cell membrane and then the solution state is used to drive the evolution of the cell edge. Previous work involved moving the cell edge using a level set method (LSM). However, the LSM is computationally very expensive, which severely limits the practical usefulness of the algorithm. To address this issue, we have employed the parameterised finite element method (PFEM) as an alternative method for evolving a cell boundary. We show that the PFEM is far more efficient and robust than the LSM. We therefore suggest that the PFEM potentially has an essential role to play in computational modelling efforts towards the understanding of many of the complex issues related to chemotaxis.
X-ray optics simulation and beamline design for the APS upgrade
NASA Astrophysics Data System (ADS)
Shi, Xianbo; Reininger, Ruben; Harder, Ross; Haeffner, Dean
2017-08-01
The upgrade of the Advanced Photon Source (APS) to a Multi-Bend Achromat (MBA) will increase the brightness of the APS by between two and three orders of magnitude. The APS upgrade (APS-U) project includes a list of feature beamlines that will take full advantage of the new machine. Many of the existing beamlines will be also upgraded to profit from this significant machine enhancement. Optics simulations are essential in the design and optimization of these new and existing beamlines. In this contribution, the simulation tools used and developed at APS, ranging from analytical to numerical methods, are summarized. Three general optical layouts are compared in terms of their coherence control and focusing capabilities. The concept of zoom optics, where two sets of focusing elements (e.g., CRLs and KB mirrors) are used to provide variable beam sizes at a fixed focal plane, is optimized analytically. The effects of figure errors on the vertical spot size and on the local coherence along the vertical direction of the optimized design are investigated.
Manipulation of positron orbits in a dipole magnetic field with fluctuating electric fields
NASA Astrophysics Data System (ADS)
Saitoh, H.; Horn-Stanja, J.; Nißl, S.; Stenson, E. V.; Hergenhahn, U.; Pedersen, T. Sunn; Singer, M.; Dickmann, M.; Hugenschmidt, C.; Stoneking, M. R.; Danielson, J. R.; Surko, C. M.
2018-01-01
We report the manipulation of positron orbits in a toroidal dipole magnetic field configuration realized with electric fields generated by segmented electrodes. When the toroidal circulation motion of positrons in the dipole field is coupled with time-varying electric fields generated by azimuthally segmented outer electrodes, positrons undergo oscillations of their radial positions. This enables quick manipulation of the spatial profiles of positrons in a dipole field trap by choosing appropriate frequency, amplitude, phase, and gating time of the electric fields. According to numerical orbit analysis, we applied these electric fields to positrons injected from the NEPOMUC slow positron facility into a prototype dipole field trap experiment with a permanent magnet. Measurements with annihilation γ-rays clearly demonstrated the efficient compression effects of positrons into the strong magnetic field region of the dipole field configuration. This positron manipulation technique can be used as one of essential tools for future experiments on the formation of electron-positron plasmas.
Convection and chemistry effects in CVD: A 3-D analysis for silicon deposition
NASA Technical Reports Server (NTRS)
Gokoglu, S. A.; Kuczmarski, M. A.; Tsui, P.; Chait, A.
1989-01-01
The computational fluid dynamics code FLUENT has been adopted to simulate the entire rectangular-channel-like (3-D) geometry of an experimental CVD reactor designed for Si deposition. The code incorporated the effects of both homogeneous (gas phase) and heterogeneous (surface) chemistry with finite reaction rates of important species existing in silane dissociation. The experiments were designed to elucidate the effects of gravitationally-induced buoyancy-driven convection flows on the quality of the grown Si films. This goal is accomplished by contrasting the results obtained from a carrier gas mixture of H2/Ar with the ones obtained from the same molar mixture ratio of H2/He, without any accompanying change in the chemistry. Computationally, these cases are simulated in the terrestrial gravitational field and in the absence of gravity. The numerical results compare favorably with experiments. Powerful computational tools provide invaluable insights into the complex physicochemical phenomena taking place in CVD reactors. Such information is essential for the improved design and optimization of future CVD reactors.
On the Spectrum of the Plenoptic Function.
Gilliam, Christopher; Dragotti, Pier-Luigi; Brookes, Mike
2014-02-01
The plenoptic function is a powerful tool to analyze the properties of multi-view image data sets. In particular, the understanding of the spectral properties of the plenoptic function is essential in many computer vision applications, including image-based rendering. In this paper, we derive for the first time an exact closed-form expression of the plenoptic spectrum of a slanted plane with finite width and use this expression as the elementary building block to derive the plenoptic spectrum of more sophisticated scenes. This is achieved by approximating the geometry of the scene with a set of slanted planes and evaluating the closed-form expression for each plane in the set. We then use this closed-form expression to revisit uniform plenoptic sampling. In this context, we derive a new Nyquist rate for the plenoptic sampling of a slanted plane and a new reconstruction filter. Through numerical simulations, on both real and synthetic scenes, we show that the new filter outperforms alternative existing filters.
Flame extinction limit and particulates formation in fuel blends
NASA Astrophysics Data System (ADS)
Subramanya, Mahesh
Many fuels used in material processing and power generation applications are generally a blend of various hydrocarbons. Although the combustion and aerosol formation dynamics of individual fuels is well understood, the flame dynamics of fuel blends are yet to be characterized. This research uses a twin flame counterflow burner to measure flame velocity, flame extinction, particulate formation and particulate morphology of hydrogen fuel blend flames at different H2 concentration, oscillation frequencies and stretch conditions. Phase resolved spectroscopic measurements (emission spectra) of OH, H, O and CH radical/atom concentrations is used to characterize the heat release processes of the flame. In addition flame generated particulates are collected using thermophoretic sample technique and are qualitative analyzed using Raman Spectroscopy and SEM. Such measurements are essential for the development of advanced computational tools capable of predicting fuel blend flame characteristics at realistic combustor conditions. The data generated through the measurements of this research are representative, and yet accurate, with unique well defined boundary conditions which can be reproduced in numerical computations for kinetic code validations.
Label-free cell separation and sorting in microfluidic systems
Gossett, Daniel R.; Weaver, Westbrook M.; Mach, Albert J.; Hur, Soojung Claire; Tse, Henry Tat Kwong; Lee, Wonhee; Amini, Hamed
2010-01-01
Cell separation and sorting are essential steps in cell biology research and in many diagnostic and therapeutic methods. Recently, there has been interest in methods which avoid the use of biochemical labels; numerous intrinsic biomarkers have been explored to identify cells including size, electrical polarizability, and hydrodynamic properties. This review highlights microfluidic techniques used for label-free discrimination and fractionation of cell populations. Microfluidic systems have been adopted to precisely handle single cells and interface with other tools for biochemical analysis. We analyzed many of these techniques, detailing their mode of separation, while concentrating on recent developments and evaluating their prospects for application. Furthermore, this was done from a perspective where inertial effects are considered important and general performance metrics were proposed which would ease comparison of reported technologies. Lastly, we assess the current state of these technologies and suggest directions which may make them more accessible. Figure A wide range of microfluidic technologies have been developed to separate and sort cells by taking advantage of differences in their intrinsic biophysical properties PMID:20419490
NASA Astrophysics Data System (ADS)
Lisimenka, Aliaksandr; Kubicki, Adam
2017-02-01
A new spectral analysis technique is proposed for rhythmic bedform quantification, based on the 2D Fourier transform involving the calculation of a set of low-order spectral moments. The approach provides a tool for efficient quantification of bedform length and height as well as spatial crest-line alignment. Contrary to the conventional method, it not only describes the most energetic component of an undulating seabed surface but also retrieves information on its secondary structure without application of any band-pass filter of which the upper and lower cut-off frequencies are a priori unknown. Validation is based on bathymetric data collected in the main Vistula River mouth area (Przekop Wisły), Poland. This revealed two generations (distinct groups) of dunes which are migrating seawards along distinct paths, probably related to the hydrological regime of the river. The data enable the identification of dune divergence and convergence zones. The approach proved successful in the parameterisation of topographic roughness, an essential aspect in numerical modelling studies.
NASA Astrophysics Data System (ADS)
Chen, Miawjane; Yan, Shangyao; Wang, Sin-Siang; Liu, Chiu-Lan
2015-02-01
An effective project schedule is essential for enterprises to increase their efficiency of project execution, to maximize profit, and to minimize wastage of resources. Heuristic algorithms have been developed to efficiently solve the complicated multi-mode resource-constrained project scheduling problem with discounted cash flows (MRCPSPDCF) that characterize real problems. However, the solutions obtained in past studies have been approximate and are difficult to evaluate in terms of optimality. In this study, a generalized network flow model, embedded in a time-precedence network, is proposed to formulate the MRCPSPDCF with the payment at activity completion times. Mathematically, the model is formulated as an integer network flow problem with side constraints, which can be efficiently solved for optimality, using existing mathematical programming software. To evaluate the model performance, numerical tests are performed. The test results indicate that the model could be a useful planning tool for project scheduling in the real world.
Brubacher, John L.; Vieira, Ana P.; Newmark, Phillip A.
2014-01-01
The flatworm Schmidtea mediterranea is an emerging model species in such fields as stem-cell biology, regeneration, and evolutionary biology. Excellent molecular tools have been developed for S. mediterranea, but ultrastructural techniques have received far less attention. Processing specimens for histology and transmission electron microscopy is notoriously idiosyncratic for particular species or specimen types. Unfortunately however, most methods for S. mediterranea described in the literature lack numerous essential details, and those few that do provide them rely on specialized equipment that may not be readily available. Here we present an optimized protocol for ultrastructural preparation of S. mediterranea. The protocol can be completed in six days, much of which is “hands-off” time. To aid with troubleshooting, we also illustrate the significant effects of seemingly minor variations in fixative, buffer concentration, and dehydration steps. This procedure will be useful for all planarian researchers, particularly those with relatively little experience in tissue processing. PMID:24556788
Ornaments of the earliest Upper Paleolithic: New insights from the Levant
Kuhn, Steven L.; Stiner, Mary C.; Reese, David S.; Güleç, Erksin
2001-01-01
Two sites located on the northern Levantine coast, Üçağızlı Cave (Turkey) and Ksar 'Akil (Lebanon) have yielded numerous marine shell beads in association with early Upper Paleolithic stone tools. Accelerator mass spectrometry (AMS) radiocarbon dates indicate ages between 39,000 and 41,000 radiocarbon years (roughly 41,000–43,000 calendar years) for the oldest ornament-bearing levels in Üçağızlı Cave. Based on stratigraphic evidence, the earliest shell beads from Ksar 'Akil may be even older. These artifacts provide some of the earliest evidence for traditions of personal ornament manufacture by Upper Paleolithic humans in western Asia, comparable in age to similar objects from Eastern Europe and Africa. The new data show that the initial appearance of Upper Paleolithic ornament technologies was essentially simultaneous on three continents. The early appearance and proliferation of ornament technologies appears to have been contingent on variable demographic or social conditions. PMID:11390976
Multidisciplinary Multiobjective Optimal Design for Turbomachinery Using Evolutionary Algorithm
NASA Technical Reports Server (NTRS)
2005-01-01
This report summarizes Dr. Lian s efforts toward developing a robust and efficient tool for multidisciplinary and multi-objective optimal design for turbomachinery using evolutionary algorithms. This work consisted of two stages. The first stage (from July 2003 to June 2004) Dr. Lian focused on building essential capabilities required for the project. More specifically, Dr. Lian worked on two subjects: an enhanced genetic algorithm (GA) and an integrated optimization system with a GA and a surrogate model. The second stage (from July 2004 to February 2005) Dr. Lian formulated aerodynamic optimization and structural optimization into a multi-objective optimization problem and performed multidisciplinary and multi-objective optimizations on a transonic compressor blade based on the proposed model. Dr. Lian s numerical results showed that the proposed approach can effectively reduce the blade weight and increase the stage pressure ratio in an efficient manner. In addition, the new design was structurally safer than the original design. Five conference papers and three journal papers were published on this topic by Dr. Lian.
Numerical simulations of strongly correlated electron and spin systems
NASA Astrophysics Data System (ADS)
Changlani, Hitesh Jaiprakash
Developing analytical and numerical tools for strongly correlated systems is a central challenge for the condensed matter physics community. In the absence of exact solutions and controlled analytical approximations, numerical techniques have often contributed to our understanding of these systems. Exact Diagonalization (ED) requires the storage of at least two vectors the size of the Hilbert space under consideration (which grows exponentially with system size) which makes it affordable only for small systems. The Density Matrix Renormalization Group (DMRG) uses an intelligent Hilbert space truncation procedure to significantly reduce this cost, but in its present formulation is limited to quasi-1D systems. Quantum Monte Carlo (QMC) maps the Schrodinger equation to the diffusion equation (in imaginary time) and only samples the eigenvector over time, thereby avoiding the memory limitation. However, the stochasticity involved in the method gives rise to the "sign problem" characteristic of fermion and frustrated spin systems. The first part of this thesis is an effort to make progress in the development of a numerical technique which overcomes the above mentioned problems. We consider novel variational wavefunctions, christened "Correlator Product States" (CPS), that have a general functional form which hopes to capture essential correlations in the ground states of spin and fermion systems in any dimension. We also consider a recent proposal to modify projector (Green's Function) Quantum Monte Carlo to ameliorate the sign problem for realistic and model Hamiltonians (such as the Hubbard model). This exploration led to our own set of improvements, primarily a semistochastic formulation of projector Quantum Monte Carlo. Despite their limitations, existing numerical techniques can yield physical insights into a wide variety of problems. The second part of this thesis considers one such numerical technique - DMRG - and adapts it to study the Heisenberg antiferromagnet on a generic tree graph. Our attention turns to a systematic numerical and semi-analytical study of the effect of local even/odd sublattice imbalance on the low energy spectrum of antiferromagnets on regular Cayley trees. Finally, motivated by previous experiments and theories of randomly diluted antiferromagnets (where an even/odd sublattice imbalance naturally occurs), we present our study of the Heisenberg antiferromagnet on the Cayley tree at the percolation threshold. Our work shows how to detect "emergent" low energy degrees of freedom and compute the effective interactions between them by using data from DMRG calculations.
ERIC Educational Resources Information Center
Spüler, Martin; Walter, Carina; Rosenstiel, Wolfgang; Gerjets, Peter; Moeller, Korbinian; Klein, Elise
2016-01-01
Numeracy is a key competency for living in our modern knowledge society. Therefore, it is essential to support numerical learning from basic to more advanced competency levels. From educational psychology it is known that learning is most effective when the respective content is neither too easy nor too demanding in relation to learners'…
Zhang, Yong-Tao; Shi, Jing; Shu, Chi-Wang; Zhou, Ye
2003-10-01
A quantitative study is carried out in this paper to investigate the size of numerical viscosities and the resolution power of high-order weighted essentially nonoscillatory (WENO) schemes for solving one- and two-dimensional Navier-Stokes equations for compressible gas dynamics with high Reynolds numbers. A one-dimensional shock tube problem, a one-dimensional example with parameters motivated by supernova and laser experiments, and a two-dimensional Rayleigh-Taylor instability problem are used as numerical test problems. For the two-dimensional Rayleigh-Taylor instability problem, or similar problems with small-scale structures, the details of the small structures are determined by the physical viscosity (therefore, the Reynolds number) in the Navier-Stokes equations. Thus, to obtain faithful resolution to these small-scale structures, the numerical viscosity inherent in the scheme must be small enough so that the physical viscosity dominates. A careful mesh refinement study is performed to capture the threshold mesh for full resolution, for specific Reynolds numbers, when WENO schemes of different orders of accuracy are used. It is demonstrated that high-order WENO schemes are more CPU time efficient to reach the same resolution, both for the one-dimensional and two-dimensional test problems.
The Basics in Pottery: Clay and Tools.
ERIC Educational Resources Information Center
Larson, Joan
1985-01-01
Art teachers at the middle school or junior high school level usually find themselves in a program teaching ceramics. The most essential tools needed for a ceramics class are discussed. Different kinds of clay are also discussed. (RM)
Numerical modeling tools for chemical vapor deposition
NASA Technical Reports Server (NTRS)
Jasinski, Thomas J.; Childs, Edward P.
1992-01-01
Development of general numerical simulation tools for chemical vapor deposition (CVD) was the objective of this study. Physical models of important CVD phenomena were developed and implemented into the commercial computational fluid dynamics software FLUENT. The resulting software can address general geometries as well as the most important phenomena occurring with CVD reactors: fluid flow patterns, temperature and chemical species distribution, gas phase and surface deposition. The physical models are documented which are available and examples are provided of CVD simulation capabilities.
O'Bryan, Corliss A; Pendleton, Sean J; Crandall, Philip G; Ricke, Steven C
2015-01-01
The antimicrobial activity of essential oils and their components has been recognized for several years. Essential oils are produced as secondary metabolites by many plants and can be distilled from all different portions of plants. The recent emergence of bacteria resistant to multiple antibiotics has spurred research into the use of essential oils as alternatives. Recent research has demonstrated that many of these essential oils have beneficial effects for livestock, including reduction of foodborne pathogens in these animals. Numerous studies have been made into the mode of action of essential oils, and the resulting elucidation of bacterial cell targets has contributed to new perspectives on countering antimicrobial resistance and pathogenicity of these bacteria. In this review, an overview of the current knowledge about the antibacterial mode of action of essential oils and their constituents is provided.
O’Bryan, Corliss A.; Pendleton, Sean J.; Crandall, Philip G.; Ricke, Steven C.
2015-01-01
The antimicrobial activity of essential oils and their components has been recognized for several years. Essential oils are produced as secondary metabolites by many plants and can be distilled from all different portions of plants. The recent emergence of bacteria resistant to multiple antibiotics has spurred research into the use of essential oils as alternatives. Recent research has demonstrated that many of these essential oils have beneficial effects for livestock, including reduction of foodborne pathogens in these animals. Numerous studies have been made into the mode of action of essential oils, and the resulting elucidation of bacterial cell targets has contributed to new perspectives on countering antimicrobial resistance and pathogenicity of these bacteria. In this review, an overview of the current knowledge about the antibacterial mode of action of essential oils and their constituents is provided. PMID:26664964
NASA Astrophysics Data System (ADS)
Greene, Patrick T.; Eldredge, Jeff D.; Zhong, Xiaolin; Kim, John
2016-07-01
In this paper, we present a method for performing uniformly high-order direct numerical simulations of high-speed flows over arbitrary geometries. The method was developed with the goal of simulating and studying the effects of complex isolated roughness elements on the stability of hypersonic boundary layers. The simulations are carried out on Cartesian grids with the geometries imposed by a third-order cut-stencil method. A fifth-order hybrid weighted essentially non-oscillatory scheme was implemented to capture any steep gradients in the flow created by the geometries and a third-order Runge-Kutta method is used for time advancement. A multi-zone refinement method was also utilized to provide extra resolution at locations with expected complex physics. The combination results in a globally fourth-order scheme in space and third order in time. Results confirming the method's high order of convergence are shown. Two-dimensional and three-dimensional test cases are presented and show good agreement with previous results. A simulation of Mach 3 flow over the logo of the Ubuntu Linux distribution is shown to demonstrate the method's capabilities for handling complex geometries. Results for Mach 6 wall-bounded flow over a three-dimensional cylindrical roughness element are also presented. The results demonstrate that the method is a promising tool for the study of hypersonic roughness-induced transition.
A Study to Investigate the Sleeping Comfort of Mattress using Finite Element Method
NASA Astrophysics Data System (ADS)
Yoshida, Hiroaki; Kamijo, Masayoshi; Shimizu, Yoshio
Sleep is an essential physiological activity for human beings and many studies have so far investigated sleeping comfort of mattresses. The appropriate measurement of stress distribution within the human body would provide valuable information to us. For the appropriate measurement to estimate stress distribution within the human body, numerical analysis is considered one of the most desirable techniques, and Finite Element Method (FEM), which is widely accepted as a useful numerical technique, was utilized in this study. Since human body dimensions have individual differences, however, it is presumed that the way of the internal stress distribution also changes due to the differences and that the mattress preference varies among different body forms. Thus, we developed three human FEM models reproducing the body forms of three types of male subjects, and investigated the sleeping comfort of mattress based on the relationship between FEM analysis findings and sensory testing results. In comparison with the results of both FEM analysis and sensory testing in the neck region, we found, the sensory testing results corresponded to the FEM analysis findings, and it was possible to estimate subjects' preferences of mattress and comfort in the neck region using the FEM analysis. In this study, we believe, the FEM analysis managed to quantify the subjects' preferences for mattress and to prove itself that it is the valuable tools to examine the sleeping comfort of mattress.
Quantifying discrimination of Framingham risk functions with different survival C statistics.
Pencina, Michael J; D'Agostino, Ralph B; Song, Linye
2012-07-10
Cardiovascular risk prediction functions offer an important diagnostic tool for clinicians and patients themselves. They are usually constructed with the use of parametric or semi-parametric survival regression models. It is essential to be able to evaluate the performance of these models, preferably with summaries that offer natural and intuitive interpretations. The concept of discrimination, popular in the logistic regression context, has been extended to survival analysis. However, the extension is not unique. In this paper, we define discrimination in survival analysis as the model's ability to separate those with longer event-free survival from those with shorter event-free survival within some time horizon of interest. This definition remains consistent with that used in logistic regression, in the sense that it assesses how well the model-based predictions match the observed data. Practical and conceptual examples and numerical simulations are employed to examine four C statistics proposed in the literature to evaluate the performance of survival models. We observe that they differ in the numerical values and aspects of discrimination that they capture. We conclude that the index proposed by Harrell is the most appropriate to capture discrimination described by the above definition. We suggest researchers report which C statistic they are using, provide a rationale for their selection, and be aware that comparing different indices across studies may not be meaningful. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Shan, Feng; Guo, Xiasheng; Tu, Juan; Cheng, Jianchun; Zhang, Dong
The high-intensity focused ultrasound (HIFU) has become an attractive therapeutic tool for the noninvasive tumor treatment. The ultrasonic transducer is the key component in HIFU treatment to generate the HIFU energy. The dimension of focal region generated by the transducer is closely relevant to the safety of HIFU treatment. Therefore, it is essential to numerically investigate the focal region of the transducer. Although the conventional acoustic wave equations have been used successfully to describe the acoustic field, there still exist some inherent drawbacks. In this work, we presented an axisymmetric isothermal multi-relaxation-time lattice Boltzmann method (MRT-LBM) model with the Bouzidi-Firdaouss-Lallemand (BFL) boundary condition in cylindrical coordinate system. With this model, some preliminary simulations were firstly conducted to determine a reasonable value of the relaxation parameter. Then, the validity of the model was examined by comparing the results obtained with the LBM results with the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation and the Spheroidal beam equation (SBE) for the focused transducers with different aperture angles, respectively. In addition, the influences of the aperture angle on the focal region were investigated. The proposed model in this work will provide significant references for the parameter optimization of the focused transducer for applications in the HIFU treatment or other fields, and provide new insights into the conventional acoustic numerical simulations.
Investigation of HIV-1 infected and uninfected cells using the optical trapping technique
NASA Astrophysics Data System (ADS)
Ombinda-Lemboumba, S.; Malabi, R.; Lugongolo, M. Y.; Thobakgale, S. L.; Manoto, S.; Mthunzi-Kufa, P.
2017-02-01
Optical trapping has emerged as an essential tool for manipulating single biological material and performing sophisticated spectroscopy analysis on individual cell. The optical trapping technique has been used to grab and immobilize cells from a tightly focused laser beam emitted through a high numerical aperture objective lens. Coupling optical trapping with other technologies is possible and allows stable sample trapping, while also facilitating molecular, chemical and spectroscopic analysis. For this reason, we are exploring laser trapping combined with laser spectroscopy as a potential non-invasive method of interrogating individual cells with a high degree of specificity in terms of information generated. Thus, for the delivery of as much pathological information as possible, we use a home-build optical trapping and spectroscopy system for real time probing human immunodeficiency virus (HIV-1) infected and uninfected single cells. Briefly, our experimental rig comprises an infrared continuous wave laser at 1064 nm with power output of 1.5 W, a 100X high numerical aperture oil-immersion microscope objective used to capture and immobilise individual cell samples as well as an excitation source. Spectroscopy spectral patterns obtained by the 1064 nm laser beam excitation provide information on HIV-1 infected and uninfected cells. We present these preliminary findings which may be valuable for the development of an HIV-1 point of care detection system.
Numerical tension adjustment of x-ray membrane to represent goat skin kompang
NASA Astrophysics Data System (ADS)
Siswanto, Waluyo Adi; Abdullah, Muhammad Syiddiq Bin
2017-04-01
This paper presents a numerical membrane model of traditional musical instrument kompang that will be used to find the parameter of membrane tension of x-ray membrane representing the classical goat-skin membrane of kompang. In this study, the experiment towards the kompang is first conducted in an acoustical anechoic enclosure and in parallel a mathematical model of the kompang membrane is developed to simulate the vibration of the kompang membrane in polar coordinate by implementing Fourier-Bessel wave function. The wave equation in polar direction in mode 0,1 is applied to provide the corresponding natural frequencies of the circular membrane. The value of initial and boundary conditions in the function is determined from experiment to allow the correct development of numerical equation. The numerical mathematical model is coded in SMath for the accurate numerical analysis as well as the plotting tool. Two kompang membrane cases with different membrane materials, i.e. goat skin and x-ray film membranes with fixed radius of 0.1 m are used in the experiment. An alternative of kompang's membrane made of x-ray film with the appropriate tension setting can be used to represent the sound of traditional goat-skin kompang. The tension setting of the membrane to resemble the goat-skin is 24N. An effective numerical tool has been develop to help kompang maker to set the tension of x-ray membrane. In the future application, any tradional kompang with different size can be replaced by another membrane material if the tension is set to the correct tension value. The developed numerical tool is useful and handy to calculate the tension of the alternative membrane material.
Numerical Tension Adjustment of X-Ray Membrane to Represent Goat Skin Kompang
NASA Astrophysics Data System (ADS)
Syiddiq, M.; Siswanto, W. A.
2017-01-01
This paper presents a numerical membrane model of traditional musical instrument kompang that will be used to find the parameter of membrane tension of x-ray membrane representing the classical goat-skin membrane of kompang. In this study, the experiment towards the kompang is first conducted in an acoustical anechoic enclosure and in parallel a mathematical model of the kompang membrane is developed to simulate the vibration of the kompang membrane in polar coordinate by implementing Fourier-Bessel wave function. The wave equation in polar direction in mode 0,1 is applied to provide the corresponding natural frequencies of the circular membrane. The value of initial and boundary conditions in the function is determined from experiment to allow the correct development of numerical equation. The numerical mathematical model is coded in SMath for the accurate numerical analysis as well as the plotting tool. Two kompang membrane cases with different membrane materials, i.e. goat skin and x-ray film membranes with fixed radius of 0.1 m are used in the experiment. An alternative of kompang’s membrane made of x-ray film with the appropriate tension setting can be used to represent the sound of traditional goat-skin kompang. The tension setting of the membrane to resemble the goat-skin is 24N. An effective numerical tool has been used to help kompang maker to set the tension of x-ray membrane. In the future application, any traditional kompang with different size can be replaced by another membrane material if the tension is set to the correct tension value. The numerical tool used is useful and handy to calculate the tension of the alternative membrane material.
LeBrun, Drake G; Chackungal, Smita; Chao, Tiffany E; Knowlton, Lisa M; Linden, Allison F; Notrica, Michelle R; Solis, Carolina V; McQueen, K A Kelly
2014-03-01
Surgery has been neglected in low- and middle-income countries for decades. It is vital that the Post-2015 Development Agenda reflect that surgery is an important part of a comprehensive global health care delivery model. We compare the operative capacities of multiple low- and middle-income countries and identify critical gaps in surgical infrastructure. The Harvard Humanitarian Initiative survey tool was used to assess the operative capacities of 78 government district hospitals in Bangladesh (n = 7), Bolivia (n = 11), Ethiopia (n = 6), Liberia (n = 11), Nicaragua (n = 10), Rwanda (n = 21), and Uganda (n = 12) from 2011 to 2012. Key outcome measures included infrastructure, equipment availability, physician and nonphysician surgical providers, operative volume, and pharmaceutical capacity. Seventy of 78 district hospitals performed operations. There was fewer than one surgeon or anesthesiologist per 100,000 catchment population in all countries except Bolivia. There were no physician anesthesiologists in any surveyed hospitals in Rwanda, Liberia, Uganda, or in the majority of hospitals in Ethiopia. Mean annual operations per hospital ranged from 374 in Nicaragua to 3,215 in Bangladesh. Emergency operations and obstetric operations constituted 57.5% and 40% of all operations performed, respectively. Availability of pulse oximetry, essential medicines, and key infrastructure (water, electricity, oxygen) varied widely between and within countries. The need for operative procedures is not being met by the limited operative capacity in numerous low- and middle-income countries. It is of paramount importance that this gap be addressed by prioritizing essential surgery and safe anesthesia in the Post-2015 Development Agenda. Copyright © 2014 Mosby, Inc. All rights reserved.
A cross-sectional survey of essential surgical capacity in Somalia
Elkheir, Natalie; Sharma, Akshay; Cherian, Meena; Saleh, Omar Abdelrahman; Everard, Marthe; Popal, Ghulam Rabani; Ibrahim, Abdi Awad
2014-01-01
Objective To assess life-saving and disability-preventing surgical services (including emergency, trauma, obstetrics, anaesthesia) of health facilities in Somalia and to assist in the planning of strategies for strengthening surgical care systems. Design Cross-sectional survey. Setting Health facilities in all 3 administrative zones of Somalia; northwest Somalia (NWS), known as Somaliland; northeast Somalia (NES), known as Puntland; and south/central Somalia (SCS). Participants 14 health facilities. Measures The WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care was employed to capture a health facility's capacity to deliver surgical and anaesthesia services by investigating four categories of data: infrastructure, human resources, interventions available and equipment. Results The 14 facilities surveyed in Somalia represent 10 of the 18 districts throughout the country. The facilities serve an average patient population of 331 250 people, and 12 of the 14 identify as hospitals. While major surgical procedures were provided at many facilities (caesarean section, laparotomy, appendicectomy, etc), only 22% had fully available oxygen access, 50% fully available electricity and less than 30% had any management guidelines for emergency and surgical care. Furthermore, only 36% were able to provide general anaesthesia inhalation due to lack of skills, supplies and equipment. Basic supplies for airway management and the prevention of infection transmission were severely lacking in most facilities. Conclusions According to the results of the WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care survey, there exist significant gaps in the capacity of emergency and essential surgical services in Somalia including inadequacies in essential equipment, service provision and infrastructure. The information provided by the WHO tool can serve as a basis for evidence-based decisions on country-level policy regarding the allocation of resources and provision of emergency and essential surgical services. PMID:24812189
A cross-sectional survey of essential surgical capacity in Somalia.
Elkheir, Natalie; Sharma, Akshay; Cherian, Meena; Saleh, Omar Abdelrahman; Everard, Marthe; Popal, Ghulam Rabani; Ibrahim, Abdi Awad
2014-05-07
To assess life-saving and disability-preventing surgical services (including emergency, trauma, obstetrics, anaesthesia) of health facilities in Somalia and to assist in the planning of strategies for strengthening surgical care systems. Cross-sectional survey. Health facilities in all 3 administrative zones of Somalia; northwest Somalia (NWS), known as Somaliland; northeast Somalia (NES), known as Puntland; and south/central Somalia (SCS). 14 health facilities. The WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care was employed to capture a health facility's capacity to deliver surgical and anaesthesia services by investigating four categories of data: infrastructure, human resources, interventions available and equipment. The 14 facilities surveyed in Somalia represent 10 of the 18 districts throughout the country. The facilities serve an average patient population of 331 250 people, and 12 of the 14 identify as hospitals. While major surgical procedures were provided at many facilities (caesarean section, laparotomy, appendicectomy, etc), only 22% had fully available oxygen access, 50% fully available electricity and less than 30% had any management guidelines for emergency and surgical care. Furthermore, only 36% were able to provide general anaesthesia inhalation due to lack of skills, supplies and equipment. Basic supplies for airway management and the prevention of infection transmission were severely lacking in most facilities. According to the results of the WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care survey, there exist significant gaps in the capacity of emergency and essential surgical services in Somalia including inadequacies in essential equipment, service provision and infrastructure. The information provided by the WHO tool can serve as a basis for evidence-based decisions on country-level policy regarding the allocation of resources and provision of emergency and essential surgical services.
Technical Report on Occupations in Numerically Controlled Metal-Cutting Machining.
ERIC Educational Resources Information Center
Manpower Administration (DOL), Washington, DC. U.S. Employment Service.
At the present time, only 5 percent of the short-run metal-cutting machining in the United States is done by numerically controlled machined tools, but within the next decade it is expected to increase by 50 percent. Numerically controlled machines use taped data which is changed into instructions and directs the machine to do certain steps…
1982-07-21
aerodynamic tool for design of elastic aircraft. Several numerical examples are given and some dynamical problems of elastic aircraft are also discussed...Qiangang, Wu Changlin, Jian Zheng Northwestern Polytechnical University Abstract: A numerical metbod,6* ted for predicting the aerodynamic characte- ristics... Numerical value calculation method is one important means of the present research on elastic aircraft pneumatic characteristics. Be- cause this
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maes, G.J.
1993-10-01
This document contains the proceedings of the 62nd Interagency Manufacturing Operations Group (IMOG) Numerical Systems Group. Included are the minutes of the 61st meeting and the agenda for the 62nd meeting. Presentations at the meeting are provided in the appendices to this document. Presentations were: 1992 NSG Annual Report to IMOG Steering Committee; Charter for the IMOG Numerical Systems Group; Y-12 Coordinate Measuring Machine Training Project; IBH NC Controller; Automatically Programmed Metrology Update; Certification of Anvil-5000 for Production Use at the Y-12 Plant; Accord Project; Sandia National Laboratories {open_quotes}Accord{close_quotes}; Demo/Anvil Tool Path Generation 5-Axis; Demo/Video Machine/Robot Animation Dynamics; Demo/Certification ofmore » Anvil Tool Path Generation; Tour of the M-60 Inspection Machine; Distributed Numerical Control Certification; Spline Usage Method; Y-12 NC Engineering Status; and Y-12 Manufacturing CAD Systems.« less
Numerical Flight Mechanics Analysis Of The SHEFEX I Ascent And Re-Entry Phases
NASA Astrophysics Data System (ADS)
Bartolome Calvo, Javier; Eggers, Thino
2011-08-01
The SHarp Edge Flight EXperiment (SHEFEX) I provides a huge amount of scientific data to validate numerical tools in hypersonic flows. These data allow the direct comparison of flight measurements with the current numerical tools available at DLR. Therefore, this paper is devoted to apply a recently developed direct coupling between aerodynamics and flight dynamics to the SHEFEX I flight. In a first step, mission analyses are carried out using the trajectory optimization program REENT 6D coupled to missile DATCOM. In a second step, the direct coupling between the trajectory program and the DLR TAU code, in which the unsteady Euler equations including rigid body motion are solved, is applied to analyze some interesting parts of ascent and re-entry phases of the flight experiment. The agreement of the numerical predictions with the obtained flight data is satisfactory assuming a variable fin deflection angle.
A study on directional resistivity logging-while-drilling based on self-adaptive hp-FEM
NASA Astrophysics Data System (ADS)
Liu, Dejun; Li, Hui; Zhang, Yingying; Zhu, Gengxue; Ai, Qinghui
2014-12-01
Numerical simulation of resistivity logging-while-drilling (LWD) tool response provides guidance for designing novel logging instruments and interpreting real-time logging data. In this paper, based on self-adaptive hp-finite element method (hp-FEM) algorithm, we analyze LWD tool response against model parameters and briefly illustrate geosteering capabilities of directional resistivity LWD. Numerical simulation results indicate that the change of source spacing is of obvious influence on the investigation depth and detecting precision of resistivity LWD tool; the change of frequency can improve the resolution of low-resistivity formation and high-resistivity formation. The simulation results also indicate that the self-adaptive hp-FEM algorithm has good convergence speed and calculation accuracy to guide the geologic steering drilling and it is suitable to simulate the response of resistivity LWD tools.
NASA Astrophysics Data System (ADS)
Huang, Chien-Jung; White, Susan; Huang, Shao-Ching; Mallya, Sanjay; Eldredge, Jeff
2016-11-01
Obstructive sleep apnea (OSA) is a medical condition characterized by repetitive partial or complete occlusion of the airway during sleep. The soft tissues in the upper airway of OSA patients are prone to collapse under the low pressure loads incurred during breathing. The ultimate goal of this research is the development of a versatile numerical tool for simulation of air-tissue interactions in the patient specific upper airway geometry. This tool is expected to capture several phenomena, including flow-induced vibration (snoring) and large deformations during airway collapse of the complex airway geometry in respiratory flow conditions. Here, we present our ongoing progress toward this goal. To avoid mesh regeneration, for flow model, a sharp-interface embedded boundary method is used on Cartesian grids for resolving the fluid-structure interface, while for the structural model, a cut-cell finite element method is used. Also, to properly resolve large displacements, non-linear elasticity model is used. The fluid and structure solvers are connected with the strongly coupled iterative algorithm. The parallel computation is achieved with the numerical library PETSc. Some two- and three- dimensional preliminary results are shown to demonstrate the ability of this tool.
OPTIMIZING BMP PLACEMENT AT WATERSHED-SCALE USING SUSTAIN
Watershed and stormwater managers need modeling tools to evaluate alternative plans for environmental quality restoration and protection needs in urban and developing areas. A watershed-scale decision-support system, based on cost optimization, provides an essential tool to suppo...
U.S. EPA's Watershed Management Research Activities
Watershed and stormwater managers need modeling tools to evaluate alternative plans for environmental quality restoration and protection needs in urban and developing areas. A watershed-scale decision-support system, based on cost optimization, provides an essential tool to suppo...
Primer on Condition Curves for Water Mains
ABSTRACT The development of economical tools to prioritize pipe renewal based upon structural condition and remaining asset life is essential to effectively manage water infrastructure assets for both large and small diameter pipes. One tool that may facilitate asset management...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chauvin, J.P.; Blaise, P.; Lyoussi, A.
2015-07-01
The French atomic and alternative energies -CEA- is strongly involved in research and development programs concerning the use of nuclear energy as a clean and reliable source of energy and consequently is working on the present and future generation of reactors on various topics such as ageing plant management, optimization of the plutonium stockpile, waste management and innovative systems exploration. Core physics studies are an essential part of this comprehensive R and D effort. In particular, the Zero Power Reactor (ZPR) of CEA: EOLE, MINERVE and MASURCA play an important role in the validation of neutron (as well photon) physicsmore » calculation tools (codes and nuclear data). The experimental programs defined in the CEA's ZPR facilities aim at improving the calculation routes by reducing the uncertainties of the experimental databases. They also provide accurate data on innovative systems in terms of new materials (moderating and decoupling materials) and new concepts (ADS, ABWR, new MTR (e.g. JHR), GENIV) involving new fuels, absorbers and coolant materials. Conducting such interesting experimental R and D programs is based on determining and measuring main parameters of phenomena of interest to qualify calculation tools and nuclear data 'libraries'. Determining these parameters relies on the use of numerous and different experimental techniques using specific and appropriate instrumentation and detection tools. Main ZPR experimental programs at CEA, their objectives and challenges will be presented and discussed. Future development and perspectives regarding ZPR reactors and associated programs will be also presented. (authors)« less
Modified dwell time optimization model and its applications in subaperture polishing.
Dong, Zhichao; Cheng, Haobo; Tam, Hon-Yuen
2014-05-20
The optimization of dwell time is an important procedure in deterministic subaperture polishing. We present a modified optimization model of dwell time by iterative and numerical method, assisted by extended surface forms and tool paths for suppressing the edge effect. Compared with discrete convolution and linear equation models, the proposed model has essential compatibility with arbitrary tool paths, multiple tool influence functions (TIFs) in one optimization, and asymmetric TIFs. The emulational fabrication of a Φ200 mm workpiece by the proposed model yields a smooth, continuous, and non-negative dwell time map with a root-mean-square (RMS) convergence rate of 99.6%, and the optimization costs much less time. By the proposed model, influences of TIF size and path interval to convergence rate and polishing time are optimized, respectively, for typical low and middle spatial-frequency errors. Results show that (1) the TIF size is nonlinear inversely proportional to convergence rate and polishing time. A TIF size of ~1/7 workpiece size is preferred; (2) the polishing time is less sensitive to path interval, but increasing the interval markedly reduces the convergence rate. A path interval of ~1/8-1/10 of the TIF size is deemed to be appropriate. The proposed model is deployed on a JR-1800 and MRF-180 machine. Figuring results of Φ920 mm Zerodur paraboloid and Φ100 mm Zerodur plane by them yield RMS of 0.016λ and 0.013λ (λ=632.8 nm), respectively, and thereby validate the feasibility of proposed dwell time model used for subaperture polishing.
Utilization of FEM model for steel microstructure determination
NASA Astrophysics Data System (ADS)
Kešner, A.; Chotěborský, R.; Linda, M.; Hromasová, M.
2018-02-01
Agricultural tools which are used in soil processing, they are worn by abrasive wear mechanism cases by hard minerals particles in the soil. The wear rate is influenced by mechanical characterization of tools material and wear rate is influenced also by soil mineral particle contents. Mechanical properties of steel can be affected by a technology of heat treatment that it leads to a different microstructures. Experimental work how to do it is very expensive and thanks to numerical methods like FEM we can assumed microstructure at low cost but each of numerical model is necessary to be verified. The aim of this work has shown a procedure of prediction microstructure of steel for agricultural tools. The material characterizations of 51CrV4 grade steel were used for numerical simulation like TTT diagram, heat capacity, heat conduction and other physical properties of material. A relationship between predicted microstructure by FEM and real microstructure after heat treatment shows a good correlation.
NASA Astrophysics Data System (ADS)
Islam, Amina; Chevalier, Sylvie; Sassi, Mohamed
2018-04-01
With advances in imaging techniques and computational power, Digital Rock Physics (DRP) is becoming an increasingly popular tool to characterize reservoir samples and determine their internal structure and flow properties. In this work, we present the details for imaging, segmentation, as well as numerical simulation of single-phase flow through a standard homogenous Silurian dolomite core plug sample as well as a heterogeneous sample from a carbonate reservoir. We develop a procedure that integrates experimental results into the segmentation step to calibrate the porosity. We also look into using two different numerical tools for the simulation; namely Avizo Fire Xlab Hydro that solves the Stokes' equations via the finite volume method and Palabos that solves the same equations using the Lattice Boltzmann Method. Representative Elementary Volume (REV) and isotropy studies are conducted on the two samples and we show how DRP can be a useful tool to characterize rock properties that are time consuming and costly to obtain experimentally.
NASA Astrophysics Data System (ADS)
Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad
2016-05-01
Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert elicitation methodology is developed and applied to the real-world test case in order to provide a road map for the use of fuzzy Bayesian inference in groundwater modeling applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghosh, Debojyoti; Baeder, James D.
2014-01-21
A new class of compact-reconstruction weighted essentially non-oscillatory (CRWENO) schemes were introduced (Ghosh and Baeder in SIAM J Sci Comput 34(3): A1678–A1706, 2012) with high spectral resolution and essentially non-oscillatory behavior across discontinuities. The CRWENO schemes use solution-dependent weights to combine lower-order compact interpolation schemes and yield a high-order compact scheme for smooth solutions and a non-oscillatory compact scheme near discontinuities. The new schemes result in lower absolute errors, and improved resolution of discontinuities and smaller length scales, compared to the weighted essentially non-oscillatory (WENO) scheme of the same order of convergence. Several improvements to the smoothness-dependent weights, proposed inmore » the literature in the context of the WENO schemes, address the drawbacks of the original formulation. This paper explores these improvements in the context of the CRWENO schemes and compares the different formulations of the non-linear weights for flow problems with small length scales as well as discontinuities. Simplified one- and two-dimensional inviscid flow problems are solved to demonstrate the numerical properties of the CRWENO schemes and its different formulations. Canonical turbulent flow problems—the decay of isotropic turbulence and the shock-turbulence interaction—are solved to assess the performance of the schemes for the direct numerical simulation of compressible, turbulent flows« less
Performance evaluation of Bragg coherent diffraction imaging
NASA Astrophysics Data System (ADS)
Öztürk, H.; Huang, X.; Yan, H.; Robinson, I. K.; Noyan, I. C.; Chu, Y. S.
2017-10-01
In this study, we present a numerical framework for modeling three-dimensional (3D) diffraction data in Bragg coherent diffraction imaging (Bragg CDI) experiments and evaluating the quality of obtained 3D complex-valued real-space images recovered by reconstruction algorithms under controlled conditions. The approach is used to systematically explore the performance and the detection limit of this phase-retrieval-based microscopy tool. The numerical investigation suggests that the superb performance of Bragg CDI is achieved with an oversampling ratio above 30 and a detection dynamic range above 6 orders. The observed performance degradation subject to the data binning processes is also studied. This numerical tool can be used to optimize experimental parameters and has the potential to significantly improve the throughput of Bragg CDI method.
First approximations in avalanche model validations using seismic information
NASA Astrophysics Data System (ADS)
Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty
2017-04-01
Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position of the flow in the slope, and make observations of the internal flow dynamics, especially flow regimes transitions, which depend on the slope-perpendicular energy fluxes induced by collisions at the basal boundary. The recorded data over several experimental seasons provide a catalogue of seismic data from different types and sizes of avalanches triggered at the VDLS experimental site. These avalanches are recorded also by the SLF instrumentation (FMCW radars, photography, photogrammetry, video, videogrammetry, pressure sensors). We select the best-quality avalanche data to model and establish comparisons. All this information allows us to calibrate parameters governing the internal energy fluxes, especially parameters governing the interaction of the avalanche with the incumbent snow cover. For the comparison between the seismic signal and the RAMMS models, we are focusing at the temporal evolution of the flow, trying to find the same arrival times of the front at the seismic sensor location in the avalanche path. We make direct quantitative comparisons between measurements and model outputs, using modelled flow height, normal stress, velocity, and pressure values, compared with the seismic signal, its envelope and its running spectrogram. In all cases, the first comparisons between the seismic signal and RAMMS outputs are very promising.
Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.
Bergeron, Dominic; Tremblay, A-M S
2016-08-01
Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.
Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation
NASA Astrophysics Data System (ADS)
Bergeron, Dominic; Tremblay, A.-M. S.
2016-08-01
Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.
Numerical Stimulation of Multicomponent Chromatography Using Spreadsheets.
ERIC Educational Resources Information Center
Frey, Douglas D.
1990-01-01
Illustrated is the use of spreadsheet programs for implementing finite difference numerical simulations of chromatography as an instructional tool in a separations course. Discussed are differential equations, discretization and integration, spreadsheet development, computer requirements, and typical simulation results. (CW)
Lourenço, Adriano M; Haddi, Khalid; Ribeiro, Bergman M; Corrêia, Roberto F T; Tomé, Hudson V V; Santos-Amaya, Oscar; Pereira, Eliseu J G; Guedes, Raul N C; Santos, Gil R; Oliveira, Eugênio E; Aguiar, Raimundo W S
2018-05-08
Although the cultivation of transgenic plants expressing toxins of Bacillus thuringiensis (Bt) represents a successful pest management strategy, the rapid evolution of resistance to Bt plants in several lepidopteran pests has threatened the sustainability of this practice. By exhibiting a favorable safety profile and allowing integration with pest management initiatives, plant essential oils have become relevant pest control alternatives. Here, we assessed the potential of essential oils extracted from a Neotropical plant, Siparuna guianensis Aublet, for improving the control and resistance management of key lepidopteran pests (i.e., Spodoptera frugiperda and Anticarsia gemmatalis). The essential oil exhibited high toxicity against both lepidopteran pest species (including an S. frugiperda strain resistant to Cry1A.105 and Cry2Ab Bt toxins). This high insecticidal activity was associated with necrotic and apoptotic effects revealed by in vitro assays with lepidopteran (but not human) cell lines. Furthermore, deficits in reproduction (e.g., egg-laying deterrence and decreased egg viability), larval development (e.g., feeding inhibition) and locomotion (e.g., individual and grouped larvae walking activities) were recorded for lepidopterans sublethally exposed to the essential oil. Thus, by similarly and efficiently controlling lepidopteran strains susceptible and resistant to Bt toxins, the S. guianensis essential oil represents a promising management tool against key lepidopteran pests.
A Comparison of Parameter Study Creation and Job Submission Tools
NASA Technical Reports Server (NTRS)
DeVivo, Adrian; Yarrow, Maurice; McCann, Karen M.; Biegel, Bryan (Technical Monitor)
2001-01-01
We consider the differences between the available general purpose parameter study and job submission tools. These tools necessarily share many features, but frequently with differences in the way they are designed and implemented For this class of features, we will only briefly outline the essential differences. However we will focus on the unique features which distinguish the ILab parameter study and job submission tool from other packages, and which make the ILab tool easier and more suitable for use in our research and engineering environment.
Dhifi, Wissal; Bellili, Sana; Jazi, Sabrine; Bahloul, Nada; Mnif, Wissem
2016-01-01
This review covers literature data summarizing, on one hand, the chemistry of essential oils and, on the other hand, their most important activities. Essential oils, which are complex mixtures of volatile compounds particularly abundant in aromatic plants, are mainly composed of terpenes biogenerated by the mevalonate pathway. These volatile molecules include monoterpenes (hydrocarbon and oxygenated monoterpens), and also sesquiterpenes (hydrocarbon and oxygenated sesquiterpens). Furthermore, they contain phenolic compounds, which are derived via the shikimate pathway. Thanks to their chemical composition, essential oils possess numerous biological activities (antioxidant, anti-inflammatory, antimicrobial, etc…) of great interest in food and cosmetic industries, as well as in the human health field. PMID:28930135
Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool
NASA Astrophysics Data System (ADS)
Torlapati, Jagadish; Prabhakar Clement, T.
2013-01-01
We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.
Energy Systems Integration News | Energy Systems Integration Facility |
answer that question by examining the technical, infrastructure, economic, and policy barriers to greater intra-hour, inter-hour, seasonal, and inter-annual variability of solar resources-essential information powerful tool that provides essential information to policymakers, financiers, project developers, and
Technology Integration in Science Classrooms: Framework, Principles, and Examples
ERIC Educational Resources Information Center
Kim, Minchi C.; Freemyer, Sarah
2011-01-01
A great number of technologies and tools have been developed to support science learning and teaching. However, science teachers and researchers point out numerous challenges to implementing such tools in science classrooms. For instance, guidelines, lesson plans, Web links, and tools teachers can easily find through Web-based search engines often…
Prostate cancer: predicting high-risk prostate cancer-a novel stratification tool.
Buck, Jessica; Chughtai, Bilal
2014-05-01
Currently, numerous systems exist for the identification of high-risk prostate cancer, but few of these systems can guide treatment strategies. A new stratification tool that uses common diagnostic factors can help to predict outcomes after radical prostatectomy. The tool aids physicians in the identification of appropriate candidates for aggressive, local treatment.
Data-Driven Modeling and Rendering of Force Responses from Elastic Tool Deformation
Rakhmatov, Ruslan; Ogay, Tatyana; Jeon, Seokhee
2018-01-01
This article presents a new data-driven model design for rendering force responses from elastic tool deformation. The new design incorporates a six-dimensional input describing the initial position of the contact, as well as the state of the tool deformation. The input-output relationship of the model was represented by a radial basis functions network, which was optimized based on training data collected from real tool-surface contact. Since the input space of the model is represented in the local coordinate system of a tool, the model is independent of recording and rendering devices and can be easily deployed to an existing simulator. The model also supports complex interactions, such as self and multi-contact collisions. In order to assess the proposed data-driven model, we built a custom data acquisition setup and developed a proof-of-concept rendering simulator. The simulator was evaluated through numerical and psychophysical experiments with four different real tools. The numerical evaluation demonstrated the perceptual soundness of the proposed model, meanwhile the user study revealed the force feedback of the proposed simulator to be realistic. PMID:29342964
Numerical Simulations of Acoustically Driven, Burning Droplets
NASA Technical Reports Server (NTRS)
Kim, H.-C.; Karagozian, A. R.; Smith, O. I.; Urban, Dave (Technical Monitor)
1999-01-01
This computational study focuses on understanding and quantifying the effects of external acoustical perturbations on droplet combustion. A one-dimensional, axisymmetric representation of the essential diffusion and reaction processes occurring in the vicinity of the droplet stagnation point is used here in order to isolate the effects of the imposed acoustic disturbance. The simulation is performed using a third order accurate, essentially non-oscillatory (ENO) numerical scheme with a full methanol-air reaction mechanism. Consistent with recent microgravity and normal gravity combustion experiments, focus is placed on conditions where the droplet is situated at a velocity antinode in order for the droplet to experience the greatest effects of fluid mechanical straining of flame structures. The effects of imposed sound pressure level and frequency are explored here, and conditions leading to maximum burning rates are identified.
Essential Features of Responsible Governance of Agricultural Biotechnology
Hartley, Sarah; Wickson, Fern
2016-01-01
Agricultural biotechnology continues to generate considerable controversy. We argue that to address this controversy, serious changes to governance are needed. The new wave of genomic tools and products (e.g., CRISPR, gene drives, RNAi, synthetic biology, and genetically modified [GM] insects and fish), provide a particularly useful opportunity to reflect on and revise agricultural biotechnology governance. In response, we present five essential features to advance more socially responsible forms of governance. In presenting these, we hope to stimulate further debate and action towards improved forms of governance, particularly as these new genomic tools and products continue to emerge. PMID:27144921
Essential Features of Responsible Governance of Agricultural Biotechnology.
Hartley, Sarah; Gillund, Frøydis; van Hove, Lilian; Wickson, Fern
2016-05-01
Agricultural biotechnology continues to generate considerable controversy. We argue that to address this controversy, serious changes to governance are needed. The new wave of genomic tools and products (e.g., CRISPR, gene drives, RNAi, synthetic biology, and genetically modified [GM] insects and fish), provide a particularly useful opportunity to reflect on and revise agricultural biotechnology governance. In response, we present five essential features to advance more socially responsible forms of governance. In presenting these, we hope to stimulate further debate and action towards improved forms of governance, particularly as these new genomic tools and products continue to emerge.
Advanced Numerical and Theoretical Methods for Photonic Crystals and Metamaterials
NASA Astrophysics Data System (ADS)
Felbacq, Didier
2016-11-01
This book provides a set of theoretical and numerical tools useful for the study of wave propagation in metamaterials and photonic crystals. While concentrating on electromagnetic waves, most of the material can be used for acoustic (or quantum) waves. For each presented numerical method, numerical code written in MATLAB® is presented. The codes are limited to 2D problems and can be easily translated in Python or Scilab, and used directly with Octave as well.
Modeling Operations Other Than War: Non-Combatants in Combat Modeling
1994-09-01
supposition that non-combatants are an essential feature in OOTW. The model proposal includes a methodology for civilian unit decision making . The model...combatants are an essential feature in OOTW. The model proposal includes a methodology for civilian unit decision making . Thi- model also includes...numerical example demonstrated that the model appeared to perform in an acceptable manner, in that it produced output within a reasonable range. During the
A numerical tool for reproducing driver behaviour: experiments and predictive simulations.
Casucci, M; Marchitto, M; Cacciabue, P C
2010-03-01
This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.
Who Are the Top Contributors in a MOOC? Relating Participants' Performance and Contributions
ERIC Educational Resources Information Center
Alario-Hoyos, C.; Muñoz-Merino, P. J.; Pérez-Sanagustín, M.; Delgado Kloos, C.; Parada Gelvez, H. A.
2016-01-01
The role of social tools in massive open online courses (MOOCs) is essential as they connect participants. Of all the participants in an MOOC, top contributors are the ones who more actively contribute via social tools. This article analyses and reports empirical data from five different social tools pertaining to an actual MOOC to characterize…
Villoutreix, Bruno O; Kuenemann, Melaine A; Poyet, Jean-Luc; Bruzzoni-Giovanelli, Heriberto; Labbé, Céline; Lagorce, David; Sperandio, Olivier; Miteva, Maria A
2014-01-01
Fundamental processes in living cells are largely controlled by macromolecular interactions and among them, protein–protein interactions (PPIs) have a critical role while their dysregulations can contribute to the pathogenesis of numerous diseases. Although PPIs were considered as attractive pharmaceutical targets already some years ago, they have been thus far largely unexploited for therapeutic interventions with low molecular weight compounds. Several limiting factors, from technological hurdles to conceptual barriers, are known, which, taken together, explain why research in this area has been relatively slow. However, this last decade, the scientific community has challenged the dogma and became more enthusiastic about the modulation of PPIs with small drug-like molecules. In fact, several success stories were reported both, at the preclinical and clinical stages. In this review article, written for the 2014 International Summer School in Chemoinformatics (Strasbourg, France), we discuss in silico tools (essentially post 2012) and databases that can assist the design of low molecular weight PPI modulators (these tools can be found at www.vls3d.com). We first introduce the field of protein–protein interaction research, discuss key challenges and comment recently reported in silico packages, protocols and databases dedicated to PPIs. Then, we illustrate how in silico methods can be used and combined with experimental work to identify PPI modulators. PMID:25254076
Investigating water transport through the xylem network in vascular plants.
Kim, Hae Koo; Park, Joonghyuk; Hwang, Ildoo
2014-04-01
Our understanding of physical and physiological mechanisms depends on the development of advanced technologies and tools to prove or re-evaluate established theories, and test new hypotheses. Water flow in land plants is a fascinating phenomenon, a vital component of the water cycle, and essential for life on Earth. The cohesion-tension theory (CTT), formulated more than a century ago and based on the physical properties of water, laid the foundation for our understanding of water transport in vascular plants. Numerous experimental tools have since been developed to evaluate various aspects of the CTT, such as the existence of negative hydrostatic pressure. This review focuses on the evolution of the experimental methods used to study water transport in plants, and summarizes the different ways to investigate the diversity of the xylem network structure and sap flow dynamics in various species. As water transport is documented at different scales, from the level of single conduits to entire plants, it is critical that new results be subjected to systematic cross-validation and that findings based on different organs be integrated at the whole-plant level. We also discuss the functional trade-offs between optimizing hydraulic efficiency and maintaining the safety of the entire transport system. Furthermore, we evaluate future directions in sap flow research and highlight the importance of integrating the combined effects of various levels of hydraulic regulation.
Quality By Design: Concept To Applications.
Swain, Suryakanta; Padhy, Rabinarayan; Jena, Bikash Ranjan; Babu, Sitty Manohar
2018-03-08
Quality by Design is associated to the modern, systematic, scientific and novel approach which is concerned with pre-distinct objectives that not only focus on product, process understanding but also leads to process control. It predominantly signifies the design and product improvement and the manufacturing process in order to fulfill the predefined manufactured goods or final products quality characteristics. It is quite essential to identify desire and required product performance report such as Target Product Profile, typical Quality Target Product Profile (QTPP) and Critical Quality attributes (CQA). This review highlighted about the concepts of QbD design space, for critical material attributes (CMAs) as well as the critical process parameters that can totally affect the CQAs within which the process shall be unaffected and consistently manufacture the required product. Risk assessment tools and design of experiments are its prime components. This paper outlines the basic knowledge of QbD, the key elements; steps as well as various tools for QbD implementation in pharmaceutics field are presented briefly. In addition to this, quite a lot of applications of QbD in numerous pharmaceutical related unit operations are discussed and summarized. This article provides a complete data as well as the road map for universal implementation and application of QbD for pharmaceutical products. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Duives, Dorine C.; Daamen, Winnie; Hoogendoorn, Serge P.
2016-04-01
In recent years numerous pedestrian simulation tools have been developed that can support crowd managers and government officials in their tasks. New technologies to monitor pedestrian flows are in dire need of models that allow for rapid state-estimation. Many contemporary pedestrian simulation tools model the movements of pedestrians at a microscopic level, which does not provide an exact solution. Macroscopic models capture the fundamental characteristics of the traffic state at a more aggregate level, and generally have a closed form solution which is necessary for rapid state estimation for traffic management purposes. This contribution presents a next step in the calibration and validation of the macroscopic continuum model detailed in Hoogendoorn et al. (2014). The influence of global and local route choice on the development of crowd movement phenomena, such as dissipation, lane-formation and stripe-formation, is studied. This study shows that most self-organization phenomena and behavioural trends only develop under very specific conditions, and as such can only be simulated using specific parameter sets. Moreover, all crowd movement phenomena can be reproduced by means of the continuum model using one parameter set. This study concludes that the incorporation of local route choice behaviour and the balancing of the aptitude of pedestrians with respect to their own class and other classes are both essential in the correct prediction of crowd movement dynamics.
Modelling the Krebs cycle and oxidative phosphorylation.
Korla, Kalyani; Mitra, Chanchal K
2014-01-01
The Krebs cycle and oxidative phosphorylation are the two most important sets of reactions in a eukaryotic cell that meet the major part of the total energy demands of a cell. In this paper, we present a computer simulation of the coupled reactions using open source tools for simulation. We also show that it is possible to model the Krebs cycle with a simple black box with a few inputs and outputs. However, the kinetics of the internal processes has been modelled using numerical tools. We also show that the Krebs cycle and oxidative phosphorylation together can be combined in a similar fashion - a black box with a few inputs and outputs. The Octave script is flexible and customisable for any chosen set-up for this model. In several cases, we had no explicit idea of the underlying reaction mechanism and the rate determining steps involved, and we have used the stoichiometric equations that can be easily changed as and when more detailed information is obtained. The script includes the feedback regulation of the various enzymes of the Krebs cycle. For the electron transport chain, the pH gradient across the membrane is an essential regulator of the kinetics and this has been modelled empirically but fully consistent with experimental results. The initial conditions can be very easily changed and the simulation is potentially very useful in a number of cases of clinical importance.
The most common technologies and tools for functional genome analysis.
Gasperskaja, Evelina; Kučinskas, Vaidutis
2017-01-01
Since the sequence of the human genome is complete, the main issue is how to understand the information written in the DNA sequence. Despite numerous genome-wide studies that have already been performed, the challenge to determine the function of genes, gene products, and also their interaction is still open. As changes in the human genome are highly likely to cause pathological conditions, functional analysis is vitally important for human health. For many years there have been a variety of technologies and tools used in functional genome analysis. However, only in the past decade there has been rapid revolutionizing progress and improvement in high-throughput methods, which are ranging from traditional real-time polymerase chain reaction to more complex systems, such as next-generation sequencing or mass spectrometry. Furthermore, not only laboratory investigation, but also accurate bioinformatic analysis is required for reliable scientific results. These methods give an opportunity for accurate and comprehensive functional analysis that involves various fields of studies: genomics, epigenomics, proteomics, and interactomics. This is essential for filling the gaps in the knowledge about dynamic biological processes at both cellular and organismal level. However, each method has both advantages and limitations that should be taken into account before choosing the right method for particular research in order to ensure successful study. For this reason, the present review paper aims to describe the most frequent and widely-used methods for the comprehensive functional analysis.
Tool Removes Coil-Spring Thread Inserts
NASA Technical Reports Server (NTRS)
Collins, Gerald J., Jr.; Swenson, Gary J.; Mcclellan, J. Scott
1991-01-01
Tool removes coil-spring thread inserts from threaded holes. Threads into hole, pries insert loose, grips insert, then pulls insert to thread it out of hole. Effects essentially reverse of insertion process to ease removal and avoid further damage to threaded inner surface of hole.
Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Y; Glascoe, L
The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less
NASA Astrophysics Data System (ADS)
Sateesh Kumar, Ch; Patel, Saroj Kumar; Das, Anshuman
2018-03-01
Temperature generation in cutting tools is one of the major causes of tool failure especially during hard machining where machining forces are quite high resulting in elevated temperatures. Thus, the present work investigates the temperature generation during hard machining of AISI 52100 steel (62 HRC hardness) with uncoated and PVD AlTiN coated Al2O3/TiCN mixed ceramic cutting tools. The experiments were performed on a heavy duty lathe machine with both coated and uncoated cutting tools under dry cutting environment. The temperature of the cutting zone was measured using an infrared thermometer and a finite element model has been adopted to predict the temperature distribution in cutting tools during machining for comparative assessment with the measured temperature. The experimental and numerical results revealed a significant reduction of cutting zone temperature during machining with PVD AlTiN coated cutting tools when compared to uncoated cutting tools during each experimental run. The main reason for decrease in temperature for AlTiN coated tools is the lower coefficient of friction offered by the coating material which allows the free flow of the chips on the rake surface when compared with uncoated cutting tools. Further, the superior wear behaviour of AlTiN coating resulted in reduction of cutting temperature.
FOCUS: Essential Elements of Quality for State-Funded Preschool Programs
ERIC Educational Resources Information Center
New Mexico Public Education Department, 2016
2016-01-01
The "FOCUS: Essential Elements of Quality, New Mexico's Tiered Quality Rating and Improvement System (TQRIS)," provides early childhood program personnel with the criteria, tools, and resources they need to improve the quality of their program. These quality improvements focus on children's growth, development, and learning--so that each…
Minimum Essential Requirements and Standards in Medical Education.
ERIC Educational Resources Information Center
Wojtczak, Andrzej; Schwarz, M. Roy
2000-01-01
Reviews the definition of standards in general, and proposes a definition of standards and global minimum essential requirements for use in medical education. Aims to serve as a tool for the improvement of quality and international comparisons of basic medical programs. Explains the IIME (Institute for International Medical Education) project…
Essentials for the Teacher's Toolbox
ERIC Educational Resources Information Center
Uhler, Jennifer
2012-01-01
Every profession has a set of essential tools for carrying out its work. Airplane mechanics cannot repair engines without sophisticated diagnostics, wrenches, and pliers. Surgeons cannot operate without scalpels and clamps. In contrast, teaching has often been perceived as a profession requiring only students, chalk, and a blackboard in order for…
USDA-ARS?s Scientific Manuscript database
A steam distillation extraction kinetics experiment was conducted to estimate essential oil yield, composition, antimalarial, and antioxidant capacity of cumin (Cuminum cyminum L.) seed (fruits). Furthermore, regression models were developed to predict essential oil yield and composition for a given...
Help Seeking: Agentic Learners Initiating Feedback
ERIC Educational Resources Information Center
Fletcher, Anna Katarina
2018-01-01
Effective feedback is an essential tool for making learning explicit and an essential feature of classroom practice that promotes learner autonomy. Yet, it remains a pressing challenge for teachers to scaffold the active involvement of students as critical, reflective and autonomous learners who use feedback constructively. This paper seeks to…
P-TRAP: a Panicle TRAit Phenotyping tool.
A L-Tam, Faroq; Adam, Helene; Anjos, António dos; Lorieux, Mathias; Larmande, Pierre; Ghesquière, Alain; Jouannic, Stefan; Shahbazkia, Hamid Reza
2013-08-29
In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods.
P-TRAP: a Panicle Trait Phenotyping tool
2013-01-01
Background In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. Results This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. Conclusions P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods. PMID:23987653
Moussaoui, Ahmed; Bouziane, Touria
2016-01-01
The method LRPIM is a Meshless method with properties of simple implementation of the essential boundary conditions and less costly than the moving least squares (MLS) methods. This method is proposed to overcome the singularity associated to polynomial basis by using radial basis functions. In this paper, we will present a study of a 2D problem of an elastic homogenous rectangular plate by using the method LRPIM. Our numerical investigations will concern the influence of different shape parameters on the domain of convergence,accuracy and using the radial basis function of the thin plate spline. It also will presents a comparison between numerical results for different materials and the convergence domain by precising maximum and minimum values as a function of distribution nodes number. The analytical solution of the deflection confirms the numerical results. The essential points in the method are: •The LRPIM is derived from the local weak form of the equilibrium equations for solving a thin elastic plate.•The convergence of the LRPIM method depends on number of parameters derived from local weak form and sub-domains.•The effect of distributions nodes number by varying nature of material and the radial basis function (TPS).
Efficient simulation of press hardening process through integrated structural and CFD analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palaniswamy, Hariharasudhan; Mondalek, Pamela; Wronski, Maciek
Press hardened steel parts are being increasingly used in automotive structures for their higher strength to meet safety standards while reducing vehicle weight to improve fuel consumption. However, manufacturing of sheet metal parts by press hardening process to achieve desired properties is extremely challenging as it involves complex interaction of plastic deformation, metallurgical change, thermal distribution, and fluid flow. Numerical simulation is critical for successful design of the process and to understand the interaction among the numerous process parameters to control the press hardening process in order to consistently achieve desired part properties. Until now there has been no integratedmore » commercial software solution that can efficiently model the complete process from forming of the blank, heat transfer between the blank and tool, microstructure evolution in the blank, heat loss from tool to the fluid that flows through water channels in the tools. In this study, a numerical solution based on Altair HyperWorks® product suite involving RADIOSS®, a non-linear finite element based structural analysis solver and AcuSolve®, an incompressible fluid flow solver based on Galerkin Least Square Finite Element Method have been utilized to develop an efficient solution for complete press hardening process design and analysis. RADIOSS is used to handle the plastic deformation, heat transfer between the blank and tool, and microstructure evolution in the blank during cooling. While AcuSolve is used to efficiently model heat loss from tool to the fluid that flows through water channels in the tools. The approach is demonstrated through some case studies.« less
Volcanology: Volcanic bipolar disorder explained
NASA Astrophysics Data System (ADS)
Jellinek, Mark
2014-02-01
Eruptions come in a range of magnitudes. Numerical simulations and laboratory experiments show that rare, giant super-eruptions and smaller, more frequent events reflect a transition in the essential driving forces for volcanism.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Namburu, Raju R.
1989-01-01
Numerical simulations are presented for hyperbolic heat-conduction problems that involve non-Fourier effects, using explicit, Lax-Wendroff/Taylor-Galerkin FEM formulations as the principal computational tool. Also employed are smoothing techniques which stabilize the numerical noise and accurately predict the propagating thermal disturbances. The accurate capture of propagating thermal disturbances at characteristic time-step values is achieved; numerical test cases are presented which validate the proposed hyperbolic heat-conduction problem concepts.
Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale
Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv
2015-01-01
X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner. PMID:26169570
Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale
NASA Astrophysics Data System (ADS)
Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv
2015-07-01
X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner.
CIM at GE's factory of the future
NASA Astrophysics Data System (ADS)
Waldman, H.
Functional features of a highly automated aircraft component batch processing factory are described. The system has processing, working, and methodology components. A rotating parts operation installed 20 yr ago features a high density of numerically controlled machines, and is connected to a hierarchical network of data communications and apparatus for moving the rotating parts and tools of engines. Designs produced at one location in the country are sent by telephone link to other sites for development of manufacturing plans, tooling, numerical control programs, and process instructions for the rotating parts. Direct numerical control is implemented at the work stations, which have instructions stored on tape for back-up in case the host computer goes down. Each machine is automatically monitored at 48 points and notice of failure can originate from any point in the system.
Restricted numerical range: A versatile tool in the theory of quantum information
NASA Astrophysics Data System (ADS)
Gawron, Piotr; Puchała, Zbigniew; Miszczak, Jarosław Adam; Skowronek, Łukasz; Życzkowski, Karol
2010-10-01
Numerical range of a Hermitian operator X is defined as the set of all possible expectation values of this observable among a normalized quantum state. We analyze a modification of this definition in which the expectation value is taken among a certain subset of the set of all quantum states. One considers, for instance, the set of real states, the set of product states, separable states, or the set of maximally entangled states. We show exemplary applications of these algebraic tools in the theory of quantum information: analysis of k-positive maps and entanglement witnesses, as well as study of the minimal output entropy of a quantum channel. Product numerical range of a unitary operator is used to solve the problem of local distinguishability of a family of two unitary gates.
Essentially nonoscillatory postprocessing filtering methods
NASA Technical Reports Server (NTRS)
Lafon, F.; Osher, S.
1992-01-01
High order accurate centered flux approximations used in the computation of numerical solutions to nonlinear partial differential equations produce large oscillations in regions of sharp transitions. Here, we present a new class of filtering methods denoted by Essentially Nonoscillatory Least Squares (ENOLS), which constructs an upgraded filtered solution that is close to the physically correct weak solution of the original evolution equation. Our method relies on the evaluation of a least squares polynomial approximation to oscillatory data using a set of points which is determined via the ENO network. Numerical results are given in one and two space dimensions for both scalar and systems of hyperbolic conservation laws. Computational running time, efficiency, and robustness of method are illustrated in various examples such as Riemann initial data for both Burgers' and Euler's equations of gas dynamics. In all standard cases, the filtered solution appears to converge numerically to the correct solution of the original problem. Some interesting results based on nonstandard central difference schemes, which exactly preserve entropy, and have been recently shown generally not to be weakly convergent to a solution of the conservation law, are also obtained using our filters.
Utilizing Technology to Enhance Learning Environments: The Net Gen Student
ERIC Educational Resources Information Center
Muhammad, Amanda J.; Mitova, Mariana A.; Wooldridge, Deborah G.
2016-01-01
It is essential for instructors to understand the importance of classroom technology so they can prepare to use it to personalize students' learning. Strategies for choosing effective electronic tools are presented, followed by specific suggestions for designing enhanced personalized learning using electronic tools.
New Texts, New Tools: An Argument for Media Literacy.
ERIC Educational Resources Information Center
McBrien, J. Lynn
1999-01-01
Adults cannot adequately prevent their children from observing media messages. Students are actually safer if they are educated about analyzing and assessing unsavory messages for themselves. Appropriate media-literacy pedagogy involves five essential elements: background, tools, deconstruction of media techniques, product evaluation, and original…
FPI: FM Success through Analytics
ERIC Educational Resources Information Center
Hickling, Duane
2013-01-01
The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…
Classification of time series patterns from complex dynamic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Rao, N.
1998-07-01
An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately,more » the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.« less
Numerical study of the small scale structures in Boussinesq convection
NASA Technical Reports Server (NTRS)
Weinan, E.; Shu, Chi-Wang
1992-01-01
Two-dimensional Boussinesq convection is studied numerically using two different methods: a filtered pseudospectral method and a high order accurate Essentially Nonoscillatory (ENO) scheme. The issue whether finite time singularity occurs for initially smooth flows is investigated. The numerical results suggest that the collapse of the bubble cap is unlikely to occur in resolved calculations. The strain rate corresponding to the intensification of the density gradient across the front saturates at the bubble cap. We also found that the cascade of energy to small scales is dominated by the formulation of thin and sharp fronts across which density jumps.
NASA Technical Reports Server (NTRS)
Kung, Ernest C.
1994-01-01
The contract research has been conducted in the following three major areas: analysis of numerical simulations and parallel observations of atmospheric blocking, diagnosis of the lower boundary heating and the response of the atmospheric circulation, and comprehensive assessment of long-range forecasting with numerical and regression methods. The essential scientific and developmental purpose of this contract research is to extend our capability of numerical weather forecasting by the comprehensive general circulation model. The systematic work as listed above is thus geared to developing a technological basis for future NASA long-range forecasting.
ERIC Educational Resources Information Center
Terkla, Dawn Geronimo; Sharkness, Jessica; Cohen, Margaret; Roscoe, Heather S.; Wiseman, Marjorie
2012-01-01
In an age in which information and data are more readily available than ever, it is critical for higher education institutions to develop tools that can communicate essential information to those who make decisions in an easy-to-understand format. One of the tools available for this purpose is a dashboard, a one- to two-page document that presents…
NASA Astrophysics Data System (ADS)
Courageot, Estelle; Sayah, Rima; Huet, Christelle
2010-05-01
Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.
Courageot, Estelle; Sayah, Rima; Huet, Christelle
2010-05-07
Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.
NASA Technical Reports Server (NTRS)
Shih, Hsin-Yi; Tien, James S.; Ferkul, Paul (Technical Monitor)
2001-01-01
The recently developed numerical model of concurrent-flow flame spread over thin solids has been used as a simulation tool to help the designs of a space experiment. The two-dimensional and three-dimensional, steady form of the compressible Navier-Stokes equations with chemical reactions are solved. With the coupled multi-dimensional solver of the radiative heat transfer, the model is capable of answering a number of questions regarding the experiment concept and the hardware designs. In this paper, the capabilities of the numerical model are demonstrated by providing the guidance for several experimental designing issues. The test matrix and operating conditions of the experiment are estimated through the modeling results. The three-dimensional calculations are made to simulate the flame-spreading experiment with realistic hardware configuration. The computed detailed flame structures provide the insight to the data collection. In addition, the heating load and the requirements of the product exhaust cleanup for the flow tunnel are estimated with the model. We anticipate that using this simulation tool will enable a more efficient and successful space experiment to be conducted.
Data access and decision tools for coastal water resources management
US EPA has supported the development of numerous models and tools to support implementation of environmental regulations. However, transfer of knowledge and methods from detailed technical models to support practical problem solving by local communities and watershed or coastal ...
Airplane numerical simulation for the rapid prototyping process
NASA Astrophysics Data System (ADS)
Roysdon, Paul F.
Airplane Numerical Simulation for the Rapid Prototyping Process is a comprehensive research investigation into the most up-to-date methods for airplane development and design. Uses of modern engineering software tools, like MatLab and Excel, are presented with examples of batch and optimization algorithms which combine the computing power of MatLab with robust aerodynamic tools like XFOIL and AVL. The resulting data is demonstrated in the development and use of a full non-linear six-degrees-of-freedom simulator. The applications for this numerical tool-box vary from un-manned aerial vehicles to first-order analysis of manned aircraft. A Blended-Wing-Body airplane is used for the analysis to demonstrate the flexibility of the code from classic wing-and-tail configurations to less common configurations like the blended-wing-body. This configuration has been shown to have superior aerodynamic performance -- in contrast to their classic wing-and-tube fuselage counterparts -- and have reduced sensitivity to aerodynamic flutter as well as potential for increased engine noise abatement. Of course without a classic tail elevator to damp the nose up pitching moment, and the vertical tail rudder to damp the yaw and possible rolling aerodynamics, the challenges in lateral roll and yaw stability, as well as pitching moment are not insignificant. This thesis work applies the tools necessary to perform the airplane development and optimization on a rapid basis, demonstrating the strength of this tool through examples and comparison of the results to similar airplane performance characteristics published in literature.
Creating a Minnesota Statewide SNAP-Ed Program Evaluation
ERIC Educational Resources Information Center
Gold, Abby; Barno, Trina Adler; Sherman, Shelley; Lovett, Kathleen; Hurtado, G. Ali
2013-01-01
Systematic evaluation is an essential tool for understanding program effectiveness. This article describes the pilot test of a statewide evaluation tool for the Supplemental Nutrition Assistance Program-Education (SNAP-Ed). A computer algorithm helped Community Nutrition Educators (CNEs) build surveys specific to their varied educational settings…
A Resource Guide Identifying Technology Tools for Schools. Appendix
ERIC Educational Resources Information Center
Fox, Christine; Jones, Rachel
2009-01-01
SETDA and NASTID's "Technology Tools for Schools Resource Guide" provides definitions of key technology components and relevant examples, where appropriate as a glossary for educators. The guide also presents essential implementation and infrastructure considerations that decision makers should think about when implementing technology in schools.…
ERIC Educational Resources Information Center
Medlin, E. Lander; Judd, R. Holly
2013-01-01
APPA's Facilities Management Evaluation Program (FMEP) provides an integrated system to optimize organizational performance. The criteria for evaluation not only provide a tool for organizational continuous improvement, they serve as a compelling leadership development tool essential for today's facilities management professional. The senior…
A Comparison of Systematic Screening Tools for Emotional and Behavioral Disorders
ERIC Educational Resources Information Center
Lane, Kathleen Lynne; Little, M. Annette; Casey, Amy M.; Lambert, Warren; Wehby, Joseph; Weisenbach, Jessica L.; Phillips, Andrea
2009-01-01
Early identification of students who might develop emotional and behavioral disorders (EBD) is essential in preventing negative outcomes. Systematic screening tools are available for identifying elementary-age students with EBD, including the "Systematic Screening for Behavior Disorders" (SSBD) and the "Student Risk Screening…
Establishing Time for Professional Learning
ERIC Educational Resources Information Center
Journal of Staff Development, 2013
2013-01-01
Time for collaborative learning is an essential resource for educators working to implement college- and career-ready standards. The pages in this article include tools from the workbook "Establishing Time for Professional Learning." The tools support a complete process to help educators effectively find and use time. The following…
ERIC Educational Resources Information Center
Schonborn, Konrad J.; Anderson, Trevor R.
2010-01-01
External representations (ERs), such as diagrams, animations, and dynamic models are vital tools for communicating and constructing knowledge in biochemistry. To build a meaningful understanding of structure, function, and process, it is essential that students become visually literate by mastering key cognitive skills that are essential for…
Lebrun, Drake G; Dhar, Debashish; Sarkar, Md Imran H; Imran, T M Tanzil A; Kazi, Sayadat N; McQueen, K A Kelly
2013-01-01
Surgically treatable diseases weigh heavily on the lives of people in resource-poor countries. Though global surgical disparities are increasingly recognized as a public health priority, the extent of these disparities is unknown because of a lack of data. The present study sought to measure surgical and anesthesia infrastructure in Bangladesh as part of an international study assessing surgical and anesthesia capacity in low income nations. A comprehensive survey tool was administered via convenience sampling at one public district hospital and one public tertiary care hospital in each of the seven administrative divisions of Bangladesh. There are an estimated 1,200 obstetricians, 2,615 general and subspecialist surgeons, and 850 anesthesiologists in Bangladesh. These numbers correspond to 0.24 surgical providers per 10,000 people and 0.05 anesthesiologists per 10,000 people. Surveyed hospitals performed a large number of operations annually despite having minimal clinical human resources and inadequate physical infrastructure. Shortages in equipment and/or essential medicines were reported at all hospitals and these shortages were particularly severe at the district hospital level. In order to meet the immense demand for surgical care in Bangladesh, public hospitals must address critical shortages in skilled human resources, inadequate physical infrastructure, and low availability of equipment and essential medications. This study identified numerous areas in which the international community can play a vital role in increasing surgical and anesthesia capacity in Bangladesh and ensuring safe surgery for all in the country.
NASA Astrophysics Data System (ADS)
Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin
2016-12-01
This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.
Benchmark Problems of the Geothermal Technologies Office Code Comparison Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Mark D.; Podgorney, Robert; Kelkar, Sharad M.
A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulationmore » capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners.« less
Increasing use of high-speed digital imagery as a measurement tool on test and evaluation ranges
NASA Astrophysics Data System (ADS)
Haddleton, Graham P.
2001-04-01
In military research and development or testing there are various fast and dangerous events that need to be recorded and analysed. High-speed cameras allow the capture of movement too fast to be recognised by the human eye, and provide data that is essential for the analysis and evaluation of such events. High-speed photography is often the only type of instrumentation that can be used to record the parameters demanded by our customers. I will show examples where this applied cinematography is used not only to provide a visual record of events, but also as an essential measurement tool.
Numerical models of laser fusion of intestinal tissues.
Pearce, John A
2009-01-01
Numerical models of continuous wave Tm:YAG thermal fusion in rat intestinal tissues were compared to experiment. Optical and thermal FDM models that included tissue damage based on Arrhenius kinetics were used to predict birefringence loss in collagen as the standard of comparison. The models also predicted collagen shrinkage, jellification and water loss. The inclusion of variable optical and thermal properties is essential to achieve favorable agreement between predicted and measured damage boundaries.
2012-02-28
Coupling in Detonation Waves: 1D Dynamics”, Paper 89, 23rd International Colloquium on the Dynamics of Explosions and Reactive ...and temperature, and can be modeled as a constant volume reaction , which is more efficient than a constant pressure reaction . After the detonation ... kinetics , and flow processes using high order numerical methods. A fifth-order WENO (weighted essentially non -oscillatory12,13) scheme was used
ERIC Educational Resources Information Center
Carey, Cayelan C.; Gougis, Rebekka Darner
2017-01-01
Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…
The Right Tools for the Job--Technology Options for Adult Online Learning and Collaboration
ERIC Educational Resources Information Center
Regional Educational Laboratory, 2014
2014-01-01
Many options exist for using technology as a tool for adult learning, and each day, it becomes easier to share information online than it ever has been. Online learning technology has grown from one-sided communications to numerous options for audience engagement and interactivity. This guide introduces a variety of tools, online platforms, and…
Automated Testcase Generation for Numerical Support Functions in Embedded Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Schnieder, Stefan-Alexander
2014-01-01
We present a tool for the automatic generation of test stimuli for small numerical support functions, e.g., code for trigonometric functions, quaternions, filters, or table lookup. Our tool is based on KLEE to produce a set of test stimuli for full path coverage. We use a method of iterative deepening over abstractions to deal with floating-point values. During actual testing the stimuli exercise the code against a reference implementation. We illustrate our approach with results of experiments with low-level trigonometric functions, interpolation routines, and mathematical support functions from an open source UAS autopilot.
Shrinkage Prediction for the Investment Casting of Stainless Steels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sabau, Adrian S
2007-01-01
In this study, the alloy shrinkage factors were obtained for the investment casting of 17-4PH stainless steel parts. For the investment casting process, unfilled wax and fused silica with a zircon prime coat were used for patterns and shell molds, respectively. Dimensions of the die tooling, wax pattern, and casting were measured using a Coordinate Measurement Machine in order to obtain the actual tooling allowances. The alloy dimensions were obtained from numerical simulation results of solidification, heat transfer, and deformation phenomena. The numerical simulation results for the shrinkage factors were compared with experimental results.
Numerical modelling of orthogonal cutting: application to woodworking with a bench plane.
Nairn, John A
2016-06-06
A numerical model for orthogonal cutting using the material point method was applied to woodcutting using a bench plane. The cutting process was modelled by accounting for surface energy associated with wood fracture toughness for crack growth parallel to the grain. By using damping to deal with dynamic crack propagation and modelling all contact between wood and the plane, simulations could initiate chip formation and proceed into steady-state chip propagation including chip curling. Once steady-state conditions were achieved, the cutting forces became constant and could be determined as a function of various simulation variables. The modelling details included a cutting tool, the tool's rake and grinding angles, a chip breaker, a base plate and a mouth opening between the base plate and the tool. The wood was modelled as an anisotropic elastic-plastic material. The simulations were verified by comparison to an analytical model and then used to conduct virtual experiments on wood planing. The virtual experiments showed interactions between depth of cut, chip breaker location and mouth opening. Additional simulations investigated the role of tool grinding angle, tool sharpness and friction.
Eren, Metin I.; Chao, Anne; Hwang, Wen-Han; Colwell, Robert K.
2012-01-01
Background Estimating assemblage species or class richness from samples remains a challenging, but essential, goal. Though a variety of statistical tools for estimating species or class richness have been developed, they are all singly-bounded: assuming only a lower bound of species or classes. Nevertheless there are numerous situations, particularly in the cultural realm, where the maximum number of classes is fixed. For this reason, a new method is needed to estimate richness when both upper and lower bounds are known. Methodology/Principal Findings Here, we introduce a new method for estimating class richness: doubly-bounded confidence intervals (both lower and upper bounds are known). We specifically illustrate our new method using the Chao1 estimator, rarefaction, and extrapolation, although any estimator of asymptotic richness can be used in our method. Using a case study of Clovis stone tools from the North American Lower Great Lakes region, we demonstrate that singly-bounded richness estimators can yield confidence intervals with upper bound estimates larger than the possible maximum number of classes, while our new method provides estimates that make empirical sense. Conclusions/Significance Application of the new method for constructing doubly-bound richness estimates of Clovis stone tools permitted conclusions to be drawn that were not otherwise possible with singly-bounded richness estimates, namely, that Lower Great Lakes Clovis Paleoindians utilized a settlement pattern that was probably more logistical in nature than residential. However, our new method is not limited to archaeological applications. It can be applied to any set of data for which there is a fixed maximum number of classes, whether that be site occupancy models, commercial products (e.g. athletic shoes), or census information (e.g. nationality, religion, age, race). PMID:22666316
Fate and Transport of Nanoparticles in Porous Media: A Numerical Study
NASA Astrophysics Data System (ADS)
Taghavy, Amir
Understanding the transport characteristics of NPs in natural soil systems is essential to revealing their potential impact on the food chain and groundwater. In addition, many nanotechnology-based remedial measures require effective transport of NPs through soil, which necessitates accurate understanding of their transport and retention behavior. Based upon the conceptual knowledge of environmental behavior of NPs, mathematical models can be developed to represent the coupling of processes that govern the fate of NPs in subsurface, serving as effective tools for risk assessment and/or design of remedial strategies. This work presents an innovative hybrid Eulerian-Lagrangian modeling technique for simulating the simultaneous reactive transport of nanoparticles (NPs) and dissolved constituents in porous media. Governing mechanisms considered in the conceptual model include particle-soil grain, particle-particle, particle-dissolved constituents, and particle- oil/water interface interactions. The main advantage of this technique, compared to conventional Eulerian models, lies in its ability to address non-uniformity in physicochemical particle characteristics. The developed numerical simulator was applied to investigate the fate and transport of NPs in a number of practical problems relevant to the subsurface environment. These problems included: (1) reductive dechlorination of chlorinated solvents by zero-valent iron nanoparticles (nZVI) in dense non-aqueous phase liquid (DNAPL) source zones; (2) reactive transport of dissolving silver nanoparticles (nAg) and the dissolved silver ions; (3) particle-particle interactions and their effects on the particle-soil grain interactions; and (4) influence of particle-oil/water interface interactions on NP transport in porous media.
NASA Technical Reports Server (NTRS)
Stoms, R. M.
1984-01-01
Numerically-controlled 5-axis machine tool uses transformer and meter to determine and indicate whether tool is in home position, but lacks built-in test mode to check them. Tester makes possible test, and repair of components at machine rather then replace them when operation seems suspect.
NASA Astrophysics Data System (ADS)
D'Ambrosio, Raffaele; Moccaldi, Martina; Paternoster, Beatrice
2018-05-01
In this paper, an adapted numerical scheme for reaction-diffusion problems generating periodic wavefronts is introduced. Adapted numerical methods for such evolutionary problems are specially tuned to follow prescribed qualitative behaviors of the solutions, making the numerical scheme more accurate and efficient as compared with traditional schemes already known in the literature. Adaptation through the so-called exponential fitting technique leads to methods whose coefficients depend on unknown parameters related to the dynamics and aimed to be numerically computed. Here we propose a strategy for a cheap and accurate estimation of such parameters, which consists essentially in minimizing the leading term of the local truncation error whose expression is provided in a rigorous accuracy analysis. In particular, the presented estimation technique has been applied to a numerical scheme based on combining an adapted finite difference discretization in space with an implicit-explicit time discretization. Numerical experiments confirming the effectiveness of the approach are also provided.
A numerical scheme for nonlinear Helmholtz equations with strong nonlinear optical effects.
Xu, Zhengfu; Bao, Gang
2010-11-01
A numerical scheme is presented to solve the nonlinear Helmholtz (NLH) equation modeling second-harmonic generation (SHG) in photonic bandgap material doped with a nonlinear χ((2)) effect and the NLH equation modeling wave propagation in Kerr type gratings with a nonlinear χ((3)) effect in the one-dimensional case. Both of these nonlinear phenomena arise as a result of the combination of high electromagnetic mode density and nonlinear reaction from the medium. When the mode intensity of the incident wave is significantly strong, which makes the nonlinear effect non-negligible, numerical methods based on the linearization of the essentially nonlinear problem will become inadequate. In this work, a robust, stable numerical scheme is designed to simulate the NLH equations with strong nonlinearity.
An accurate boundary element method for the exterior elastic scattering problem in two dimensions
NASA Astrophysics Data System (ADS)
Bao, Gang; Xu, Liwei; Yin, Tao
2017-11-01
This paper is concerned with a Galerkin boundary element method solving the two dimensional exterior elastic wave scattering problem. The original problem is first reduced to the so-called Burton-Miller [1] boundary integral formulation, and essential mathematical features of its variational form are discussed. In numerical implementations, a newly-derived and analytically accurate regularization formula [2] is employed for the numerical evaluation of hyper-singular boundary integral operator. A new computational approach is employed based on the series expansions of Hankel functions for the computation of weakly-singular boundary integral operators during the reduction of corresponding Galerkin equations into a discrete linear system. The effectiveness of proposed numerical methods is demonstrated using several numerical examples.
Fourier/Chebyshev methods for the incompressible Navier-Stokes equations in finite domains
NASA Technical Reports Server (NTRS)
Corral, Roque; Jimenez, Javier
1992-01-01
A fully spectral numerical scheme for the incompressible Navier-Stokes equations in domains which are infinite or semi-infinite in one dimension. The domain is not mapped, and standard Fourier or Chebyshev expansions can be used. The handling of the infinite domain does not introduce any significant overhead. The scheme assumes that the vorticity in the flow is essentially concentrated in a finite region, which is represented numerically by standard spectral collocation methods. To accomodate the slow exponential decay of the velocities at infinity, extra expansion functions are introduced, which are handled analytically. A detailed error analysis is presented, and two applications to Direct Numerical Simulation of turbulent flows are discussed in relation with the numerical performance of the scheme.
A Web-Based Course Assessment Tool with Direct Mapping to Student Outcomes
ERIC Educational Resources Information Center
Ibrahim, Walid; Atif, Yacine; Shuaib, Khaled; Sampson, Demetrios
2015-01-01
The assessment of curriculum outcomes is an essential element for continuous academic improvement. However, the collection, aggregation and analysis of assessment data are notoriously complex and time-consuming processes. At the same time, only few developments of supporting electronic processes and tools for continuous academic program assessment…
Physical Models of Schooling, the 'Ought' Question and Educational Change.
ERIC Educational Resources Information Center
Bauer, Norman J.
This paper examines the methods used in designing school and classroom environments. The tools are labeled: (1) discipline-centered schooling; (2) empirical-naturalistic schooling; and (3) great works schooling. First, the outline endeavors to reveal the essential elements of the three tools that represent images, structures, or "maps" of…
Collection Development in Public Health: A Guide to Selection Tools
ERIC Educational Resources Information Center
Wallis, Lisa C.
2004-01-01
Public health librarians face many challenges in collection development because the field is multidisciplinary, the collection's users have varied needs, and many of the essential resources are grey literature materials. Further, little has been published about public health selection tools. However, librarians responsible for these areas have a…
Using the Internet As an Instructional Tool.
ERIC Educational Resources Information Center
Hudson River Center for Program Development, Glenmont, NY.
This manual is designed to introduce adult educators to the Internet and examine ways that it can enhance instruction. An overview of the Internet covers its evolution. These three sections focus on the three areas of the Internet essential to instructional application: communication, information access, and search tools. The section on…
Development and Classroom Implementation of an Environmental Data Creation and Sharing Tool
ERIC Educational Resources Information Center
Brogan, Daniel S.; McDonald, Walter M.; Lohani, Vinod K.; Dymond, Randel L.; Bradner, Aaron J.
2016-01-01
Education is essential for solving the complex water-related challenges facing society. The Learning Enhanced Watershed Assessment System (LEWAS) and the Online Watershed Learning System (OWLS) provide data creation and data sharing infrastructures, respectively, that combine to form an environmental learning tool. This system collects, integrates…
Physical and numerical studies of a fracture system model
NASA Astrophysics Data System (ADS)
Piggott, Andrew R.; Elsworth, Derek
1989-03-01
Physical and numerical studies of transient flow in a model of discretely fractured rock are presented. The physical model is a thermal analogue to fractured media flow consisting of idealized disc-shaped fractures. The numerical model is used to predict the behavior of the physical model. The use of different insulating materials to encase the physical model allows the effects of differing leakage magnitudes to be examined. A procedure for determining appropriate leakage parameters is documented. These parameters are used in forward analysis to predict the thermal response of the physical model. Knowledge of the leakage parameters and of the temporal variation of boundary conditions are shown to be essential to an accurate prediction. Favorable agreement is illustrated between numerical and physical results. The physical model provides a data source for the benchmarking of alternative numerical algorithms.
Numerical model for thermodynamical behaviors of unsaturated soil
NASA Astrophysics Data System (ADS)
Miyamoto, Yuji; Yamada, Mitsuhide; Sako, Kazunari; Araki, Kohei; Kitamura, Ryosuke
Kitamura et al. have proposed the numerical models to establish the unsaturated soil mechanics aided by probability theory and statistics, and to apply the unsaturated soil mechanics to the geo-simulator, where the numerical model for the thermodynamical behaviors of unsaturated soil are essential. In this paper the thermodynamics is introduced to investigate the heat transfer through unsaturated soil and the evaporation of pore water in soil based on the first and second laws of thermodynamics, i.e., the conservation of energy, and increasing entropy. On the other hand the lysimeter equipment is used to obtain the data for the evaporation of pore water during fine days and seepage of rain water during rainy days. The numerical simulation is carried out by using the proposed numerical model and the results are compared with those obtained from the lysimeter test.
Itoh, Satoru; Hattori, Chiharu; Nagata, Mayumi; Sanbuissho, Atsushi
2012-08-30
The liver micronucleus test is an important method to detect pro-mutagens such as active metabolites not reaching bone marrow due to their short lifespan. We have already reported that dosing of the test compound after partial hepatectomy (PH) is essential to detect genotoxicity of numerical chromosome aberration inducers in mice [Mutat. Res. 632 (2007) 89-98]. In naive animals, the proportion of binucleated cells in rats is less than half of that in mice, which suggests a species difference in the response to chromosome aberration inducers. In the present study, we investigated the responses to structural and numerical chromosome aberration inducers in the rat liver micronucleus test. Two structural chromosome aberretion inducers (diethylnitrosamine and 1,2-dimethylhydrazine) and two numerical chromosome aberration inducers (colchicine and carbendazim) were used in the present study. PH was performed a day before or after the dosing of the test compound in 8-week old male F344 rats and hepatocytes were isolated 4 days after the PH. As a result, diethylnitrosamine and 1,2-dimethylhydrazine, structural chromosome aberration inducers, exhibited significant increase in the incidence of micronucleated hepatocyte (MNH) when given either before and after PH. Colchicine and carbendazim, numerical chromosome aberration inducers, did not result in any toxicologically significant increase in MNH frequency when given before PH, while they exhibited MNH induction when given after PH. It is confirmed that dosing after PH is essential in order to detect genotoxicity of numerical chromosome aberration inducers in rats as well as in mice. Regarding the species difference, a different temporal response to colchicine was identified. Colchicine increased the incidence of MNH 4 days after PH in rats, although such induction in mice was observed 8-10 days after PH. Copyright © 2012 Elsevier B.V. All rights reserved.
The Personal Digital Library (PDL)-based e-learning: Using the PDL as an e-learning support tool
NASA Astrophysics Data System (ADS)
Deng, Xiaozhao; Ruan, Jianhai
The paper describes a support tool for learners engaged in e-learning, the Personal Digital Library (PDL). The characteristics and functionality of the PDL are presented. Suggested steps for constructing and managing a PDL are outlined and discussed briefly. The authors believe that the PDL as a support tool of e-learning will be important and essential in the future.
Web-based automation of green building rating index and life cycle cost analysis
NASA Astrophysics Data System (ADS)
Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul
2018-04-01
Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.
Comparison of ORSAT and SCARAB Reentry Analysis Tools for a Generic Satellite Test Case
NASA Technical Reports Server (NTRS)
Kelley, Robert L.; Hill, Nicole M.; Rochelle, W. C.; Johnson, Nicholas L.; Lips, T.
2010-01-01
Reentry analysis is essential to understanding the consequences of the full life cycle of a spacecraft. Since reentry is a key factor in spacecraft development, NASA and ESA have separately developed tools to assess the survivability of objects during reentry. Criteria such as debris casualty area and impact energy are particularly important to understanding the risks posed to people on Earth. Therefore, NASA and ESA have undertaken a series of comparison studies of their respective reentry codes for verification and improvements in accuracy. The NASA Object Reentry Survival Analysis Tool (ORSAT) and the ESA Spacecraft Atmospheric Reentry and Aerothermal Breakup (SCARAB) reentry analysis tools serve as standard codes for reentry survivability assessment of satellites. These programs predict whether an object will demise during reentry and calculate the debris casualty area of objects determined to survive, establishing the reentry risk posed to the Earth's population by surviving debris. A series of test cases have been studied for comparison and the most recent uses "Testsat," a conceptual satellite composed of generic parts, defined to use numerous simple shapes and various materials for a better comparison of the predictions of these two codes. This study is an improvement on the others in this series because of increased consistency in modeling techniques and variables. The overall comparison demonstrated that the two codes arrive at similar results. Either most objects modeled resulted in close agreement between the two codes, or if the difference was significant, the variance could be explained as a case of semantics in the model definitions. This paper presents the main results of ORSAT and SCARAB for the Testsat case and discusses the sources of any discovered differences. Discussion of the results of previous comparisons is made for a summary of differences between the codes and lessons learned from this series of tests.
Simulation tools for guided wave based structural health monitoring
NASA Astrophysics Data System (ADS)
Mesnil, Olivier; Imperiale, Alexandre; Demaldent, Edouard; Baronian, Vahan; Chapuis, Bastien
2018-04-01
Structural Health Monitoring (SHM) is a thematic derived from Non Destructive Evaluation (NDE) based on the integration of sensors onto or into a structure in order to monitor its health without disturbing its regular operating cycle. Guided wave based SHM relies on the propagation of guided waves in plate-like or extruded structures. Using piezoelectric transducers to generate and receive guided waves is one of the most widely accepted paradigms due to the low cost and low weight of those sensors. A wide range of techniques for flaw detection based on the aforementioned setup is available in the literature but very few of these techniques have found industrial applications yet. A major difficulty comes from the sensitivity of guided waves to a substantial number of parameters such as the temperature or geometrical singularities, making guided wave measurement difficult to analyze. In order to apply guided wave based SHM techniques to a wider spectrum of applications and to transfer those techniques to the industry, the CEA LIST develops novel numerical methods. These methods facilitate the evaluation of the robustness of SHM techniques for multiple applicative cases and ease the analysis of the influence of various parameters, such as sensors positioning or environmental conditions. The first numerical tool is the guided wave module integrated to the commercial software CIVA, relying on a hybrid modal-finite element formulation to compute the guided wave response of perturbations (cavities, flaws…) in extruded structures of arbitrary cross section such as rails or pipes. The second numerical tool is based on the spectral element method [2] and simulates guided waves in both isotropic (metals) and orthotropic (composites) plate like-structures. This tool is designed to match the widely accepted sparse piezoelectric transducer array SHM configuration in which each embedded sensor acts as both emitter and receiver of guided waves. This tool is under development and will be adapted to simulate complex real-life structures such as curved composite panels with stiffeners. This communication will present these numerical tools and their main functionalities.
Kernel Ada Programming Support Environment (KAPSE) Interface Team: Public Report. Volume II.
1982-10-28
essential I parameters from our work so far in this area and, using trade-offs concerning these, construct the KIT’s recommended alternative. 1145...environment that are also in the development states. At this point in development it is essential for the KITEC to provide a forum and act as a focal...standardization in this area. Moreover, this is an area with considerable divergence in proposed approaches. Or the other hand, an essential tool from the point of
Overview of Iron Metabolism in Health and Disease
Dev, Som; Babitt, Jodie L.
2017-01-01
Iron is an essential element for numerous fundamental biologic processes, but excess iron is toxic. Abnormalities in systemic iron balance are common in patients with chronic kidney disease (CKD) and iron administration is a mainstay of anemia management in many patients. This review provides an overview of the essential role of iron in biology, the regulation of systemic and cellular iron homeostasis, how imbalances in iron homeostasis contribute to disease, and the implications for CKD patients. PMID:28296010
USDA-ARS?s Scientific Manuscript database
Although slowly abandoned in developed countries, furrow irrigation systems continue to be a dominant irrigation method in developing countries. Numerical models represent powerful tools to assess irrigation and fertigation efficiency. While several models have been proposed in the past, the develop...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.
Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less
Kazaura, Kamugisha; Omae, Kazunori; Suzuki, Toshiji; Matsumoto, Mitsuji; Mutafungwa, Edward; Korhonen, Timo O; Murakami, Tadaaki; Takahashi, Koichi; Matsumoto, Hideki; Wakamori, Kazuhiko; Arimoto, Yoshinori
2006-06-12
The deterioration and deformation of a free-space optical beam wave-front as it propagates through the atmosphere can reduce the link availability and may introduce burst errors thus degrading the performance of the system. We investigate the suitability of utilizing soft-computing (SC) based tools for improving performance of free-space optical (FSO) communications systems. The SC based tools are used for the prediction of key parameters of a FSO communications system. Measured data collected from an experimental FSO communication system is used as training and testing data for a proposed multi-layer neural network predictor (MNNP) used to predict future parameter values. The predicted parameters are essential for reducing transmission errors by improving the antenna's accuracy of tracking data beams. This is particularly essential for periods considered to be of strong atmospheric turbulence. The parameter values predicted using the proposed tool show acceptable conformity with original measurements.
Risk management measures for chemicals: the "COSHH essentials" approach.
Garrod, A N I; Evans, P G; Davy, C W
2007-12-01
"COSHH essentials" was developed in Great Britain to help duty holders comply with the Control of Substances Hazardous to Health (COSHH) Regulations. It uses a similar approach to that described in the new European "REACH" Regulation (Registration, Evaluation, Authorisation and Restriction of Chemicals; EC No. 1907/2006 of the European Parliament), insofar as it identifies measures for managing the risk for specified exposure scenarios. It can therefore assist REACH duty holders with the identification and communication of appropriate risk-management measures. The technical basis for COSHH essentials is explained in the original papers published in the Annals of Occupational Hygiene. Its details will, therefore, not be described here; rather, its ability to provide a suitable means for communicating risk-management measures will be explored. COSHH essentials is a simple tool based on an empirical approach to risk assessment and risk management. The output is a "Control Guidance Sheet" that lists the "dos" and "don'ts" for control in a specific task scenario. The guidance in COSHH essentials recognises that exposure in the workplace will depend not just on mechanical controls, but also on a number of other factors, including administrative and behavioural controls, such as systems of work, supervision and training. In 2002, COSHH essentials was made freely available via the internet (http://www.coshh-essentials.org.uk/). This electronic delivery enabled links to be made between product series that share tasks, such as drum filling, and with ancillary guidance, such as setting up health surveillance for work with a respiratory sensitiser. COSHH essentials has proved to be a popular tool for communicating good control practice. It has attracted over 1 million visits to its site since its launch. It offers a common benchmark of good practice for chemical users, manufacturers, suppliers and importers, as well as regulators and health professionals.
TDR Targets: a chemogenomics resource for neglected diseases.
Magariños, María P; Carmona, Santiago J; Crowther, Gregory J; Ralph, Stuart A; Roos, David S; Shanmugam, Dhanasekaran; Van Voorhis, Wesley C; Agüero, Fernán
2012-01-01
The TDR Targets Database (http://tdrtargets.org) has been designed and developed as an online resource to facilitate the rapid identification and prioritization of molecular targets for drug development, focusing on pathogens responsible for neglected human diseases. The database integrates pathogen specific genomic information with functional data (e.g. expression, phylogeny, essentiality) for genes collected from various sources, including literature curation. This information can be browsed and queried using an extensive web interface with functionalities for combining, saving, exporting and sharing the query results. Target genes can be ranked and prioritized using numerical weights assigned to the criteria used for querying. In this report we describe recent updates to the TDR Targets database, including the addition of new genomes (specifically helminths), and integration of chemical structure, property and bioactivity information for biological ligands, drugs and inhibitors and cheminformatic tools for querying and visualizing these chemical data. These changes greatly facilitate exploration of linkages (both known and predicted) between genes and small molecules, yielding insight into whether particular proteins may be druggable, effectively allowing the navigation of chemical space in a genomics context.
TDR Targets: a chemogenomics resource for neglected diseases
Magariños, María P.; Carmona, Santiago J.; Crowther, Gregory J.; Ralph, Stuart A.; Roos, David S.; Shanmugam, Dhanasekaran; Van Voorhis, Wesley C.; Agüero, Fernán
2012-01-01
The TDR Targets Database (http://tdrtargets.org) has been designed and developed as an online resource to facilitate the rapid identification and prioritization of molecular targets for drug development, focusing on pathogens responsible for neglected human diseases. The database integrates pathogen specific genomic information with functional data (e.g. expression, phylogeny, essentiality) for genes collected from various sources, including literature curation. This information can be browsed and queried using an extensive web interface with functionalities for combining, saving, exporting and sharing the query results. Target genes can be ranked and prioritized using numerical weights assigned to the criteria used for querying. In this report we describe recent updates to the TDR Targets database, including the addition of new genomes (specifically helminths), and integration of chemical structure, property and bioactivity information for biological ligands, drugs and inhibitors and cheminformatic tools for querying and visualizing these chemical data. These changes greatly facilitate exploration of linkages (both known and predicted) between genes and small molecules, yielding insight into whether particular proteins may be druggable, effectively allowing the navigation of chemical space in a genomics context. PMID:22116064
Accurate thermoelastic tensor and acoustic velocities of NaCl
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcondes, Michel L., E-mail: michel@if.usp.br; Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455; Shukla, Gaurav, E-mail: shukla@physics.umn.edu
Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor bymore » using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.« less
Optimization of hydrometric monitoring network in urban drainage systems using information theory.
Yazdi, J
2017-10-01
Regular and continuous monitoring of urban runoff in both quality and quantity aspects is of great importance for controlling and managing surface runoff. Due to the considerable costs of establishing new gauges, optimization of the monitoring network is essential. This research proposes an approach for site selection of new discharge stations in urban areas, based on entropy theory in conjunction with multi-objective optimization tools and numerical models. The modeling framework provides an optimal trade-off between the maximum possible information content and the minimum shared information among stations. This approach was applied to the main surface-water collection system in Tehran to determine new optimal monitoring points under the cost considerations. Experimental results on this drainage network show that the obtained cost-effective designs noticeably outperform the consulting engineers' proposal in terms of both information contents and shared information. The research also determined the highly frequent sites at the Pareto front which might be important for decision makers to give a priority for gauge installation on those locations of the network.
Tunable plasmonic toroidal terahertz metamodulator
NASA Astrophysics Data System (ADS)
Gerislioglu, Burak; Ahmadivand, Arash; Pala, Nezih
2018-04-01
Optical modulators are essential and strategic parts of micro- and nanophotonic circuits to encode electro-optical signals in the optical domain. Here, by using arrays of multipixel toroidal plasmonic terahertz (THz) metamolecules, we developed a functional plasmonic metamodulator with high efficiency and tunability. Technically, the dynamic toroidal dipole induces nonradiating charge-current arrangements leading to have an exquisite role in defining the inherent spectral features of various materials. By categorizing in a different family of multipoles far from the traditional electromagnetic multipoles, the toroidal dipole corresponds to poloidal currents flowing on the surface of a closed-loop torus. Utilizing the sensitivity of the optically driven toroidal momentum to the incident THz beam power and by employing both numerical tools and experimental analysis, we systematically studied the spectral response of the proposed THz plasmonic metadevice. In this Rapid Communication, we uncover a correlation between the existence and the excitation of the toroidal response and the incident beam power. This mechanism is employed to develop THz toroidal metamodulators with a strong potential to be employed for practical advanced and next-generation communication, filtering, and routing applications.
Comet assay: an essential tool in toxicological research.
Glei, M; Schneider, T; Schlörmann, W
2016-10-01
The comet assay is a versatile, reliable, cost-efficient, and fast technique for detecting DNA damage and repair in any tissue. It is useable in almost any cell type and applicable to both eukaryotic and prokaryotic organisms. Instead of highlighting one of the numerous specific aspects of the comet assay, the present review aims at giving an overview about the evolution of this widely applicable method from the first description by Ostling and Johanson to the OECD Guideline 489 for the in vivo mammalian comet assay. In addition, methodical aspects and the influence of critical steps of the assay as well as the evaluation of results and improvements of the method are reviewed. Methodical aspects regarding oxidative DNA damage and repair are also addressed. An overview about the most recent works and relevant cutting-edge reviews based on the comet assay with special regard to, e.g., clinical applications, nanoparticles or environmental risk assessment concludes this review. Taken together, the presented overview raises expectations to further decades of successful applications and enhancements of this excellent method.
The Many Faces of Apomorphine: Lessons from the Past and Challenges for the Future.
Auffret, Manon; Drapier, Sophie; Vérin, Marc
2018-06-01
Apomorphine is now recognized as the oldest antiparkinsonian drug on the market. Though still underused, it is increasingly prescribed in Europe for patients with advanced Parkinson's disease (PD) with motor fluctuations. However, its history is far from being limited to movement disorders. This paper traces the history of apomorphine, from its earliest empirical use, to its synthesis, pharmacological development, and numerous indications in human and veterinary medicine, in light of its most recent uses and newest challenges. From shamanic rituals in ancient Egypt and Mesoamerica, to the treatment of erectile dysfunction, from being discarded as a pharmacological tool to becoming an essential antiparkinsonian drug, the path of apomorphine in the therapeutic armamentarium has been tortuous and punctuated by setbacks and groundbreaking discoveries. Throughout history, three main clinical indications stood out: emetic (gastric emptying, respiratory disorders, aversive conditioning), sedative (mental disorders, clinical anesthesia, alcoholism), and antiparkinsonian (fluctuations). New indications may arise in the future, both in PD (palliative care, nonmotor symptoms, withdrawal of oral dopaminergic medication), and outside PD, with promising work in neuroprotection or addiction.
Fast identification of the conduction-type of nanomaterials by field emission technique.
Yang, Xun; Gan, Haibo; Tian, Yan; Peng, Luxi; Xu, Ningsheng; Chen, Jun; Chen, Huanjun; Deng, Shaozhi; Liang, Shi-Dong; Liu, Fei
2017-10-12
There are more or less dopants or defects existing in nanomaterials, so they usually have different conduct-types even for the same substrate. Therefore, fast identification of the conduction-type of nanomaterials is very essential for their practical application in functional nanodevices. Here we use the field emission (FE) technique to research nanomaterials and establish a generalized Schottky-Nordheim (SN) model, in which an important parameter λ (the image potential factor) is first introduced to describe the effective image potential. By regarding λ as the criterion, their energy-band structure can be identified: (a) λ = 1: metal; (b) 0.5 < λ < 1: n-type semiconductor; (c) 0 < λ < 0.5: p-type semiconductor. Moreover, this method can be utilized to qualitatively evaluate the doping-degree for a given semiconductor. We test numerically and experimentally a group of nanomaterial emitters and all results agree with our theoretical results very well, which suggests that our method based on FE measurements should be an ideal and powerful tool to fast ascertain the conduction-type of nanomaterials.
Protection heater design validation for the LARP magnets using thermal imaging
Marchevsky, M.; Turqueti, M.; Cheng, D. W.; ...
2016-03-16
Protection heaters are essential elements of a quench protection scheme for high-field accelerator magnets. Various heater designs fabricated by LARP and CERN have been already tested in the LARP high-field quadrupole HQ and presently being built into the coils of the high-field quadrupole MQXF. In order to compare the heat flow characteristics and thermal diffusion timescales of different heater designs, we powered heaters of two different geometries in ambient conditions and imaged the resulting thermal distributions using a high-sensitivity thermal video camera. We observed a peculiar spatial periodicity in the temperature distribution maps potentially linked to the structure of themore » underlying cable. Two-dimensional numerical simulation of heat diffusion and spatial heat distribution have been conducted, and the results of simulation and experiment have been compared. Imaging revealed hot spots due to a current concentration around high curvature points of heater strip of varying cross sections and visualized thermal effects of various interlayer structural defects. Furthermore, thermal imaging can become a future quality control tool for the MQXF coil heaters.« less
G-DYN Multibody Dynamics Engine
NASA Technical Reports Server (NTRS)
Acikmese, Behcet; Blackmore, James C.; Broderick, Daniel
2011-01-01
G-DYN is a multi-body dynamic simulation software engine that automatically assembles and integrates equations of motion for arbitrarily connected multibody dynamic systems. The algorithm behind G-DYN is based on a primal-dual formulation of the dynamics that captures the position and velocity vectors (primal variables) of each body and the interaction forces (dual variables) between bodies, which are particularly useful for control and estimation analysis and synthesis. It also takes full advantage of the spare matrix structure resulting from the system dynamics to numerically integrate the equations of motion efficiently. Furthermore, the dynamic model for each body can easily be replaced without re-deriving the overall equations of motion, and the assembly of the equations of motion is done automatically. G-DYN proved an essential software tool in the simulation of spacecraft systems used for small celestial body surface sampling, specifically in simulating touch-and-go (TAG) maneuvers of a robotic sampling system from a comet and asteroid. It is used extensively in validating mission concepts for small body sample return, such as Comet Odyssey and Galahad New Frontiers proposals.
Biomathematical Description of Synthetic Peptide Libraries
Trepel, Martin
2015-01-01
Libraries of randomised peptides displayed on phages or viral particles are essential tools in a wide spectrum of applications. However, there is only limited understanding of a library's fundamental dynamics and the influences of encoding schemes and sizes on their quality. Numeric properties of libraries, such as the expected number of different peptides and the library's coverage, have long been in use as measures of a library's quality. Here, we present a graphical framework of these measures together with a library's relative efficiency to help to describe libraries in enough detail for researchers to plan new experiments in a more informed manner. In particular, these values allow us to answer-in a probabilistic fashion-the question of whether a specific library does indeed contain one of the "best" possible peptides. The framework is implemented in a web-interface based on two packages, discreteRV and peptider, to the statistical software environment R. We further provide a user-friendly web-interface called PeLiCa (Peptide Library Calculator, http://www.pelica.org), allowing scientists to plan and analyse their peptide libraries. PMID:26042419
NASA Astrophysics Data System (ADS)
Le, Trung; Borazjani, Iman; Sotiropoulos, Fotis
2009-11-01
In order to test and optimize heart valve prosthesis and enable virtual implantation of other biomedical devices it is essential to develop and validate high-resolution FSI-CFD codes for carrying out simulations in patient-specific geometries. We have developed a powerful numerical methodology for carrying out FSI simulations of cardiovascular flows based on the CURVIB approach (Borazjani, L. Ge, and F. Sotiropoulos, Journal of Computational physics, vol. 227, pp. 7587-7620 2008). We have extended our FSI method to overset grids to handle efficiently more complicated geometries e.g. simulating an MHV implanted in an anatomically realistic aorta and left-ventricle. A compliant, anatomic left-ventricle is modeled using prescribed motion in one domain. The mechanical heart valve is placed inside the second domain i.e. the body-fitted curvilinear mesh of the anatomic aorta. The simulations of an MHV with a left-ventricle model underscore the importance of inflow conditions and ventricular compliance for such simulations and demonstrate the potential of our method as a powerful tool for patient-specific simulations.
Aboriginal and invasive rats of genus Rattus as hosts of infectious agents.
Kosoy, Michael; Khlyap, Lyudmila; Cosson, Jean-Francois; Morand, Serge
2015-01-01
From the perspective of ecology of zoonotic pathogens, the role of the Old World rats of the genus Rattus is exceptional. The review analyzes specific characteristics of rats that contribute to their important role in hosting pathogens, such as host-pathogen relations and rates of rat-borne infections, taxonomy, ecology, and essential factors. Specifically the review addresses recent taxonomic revisions within the genus Rattus that resulted from applications of new genetic tools in understanding relationships between the Old World rats and the infectious agents that they carry. Among the numerous species within the genus Rattus, only three species-the Norway rat (R. norvegicus), the black or roof rat (R. rattus), and the Asian black rat (R. tanezumi)-have colonized urban ecosystems globally for a historically long period of time. The fourth invasive species, R. exulans, is limited to tropical Asia-Pacific areas. One of the points highlighted in this review is the necessity to discriminate the roles played by rats as pathogen reservoirs within the land of their original diversification and in regions where only one or few rat species were introduced during the recent human history.
Nim, Hieu T; Furtado, Milena B; Costa, Mauro W; Rosenthal, Nadia A; Kitano, Hiroaki; Boyd, Sarah E
2015-05-01
Existing de novo software platforms have largely overlooked a valuable resource, the expertise of the intended biologist users. Typical data representations such as long gene lists, or highly dense and overlapping transcription factor networks often hinder biologists from relating these results to their expertise. VISIONET, a streamlined visualisation tool built from experimental needs, enables biologists to transform large and dense overlapping transcription factor networks into sparse human-readable graphs via numerically filtering. The VISIONET interface allows users without a computing background to interactively explore and filter their data, and empowers them to apply their specialist knowledge on far more complex and substantial data sets than is currently possible. Applying VISIONET to the Tbx20-Gata4 transcription factor network led to the discovery and validation of Aldh1a2, an essential developmental gene associated with various important cardiac disorders, as a healthy adult cardiac fibroblast gene co-regulated by cardiogenic transcription factors Gata4 and Tbx20. We demonstrate with experimental validations the utility of VISIONET for expertise-driven gene discovery that opens new experimental directions that would not otherwise have been identified.
NASA Astrophysics Data System (ADS)
Neuberg, J. W.; Thomas, M.; Pascal, K.; Karl, S.
2012-04-01
Geophysical datasets are essential to guide particularly short-term forecasting of volcanic activity. Key parameters are derived from these datasets and interpreted in different ways, however, the biggest impact on the interpretation is not determined by the range of parameters but controlled through the parameterisation and the underlying conceptual model of the volcanic process. On the other hand, the increasing number of sophisticated geophysical models need to be constrained by monitoring data, to transform a merely numerical exercise into a useful forecasting tool. We utilise datasets from the "big three", seismology, deformation and gas emissions, to gain insight in the mutual relationship between conceptual models and constraining data. We show that, e.g. the same seismic dataset can be interpreted with respect to a wide variety of different models with very different implications to forecasting. In turn, different data processing procedures lead to different outcomes even though they are based on the same conceptual model. Unsurprisingly, the most reliable interpretation will be achieved by employing multi-disciplinary models with overlapping constraints.
VAAPA: a web platform for visualization and analysis of alternative polyadenylation.
Guan, Jinting; Fu, Jingyi; Wu, Mingcheng; Chen, Longteng; Ji, Guoli; Quinn Li, Qingshun; Wu, Xiaohui
2015-02-01
Polyadenylation [poly(A)] is an essential process during the maturation of most mRNAs in eukaryotes. Alternative polyadenylation (APA) as an important layer of gene expression regulation has been increasingly recognized in various species. Here, a web platform for visualization and analysis of alternative polyadenylation (VAAPA) was developed. This platform can visualize the distribution of poly(A) sites and poly(A) clusters of a gene or a section of a chromosome. It can also highlight genes with switched APA sites among different conditions. VAAPA is an easy-to-use web-based tool that provides functions of poly(A) site query, data uploading, downloading, and APA sites visualization. It was designed in a multi-tier architecture and developed based on Smart GWT (Google Web Toolkit) using Java as the development language. VAAPA will be a valuable addition to the community for the comprehensive study of APA, not only by making the high quality poly(A) site data more accessible, but also by providing users with numerous valuable functions for poly(A) site analysis and visualization. Copyright © 2014 Elsevier Ltd. All rights reserved.
Minimizing finite-volume discretization errors on polyhedral meshes
NASA Astrophysics Data System (ADS)
Mouly, Quentin; Evrard, Fabien; van Wachem, Berend; Denner, Fabian
2017-11-01
Tetrahedral meshes are widely used in CFD to simulate flows in and around complex geometries, as automatic generation tools now allow tetrahedral meshes to represent arbitrary domains in a relatively accessible manner. Polyhedral meshes, however, are an increasingly popular alternative. While tetrahedron have at most four neighbours, the higher number of neighbours per polyhedral cell leads to a more accurate evaluation of gradients, essential for the numerical resolution of PDEs. The use of polyhedral meshes, nonetheless, introduces discretization errors for finite-volume methods: skewness and non-orthogonality, which occur with all sorts of unstructured meshes, as well as errors due to non-planar faces, specific to polygonal faces with more than three vertices. Indeed, polyhedral mesh generation algorithms cannot, in general, guarantee to produce meshes free of non-planar faces. The presented work focuses on the quantification and optimization of discretization errors on polyhedral meshes in the context of finite-volume methods. A quasi-Newton method is employed to optimize the relevant mesh quality measures. Various meshes are optimized and CFD results of cases with known solutions are presented to assess the improvements the optimization approach can provide.
Sources of Emittance in RF Photocathode Injectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowell, David
2016-12-11
Advances in electron beam technology have been central to creating the current generation of x-ray free electron lasers and ultra-fast electron microscopes. These once exotic devices have become essential tools for basic research and applied science. One important beam technology for both is the electron source which, for many of these instruments, is the photocathode RF gun. The invention of the photocathode gun and the concepts of emittance compensation and beam matching in the presence of space charge and RF forces have made these high-quality beams possible. Achieving even brighter beams requires a taking a finer resolution view of themore » electron dynamics near the cathode during photoemission and the initial acceleration of the beam. In addition, the high brightness beam is more sensitive to degradation by the optical aberrations of the gun’s RF and magnetic lenses. This paper discusses these topics including the beam properties due to fundamental photoemission physics, space charge effects close to the cathode, and optical distortions introduced by the RF and solenoid fields. Analytic relations for these phenomena are derived and compared with numerical simulations.« less
Marmorat, Thibaud; Canat, Hélène Labrosse; Préau, Marie; Farsi, Fadila
2017-03-06
Objectives: As a result of organizational and therapeutic progress in the management of cancer, retail pharmacies are faced with numerous challenges in the follow-up of cancer patients. This study was designed to provide a better understanding of the way in which retail pharmacists define their role in the management of cancer patients and to identify actions that would promote more efficient coordination with other oncology professionals.Methods: Semi-structured qualitative interviews were conducted with retail pharmacists and data were analysed by thematic analysis.Results: The majority (53%) of retail pharmacists provide patients with explanations concerning their treatments. Participants in this study described in detail patients’ questions concerning adverse effects (79%) as well as certain forms of alternative medicine (37%). Difficulties with an impact on patient follow-up were also reported, such as the lack of medical information concerning cancer treatments (21%) and their relationship with the hospital (26%).Conclusion: The availability of information tools shared by all healthcare professionals therefore appears to be essential to address the difficulties of follow-up of cancer patients by retail pharmacists.
A thermodynamic approach to obtain materials properties for engineering applications
NASA Technical Reports Server (NTRS)
Chang, Y. Austin
1993-01-01
With the ever increases in the capabilities of computers for numerical computations, we are on the verge of using these tools to model manufacturing processes for improving the efficiency of these processes as well as the quality of the products. One such process is casting for the production of metals. However, in order to model metal casting processes in a meaningful way it is essential to have the basic properties of these materials in their molten state, solid state as well as in the mixed state of solid and liquid. Some of the properties needed may be considered as intrinsic such as the density, heat capacity or enthalpy of freezing of a pure metal, while others are not. For instance, the enthalpy of solidification of an alloy is not a defined thermodynamic quantity. Its value depends on the micro-segregation of the phases during the course of solidification. The objective of the present study is to present a thermodynamic approach to obtain some of the intrinsic properties and combining thermodynamics with kinetic models to estimate such quantities as the enthalpy of solidification of an alloy.
Efficient numerical modeling of the cornea, and applications
NASA Astrophysics Data System (ADS)
Gonzalez, L.; Navarro, Rafael M.; Hdez-Matamoros, J. L.
2004-10-01
Corneal topography has shown to be an essential tool in the ophthalmology clinic both in diagnosis and custom treatments (refractive surgery, keratoplastia), having also a strong potential in optometry. The post processing and analysis of corneal elevation, or local curvature data, is a necessary step to refine the data and also to extract relevant information for the clinician. In this context a parametric cornea model is proposed consisting of a surface described mathematically by two terms: one general ellipsoid corresponding to a regular base surface, expressed by a general quadric term located at an arbitrary position and free orientation in 3D space and a second term, described by a Zernike polynomial expansion, which accounts for irregularities and departures from the basic geometry. The model has been validated obtaining better adjustment of experimental data than other previous models. Among other potential applications, here we present the determination of the optical axis of the cornea by transforming the general quadric to its canonical form. This has permitted us to perform 3D registration of corneal topographical maps to improve the signal-to-noise ratio. Other basic and clinical applications are also explored.
Resonant Absorption in GaAs-Based Nanowires by Means of Photo-Acoustic Spectroscopy
NASA Astrophysics Data System (ADS)
Petronijevic, E.; Leahu, G.; Belardini, A.; Centini, M.; Li Voti, R.; Hakkarainen, T.; Koivusalo, E.; Guina, M.; Sibilia, C.
2018-03-01
Semiconductor nanowires made of high refractive index materials can couple the incoming light to specific waveguide modes that offer resonant absorption enhancement under the bandgap wavelength, essential for light harvesting, lasing and detection applications. Moreover, the non-trivial ellipticity of such modes can offer near field interactions with chiral molecules, governed by near chiral field. These modes are therefore very important to detect. Here, we present the photo-acoustic spectroscopy as a low-cost, reliable, sensitive and scattering-free tool to measure the spectral position and absorption efficiency of these modes. The investigated samples are hexagonal nanowires with GaAs core; the fabrication by means of lithography-free molecular beam epitaxy provides controllable and uniform dimensions that allow for the excitation of the fundamental resonant mode around 800 nm. We show that the modulation frequency increase leads to the discrimination of the resonant mode absorption from the overall absorption of the substrate. As the experimental data are in great agreement with numerical simulations, the design can be optimized and followed by photo-acoustic characterization for a specific application.
Effective Communication: An Essential Tool To Cope with the Challenge of Technological Change.
ERIC Educational Resources Information Center
Coing, Marga
For a library to function effectively, it is essential that it fosters an open management style, which encourages communication of ideas and objectives both within the library itself and, by example, in other elements in the overall administration of which the library is a part. This paper describes the improvement in morale, efficiency, and…
Verification of RRA and CMC in OpenSim
NASA Astrophysics Data System (ADS)
Ieshiro, Yuma; Itoh, Toshiaki
2013-10-01
OpenSim is the free software that can handle various analysis and simulation of skeletal muscle dynamics with PC. This study treated RRA and CMC tools in OpenSim. It is remarkable that we can simulate human motion with respect to nerve signal of muscles using these tools. However, these tools seem to still in developmental stages. In order to verify applicability of these tools, we analyze bending and stretching motion data which are obtained from motion capture device using these tools. In this study, we checked the consistency between real muscle behavior and numerical results from these tools.
Numerical methods for systems of conservation laws of mixed type using flux splitting
NASA Technical Reports Server (NTRS)
Shu, Chi-Wang
1990-01-01
The essentially non-oscillatory (ENO) finite difference scheme is applied to systems of conservation laws of mixed hyperbolic-elliptic type. A flux splitting, with the corresponding Jacobi matrices having real and positive/negative eigenvalues, is used. The hyperbolic ENO operator is applied separately. The scheme is numerically tested on the van der Waals equation in fluid dynamics. Convergence was observed with good resolution to weak solutions for various Riemann problems, which are then numerically checked to be admissible as the viscosity-capillarity limits. The interesting phenomena of the shrinking of elliptic regions if they are present in the initial conditions were also observed.
Local lubrication model for spherical particles within incompressible Navier-Stokes flows.
Lambert, B; Weynans, L; Bergmann, M
2018-03-01
The lubrication forces are short-range hydrodynamic interactions essential to describe suspension of the particles. Usually, they are underestimated in direct numerical simulations of particle-laden flows. In this paper, we propose a lubrication model for a coupled volume penalization method and discrete element method solver that estimates the unresolved hydrodynamic forces and torques in an incompressible Navier-Stokes flow. Corrections are made locally on the surface of the interacting particles without any assumption on the global particle shape. The numerical model has been validated against experimental data and performs as well as existing numerical models that are limited to spherical particles.
Wind conditions in urban layout - Numerical and experimental research
NASA Astrophysics Data System (ADS)
Poćwierz, Marta; Zielonko-Jung, Katarzyna
2018-01-01
This paper presents research which compares the numerical and the experimental results for different cases of airflow around a few urban layouts. The study is concerned mostly with the analysis of parameters, such as pressure and velocity fields, which are essential in the building industry. Numerical simulations have been performed by the commercial software Fluent, with the use of a few different turbulence models, including popular k-ɛ, k-ɛ realizable or k-ω. A particular attention has been paid to accurate description of the conditions on the inlet and the selection of suitable computing grid. The pressure measurement near buildings and oil visualization were undertaken and described accordingly.
Numerical simulation of eigenmodes of ring and race-track optical microresonators
NASA Astrophysics Data System (ADS)
Raskhodchikov, A. V.; Raskhodchikov, D. V.; Scherbak, S. A.; Lipovskii, A. A.
2017-11-01
We have performed a numerical study of whispering gallery modes of ring and race-track optical microresonators. Mode excitation was considered and their spectra and electromagnetic field distributions were calculated via numerical solution of the Helmholtz equation. We pay additional attention to features of eigenmodes in race-tracks in contrast with ring resonators. Particularly, we demonstrate that modes in race-tracks are not “classic” WGM in terms of total internal reflection from a single boundary, and an inner boundary is essential for their formation. The dependence of effective refractive index of race-tracks modes on the resonator width is shown.
The use of Skype in a community hospital inpatient palliative medicine consultation service.
Brecher, David B
2013-01-01
Skype™, an Internet-based communication tool, has enhanced communication under numerous circumstances. As telemedicine continues to be an increasing part of medical practice, there will be more opportunities to use Skype and similar tools. Numerous scenarios in the lay literature have helped to highlight the potential uses. Although most commonly used to enhance physician-to-patient communication, there has been limited reported use of Skype for patient-to-family communication, especially in end of life and palliative care. Our inpatient Palliative Medicine Consultation Service has offered and used this technology to enhance our patients' quality of life. The objective was to provide another tool for our patients to use to communicate with family and/or friends, especially under circumstances in which clinical symptoms, functional status, financial concerns, or geographic limitations preclude in-person face-to face communication.
Successes and Challenges of Incompressible Flow Simulation
NASA Technical Reports Server (NTRS)
Kwak, Dochan; Kiris, Cetin
2003-01-01
During the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of CFD discipline. Even though incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient, CFD tools become indispensable in fluid engineering for incompressible and low speed flow. This paper is intended to review some of the successes made possible by advances in computational technologies during the same period, and discuss some of the current challenges.
Elements of orbit-determination theory - Textbook
NASA Technical Reports Server (NTRS)
Solloway, C. B.
1971-01-01
Text applies to solution of various optimization problems. Concepts are logically introduced and refinements and complexities for computerized numerical solutions are avoided. Specific topics and essential equivalence of several different approaches to various aspects of the problem are given.
On the application of subcell resolution to conservation laws with stiff source terms
NASA Technical Reports Server (NTRS)
Chang, Shih-Hung
1989-01-01
LeVeque and Yee recently investigated a one-dimensional scalar conservation law with stiff source terms modeling the reacting flow problems and discovered that for the very stiff case most of the current finite difference methods developed for non-reacting flows would produce wrong solutions when there is a propagating discontinuity. A numerical scheme, essentially nonoscillatory/subcell resolution - characteristic direction (ENO/SRCD), is proposed for solving conservation laws with stiff source terms. This scheme is a modification of Harten's ENO scheme with subcell resolution, ENO/SR. The locations of the discontinuities and the characteristic directions are essential in the design. Strang's time-splitting method is used and time evolutions are done by advancing along the characteristics. Numerical experiment using this scheme shows excellent results on the model problem of LeVeque and Yee. Comparisons of the results of ENO, ENO/SR, and ENO/SRCD are also presented.
Gene Transfers Shaped the Evolution of De Novo NAD+ Biosynthesis in Eukaryotes
Ternes, Chad M.; Schönknecht, Gerald
2014-01-01
NAD+ is an essential molecule for life, present in each living cell. It can function as an electron carrier or cofactor in redox biochemistry and energetics, and serves as substrate to generate the secondary messenger cyclic ADP ribose and nicotinic acid adenine dinucleotide phosphate. Although de novo NAD+ biosynthesis is essential, different metabolic pathways exist in different eukaryotic clades. The kynurenine pathway starting with tryptophan was most likely present in the last common ancestor of all eukaryotes, and is active in fungi and animals. The aspartate pathway, detected in most photosynthetic eukaryotes, was probably acquired from the cyanobacterial endosymbiont that gave rise to chloroplasts. An evolutionary analysis of enzymes catalyzing de novo NAD+ biosynthesis resulted in evolutionary trees incongruent with established organismal phylogeny, indicating numerous gene transfers. Endosymbiotic gene transfers probably introduced the aspartate pathway into eukaryotes and may have distributed it among different photosynthetic clades. In addition, several horizontal gene transfers substituted eukaryotic genes with bacterial orthologs. Although horizontal gene transfer is accepted as a key mechanism in prokaryotic evolution, it is supposed to be rare in eukaryotic evolution. The essential metabolic pathway of de novo NAD+ biosynthesis in eukaryotes was shaped by numerous gene transfers. PMID:25169983
Computational fluid dynamics applications to improve crop production systems
USDA-ARS?s Scientific Manuscript database
Computational fluid dynamics (CFD), numerical analysis and simulation tools of fluid flow processes have emerged from the development stage and become nowadays a robust design tool. It is widely used to study various transport phenomena which involve fluid flow, heat and mass transfer, providing det...
Development of a Design Tool for Planning Aqueous Amendment Injection Systems
2012-08-01
Chemical Oxidation with Permanganate (MnO4- ) ...................................... 2 1.4 IMPLEMENTATION ISSUES...17 6.4 SS DESIGN TOOL DEVELOPMENT AND EVALUATION ........................... 19 7.0 CHEMICAL OXIDATION WITH PERMANGANATE ...21 7.1 NUMERICAL MODELING OF PERMANGANATE DISTRIBUTION ........... 21 7.2 CDISCO DEVELOPMENT AND EVALUATION
Finding Your Voice: Talent Development Centers and the Academic Talent Search
ERIC Educational Resources Information Center
Rushneck, Amy S.
2012-01-01
Talent Development Centers are just one of many tools every family, teacher, and gifted advocate should have in their tool box. To understand the importance of Talent Development Centers, it is essential to also understand the Academic Talent Search Program. Talent Search participants who obtain scores comparable to college-bound high school…
Innovative Assessment Tools for a Short, Fast-Paced, Summer Field Course
ERIC Educational Resources Information Center
Baustian, Melissa M.; Bentley, Samuel J.; Wandersee, James H.
2008-01-01
An experiential science program, such as a summer course at a field station, requires unique assessment tools. Traditional assessment via a pencil-and-paper exam cannot capture the essential skills and concepts learned at a summer field station. Therefore, the authors developed a pre- and postcourse image-based analysis to evaluate student…
Sold! The Elementary Classroom Auction as Learning Tool of Communication and Economics
ERIC Educational Resources Information Center
Boyd, Josh; Boyd, Gina
2014-01-01
An auction, though an economic tool, is essentially a performance dependent on communication (Smith, 1989). The auctioneer dictates the pace, asks for bids, and acknowledges responses; the enterprise is controlled by a voice (Boyce, 2001). Bidders must listen and respond strategically to the communication of the people around them. An auction…
Practice versus Politics in Danish Day-Care Centres: How to Bridge the Gap in Early Learning?
ERIC Educational Resources Information Center
Clasen, Line Engel; Jensen de López, Kristine
2016-01-01
It is essential that early educators in day-care services possess adequate pedagogical tools for supporting children's communicative development. Early literacy programmes (ELPs) are potential tools. However, studies investigating the effects of ELPs seldom address implementation processes or the programme users' perspectives. This study sheds…
Small Wonders Close Encounters
ERIC Educational Resources Information Center
Kniseley, MacGregor; Capraro, Karen
2013-01-01
This article introduces students to the world of digital microscopy. Looking at small objects through a digital microscope is like traveling through a foreign country for the first time. The experience is new, engaging, and exciting. A handheld digital microscope is an essential tool in a 21st century teacher's toolkit and the perfect tool to…
NASA Astrophysics Data System (ADS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2005-05-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
NASA Technical Reports Server (NTRS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2005-01-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
Aspects of Strongly Correlated Many-Body Fermi Systems
NASA Astrophysics Data System (ADS)
Porter, William J., III
A, by now, well-known signal-to-noise problem plagues Monte Carlo calculations of quantum-information-theoretic observables in systems of interacting fermions, particularly the Renyi entanglement entropies Sn, even in many cases where the infamous sign problem does not appear. Several methods have been put forward to circumvent this affliction including ensemble-switching techniques using auxiliary partition-function ratios. This dissertation presents an algorithm that modifies the recently proposed free-fermion decomposition in an essential way: we incorporate the entanglement-sensitive correlations directly into the probability measure in a natural way. Implementing this algorithm, we demonstrate that it is compatible with the hybrid Monte Carlo algorithm, the workhorse of the lattice quantum chromodynamics community and an essential tool for studying gauge theories that contain dynamical fermions. By studying a simple one-dimensional Hubbard model, we demonstrate that our method does not exhibit the same debilitating numerical difficulties that naive attempts to study entanglement often encounter. Following that, we illustrate some key probabilistic insights, using intuition derived from the previous method and its successes to construct a simpler, better behaved, and more elegant algorithm. Using this method, in combination with new identities which allow us to avoid seemingly necessary numerical difficulties, the inversion of the restricted one-body density matrices, we compute high order Renyi entropies and perform a thorough comparison to this new algorithm's predecessor using the Hubbard model mentioned before. Finally, we characterize non-perturbatively the Renyi entropies of degree n = 2,3,4, and 5 of three-dimensional, strongly coupled many-fermion systems in the scale-invariant regime of short interaction range and large scattering length, i.e. in the unitary limit using the algorithms detailed herein. We also detail an exact, few-body projective method which we use to characterize the entanglement properties of the two-body sector across a broad range of attractive couplings. In the many-body case, we determine universal scaling properties of this system, and for the two-body case, we compute the entanglement spectrum exactly, successfully characterizing a vast range of entanglement behavior across the BCS-BEC crossover.
Numerical and experimental study on buckling and postbuckling behavior of cracked cylindrical shells
NASA Astrophysics Data System (ADS)
Saemi, J.; Sedighi, M.; Shariati, M.
2015-09-01
The effect of crack on load-bearing capacity and buckling behavior of cylindrical shells is an essential consideration in their design. In this paper, experimental and numerical buckling analysis of steel cylindrical shells of various lengths and diameters with cracks have been studied using the finite element method, and the effect of crack position, crack orientation and the crack length-to-cylindrical shell perimeter ( λ = a/(2 πr)) and shell length-to-diameter ( L/ D) ratios on the buckling and post-buckling behavior of cylindrical shells has been investigated. For several specimens, buckling test was performed using an INSTRON 8802 servo hydraulic machine, and the results of experimental tests were compared to numerical results. A very good correlation was observed between numerical simulation and experimental results. Finally, based on the experimental and numerical results, sensitivity of the buckling load to the shell length, crack length and orientation has also been investigated.
Page, Grier P; Coulibaly, Issa
2008-01-01
Microarrays are a very powerful tool for quantifying the amount of RNA in samples; however, their ability to query essentially every gene in a genome, which can number in the tens of thousands, presents analytical and interpretative problems. As a result, a variety of software and web-based tools have been developed to help with these issues. This article highlights and reviews some of the tools for the first steps in the analysis of a microarray study. We have tried for a balance between free and commercial systems. We have organized the tools by topics including image processing tools (Section 2), power analysis tools (Section 3), image analysis tools (Section 4), database tools (Section 5), databases of functional information (Section 6), annotation tools (Section 7), statistical and data mining tools (Section 8), and dissemination tools (Section 9).
Sandia National Laboratories analysis code data base
NASA Astrophysics Data System (ADS)
Peterson, C. W.
1994-11-01
Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.
Fast algorithms for Quadrature by Expansion I: Globally valid expansions
NASA Astrophysics Data System (ADS)
Rachh, Manas; Klöckner, Andreas; O'Neil, Michael
2017-09-01
The use of integral equation methods for the efficient numerical solution of PDE boundary value problems requires two main tools: quadrature rules for the evaluation of layer potential integral operators with singular kernels, and fast algorithms for solving the resulting dense linear systems. Classically, these tools were developed separately. In this work, we present a unified numerical scheme based on coupling Quadrature by Expansion, a recent quadrature method, to a customized Fast Multipole Method (FMM) for the Helmholtz equation in two dimensions. The method allows the evaluation of layer potentials in linear-time complexity, anywhere in space, with a uniform, user-chosen level of accuracy as a black-box computational method. Providing this capability requires geometric and algorithmic considerations beyond the needs of standard FMMs as well as careful consideration of the accuracy of multipole translations. We illustrate the speed and accuracy of our method with various numerical examples.
Thermo-elasto-plastic simulations of femtosecond laser-induced multiple-cavity in fused silica
NASA Astrophysics Data System (ADS)
Beuton, R.; Chimier, B.; Breil, J.; Hébert, D.; Mishchik, K.; Lopez, J.; Maire, P. H.; Duchateau, G.
2018-04-01
The formation and the interaction of multiple cavities, induced by tightly focused femtosecond laser pulses, are studied using a developed numerical tool, including the thermo-elasto-plastic material response. Simulations are performed in fused silica in cases of one, two, and four spots of laser energy deposition. The relaxation of the heated matter, launching shock waves in the surrounding cold material, leads to cavity formation and emergence of areas where cracks may be induced. Results show that the laser-induced structure shape depends on the energy deposition configuration and demonstrate the potential of the used numerical tool to obtain the desired designed structure or technological process.
Computer-Numerical-Control and the EMCO Compact 5 Lathe.
ERIC Educational Resources Information Center
Mullen, Frank M.
This laboratory manual is intended for use in teaching computer-numerical-control (CNC) programming using the Emco Maier Compact 5 Lathe. Developed for use at the postsecondary level, this material contains a short introduction to CNC machine tools. This section covers CNC programs, CNC machine axes, and CNC coordinate systems. The following…
Numerical model for healthy and injured ankle ligaments.
Forestiero, Antonella; Carniel, Emanuele Luigi; Fontanella, Chiara Giulia; Natali, Arturo Nicola
2017-06-01
The aim of this work is to provide a computational tool for the investigation of ankle mechanics under different loading conditions. The attention is focused on the biomechanical role of ankle ligaments that are fundamental for joints stability. A finite element model of the human foot is developed starting from Computed Tomography and Magnetic Resonance Imaging, using particular attention to the definition of ankle ligaments. A refined fiber-reinforced visco-hyperelastic constitutive model is assumed to characterize the mechanical response of ligaments. Numerical analyses that interpret anterior drawer and the talar tilt tests reported in literature are performed. The numerical results are in agreement with the range of values obtained by experimental tests confirming the accuracy of the procedure adopted. The increase of the ankle range of motion after some ligaments rupture is also evaluated, leading to the capability of the numerical models to interpret the damage conditions. The developed computational model provides a tool for the investigation of foot and ankle functionality in terms of stress-strain of the tissues and in terms of ankle motion, considering different types of damage to ankle ligaments.
On numerical modeling of one-dimensional geothermal histories
Haugerud, R.A.
1989-01-01
Numerical models of one-dimensional geothermal histories are one way of understanding the relations between tectonics and transient thermal structure in the crust. Such models can be powerful tools for interpreting geochronologic and thermobarometric data. A flexible program to calculate these models on a microcomputer is available and examples of its use are presented. Potential problems with this approach include the simplifying assumptions that are made, limitations of the numerical techniques, and the neglect of convective heat transfer. ?? 1989.
The effectiveness of a self-reporting bedside pain assessment tool for oncology inpatients.
Kim, Eun Bi; Han, Hye-Suk; Chung, Jung Hwa; Park, Bo Ram; Lim, Sung-Nam; Yim, Kyoung Hoon; Shin, Young Duck; Lee, Ki Hyeong; Kim, Wun-Jae; Kim, Seung Taik
2012-11-01
Pain is common during cancer treatment, and patient self-reporting of pain is an essential first step for ideal cancer pain management. However, many studies on cancer pain management report that, because pain may be underestimated, it is often inadequately managed. The aim of this study was to evaluate the effectiveness of bedside self-assessment of pain intensity for inpatients using a self-reporting pain board. Fifty consecutive inpatients admitted to the Oncology Department of Chungbuk National University Hospital were included in this observational prospective study from February 2011 to December 2011. The medical staff performed pain assessments by asking patients questions and using verbal rated scales (VRS) over 3 consecutive days. Then, for 3 additional days, patients used a self-reporting pain board attached to the bed, which had movable indicators representing 0-10 on a numeric rating scale (NRS) and the frequency of breakthrough pain. Patient reliability over the medical staff's pain assessment increased from 74% to 96% after applying the self-reporting pain board (p=0.004). The gap (mean±standard deviation [SD]) between the NRS reported by patients and the NRS recorded on the medical records decreased from 3.16±2.08 to 1.00±1.02 (p<0.001), and the level of patient satisfaction with pain management increased from 54% to 82% (p=0.002). This study suggests that the self-reporting bedside pain assessment tool provides a reliable and effective means of assessing pain in oncology inpatients.
3D visualization of solar wind ion data from the Chang'E-1 exploration
NASA Astrophysics Data System (ADS)
Zhang, Tian; Sun, Yankui; Tang, Zesheng
2011-10-01
Chang'E-1 (abbreviation CE-1), China's first Moon-orbiting spacecraft launched in 2007, carried equipment called the Solar Wind Ion Detector (abbreviation SWID), which sent back tens of gigabytes of solar wind ion differential number flux data. These data are essential for furthering our understanding of the cislunar space environment. However, to fully comprehend and analyze these data presents considerable difficulties, not only because of their huge size (57 GB), but also because of their complexity. Therefore, a new 3D visualization method is developed to give a more intuitive representation than traditional 1D and 2D visualizations, and in particular to offer a better indication of the direction of the incident ion differential number flux and the relative spatial position of CE-1 with respect to the Sun, the Earth, and the Moon. First, a coordinate system named Selenocentric Solar Ecliptic (SSE) which is more suitable for our goal is chosen, and solar wind ion differential number flux vectors in SSE are calculated from Geocentric Solar Ecliptic System (GSE) and Moon Center Coordinate (MCC) coordinates of the spacecraft, and then the ion differential number flux distribution in SSE is visualized in 3D space. This visualization method is integrated into an interactive visualization analysis software tool named vtSWIDs, developed in MATLAB, which enables researchers to browse through numerous records and manipulate the visualization results in real time. The tool also provides some useful statistical analysis functions, and can be easily expanded.
NASA Astrophysics Data System (ADS)
Srinivasan, Yeshwanth; Hernes, Dana; Tulpule, Bhakti; Yang, Shuyu; Guo, Jiangling; Mitra, Sunanda; Yagneswaran, Sriraja; Nutter, Brian; Jeronimo, Jose; Phillips, Benny; Long, Rodney; Ferris, Daron
2005-04-01
Automated segmentation and classification of diagnostic markers in medical imagery are challenging tasks. Numerous algorithms for segmentation and classification based on statistical approaches of varying complexity are found in the literature. However, the design of an efficient and automated algorithm for precise classification of desired diagnostic markers is extremely image-specific. The National Library of Medicine (NLM), in collaboration with the National Cancer Institute (NCI), is creating an archive of 60,000 digitized color images of the uterine cervix. NLM is developing tools for the analysis and dissemination of these images over the Web for the study of visual features correlated with precancerous neoplasia and cancer. To enable indexing of images of the cervix, it is essential to develop algorithms for the segmentation of regions of interest, such as acetowhitened regions, and automatic identification and classification of regions exhibiting mosaicism and punctation. Success of such algorithms depends, primarily, on the selection of relevant features representing the region of interest. We present color and geometric features based statistical classification and segmentation algorithms yielding excellent identification of the regions of interest. The distinct classification of the mosaic regions from the non-mosaic ones has been obtained by clustering multiple geometric and color features of the segmented sections using various morphological and statistical approaches. Such automated classification methodologies will facilitate content-based image retrieval from the digital archive of uterine cervix and have the potential of developing an image based screening tool for cervical cancer.
ERIC Educational Resources Information Center
Dahiya, Sunita; Dahiya, Rajiv
2015-01-01
Theory and practicals are two essential components of pharmacy course curriculum; but in addition to appearing and passing examination with good score grades, pharmacy post graduation (PG) pursuing students are essentially required to develop some professional skills which might not be attained solely by conventional class room programs. This…
Evaluating the Zebrafish Embryo Toxicity Test for Pesticide Hazard Screening
Given the numerous chemicals used in society, it is critical to develop tools for accurate and efficient evaluation of potential risks to human and ecological receptors. Fish embryo acute toxicity tests are 1 tool that has been shown to be highly predictive of standard, more reso...
Modelling Student Misconceptions Using Nested Logit Item Response Models
ERIC Educational Resources Information Center
Yildiz, Mustafa
2017-01-01
Student misconceptions have been studied for decades from a curricular/instructional perspective and from the assessment/test level perspective. Numerous misconception assessment tools have been developed in order to measure students' misconceptions relative to the correct content. Often, these tools are used to make a variety of educational…
AtCHX13 is a plasma membrane K(+) transporter
USDA-ARS?s Scientific Manuscript database
Potassium (K+) homeostasis is essential for diverse cellular processes, although how various cation transporters collaborate to maintain a suitable K(+) required for growth and development is poorly understood. The Arabidopsis ("Arabidopsis thaliana") genome contains numerous cation:proton antiporte...
AtCHX13 is a plasma membrane K+ transporter
USDA-ARS?s Scientific Manuscript database
Potassium (K+) homeostasis is essential for diverse cellular processes, although how various cation transporters collaborate to maintain a suitable K+ required for growth and development is poorly understood. The Arabidopsis (Arabidopsis thaliana) genome contains numerous cation:proton antiporters (...
Evaluation of equipment and methods to map lost circulation zones in geothermal wells
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, W.J.; Leon, P.A.; Pittard, G.
A study and evaluation of methods to locate, characterize, and quantify lost circulation zones are described. Twenty-five methods of mapping and quantifying lost circulation zones were evaluated, including electrical, acoustical, mechanical, radioactive, and optical systems. Each tool studied is described. The structured, numerical evaluation plan, used as the basis for comparing the 25 tools, and the resulting ranking among the tools is presented.
Achieving better cooling of turbine blades using numerical simulation methods
NASA Astrophysics Data System (ADS)
Inozemtsev, A. A.; Tikhonov, A. S.; Sendyurev, C. I.; Samokhvalov, N. Yu.
2013-02-01
A new design of the first-stage nozzle vane for the turbine of a prospective gas-turbine engine is considered. The blade's thermal state is numerically simulated in conjugate statement using the ANSYS CFX 13.0 software package. Critical locations in the blade design are determined from the distribution of heat fluxes, and measures aimed at achieving more efficient cooling are analyzed. Essentially lower (by 50-100°C) maximal temperature of metal has been achieved owing to the results of the performed work.
Computation of transonic viscous-inviscid interacting flow
NASA Technical Reports Server (NTRS)
Whitfield, D. L.; Thomas, J. L.; Jameson, A.; Schmidt, W.
1983-01-01
Transonic viscous-inviscid interaction is considered using the Euler and inverse compressible turbulent boundary-layer equations. Certain improvements in the inverse boundary-layer method are mentioned, along with experiences in using various Runge-Kutta schemes to solve the Euler equations. Numerical conditions imposed on the Euler equations at a surface for viscous-inviscid interaction using the method of equivalent sources are developed, and numerical solutions are presented and compared with experimental data to illustrate essential points. Previously announced in STAR N83-17829
1987-09-01
one commercial code based on the p and h-p version of the finite element, the program PROBE of NOETIC Technologies (St. Louis, MO). PROBE deals with two...Standards. o To be an international center of study and research for foreign students in numerical mathematics who are supported by foreign govern- ments or...ment agencies such as the National Bureau of Standards. o To be an international center of study and research for foreign students in numerical
NASA Astrophysics Data System (ADS)
Reis, C.; Clain, S.; Figueiredo, J.; Baptista, M. A.; Miranda, J. M. A.
2015-12-01
Numerical tools turn to be very important for scenario evaluations of hazardous phenomena such as tsunami. Nevertheless, the predictions highly depends on the numerical tool quality and the design of efficient numerical schemes still receives important attention to provide robust and accurate solutions. In this study we propose a comparative study between the efficiency of two volume finite numerical codes with second-order discretization implemented with different method to solve the non-conservative shallow water equations, the MUSCL (Monotonic Upstream-Centered Scheme for Conservation Laws) and the MOOD methods (Multi-dimensional Optimal Order Detection) which optimize the accuracy of the approximation in function of the solution local smoothness. The MUSCL is based on a priori criteria where the limiting procedure is performed before updated the solution to the next time-step leading to non-necessary accuracy reduction. On the contrary, the new MOOD technique uses a posteriori detectors to prevent the solution from oscillating in the vicinity of the discontinuities. Indeed, a candidate solution is computed and corrections are performed only for the cells where non-physical oscillations are detected. Using a simple one-dimensional analytical benchmark, 'Single wave on a sloping beach', we show that the classical 1D shallow-water system can be accurately solved with the finite volume method equipped with the MOOD technique and provide better approximation with sharper shock and less numerical diffusion. For the code validation, we also use the Tohoku-Oki 2011 tsunami and reproduce two DART records, demonstrating that the quality of the solution may deeply interfere with the scenario one can assess. This work is funded by the Portugal-France research agreement, through the research project GEONUM FCT-ANR/MAT-NAN/0122/2012.Numerical tools turn to be very important for scenario evaluations of hazardous phenomena such as tsunami. Nevertheless, the predictions highly depends on the numerical tool quality and the design of efficient numerical schemes still receives important attention to provide robust and accurate solutions. In this study we propose a comparative study between the efficiency of two volume finite numerical codes with second-order discretization implemented with different method to solve the non-conservative shallow water equations, the MUSCL (Monotonic Upstream-Centered Scheme for Conservation Laws) and the MOOD methods (Multi-dimensional Optimal Order Detection) which optimize the accuracy of the approximation in function of the solution local smoothness. The MUSCL is based on a priori criteria where the limiting procedure is performed before updated the solution to the next time-step leading to non-necessary accuracy reduction. On the contrary, the new MOOD technique uses a posteriori detectors to prevent the solution from oscillating in the vicinity of the discontinuities. Indeed, a candidate solution is computed and corrections are performed only for the cells where non-physical oscillations are detected. Using a simple one-dimensional analytical benchmark, 'Single wave on a sloping beach', we show that the classical 1D shallow-water system can be accurately solved with the finite volume method equipped with the MOOD technique and provide better approximation with sharper shock and less numerical diffusion. For the code validation, we also use the Tohoku-Oki 2011 tsunami and reproduce two DART records, demonstrating that the quality of the solution may deeply interfere with the scenario one can assess. This work is funded by the Portugal-France research agreement, through the research project GEONUM FCT-ANR/MAT-NAN/0122/2012.
Analysis of Ten Reverse Engineering Tools
NASA Astrophysics Data System (ADS)
Koskinen, Jussi; Lehmonen, Tero
Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.
Building ProteomeTools based on a complete synthetic human proteome
Zolg, Daniel P.; Wilhelm, Mathias; Schnatbaum, Karsten; Zerweck, Johannes; Knaute, Tobias; Delanghe, Bernard; Bailey, Derek J.; Gessulat, Siegfried; Ehrlich, Hans-Christian; Weininger, Maximilian; Yu, Peng; Schlegl, Judith; Kramer, Karl; Schmidt, Tobias; Kusebauch, Ulrike; Deutsch, Eric W.; Aebersold, Ruedi; Moritz, Robert L.; Wenschuh, Holger; Moehring, Thomas; Aiche, Stephan; Huhmer, Andreas; Reimer, Ulf; Kuster, Bernhard
2018-01-01
The ProteomeTools project builds molecular and digital tools from the human proteome to facilitate biomedical and life science research. Here, we report the generation and multimodal LC-MS/MS analysis of >330,000 synthetic tryptic peptides representing essentially all canonical human gene products and exemplify the utility of this data. The resource will be extended to >1 million peptides and all data will be shared with the community via ProteomicsDB and proteomeXchange. PMID:28135259
Orchard, Ané; Sandasi, Maxleene; Kamatou, Guy; Viljoen, Alvaro; van Vuuren, Sandy
2017-01-01
This study reports on the inhibitory concentration of 59 commercial essential oils recommended for dermatological conditions, and identifies putative compounds responsible for antimicrobial activity. Essential oils were investigated for antimicrobial activity using minimum inhibitory concentration assays. Ten essential oils were identified as having superior antimicrobial activity. The essential oil compositions were determined using gas chromatography coupled to mass spectrometry and the data analysed with the antimicrobial activity using multivariate tools. Orthogonal projections to latent structures models were created for seven of the pathogens. Eugenol was identified as the main biomarker responsible for antimicrobial activity in the majority of the essential oils. The essential oils mostly displayed noteworthy antimicrobial activity, with five oils displaying broad-spectrum activity against the 13 tested micro-organisms. The antimicrobial efficacies of the essential oils highlight their potential in treating dermatological infections and through chemometric modelling, bioactive volatiles have been identified. © 2017 Wiley-VHCA AG, Zurich, Switzerland.
Numerical and Experimental Investigations of the Flow in a Stationary Pelton Bucket
NASA Astrophysics Data System (ADS)
Nakanishi, Yuji; Fujii, Tsuneaki; Kawaguchi, Sho
A numerical code based on one of mesh-free particle methods, a Moving-Particle Semi-implicit (MPS) Method has been used for the simulation of free surface flows in a bucket of Pelton turbines so far. In this study, the flow in a stationary bucket is investigated by MPS simulation and experiment to validate the numerical code. The free surface flow dependent on the angular position of the bucket and the corresponding pressure distribution on the bucket computed by the numerical code are compared with that obtained experimentally. The comparison shows that numerical code based on MPS method is useful as a tool to gain an insight into the free surface flows in Pelton turbines.
WEC3: Wave Energy Converter Code Comparison Project: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien
This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less
Transforming Mobile Platform with KI-SIM Card into an Open Mobile Identity Tool
NASA Astrophysics Data System (ADS)
Hyppönen, Konstantin; Hassinen, Marko; Trichina, Elena
Recent introduction of Near Field Communication (NFC) in mobile phones has stimulated the development of new proximity payment and identification services. We present an architecture that facilitates the use of the mobile phone as a personalised electronic identity tool. The tool can work as a replacement for numerous ID cards and licenses. Design for privacy principles have been applied, such as minimisation of data collection and informed consent of the user. We describe an implementation of a lightweight version of the of the mobile identity tool using currently available handset technology and off-the-shelf development tools.
An integrated modeling and design tool for advanced optical spacecraft
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1992-01-01
Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.
Analysis of the electromagnetic wave resistivity tool in deviated well drilling
NASA Astrophysics Data System (ADS)
Zhang, Yumei; Xu, Lijun; Cao, Zhang
2014-04-01
Electromagnetic wave resistivity (EWR) tools are used to provide real-time measurements of resistivity in the formation around the tool in Logging While Drilling (LWD). In this paper, the acquired resistivity information in the formation is analyzed to extract more information, including dipping angle and azimuth direction of the drill. A finite element (FM) model of EWR tool working in layered earth formations is established. Numerical analysis and FM simulations are employed to analyze the amplitude ratio and phase difference between the voltages measured at the two receivers of the EWR tool in deviated well drilling.
ERIC Educational Resources Information Center
Wendt, Oliver; Miller, Bridget
2012-01-01
Critical appraisal of the research literature is an essential step in informing and implementing evidence-based practice. Quality appraisal tools that assess the methodological quality of experimental studies provide a means to identify the most rigorous research suitable for evidence-based decision-making. In single-subject experimental research,…
Digital Pedagogies for Teachers' CPD
ERIC Educational Resources Information Center
Montebello, Matthew
2017-01-01
The continuous professional development of educators is not only essential to highly maintain their expertise levels and ensure that their knowledge is up to scratch, but also to catch up and adopt new pedagogical tools, skills and techniques. The advent of the Web 2.0 brought about a plethora of digital tools that teachers have not only struggled…
Toward a New Approach to the Evaluation of a Digital Curriculum Using Learning Analytics
ERIC Educational Resources Information Center
Rangel, Virginia Snodgrass; Bell, Elizabeth R.; Monroy, Carlos; Whitaker, J. Reid
2015-01-01
Understanding how an educational intervention is implemented is essential to evaluating its effectiveness. With the increased use of digital tools in classrooms, however, traditional methods of measuring implementation fall short. Fortunately, there is a way to learn about the interactions that users have with digital tools that are embedded into…
ERIC Educational Resources Information Center
Britton, Emily; Simper, Natalie; Leger, Andrew; Stephenson, Jenn
2017-01-01
Effective teamwork skills are essential for success in an increasingly team-based workplace. However, research suggests that there is often confusion concerning how teamwork is measured and assessed, making it difficult to develop these skills in undergraduate curricula. The goal of the present study was to develop a sustainable tool for assessing…
USDA-ARS?s Scientific Manuscript database
Positional cloning in bread wheat is a tedious task due to its huge genome size (~17 Gbp) and polyploid character. BAC libraries represent an essential tool for positional cloning. However, wheat BAC libraries comprise more than million clones, which make their screening very laborious. Here we pres...
ERIC Educational Resources Information Center
DiLuzio, Geneva J.; And Others
This document accompanies Conceptual Learning and Development Assessment Series II: Cutting Tool, a test constructed to chart the conceptual development of individuals. As a technical manual, it contains information on the rationale, development, standardization, and reliability of the test, as well as essential information and statistical data…
Tabari, Mohaddeseh Abouhosseini; Youssefi, Mohammad Reza; Esfandiari, Aryan; Benelli, Giovanni
2017-10-01
Insect vectors are responsible for spreading devastating parasites and pathogens. A large number of botanicals have been suggested for eco-friendly control programs against mosquito vectors, and some of them are aromatic plants. Pelargonium roseum, a species belonging to the Geraniaceae family, due to its pleasant rose-like odor may represent a suitable candidate as mosquito repellent and/or larvicide. In this research, we evaluated the toxicity of the essential oil from P. roseum and its major constituents against the West Nile and filariasis vector Culex pipiens. The chemical composition of P. roseum essential oil was analyzed by gas chromatography-mass spectroscopy. Major constituents were citronellol (35.9%), geraniol (18.5%), and linalool (5.72%). The bioactivity of P. roseum essential oil and its three major compounds on larvae and egg rafts of Cx. pipiens was evaluated. The essential oil had a significant toxic effect on larvae and egg rafts of Cx. pipiens, with 50% lethal concentration (LC 50 ) values of 5.49 and 0.45μg/mL, respectively. Major constituents, geraniol, citronellol and linalool resulted in LC 50 values of 6.86, 7.64 and 14.87μg/mL on larvae, and 0.8, 0.67 and 1.27μg/mL on egg rafts. Essential oil and two of its constituents, citronellol and geraniol showed moderate knock-down on Cx. pipiens adults. Overall, the present investigation revealed that the major components of P. roseum and specially the whole essential oil could be helpful in developing novel and safe mosquito control tools and also offer an environmentally safe and cheap tool for reducing Cx. pipiens mosquito populations. Copyright © 2017. Published by Elsevier Ltd.
Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...
2016-06-09
Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less
Growing evidence for human health benefits of boron
USDA-ARS?s Scientific Manuscript database
Growing evidence from numerous laboratories using a variety of experimental models shows that boron is a bioactive beneficial, perhaps essential, element for humans. Reported beneficial actions of boron include arthritis alleviation or risk reduction; bone growth and maintenance; central nervous sys...
NASA Astrophysics Data System (ADS)
Zavrazhina, T. V.
2007-10-01
A mathematical modeling technique is proposed for oscillation chaotization in an essentially nonlinear dissipative Duffing oscillator with two-frequency excitation on an invariant torus in ℝ2. The technique is based on the joint application of the parameter continuation method, Floquet stability criteria, bifurcation theory, and the Everhart high-accuracy numerical integration method. This approach is used for the numerical construction of subharmonic solutions in the case when the oscillator passes to chaos through a sequence of period-multiplying bifurcations. The value of a universal constant obtained earlier by the author while investigating oscillation chaotization in dissipative oscillators with single-frequency periodic excitation is confirmed.
ERIC Educational Resources Information Center
Agus, Mirian; Penna, Maria Pietronilla; Peró-Cebollero, Maribel; Guàrdia-Olmos, Joan
2016-01-01
Research on the graphical facilitation of probabilistic reasoning has been characterised by the effort expended to identify valid assessment tools. The authors developed an assessment instrument to compare reasoning performances when problems were presented in verbal-numerical and graphical-pictorial formats. A sample of undergraduate psychology…
On the use of the line integral in the numerical treatment of conservative problems
NASA Astrophysics Data System (ADS)
Brugnano, Luigi; Iavernaro, Felice
2016-06-01
We sketch out the use of the line integral as a tool to devise numerical methods suitable for conservative and, in particular, Hamiltonian problems. The monograph [3] presents the fundamental theory on line integral methods and this short note aims at exploring some aspects and results emerging from their study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heckman, B.K.; Chinn, V.K.
1981-01-01
The development and use of computer programs written to produce the paper tape needed for the automation, or numeric control, of drill presses employed to fabricate computed-designed printed circuit boards are described. (LCL)
Brown, Amanda M.V.; Howe, Dana K.; Wasala, Sulochana K.; Peetz, Amy B.; Zasada, Inga A.; Denver, Dee R.
2015-01-01
Bacterial mutualists can modulate the biochemical capacity of animals. Highly coevolved nutritional mutualists do this by synthesizing nutrients missing from the host’s diet. Genomics tools have advanced the study of these partnerships. Here we examined the endosymbiont Xiphinematobacter (phylum Verrucomicrobia) from the dagger nematode Xiphinema americanum, a migratory ectoparasite of numerous crops that also vectors nepovirus. Previously, this endosymbiont was identified in the gut, ovaries, and eggs, but its role was unknown. We explored the potential role of this symbiont using fluorescence in situ hybridization, genome sequencing, and comparative functional genomics. We report the first genome of an intracellular Verrucomicrobium and the first exclusively intracellular non-Wolbachia nematode symbiont. Results revealed that Xiphinematobacter had a small 0.916-Mb genome with only 817 predicted proteins, resembling genomes of other mutualist endosymbionts. Compared with free-living relatives, conserved proteins were shorter on average, and there was large-scale loss of regulatory pathways. Despite massive gene loss, more genes were retained for biosynthesis of amino acids predicted to be essential to the host. Gene ontology enrichment tests showed enrichment for biosynthesis of arginine, histidine, and aromatic amino acids, as well as thiamine and coenzyme A, diverging from the profiles of relatives Akkermansia muciniphilia (in the human colon), Methylacidiphilum infernorum, and the mutualist Wolbachia from filarial nematodes. Together, these features and the location in the gut suggest that Xiphinematobacter functions as a nutritional mutualist, supplementing essential nutrients that are depleted in the nematode diet. This pattern points to evolutionary convergence with endosymbionts found in sap-feeding insects. PMID:26362082
Numerical Simulations of the Digital Microfluidic Manipulation of Single Microparticles.
Lan, Chuanjin; Pal, Souvik; Li, Zhen; Ma, Yanbao
2015-09-08
Single-cell analysis techniques have been developed as a valuable bioanalytical tool for elucidating cellular heterogeneity at genomic, proteomic, and cellular levels. Cell manipulation is an indispensable process for single-cell analysis. Digital microfluidics (DMF) is an important platform for conducting cell manipulation and single-cell analysis in a high-throughput fashion. However, the manipulation of single cells in DMF has not been quantitatively studied so far. In this article, we investigate the interaction of a single microparticle with a liquid droplet on a flat substrate using numerical simulations. The droplet is driven by capillary force generated from the wettability gradient of the substrate. Considering the Brownian motion of microparticles, we utilize many-body dissipative particle dynamics (MDPD), an off-lattice mesoscopic simulation technique, in this numerical study. The manipulation processes (including pickup, transport, and drop-off) of a single microparticle with a liquid droplet are simulated. Parametric studies are conducted to investigate the effects on the manipulation processes from the droplet size, wettability gradient, wetting properties of the microparticle, and particle-substrate friction coefficients. The numerical results show that the pickup, transport, and drop-off processes can be precisely controlled by these parameters. On the basis of the numerical results, a trap-free delivery of a hydrophobic microparticle to a destination on the substrate is demonstrated in the numerical simulations. The numerical results not only provide a fundamental understanding of interactions among the microparticle, the droplet, and the substrate but also demonstrate a new technique for the trap-free immobilization of single hydrophobic microparticles in the DMF design. Finally, our numerical method also provides a powerful design and optimization tool for the manipulation of microparticles in DMF systems.
Economic Consequence Analysis of Disasters: The ECAT Software Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Adam; Prager, Fynn; Chen, Zhenhua
This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less
ERIC Educational Resources Information Center
Adhikari, Manahari
2014-01-01
This essay examines how Virginia Woolf uses writing as a tool to locate sites of negations, such as women's exclusion from places of power and knowledge, and to expose negative essentializing that permeates patriarchal structure in "A Room of One's Own." Whereas scholarship on the book has explored a wide range of issues including sex,…
Fatigue crack growth and life prediction under mixed-mode loading
NASA Astrophysics Data System (ADS)
Sajith, S.; Murthy, K. S. R. K.; Robi, P. S.
2018-04-01
Fatigue crack growth life as a function of crack length is essential for the prevention of catastrophic failures from damage tolerance perspective. In damage tolerance design approach, principles of fracture mechanics are usually applied to predict the fatigue life of structural components. Numerical prediction of crack growth versus number of cycles is essential in damage tolerance design. For cracks under mixed mode I/II loading, modified Paris law (d/a d N =C (ΔKe q ) m ) along with different equivalent stress intensity factor (ΔKeq) model is used for fatigue crack growth rate prediction. There are a large number of ΔKeq models available for the mixed mode I/II loading, the selection of proper ΔKeq model has significant impact on fatigue life prediction. In the present investigation, the performance of ΔKeq models in fatigue life prediction is compared with respect to the experimental findings as there are no guidelines/suggestions available on the selection of these models for accurate and/or conservative predictions of fatigue life. Within the limitations of availability of experimental data and currently available numerical simulation techniques, the results of present study attempt to outline models that would provide accurate and conservative life predictions. Such a study aid the numerical analysts or engineers in the proper selection of the model for numerical simulation of the fatigue life. Moreover, the present investigation also suggests a procedure to enhance the accuracy of life prediction using Paris law.
NASA Technical Reports Server (NTRS)
Zepeda, J. L.
1983-01-01
New tool measures separation between recessed parallel surfaces. Tiles have overhanging edges, tool designed to slip into gap from end so it extends through 0.040-inch crack. Measure gaps between 0.200 and 0.400 inch so gap fillers of proper thickness can be selected. Useful in numerous industrial situation involving gap measurements in inaccessable places.
USDA-ARS?s Scientific Manuscript database
Tick-borne Babesia parasites are responsible for costly diseases worldwide. Improved control and prevention tools are urgently needed, but development of such tools is limited by numerous gaps in knowledge of the parasite-host relationships. We hereby used atomic force microscopy (AFM) and Kelvin pr...
Education Faculty Students' Views about Use of E-Books
ERIC Educational Resources Information Center
Yalman, Murat
2015-01-01
Parallel to technological developments, numerous new tools are now available for people's use. Societies adapt these tools to their professional lives by learning how to use them. In this way, they try to establish more comfortable working environments. Universities giving vocational education are supposed to teach these new technologies to their…
An Open-Access Educational Tool for Teaching Motion Dynamics in Multi-Axis Servomotor Control
ERIC Educational Resources Information Center
Rivera-Guillen, J. R.; de Jesus Rangel-Magdaleno, J.; de Jesus Romero-Troncoso, R.; Osornio-Rios, R. A.; Guevara-Gonzalez, R. G.
2012-01-01
Servomotors are widely used in computerized numerically controlled (CNC) machines, hence motion control is a major topic covered in undergraduate/graduate engineering courses. Despite the fact that several syllabi include the motion dynamics topic in their courses, there are neither suitable tools available for designing and simulating multi-axis…
ACED IT: A Tool for Improved Ethical and Moral Decision-Making
ERIC Educational Resources Information Center
Kreitler, Crystal Mata; Stenmark, Cheryl K.; Rodarte, Allen M.; Piñón DuMond, Rebecca
2014-01-01
Numerous examples of unethical organizational decision-making highlighted in the media have led many to question the general moral perception and ethical judgments of individuals. The present study examined two forms of a straightforward ethical decision-making (EDM) tool (ACED IT cognitive map) that could be a relatively simple instrument for…
ERIC Educational Resources Information Center
Santos-Trigo, Manuel; Espinosa-Perez, Hugo; Reyes-Rodriguez, Aaron
2006-01-01
Technological tools have the potential to offer students the possibility to represent information and relationships embedded in problems and concepts in ways that involve numerical, algebraic, geometric, and visual approaches. In this paper, the authors present and discuss an example in which an initial representation of a mathematical object…
Step-and-Repeat Nanoimprint-, Photo- and Laser Lithography from One Customised CNC Machine.
Greer, Andrew Im; Della-Rosa, Benoit; Khokhar, Ali Z; Gadegaard, Nikolaj
2016-12-01
The conversion of a computer numerical control machine into a nanoimprint step-and-repeat tool with additional laser- and photolithography capacity is documented here. All three processes, each demonstrated on a variety of photoresists, are performed successfully and analysed so as to enable the reader to relate their known lithography process(es) to the findings. Using the converted tool, 1 cm(2) of nanopattern may be exposed in 6 s, over 3300 times faster than the electron beam equivalent. Nanoimprint tools are commercially available, but these can cost around 1000 times more than this customised computer numerical control (CNC) machine. The converted equipment facilitates rapid production and large area micro- and nanoscale research on small grants, ultimately enabling faster and more diverse growth in this field of science. In comparison to commercial tools, this converted CNC also boasts capacity to handle larger substrates, temperature control and active force control, up to ten times more curing dose and compactness. Actual devices are fabricated using the machine including an expanded nanotopographic array and microfluidic PDMS Y-channel mixers.
NASA Astrophysics Data System (ADS)
Vincent, Timothy J.; Rumpfkeil, Markus P.; Chaudhary, Anil
2018-03-01
The complex, multi-faceted physics of laser-based additive metals processing tends to demand high-fidelity models and costly simulation tools to provide predictions accurate enough to aid in selecting process parameters. Of particular difficulty is the accurate determination of melt pool shape and size, which are useful for predicting lack-of-fusion, as this typically requires an adequate treatment of thermal and fluid flow. In this article we describe a novel numerical simulation tool which aims to achieve a balance between accuracy and cost. This is accomplished by making simplifying assumptions regarding the behavior of the gas-liquid interface for processes with a moderate energy density, such as Laser Engineered Net Shaping (LENS). The details of the implementation, which is based on the solver simpleFoam of the well-known software suite OpenFOAM, are given here and the tool is verified and validated for a LENS process involving Ti-6Al-4V. The results indicate that the new tool predicts width and height of a deposited track to engineering accuracy levels.
Step-and-Repeat Nanoimprint-, Photo- and Laser Lithography from One Customised CNC Machine
NASA Astrophysics Data System (ADS)
Greer, Andrew IM; Della-Rosa, Benoit; Khokhar, Ali Z.; Gadegaard, Nikolaj
2016-03-01
The conversion of a computer numerical control machine into a nanoimprint step-and-repeat tool with additional laser- and photolithography capacity is documented here. All three processes, each demonstrated on a variety of photoresists, are performed successfully and analysed so as to enable the reader to relate their known lithography process(es) to the findings. Using the converted tool, 1 cm2 of nanopattern may be exposed in 6 s, over 3300 times faster than the electron beam equivalent. Nanoimprint tools are commercially available, but these can cost around 1000 times more than this customised computer numerical control (CNC) machine. The converted equipment facilitates rapid production and large area micro- and nanoscale research on small grants, ultimately enabling faster and more diverse growth in this field of science. In comparison to commercial tools, this converted CNC also boasts capacity to handle larger substrates, temperature control and active force control, up to ten times more curing dose and compactness. Actual devices are fabricated using the machine including an expanded nanotopographic array and microfluidic PDMS Y-channel mixers.
Power, Kevin; Kirwan, Grainne; Palmer, Marion
2011-01-01
Research has indicated that use of cognitive skills training tools can produce positive benefits with older adults. However, little research has compared the efficacy of technology-based interventions and more traditional, text-based interventions which are also available. This study aimed to investigate cognitive skills improvements experienced by 40 older adults using cognitive skills training tools. A Solomon 4 group design was employed to determine which intervention demonstrated the greatest improvement. Participants were asked to use the interventions for 5-10 minutes per day, over a period of 60 days. Pre and post-tests consisted of measures of numerical ability, self-reported memory and intelligence. Following training, older adults indicated significant improvements on numerical ability and intelligence regardless of intervention type. No improvement in selfreported memory was observed. This research provides a critical appraisal of brain training tools and can help point the way for future improvements in the area. Brain training improvements could lead to improved quality of life, and perhaps, have financial and independent living ramifications for older adults.
NASA Astrophysics Data System (ADS)
Vincent, Timothy J.; Rumpfkeil, Markus P.; Chaudhary, Anil
2018-06-01
The complex, multi-faceted physics of laser-based additive metals processing tends to demand high-fidelity models and costly simulation tools to provide predictions accurate enough to aid in selecting process parameters. Of particular difficulty is the accurate determination of melt pool shape and size, which are useful for predicting lack-of-fusion, as this typically requires an adequate treatment of thermal and fluid flow. In this article we describe a novel numerical simulation tool which aims to achieve a balance between accuracy and cost. This is accomplished by making simplifying assumptions regarding the behavior of the gas-liquid interface for processes with a moderate energy density, such as Laser Engineered Net Shaping (LENS). The details of the implementation, which is based on the solver simpleFoam of the well-known software suite OpenFOAM, are given here and the tool is verified and validated for a LENS process involving Ti-6Al-4V. The results indicate that the new tool predicts width and height of a deposited track to engineering accuracy levels.
Valdor, Paloma F; Gómez, Aina G; Velarde, Víctor; Puente, Araceli
2016-04-01
Oil spills are one of the most widespread problems in port areas (loading/unloading of bulk liquid, fuel supply). Specific environmental risk analysis procedures for diffuse oil sources that are based on the evolution of oil in the marine environment are needed. Diffuse sources such as oil spills usually present a lack of information, which makes the use of numerical models an arduous and occasionally impossible task. For that reason, a tool that can assess the risk of oil spills in near-shore areas by using Geographical Information System (GIS) is presented. The SPILL Tool provides immediate results by automating the process without miscalculation errors. The tool was developed using the Python and ArcGIS scripting library to build a non-ambiguous geoprocessing workflow. The SPILL Tool was implemented for oil facilities at Tarragona Harbor (NE Spain) and validated showing a satisfactory correspondence (around 0.60 RSR error index) with the results obtained using a 2D calibrated oil transport numerical model. Copyright © 2016 Elsevier Ltd. All rights reserved.
TSOS and TSOS-FK hybrid methods for modelling the propagation of seismic waves
NASA Astrophysics Data System (ADS)
Ma, Jian; Yang, Dinghui; Tong, Ping; Ma, Xiao
2018-05-01
We develop a new time-space optimized symplectic (TSOS) method for numerically solving elastic wave equations in heterogeneous isotropic media. We use the phase-preserving symplectic partitioned Runge-Kutta method to evaluate the time derivatives and optimized explicit finite-difference (FD) schemes to discretize the space derivatives. We introduce the averaged medium scheme into the TSOS method to further increase its capability of dealing with heterogeneous media and match the boundary-modified scheme for implementing free-surface boundary conditions and the auxiliary differential equation complex frequency-shifted perfectly matched layer (ADE CFS-PML) non-reflecting boundaries with the TSOS method. A comparison of the TSOS method with analytical solutions and standard FD schemes indicates that the waveform generated by the TSOS method is more similar to the analytic solution and has a smaller error than other FD methods, which illustrates the efficiency and accuracy of the TSOS method. Subsequently, we focus on the calculation of synthetic seismograms for teleseismic P- or S-waves entering and propagating in the local heterogeneous region of interest. To improve the computational efficiency, we successfully combine the TSOS method with the frequency-wavenumber (FK) method and apply the ADE CFS-PML to absorb the scattered waves caused by the regional heterogeneity. The TSOS-FK hybrid method is benchmarked against semi-analytical solutions provided by the FK method for a 1-D layered model. Several numerical experiments, including a vertical cross-section of the Chinese capital area crustal model, illustrate that the TSOS-FK hybrid method works well for modelling waves propagating in complex heterogeneous media and remains stable for long-time computation. These numerical examples also show that the TSOS-FK method can tackle the converted and scattered waves of the teleseismic plane waves caused by local heterogeneity. Thus, the TSOS and TSOS-FK methods proposed in this study present an essential tool for the joint inversion of local, regional, and teleseismic waveform data.
Sugawara, Kotaro; Yamashita, Hiroharu; Uemura, Yukari; Mitsui, Takashi; Yagi, Koichi; Nishida, Masato; Aikou, Susumu; Mori, Kazuhiko; Nomura, Sachiyo; Seto, Yasuyuki
2017-10-01
The current eighth tumor node metastasis lymph node category pathologic lymph node staging system for esophageal squamous cell carcinoma is based solely on the number of metastatic nodes and does not consider anatomic distribution. We aimed to assess the prognostic capability of the eighth tumor node metastasis pathologic lymph node staging system (numeric-based) compared with the 11th Japan Esophageal Society (topography-based) pathologic lymph node staging system in patients with esophageal squamous cell carcinoma. We retrospectively reviewed the clinical records of 289 patients with esophageal squamous cell carcinoma who underwent esophagectomy with extended lymph node dissection during the period from January 2006 through June 2016. We compared discrimination abilities for overall survival, recurrence-free survival, and cancer-specific survival between these 2 staging systems using C-statistics. The median number of dissected and metastatic nodes was 61 (25% to 75% quartile range, 45 to 79) and 1 (25% to 75% quartile range, 0 to 3), respectively. The eighth tumor node metastasis pathologic lymph node staging system had a greater ability to accurately determine overall survival (C-statistics: tumor node metastasis classification, 0.69, 95% confidence interval, 0.62-0.76; Japan Esophageal Society classification; 0.65, 95% confidence interval, 0.58-0.71; P = .014) and cancer-specific survival (C-statistics: tumor node metastasis classification, 0.78, 95% confidence interval, 0.70-0.87; Japan Esophageal Society classification; 0.72, 95% confidence interval, 0.64-0.80; P = .018). Rates of total recurrence rose as the eighth tumor node metastasis pathologic lymph node stage increased, while stratification of patients according to the topography-based node classification system was not feasible. Numeric nodal staging is an essential tool for stratifying the oncologic outcomes of patients with esophageal squamous cell carcinoma even in the cohort in which adequate numbers of lymph nodes were harvested. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Nagaso, Masaru; Komatitsch, Dimitri; Moysan, Joseph; Lhuillier, Christian
2018-01-01
ASTRID project, French sodium cooled nuclear reactor of 4th generation, is under development at the moment by Alternative Energies and Atomic Energy Commission (CEA). In this project, development of monitoring techniques for a nuclear reactor during operation are identified as a measure issue for enlarging the plant safety. Use of ultrasonic measurement techniques (e.g. thermometry, visualization of internal objects) are regarded as powerful inspection tools of sodium cooled fast reactors (SFR) including ASTRID due to opacity of liquid sodium. In side of a sodium cooling circuit, heterogeneity of medium occurs because of complex flow state especially in its operation and then the effects of this heterogeneity on an acoustic propagation is not negligible. Thus, it is necessary to carry out verification experiments for developments of component technologies, while such kind of experiments using liquid sodium may be relatively large-scale experiments. This is why numerical simulation methods are essential for preceding real experiments or filling up the limited number of experimental results. Though various numerical methods have been applied for a wave propagation in liquid sodium, we still do not have a method for verifying on three-dimensional heterogeneity. Moreover, in side of a reactor core being a complex acousto-elastic coupled region, it has also been difficult to simulate such problems with conventional methods. The objective of this study is to solve these 2 points by applying three-dimensional spectral element method. In this paper, our initial results on three-dimensional simulation study on heterogeneous medium (the first point) are shown. For heterogeneity of liquid sodium to be considered, four-dimensional temperature field (three spatial and one temporal dimension) calculated by computational fluid dynamics (CFD) with Large-Eddy Simulation was applied instead of using conventional method (i.e. Gaussian Random field). This three-dimensional numerical experiment yields that we could verify the effects of heterogeneity of propagation medium on waves in Liquid sodium.
A computer controlled power tool for the servicing of the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
Richards, Paul W.; Konkel, Carl; Smith, Chris; Brown, Lee; Wagner, Ken
1996-01-01
The Hubble Space Telescope (HST) Pistol Grip Tool (PGT) is a self-contained, microprocessor controlled, battery-powered, 3/8-inch-drive hand-held tool. The PGT is also a non-powered ratchet wrench. This tool will be used by astronauts during Extravehicular Activity (EVA) to apply torque to the HST and HST Servicing Support Equipment mechanical interfaces and fasteners. Numerous torque, speed, and turn or angle limits are programmed into the PGT for use during various missions. Batteries are replaceable during ground operations, Intravehicular Activities, and EVA's.
Turkmenoglu, Fatma Pinar; Agar, Osman Tuncay; Akaydin, Galip; Hayran, Mutlu; Demirci, Betul
2015-06-22
According to distribution of genus Achillea, two main centers of diversity occur in S.E. Europe and S.W. Asia. Diversified essential oil compositions from Balkan Peninsula have been numerously reported. However, report on essential oils of Achillea species growing in Turkey, which is one of the main centers of diversity, is very limited. This paper represents the chemical compositions of the essential oils obtained by hydrodistillation from the aerial parts of eleven Achillea species, identified simultaneously by gas chromatography and gas chromatography-mass spectrometry. The main components were found to be 1,8-cineole, p-cymene, viridiflorol, nonacosane, α-bisabolol, caryophyllene oxide, α-bisabolon oxide A, β-eudesmol, 15-hexadecanolide and camphor. The chemical principal component analysis based on thirty compounds identified three species groups and a subgroup, where each group constituted a chemotype. This is the first report on the chemical composition of A. hamzaoglui essential oil; as well as the antioxidant and antimicrobial evaluation of its essential oil and methanolic extract.
Xiao, Zuobing; Liu, Wanlong; Zhu, Guangyong; Zhou, Rujun; Niu, Yunwei
2014-06-01
This paper briefly introduces the preparation and application of flavour and essential oils microcapsules based on complex coacervation technology. The conventional encapsulating agents of oppositely charged proteins and polysaccharides that are used for microencapsulation of flavours and essential oils are reviewed along with the recent advances in complex coacervation methods. Proteins extracted from animal-derived products (gelatin, whey proteins, silk fibroin) and from vegetables (soy proteins, pea proteins), and polysaccharides such as gum Arabic, pectin, chitosan, agar, alginate, carrageenan and sodium carboxymethyl cellulose are described in depth. In recent decades, flavour and essential oils microcapsules have found numerous potential practical applications in food, textiles, agriculturals and pharmaceuticals. In this paper, the different coating materials and their application are discussed in detail. Consequently, the information obtained allows criteria to be established for selecting a method for the preparation of microcapsules according to their advantages, limitations and behaviours as carriers of flavours and essential oils. © 2013 Society of Chemical Industry.
Tenhaven, Christoph; Tipold, Andrea; Fischer, Martin R; Ehlers, Jan P
2013-01-01
Informal and formal lifelong learning is essential at university and in the workplace. Apart from classical learning techniques, Web 2.0 tools can be used. It is controversial whether there is a so-called net generation amongst people under 30. To test the hypothesis that a net generation among students and young veterinarians exists. An online survey of students and veterinarians was conducted in the German-speaking countries which was advertised via online media and traditional print media. 1780 people took part in the survey. Students and veterinarians have different usage patterns regarding social networks (91.9% vs. 69%) and IM (55.9% vs. 24.5%). All tools were predominantly used passively and in private, to a lesser extent also professionally and for studying. The use of Web 2.0 tools is useful, however, teaching information and media skills, preparing codes of conduct for the internet and verification of user generated content is essential.
Maintenance Audit through Value Analysis Technique: A Case Study
NASA Astrophysics Data System (ADS)
Carnero, M. C.; Delgado, S.
2008-11-01
The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.
Solving the three-body Coulomb breakup problem using exterior complex scaling
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurdy, C.W.; Baertschy, M.; Rescigno, T.N.
2004-05-17
Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a ''reduction to computation'' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish themore » formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations, and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wave function can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.« less
Singh, Ajai R; Singh, Shakuntala A
2004-01-01
A lot of Indian research is replicative in nature. This is because originality is at a premium here and mediocrity is in great demand. But replication has its merit as well because it helps in corroboration. And that is the bedrock on which many a fancied scientific hypothesis or theory stands, or falls. However, to go from replicative to original research will involve a massive effort to restructure the Indian psyche and an all round effort from numerous quarters.The second part of this paper deals with the essence of scientific temper,which need not have any basic friendship, or animosity, with religion, faith, superstition and other such entities. A true scientist follows two cardinal rules. He is never unwilling to accept the worth of evidence, howsoever damning to the most favourite of his theories. Second, and perhaps more important, for want of evidence, he withholds comment. He says neither yes nor no.Where will Science ultimately lead Man is the third part of this essay. One argument is that the conflict between Man and Science will continue tilleither of them is exhausted or wiped out. The other believes that it is Science which has to be harnessed for Man and not Man used for Science. And with the numerous checks and balances in place, Science will remain an effective tool for man's progress. The essential value-neutrality of Science will have to be supplemented by the values that man has upheld for centuries as fundamental, and which religious thought and moral philosophy have continuously professed.