Active Inference and Learning in the Cerebellum.
Friston, Karl; Herreros, Ivan
2016-09-01
This letter offers a computational account of Pavlovian conditioning in the cerebellum based on active inference and predictive coding. Using eyeblink conditioning as a canonical paradigm, we formulate a minimal generative model that can account for spontaneous blinking, startle responses, and (delay or trace) conditioning. We then establish the face validity of the model using simulated responses to unconditioned and conditioned stimuli to reproduce the sorts of behavior that are observed empirically. The scheme's anatomical validity is then addressed by associating variables in the predictive coding scheme with nuclei and neuronal populations to match the (extrinsic and intrinsic) connectivity of the cerebellar (eyeblink conditioning) system. Finally, we try to establish predictive validity by reproducing selective failures of delay conditioning, trace conditioning, and extinction using (simulated and reversible) focal lesions. Although rather metaphorical, the ensuing scheme can account for a remarkable range of anatomical and neurophysiological aspects of cerebellar circuitry-and the specificity of lesion-deficit mappings that have been established experimentally. From a computational perspective, this work shows how conditioning or learning can be formulated in terms of minimizing variational free energy (or maximizing Bayesian model evidence) using exactly the same principles that underlie predictive coding in perception.
Simulation of a main steam line break with steam generator tube rupture using trace
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallardo, S.; Querol, A.; Verdu, G.
A simulation of the OECD/NEA ROSA-2 Project Test 5 was made with the thermal-hydraulic code TRACE5. Test 5 performed in the Large Scale Test Facility (LSTF) reproduced a Main Steam Line Break (MSLB) with a Steam Generator Tube Rupture (SGTR) in a Pressurized Water Reactor (PWR). The result of these simultaneous breaks is a depressurization in the secondary and primary system in loop B because both systems are connected through the SGTR. Good approximation was obtained between TRACE5 results and experimental data. TRACE5 reproduces qualitatively the phenomena that occur in this transient: primary pressure falls after the break, stagnation ofmore » the pressure after the opening of the relief valve of the intact steam generator, the pressure falls after the two openings of the PORV and the recovery of the liquid level in the pressurizer after each closure of the PORV. Furthermore, a sensitivity analysis has been performed to know the effect of varying the High Pressure Injection (HPI) flow rate in both loops on the system pressures evolution. (authors)« less
Full wave simulations of helicon wave losses in the scrape-off-layer of the DIII-D tokamak
NASA Astrophysics Data System (ADS)
Lau, Cornwall; Jaeger, Fred; Berry, Lee; Bertelli, Nicola; Pinsker, Robert
2017-10-01
Helicon waves have been recently proposed as an off-axis current drive actuator for DIII-D. Previous modeling using the hot plasma, full wave code AORSA, has shown good agreement with the ray tracing code GENRAY for helicon wave propagation and absorption in the core plasma. AORSA, and a new, reduced finite-element-model show discrepancies between ray tracing and full wave occur in the scrape-off-layer (SOL), especially at high densities. The reduced model is much faster than AORSA, and reproduces most of the important features of the AORSA model. The reduced model also allows for larger parametric scans and for the easy use of arbitrary tokamak geometry. Results of the full wave codes, AORSA and COMSOL, will be shown for helicon wave losses in the SOL are shown for a large range of parameters, such as SOL density profiles, n||, radial and vertical locations of the antenna, and different tokamak vessel geometries. This work was supported by DE-AC05-00OR22725, DE-AC02-09CH11466, and DE-FC02-04ER54698.
ASTRORAY: General relativistic polarized radiative transfer code
NASA Astrophysics Data System (ADS)
Shcherbakov, Roman V.
2014-07-01
ASTRORAY employs a method of ray tracing and performs polarized radiative transfer of (cyclo-)synchrotron radiation. The radiative transfer is conducted in curved space-time near rotating black holes described by Kerr-Schild metric. Three-dimensional general relativistic magneto hydrodynamic (3D GRMHD) simulations, in particular performed with variations of the HARM code, serve as an input to ASTRORAY. The code has been applied to reproduce the sub-mm synchrotron bump in the spectrum of Sgr A*, and to test the detectability of quasi-periodic oscillations in its light curve. ASTRORAY can be readily applied to model radio/sub-mm polarized spectra of jets and cores of other low-luminosity active galactic nuclei. For example, ASTRORAY is uniquely suitable to self-consistently model Faraday rotation measure and circular polarization fraction in jets.
NASA Astrophysics Data System (ADS)
Assari, Amin; Mohammadi, Zargham
2017-09-01
Karst systems show high spatial variability of hydraulic parameters over small distances and this makes their modeling a difficult task with several uncertainties. Interconnections of fractures have a major role on the transport of groundwater, but many of the stochastic methods in use do not have the capability to reproduce these complex structures. A methodology is presented for the quantification of tortuosity using the single normal equation simulation (SNESIM) algorithm and a groundwater flow model. A training image was produced based on the statistical parameters of fractures and then used in the simulation process. The SNESIM algorithm was used to generate 75 realizations of the four classes of fractures in a karst aquifer in Iran. The results from six dye tracing tests were used to assign hydraulic conductivity values to each class of fractures. In the next step, the MODFLOW-CFP and MODPATH codes were consecutively implemented to compute the groundwater flow paths. The 9,000 flow paths obtained from the MODPATH code were further analyzed to calculate the tortuosity factor. Finally, the hydraulic conductivity values calculated from the dye tracing experiments were refined using the actual flow paths of groundwater. The key outcomes of this research are: (1) a methodology for the quantification of tortuosity; (2) hydraulic conductivities, that are incorrectly estimated (biased low) with empirical equations that assume Darcian (laminar) flow with parallel rather than tortuous streamlines; and (3) an understanding of the scale-dependence and non-normal distributions of tortuosity.
MALBEC: a new CUDA-C ray-tracer in general relativity
NASA Astrophysics Data System (ADS)
Quiroga, G. D.
2018-06-01
A new CUDA-C code for tracing orbits around non-charged black holes is presented. This code, named MALBEC, take advantage of the graphic processing units and the CUDA platform for tracking null and timelike test particles in Schwarzschild and Kerr. Also, a new general set of equations that describe the closed circular orbits of any timelike test particle in the equatorial plane is derived. These equations are extremely important in order to compare the analytical behavior of the orbits with the numerical results and verify the correct implementation of the Runge-Kutta algorithm in MALBEC. Finally, other numerical tests are performed, demonstrating that MALBEC is able to reproduce some well-known results in these metrics in a faster and more efficient way than a conventional CPU implementation.
Cosmic microwave background reconstruction from WMAP and Planck PR2 data
NASA Astrophysics Data System (ADS)
Bobin, J.; Sureau, F.; Starck, J.-L.
2016-06-01
We describe a new estimate of the cosmic microwave background (CMB) intensity map reconstructed by a joint analysis of the full Planck 2015 data (PR2) and nine years of WMAP data. The proposed map provides more than a mere update of the CMB map introduced in a previous paper since it benefits from an improvement of the component separation method L-GMCA (Local-Generalized Morphological Component Analysis), which facilitates efficient separation of correlated components. Based on the most recent CMB data, we further confirm previous results showing that the proposed CMB map estimate exhibits appealing characteristics for astrophysical and cosmological applications: I) it is a full-sky map as it did not require any inpainting or interpolation postprocessing; II) foreground contamination is very low even on the galactic center; and III) the map does not exhibit any detectable trace of thermal Sunyaev-Zel'dovich contamination. We show that its power spectrum is in good agreement with the Planck PR2 official theoretical best-fit power spectrum. Finally, following the principle of reproducible research, we provide the codes to reproduce the L-GMCA, which makes it the only reproducible CMB map. The reconstructed CMB map and the code are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/591/A50
Transport of Energetic Ions in the Ring Current During Geomagnetic Storms
NASA Technical Reports Server (NTRS)
Kistler, Lynn M.; Kaufmann, Richard
2001-01-01
In the final year (plus no-cost extentions) of this grant, we have: Used the particle tracing code to perform a systematic study of the expected energy spectra over the full range of local times in the ring current using a variety of electric and magnetic field models. Shown that the Weimer electric field is superior to the Volland-Stern electric field in reproducing the observed energy spectra on the AMPTE CCE spacecraft. Redone our analysis of the pitch angle spectra of energetic ions during storms in the magnetosphere, using a larger data set, and a more reliable classification technique.
SolTrace | Concentrating Solar Power | NREL
NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted
Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence
NASA Astrophysics Data System (ADS)
Hildebrandt, Mario; Dittmann, Jana
2015-03-01
Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.
Comparing TID simulations using 3-D ray tracing and mirror reflection
NASA Astrophysics Data System (ADS)
Huang, X.; Reinisch, B. W.; Sales, G. S.; Paznukhov, V. V.; Galkin, I. A.
2016-04-01
Measuring the time variations of Doppler frequencies and angles of arrival (AoA) of ionospherically reflected HF waves has been proposed as a means of detecting the occurrence of traveling ionospheric disturbances (TIDs). Simulations are made using ray tracing through the International Reference Ionosphere (IRI) electron density model in an effort to reproduce measured signatures. The TID is represented by a wavelike perturbation of the 3-D electron density traveling horizontally in the ionosphere with an amplitude that varies sinusoidally with time. By judiciously selecting the TID parameters the ray tracing simulation reproduces the observed Doppler frequencies and AoAs. Ray tracing in a 3-D realistic ionosphere is, however, excessively time consuming considering the involved homing procedures. It is shown that a carefully selected reflecting corrugated mirror can reproduce the time variations of the AoA and Doppler frequency. The results from the ray tracing through the IRI model ionosphere and the mirror model reflections are compared to assess the applicability of the mirror-reflection model.
On the Information Content of Program Traces
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Hood, Robert; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Program traces are used for analysis of program performance, memory utilization, and communications as well as for program debugging. The trace contains records of execution events generated by monitoring units inserted into the program. The trace size limits the resolution of execution events and restricts the user's ability to analyze the program execution. We present a study of the information content of program traces and develop a coding scheme which reduces the trace size to the limit given by the trace entropy. We apply the coding to the traces of AIMS instrumented programs executed on the IBM SPA and the SCSI Power Challenge and compare it with other coding methods. Our technique shows size of the trace can be reduced by more than a factor of 5.
Automated Diversity in Computer Systems
2005-09-01
traces that started with trace heads , namely backwards- taken branches. These branches are indicative of loops within the program, and Dynamo assumes that...would be the ones the program would normally take. Therefore when a trace head became hot (was visited enough times), only a single code trace would...all encountered trace heads . When an interesting instruction is being emulated, the tracing code checks to see if it has been encountered before
[The reproducibility of multifocal ERG recordings].
Meigen, T; Friedrich, A
2002-09-01
Multifocal electroretinogram recordings (mfERG) can be used to detect a local dysfunction of the retina. In this study we tested both the intrasessional and inter-sessional reproducibility of mfERG amplitudes. MfERGs from 6 eyes of 6 normal subjects were recorded on two different days using DTL electrodes. The relative coefficient of variation ( RCV) was used to quantify the amplitude reproducibility. We tested the effect of (a) session (inter- vs. intrasessional), (b) recording duration (7.3 vs. 3.6 min), (c) trace type (hexagon traces vs. ring averages), and (d) amplitude definition (peak-trough analysis vs. scalar product) on RCV. RCV was 6.5+/-0.4% (Mean+/-SEM, n=96) when averaged across all recording conditions and all subjects. The ANOVA showed a significant difference ( p=0.018) between hexagon traces and ring averages. Another significant effect ( p=0.016) occurred for the interaction of (a) and (b). MfERGs can be recorded with a high degree of reproducibility even for short recording durations and single hexagon traces. As the factor (a) did not show a significant effect, the new placement of the DTL electrode in the second session does not necessarily increase the retest variability compared to a second recording within the same session.
Context-sensitive trace inlining for Java.
Häubl, Christian; Wimmer, Christian; Mössenböck, Hanspeter
2013-12-01
Method inlining is one of the most important optimizations in method-based just-in-time (JIT) compilers. It widens the compilation scope and therefore allows optimizing multiple methods as a whole, which increases the performance. However, if method inlining is used too frequently, the compilation time increases and too much machine code is generated. This has negative effects on the performance. Trace-based JIT compilers only compile frequently executed paths, so-called traces, instead of whole methods. This may result in faster compilation, less generated machine code, and better optimized machine code. In the previous work, we implemented a trace recording infrastructure and a trace-based compiler for [Formula: see text], by modifying the Java HotSpot VM. Based on this work, we evaluate the effect of trace inlining on the performance and the amount of generated machine code. Trace inlining has several major advantages when compared to method inlining. First, trace inlining is more selective than method inlining, because only frequently executed paths are inlined. Second, the recorded traces may capture information about virtual calls, which simplify inlining. A third advantage is that trace information is context sensitive so that different method parts can be inlined depending on the specific call site. These advantages allow more aggressive inlining while the amount of generated machine code is still reasonable. We evaluate several inlining heuristics on the benchmark suites DaCapo 9.12 Bach, SPECjbb2005, and SPECjvm2008 and show that our trace-based compiler achieves an up to 51% higher peak performance than the method-based Java HotSpot client compiler. Furthermore, we show that the large compilation scope of our trace-based compiler has a positive effect on other compiler optimizations such as constant folding or null check elimination.
Three dimensional ray tracing of the Jovian magnetosphere in the low frequency range
NASA Technical Reports Server (NTRS)
Menietti, J. D.
1984-01-01
Ray tracing studies of Jovian low frequency emissions were studied. A comprehensive three-dimensional ray tracing computer code for examination of model Jovian decametric (DAM) emission was developed. The improvements to the computer code are outlined and described. The results of the ray tracings of Jovian emissions will be presented in summary form.
Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N
2012-01-01
Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.
The MIMIC Code Repository: enabling reproducibility in critical care research.
Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J
2018-01-01
Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Kristjánsson, Tómas; Thorvaldsson, Tómas Páll; Kristjánsson, Arni
2014-01-01
Previous research involving both unimodal and multimodal studies suggests that single-response change detection is a capacity-free process while a discriminatory up or down identification is capacity-limited. The trace/context model assumes that this reflects different memory strategies rather than inherent differences between identification and detection. To perform such tasks, one of two strategies is used, a sensory trace or a context coding strategy, and if one is blocked, people will automatically use the other. A drawback to most preceding studies is that stimuli are presented at separate locations, creating the possibility of a spatial confound, which invites alternative interpretations of the results. We describe a series of experiments, investigating divided multimodal attention, without the spatial confound. The results challenge the trace/context model. Our critical experiment involved a gap before a change in volume and brightness, which according to the trace/context model blocks the sensory trace strategy, simultaneously with a roaming pedestal, which should block the context coding strategy. The results clearly show that people can use strategies other than sensory trace and context coding in the tasks and conditions of these experiments, necessitating changes to the trace/context model.
Validation of Ray Tracing Code Refraction Effects
NASA Technical Reports Server (NTRS)
Heath, Stephanie L.; McAninch, Gerry L.; Smith, Charles D.; Conner, David A.
2008-01-01
NASA's current predictive capabilities using the ray tracing program (RTP) are validated using helicopter noise data taken at Eglin Air Force Base in 2007. By including refractive propagation effects due to wind and temperature, the ray tracing code is able to explain large variations in the data observed during the flight test.
LENSED: a code for the forward reconstruction of lenses and sources from strong lensing observations
NASA Astrophysics Data System (ADS)
Tessore, Nicolas; Bellagamba, Fabio; Metcalf, R. Benton
2016-12-01
Robust modelling of strong lensing systems is fundamental to exploit the information they contain about the distribution of matter in galaxies and clusters. In this work, we present LENSED, a new code which performs forward parametric modelling of strong lenses. LENSED takes advantage of a massively parallel ray-tracing kernel to perform the necessary calculations on a modern graphics processing unit (GPU). This makes the precise rendering of the background lensed sources much faster, and allows the simultaneous optimization of tens of parameters for the selected model. With a single run, the code is able to obtain the full posterior probability distribution for the lens light, the mass distribution and the background source at the same time. LENSED is first tested on mock images which reproduce realistic space-based observations of lensing systems. In this way, we show that it is able to recover unbiased estimates of the lens parameters, even when the sources do not follow exactly the assumed model. Then, we apply it to a subsample of the Sloan Lens ACS Survey lenses, in order to demonstrate its use on real data. The results generally agree with the literature, and highlight the flexibility and robustness of the algorithm.
Simulation of multi-pulse coaxial helicity injection in the Sustained Spheromak Physics Experiment
NASA Astrophysics Data System (ADS)
O'Bryan, J. B.; Romero-Talamás, C. A.; Woodruff, S.
2018-03-01
Nonlinear, numerical computation with the NIMROD code is used to explore magnetic self-organization during multi-pulse coaxial helicity injection in the Sustained Spheromak Physics eXperiment. We describe multiple distinct phases of spheromak evolution, starting from vacuum magnetic fields and the formation of the initial magnetic flux bubble through multiple refluxing pulses and the eventual onset of the column mode instability. Experimental and computational magnetic diagnostics agree on the onset of the column mode instability, which first occurs during the second refluxing pulse of the simulated discharge. Our computations also reproduce the injector voltage traces, despite only specifying the injector current and not explicitly modeling the external capacitor bank circuit. The computations demonstrate that global magnetic evolution is fairly robust to different transport models and, therefore, that a single fluid-temperature model is sufficient for a broader, qualitative assessment of spheromak performance. Although discharges with similar traces of normalized injector current produce similar global spheromak evolution, details of the current distribution during the column mode instability impact the relative degree of poloidal flux amplification and magnetic helicity content.
An empirical analysis of journal policy effectiveness for computational reproducibility.
Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun
2018-03-13
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.
An empirical analysis of journal policy effectiveness for computational reproducibility
Seiler, Jennifer; Ma, Zhaokun
2018-01-01
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by (i) requesting data and code from authors and (ii) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy—author remission of data and code postpublication upon request—an improvement over no policy, but currently insufficient for reproducibility. PMID:29531050
NASA Technical Reports Server (NTRS)
Desautel, Richard
1993-01-01
The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).
Lázzari, J O; Pereira, M; Antunes, C M; Guimarães, A; Moncayo, A; Chávez Domínguez, R; Hernández Pieretti, O; Macedo, V; Rassi, A; Maguire, J; Romero, A
1998-11-01
An electrocardiographic recording method with an associated reading guide, designed for epidemiological studies on Chagas' disease, was tested to assess its diagnostic reproducibility. Six cardiologists from five countries each read 100 electrocardiographic (ECG) tracings, including 30 from chronic chagasic patients, then reread them after an interval of 6 months. The readings were blind, with the tracings numbered randomly for the first reading and renumbered randomly for the second reading. The physicians, all experienced in interpreting ECGs from chagasic patients, followed printed instructions for reading the tracings. Reproducibility of the readings was evaluated using the kappa (kappa) index for concordance. The results showed a high degree of interobserver concordance with respect to the diagnosis of normal vs. abnormal tracings (kappa = 0.66; SE 0.02). While the interpretations of some categories of ECG abnormalities were highly reproducible, others, especially those having a low prevalence, showed lower levels of concordance. Intraobserver concordance was uniformly higher than interobserver concordance. The findings of this study justify the use by specialists of the recording of readings method proposed for epidemiological studies on Chagas' disease, but warrant caution in the interpretation of some categories of electrocardiographic alterations.
The Alba ray tracing code: ART
NASA Astrophysics Data System (ADS)
Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi
2013-09-01
The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.
Provenance of Earth Science Datasets - How Deep Should One Go?
NASA Astrophysics Data System (ADS)
Ramapriyan, H.; Manipon, G. J. M.; Aulenbach, S.; Duggan, B.; Goldstein, J.; Hua, H.; Tan, D.; Tilmes, C.; Wilson, B. D.; Wolfe, R.; Zednik, S.
2015-12-01
For credibility of scientific research, transparency and reproducibility are essential. This fundamental tenet has been emphasized for centuries, and has been receiving increased attention in recent years. The Office of Management and Budget (2002) addressed reproducibility and other aspects of quality and utility of information from federal agencies. Specific guidelines from NASA (2002) are derived from the above. According to these guidelines, "NASA requires a higher standard of quality for information that is considered influential. Influential scientific, financial, or statistical information is defined as NASA information that, when disseminated, will have or does have clear and substantial impact on important public policies or important private sector decisions." For information to be compliant, "the information must be transparent and reproducible to the greatest possible extent." We present how the principles of transparency and reproducibility have been applied to NASA data supporting the Third National Climate Assessment (NCA3). The depth of trace needed of provenance of data used to derive conclusions in NCA3 depends on how the data were used (e.g., qualitatively or quantitatively). Given that the information is diligently maintained in the agency archives, it is possible to trace from a figure in the publication through the datasets, specific files, algorithm versions, instruments used for data collection, and satellites, as well as the individuals and organizations involved in each step. Such trace back permits transparency and reproducibility.
Representativeness of laboratory sampling procedures for the analysis of trace metals in soil.
Dubé, Jean-Sébastien; Boudreault, Jean-Philippe; Bost, Régis; Sona, Mirela; Duhaime, François; Éthier, Yannic
2015-08-01
This study was conducted to assess the representativeness of laboratory sampling protocols for purposes of trace metal analysis in soil. Five laboratory protocols were compared, including conventional grab sampling, to assess the influence of sectorial splitting, sieving, and grinding on measured trace metal concentrations and their variability. It was concluded that grinding was the most important factor in controlling the variability of trace metal concentrations. Grinding increased the reproducibility of sample mass reduction by rotary sectorial splitting by up to two orders of magnitude. Combined with rotary sectorial splitting, grinding increased the reproducibility of trace metal concentrations by almost three orders of magnitude compared to grab sampling. Moreover, results showed that if grinding is used as part of a mass reduction protocol by sectorial splitting, the effect of sieving on reproducibility became insignificant. Gy's sampling theory and practice was also used to analyze the aforementioned sampling protocols. While the theoretical relative variances calculated for each sampling protocol qualitatively agreed with the experimental variances, their quantitative agreement was very poor. It was assumed that the parameters used in the calculation of theoretical sampling variances may not correctly estimate the constitutional heterogeneity of soils or soil-like materials. Finally, the results have highlighted the pitfalls of grab sampling, namely, the fact that it does not exert control over incorrect sampling errors and that it is strongly affected by distribution heterogeneity.
Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives
NASA Astrophysics Data System (ADS)
Yan, An
2016-04-01
Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.
RAY-RAMSES: a code for ray tracing on the fly in N-body simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barreira, Alexandre; Llinares, Claudio; Bose, Sownak
2016-05-01
We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementationmore » using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conventional methods, which can be important in tests of theory systematics in preparation for upcoming large scale structure surveys.« less
Paixão, Fernanda; Silva, Wilkens Aurélio Buarque e; Silva, Frederico Andrade e; Ramos, Guilherme da Gama; Cruz, Mônica Vieira de Jesus
2007-01-01
The centric relation is a mandibular position that determines a balance relation among the temporomandibular joints, the chew muscles and the occlusion. This position makes possible to the dentist to plan and to execute oral rehabilitation respecting the physiological principles of the stomatognathic system. The aim of this study was to investigate the reproducibility of centric relation records obtained using two techniques: Dawson’s Bilateral Manipulation and Gysi’s Gothic Arch Tracing. Twenty volunteers (14 females and 6 males) with no dental loss, presenting occlusal contacts according to those described in Angle’s I classification and without signs and symptoms of temporomandibular disorders were selected. All volunteers were submitted five times with a 1-week interval, always in the same schedule, to the Dawson’s Bilateral Manipulation and to the Gysi’s Gothic Arch Tracing with aid of an intraoral apparatus. The average standard error of each technique was calculated (Bilateral Manipulation 0.94 and Gothic Arch Tracing 0.27). Shapiro-Wilk test was applied and the results allowed application of Student’s t-test (sampling error of 5%). The techniques showed different degrees of variability. The Gysi’s Gothic Arch Tracing was found to be more accurate than the Bilateral Manipulation in reproducing the centric relation records. PMID:19089144
ERIC Educational Resources Information Center
Naturescope, 1987
1987-01-01
Provides background information on how scientists have learned about the history of the Earth, including studying fossils, dating rocks, and tracing geological movements. Included are teaching activities about prehistoric animals, state fossils, tracing animal movement and evolution, and discovering fossils. Contains reproducible handouts and…
Mizumachi, Hideyuki; Sakuma, Megumi; Ikezumi, Mayu; Saito, Kazutoshi; Takeyoshi, Midori; Imai, Noriyasu; Okutomi, Hiroko; Umetsu, Asami; Motohashi, Hiroko; Watanabe, Mika; Miyazawa, Masaaki
2018-05-03
The epidermal sensitization assay (EpiSensA) is an in vitro skin sensitization test method based on gene expression of four markers related to the induction of skin sensitization; the assay uses commercially available reconstructed human epidermis. EpiSensA has exhibited an accuracy of 90% for 72 chemicals, including lipophilic chemicals and pre-/pro-haptens, when compared with the results of the murine local lymph node assay. In this work, a ring study was performed by one lead and two naive laboratories to evaluate the transferability, as well as within- and between-laboratory reproducibilities, of EpiSensA. Three non-coded chemicals (two lipophilic sensitizers and one non-sensitizer) were tested for the assessment of transferability and 10 coded chemicals (seven sensitizers and three non-sensitizers, including four lipophilic chemicals) were tested for the assessment of reproducibility. In the transferability phase, the non-coded chemicals (two sensitizers and one non-sensitizer) were correctly classified at the two naive laboratories, indicating that the EpiSensA protocol was transferred successfully. For the within-laboratory reproducibility, the data generated with three coded chemicals tested in three independent experiments in each laboratory gave consistent predictions within laboratories. For the between-laboratory reproducibility, 9 of the 10 coded chemicals tested once in each laboratory provided consistent predictions among the three laboratories. These results suggested that EpiSensA has good transferability, as well as within- and between-laboratory reproducibility. Copyright © 2018 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit
2017-03-01
In this article, we reply to a comment made on our previous commentary regarding reproducibility in computational hydrology. Software licensing and version control of code are important technical aspects of making code and workflows of scientific experiments open and reproducible. However, in our view, it is the cultural change that is the greatest challenge to overcome to achieve reproducible scientific research in computational hydrology. We believe that from changing the culture and attitude among hydrological scientists, details will evolve to cover more (technical) aspects over time.
Trace Replay and Network Simulation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acun, Bilge; Jain, Nikhil; Bhatele, Abhinav
2015-03-23
TraceR is a trace reply tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performances and understanding network behavior by simulating messaging in High Performance Computing applications on interconnection networks.
Trace Replay and Network Simulation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, Nikhil; Bhatele, Abhinav; Acun, Bilge
TraceR Is a trace replay tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performance and understanding network behavior by simulating messaging In High Performance Computing applications on interconnection networks.
NASA Astrophysics Data System (ADS)
Wrable-Rose, Madeline; Primera-Pedrozo, Oliva M.; Pacheco-Londoño, Leonardo C.; Hernandez-Rivera, Samuel P.
2010-12-01
This research examines the surface contamination properties, trace sample preparation methodologies, detection systems response and generation of explosive contamination standards for trace detection systems. Homogeneous and reproducible sample preparation is relevant for trace detection of chemical threats, such as warfare agents, highly energetic materials (HEM) and toxic industrial chemicals. The objective of this research was to develop a technology capable of producing samples and standards of HEM with controlled size and distribution on a substrate to generate specimens that would reproduce real contamination conditions. The research activities included (1) a study of the properties of particles generated by two deposition techniques: sample smearing deposition and inkjet deposition, on gold-coated silicon, glass and stainless steel substrates; (2) characterization of composition, distribution and adhesion characteristics of deposits; (3) evaluation of accuracy and reproducibility for depositing neat highly energetic materials such as TNT, RDX and ammonium nitrate; (4) a study of HEM-surface interactions using FTIR-RAIRS; and (5) establishment of protocols for validation of surface concentration using destructive methods such as HPLC.
NASA Astrophysics Data System (ADS)
Basu, Sukanta; Nunalee, Christopher G.; He, Ping; Fiorino, Steven T.; Vorontsov, Mikhail A.
2014-10-01
In this paper, we reconstruct the meteorological and optical environment during the time of Titanic's disaster utilizing a state-of-the-art meteorological model, a ray-tracing code, and a unique public-domain dataset called the Twentieth Century Global Reanalysis. With high fidelity, our simulation captured the occurrence of an unusually high Arctic pressure system over the disaster site with calm wind. It also reproduced the movement of a polar cold front through the region bringing a rapid drop in air temperature. The simulated results also suggest that unusual meteorological conditions persisted several hours prior to the Titanic disaster which contributed to super-refraction and intermittent optical turbulence. However, according to the simulations, such anomalous conditions were not present at the time of the collision of Titanic with an iceberg.
Disk Emission from Magnetohydrodynamic Simulations of Spinning Black Holes
NASA Technical Reports Server (NTRS)
Schnittman, Jeremy D.; Krolik, Julian H.; Noble, Scott C.
2016-01-01
We present the results of a new series of global, three-dimensional, relativistic magnetohydrodynamic (MHD) simulations of thin accretion disks around spinning black holes. The disks have aspect ratios of H/R approx. 0.05 and spin parameters of a/M = 0, 0.5, 0.9, and 0.99. Using the ray-tracing code Pandurata, we generate broadband thermal spectra and polarization signatures from the MHD simulations. We find that the simulated spectra can be well fit with a simple, universal emissivity profile that better reproduces the behavior of the emission from the inner disk, compared to traditional analyses carried out using a Novikov-Thorne thin disk model. Finally, we show how spectropolarization observations can be used to convincingly break the spin-inclination degeneracy well known to the continuum-fitting method of measuring black hole spin.
Trace-shortened Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Solomon, G.
1994-01-01
Reed-Solomon (RS) codes have been part of standard NASA telecommunications systems for many years. RS codes are character-oriented error-correcting codes, and their principal use in space applications has been as outer codes in concatenated coding systems. However, for a given character size, say m bits, RS codes are limited to a length of, at most, 2(exp m). It is known in theory that longer character-oriented codes would be superior to RS codes in concatenation applications, but until recently no practical class of 'long' character-oriented codes had been discovered. In 1992, however, Solomon discovered an extensive class of such codes, which are now called trace-shortened Reed-Solomon (TSRS) codes. In this article, we will continue the study of TSRS codes. Our main result is a formula for the dimension of any TSRS code, as a function of its error-correcting power. Using this formula, we will give several examples of TSRS codes, some of which look very promising as candidate outer codes in high-performance coded telecommunications systems.
NASA Technical Reports Server (NTRS)
Wey, Thomas; Liu, Nan-Suey
2003-01-01
The overall objective of the current effort at NASA GRC is to evaluate, develop, and apply methodologies suitable for modeling intra-engine trace chemical changes over post combustor flow path relevant to the pollutant emissions from aircraft engines. At the present time, the focus is the high pressure turbine environment. At first, the trace chemistry model of CNEWT were implemented into GLENN-HT as well as NCC. Then, CNEWT, CGLENN-HT, and NCC were applied to the trace species evolution in a cascade of Cambridge University's No. 2 rotor and in a turbine vane passage. In general, the results from these different codes provide similar features. However, the details of some of the quantities of interest can be sensitive to the differences of these codes. This report summaries the implementation effort and presents the comparison of the No. 2 rotor results obtained from these different codes. The comparison of the turbine vane passage results is reported elsewhere. In addition to the implementation of trace chemistry model into existing CFD codes, several pre/post-processing tools that can handle the manipulations of the geometry, the unstructured and structured grids as well as the CFD solutions also have been enhanced and seamlessly tied with NCC, CGLENN-HT, and CNEWT. Thus, a complete CFD package consisting of pre/post-processing tools and flow solvers suitable for post-combustor intra-engine trace chemistry study is assembled.
Schmuker, Michael; Yamagata, Nobuhiro; Nawrot, Martin Paul; Menzel, Randolf
2011-01-01
The honeybee Apis mellifera has a remarkable ability to detect and locate food sources during foraging, and to associate odor cues with food rewards. In the honeybee's olfactory system, sensory input is first processed in the antennal lobe (AL) network. Uniglomerular projection neurons (PNs) convey the sensory code from the AL to higher brain regions via two parallel but anatomically distinct pathways, the lateral and the medial antenno-cerebral tract (l- and m-ACT). Neurons innervating either tract show characteristic differences in odor selectivity, concentration dependence, and representation of mixtures. It is still unknown how this differential stimulus representation is achieved within the AL network. In this contribution, we use a computational network model to demonstrate that the experimentally observed features of odor coding in PNs can be reproduced by varying lateral inhibition and gain control in an otherwise unchanged AL network. We show that odor coding in the l-ACT supports detection and accurate identification of weak odor traces at the expense of concentration sensitivity, while odor coding in the m-ACT provides the basis for the computation and following of concentration gradients but provides weaker discrimination power. Both coding strategies are mutually exclusive, which creates a tradeoff between detection accuracy and sensitivity. The development of two parallel systems may thus reflect an evolutionary solution to this problem that enables honeybees to achieve both tasks during bee foraging in their natural environment, and which could inspire the development of artificial chemosensory devices for odor-guided navigation in robots.
Assessment of the TRACE Reactor Analysis Code Against Selected PANDA Transient Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zavisca, M.; Ghaderi, M.; Khatib-Rahbar, M.
2006-07-01
The TRACE (TRAC/RELAP Advanced Computational Engine) code is an advanced, best-estimate thermal-hydraulic program intended to simulate the transient behavior of light-water reactor systems, using a two-fluid (steam and water, with non-condensable gas), seven-equation representation of the conservation equations and flow-regime dependent constitutive relations in a component-based model with one-, two-, or three-dimensional elements, as well as solid heat structures and logical elements for the control system. The U.S. Nuclear Regulatory Commission is currently supporting the development of the TRACE code and its assessment against a variety of experimental data pertinent to existing and evolutionary reactor designs. This paper presents themore » results of TRACE post-test prediction of P-series of experiments (i.e., tests comprising the ISP-42 blind and open phases) conducted at the PANDA large-scale test facility in 1990's. These results show reasonable agreement with the reported test results, indicating good performance of the code and relevant underlying thermal-hydraulic and heat transfer models. (authors)« less
A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.
MCViNE- An object oriented Monte Carlo neutron ray tracing simulation package
Lin, J. Y. Y.; Smith, Hillary L.; Granroth, Garrett E.; ...
2015-11-28
MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiplemore » scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. As a result, with simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.« less
A users' guide to the trace contaminant control simulation computer program
NASA Technical Reports Server (NTRS)
Perry, J. L.
1994-01-01
The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various trace contaminant control technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. The results obtained from the program can be useful in assessing different technology combinations, system sizing, system location with respect to other life support systems, and the overall life cycle economics of a trace contaminant control system. The user's manual is extracted in its entirety from NASA TM-108409 to provide a stand-alone reference for using any version of the program. The first publication of the manual as part of TM-108409 also included a detailed listing of version 8.0 of the program. As changes to the code were necessary, it became apparent that the user's manual should be separate from the computer code documentation and be general enough to provide guidance in using any version of the program. Provided in the guide are tips for input file preparation, general program execution, and output file manipulation. Information concerning source code listings of the latest version of the computer program may be obtained by contacting the author.
Conformally encapsulated multi-electrode arrays with seamless insulation
Tabada, Phillipe J.; Shah, Kedar G.; Tolosa, Vanessa; Pannu, Satinderall S.; Tooker, Angela; Delima, Terri; Sheth, Heeral; Felix, Sarah
2016-11-22
Thin-film multi-electrode arrays (MEA) having one or more electrically conductive beams conformally encapsulated in a seamless block of electrically insulating material, and methods of fabricating such MEAs using reproducible, microfabrication processes. One or more electrically conductive traces are formed on scaffold material that is subsequently removed to suspend the traces over a substrate by support portions of the trace beam in contact with the substrate. By encapsulating the suspended traces, either individually or together, with a single continuous layer of an electrically insulating material, a seamless block of electrically insulating material is formed that conforms to the shape of the trace beam structure, including any trace backings which provide suspension support. Electrical contacts, electrodes, or leads of the traces are exposed from the encapsulated trace beam structure by removing the substrate.
Mise en Scene: Conversion of Scenarios to CSP Traces for the Requirements-to-Design-to-Code Project
NASA Technical Reports Server (NTRS)
Carter. John D.; Gardner, William B.; Rash, James L.; Hinchey, Michael G.
2007-01-01
The "Requirements-to-Design-to-Code" (R2D2C) project at NASA's Goddard Space Flight Center is based on deriving a formal specification expressed in Communicating Sequential Processes (CSP) notation from system requirements supplied in the form of CSP traces. The traces, in turn, are to be extracted from scenarios, a user-friendly medium often used to describe the required behavior of computer systems under development. This work, called Mise en Scene, defines a new scenario medium (Scenario Notation Language, SNL) suitable for control-dominated systems, coupled with a two-stage process for automatic translation of scenarios to a new trace medium (Trace Notation Language, TNL) that encompasses CSP traces. Mise en Scene is offered as an initial solution to the problem of the scenarios-to-traces "D2" phase of R2D2C. A survey of the "scenario" concept and some case studies are also provided.
Grybauskas, Simonas; Balciuniene, Irena; Vetra, Janis
2007-01-01
The emerging market of digital cephalographs and computerized cephalometry is overwhelming the need to examine the advantages and drawbacks of manual cephalometry, meanwhile, small offices continue to benefit from the economic efficacy and ease of use of analogue cephalograms. The use of modern cephalometric software requires import of digital cephalograms or digital capture of analogue data: scanning and digital photography. The validity of digital photographs of analogue headfilms rather than original headfilms in clinical practice has not been well established. Digital photography could be a fast and inexpensive method of digital capture of analogue cephalograms for use in digital cephalometry. The objective of this study was to determine the validity and reproducibility of measurements obtained from digital photographs of analogue headfilms in lateral cephalometry. Analogue cephalometric radiographs were performed on 15 human dry skulls. Each of them was traced on acetate paper and photographed three times independently. Acetate tracings and digital photographs were digitized and analyzed in cephalometric software. Linear regression model, paired t-test intergroup analysis and coefficient of repeatability were used to assess validity and reproducibility for 63 angular, linear and derivative measurements. 54 out of 63 measurements were determined to have clinically acceptable reproducibility in the acetate tracing group as well as 46 out of 63 in the digital photography group. The worst reproducibility was determined for measurements dependent on landmarks of incisors and poorly defined outlines, majority of them being angular measurements. Validity was acceptable for all measurements, and although statistically significant differences between methods existed for as many as 15 parameters, they appeared to be clinically insignificant being smaller than 1 unit of measurement. Validity was acceptable for 59 of 63 measurements obtained from digital photographs, substantiating the use of digital photography for headfilm capture and computer-aided cephalometric analysis.
SilMush: A procedure for modeling of the geochemical evolution of silicic magmas and granitic rocks
NASA Astrophysics Data System (ADS)
Hertogen, Jan; Mareels, Joyce
2016-07-01
A boundary layer crystallization modeling program is presented that specifically addresses the chemical fractionation in silicic magma systems and the solidification of plutonic bodies. The model is a Langmuir (1989) type approach and does not invoke crystal settling in high-viscosity silicic melts. The primary aim is to model a granitic rock as a congealed crystal-liquid mush, and to integrate major element and trace element modeling. The procedure allows for some exploratory investigation of the exsolution of H2O-fluids and of the fluid/melt partitioning of trace elements. The procedure is implemented as a collection of subroutines for the MS Excel spreadsheet environment and is coded in the Visual Basic for Applications (VBA) language. To increase the flexibility of the modeling, the procedure is based on discrete numeric process simulation rather than on solution of continuous differential equations. The program is applied to a study of the geochemical variation within and among three granitic units (Senones, Natzwiller, Kagenfels) from the Variscan Northern Vosges Massif, France. The three units cover the compositional range from monzogranite, over syenogranite to alkali-feldspar granite. An extensive set of new major element and trace element data is presented. Special attention is paid to the essential role of accessory minerals in the fractionation of the Rare Earth Elements. The crystallization model is able to reproduce the essential major and trace element variation trends in the data sets of the three separate granitic plutons. The Kagenfels alkali-feldspar leucogranite couples very limited variation in major element composition to a considerable and complex variation of trace elements. The modeling results can serve as a guide for the reconstruction of the emplacement sequence of petrographically distinct units. Although the modeling procedure essentially deals with geochemical fractionation within a single pluton, the modeling results bring up a number of questions about the petrogenetic relationships among parental magmas of nearly coeval granitic units emplaced in close proximity.
Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry
NASA Technical Reports Server (NTRS)
Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2015-01-01
A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.
Solar proton exposure of an ICRU sphere within a complex structure part II: Ray-trace geometry.
Slaba, Tony C; Wilson, John W; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A
2016-06-01
A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z ≤ 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency. Published by Elsevier Ltd.
Evolvix BEST Names for semantic reproducibility across code2brain interfaces
Scheuer, Katherine S.; Keel, Seth A.; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C.; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G.; Moog, Cecilia L.; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist‐Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda‐Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L.; Freiberg, Erika; Waters, Noah P.; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M.; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha
2016-01-01
Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general‐purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long‐term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder‐brains to reader‐brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. PMID:27918836
Tsien, Joe Z.
2013-01-01
Mapping and decoding brain activity patterns underlying learning and memory represents both great interest and immense challenge. At present, very little is known regarding many of the very basic questions regarding the neural codes of memory: are fear memories retrieved during the freezing state or non-freezing state of the animals? How do individual memory traces give arise to a holistic, real-time associative memory engram? How are memory codes regulated by synaptic plasticity? Here, by applying high-density electrode arrays and dimensionality-reduction decoding algorithms, we investigate hippocampal CA1 activity patterns of trace fear conditioning memory code in inducible NMDA receptor knockout mice and their control littermates. Our analyses showed that the conditioned tone (CS) and unconditioned foot-shock (US) can evoke hippocampal ensemble responses in control and mutant mice. Yet, temporal formats and contents of CA1 fear memory engrams differ significantly between the genotypes. The mutant mice with disabled NMDA receptor plasticity failed to generate CS-to-US or US-to-CS associative memory traces. Moreover, the mutant CA1 region lacked memory traces for “what at when” information that predicts the timing relationship between the conditioned tone and the foot shock. The degraded associative fear memory engram is further manifested in its lack of intertwined and alternating temporal association between CS and US memory traces that are characteristic to the holistic memory recall in the wild-type animals. Therefore, our study has decoded real-time memory contents, timing relationship between CS and US, and temporal organizing patterns of fear memory engrams and demonstrated how hippocampal memory codes are regulated by NMDA receptor synaptic plasticity. PMID:24302990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weeratunga, S K
Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can bemore » easily shared between these two code frameworks and concludes with a set of recommendations for its development.« less
Language-Agnostic Reproducible Data Analysis Using Literate Programming.
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.
Language-Agnostic Reproducible Data Analysis Using Literate Programming
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123
Environmental Ethics and Civil Engineering.
ERIC Educational Resources Information Center
Vesilind, P. Aarne
1987-01-01
Traces the development of the civil engineering code of ethics. Points out that the code does have an enforceable provision that addresses the engineer's responsibility toward the environment. Suggests revisions to the code to accommodate the environmental impacts of civil engineering. (TW)
SolTrace Background | Concentrating Solar Power | NREL
codes was written to model a very specific optical geometry, and each one built upon the others in an evolutionary way. Examples of such codes include: OPTDSH, a code written to model circular aperture parabolic
Towards Reproducibility in Computational Hydrology
NASA Astrophysics Data System (ADS)
Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit
2017-04-01
Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as the example application area, we believe that our conclusions are of value to the wider environmental and geoscience community as far as the use of code and models for scientific advancement is concerned. References: [1] Hutton, C., T. Wagener, J. Freer, D. Han, C. Duffy, and B. Arheimer (2016), Most computational hydrology is not reproducible, so is it really science?, Water Resour. Res., 52, 7548-7555, doi:10.1002/2016WR019285. [2] Ceola, S., et al. (2015), Virtual laboratories: New opportunities for collaborative water science, Hydrol. Earth Syst. Sci. Discuss., 11(12), 13443-13478, doi:10.5194/hessd-11-13443-2014.
Modelling Metamorphism by Abstract Interpretation
NASA Astrophysics Data System (ADS)
Dalla Preda, Mila; Giacobazzi, Roberto; Debray, Saumya; Coogan, Kevin; Townsend, Gregg M.
Metamorphic malware apply semantics-preserving transformations to their own code in order to foil detection systems based on signature matching. In this paper we consider the problem of automatically extract metamorphic signatures from these malware. We introduce a semantics for self-modifying code, later called phase semantics, and prove its correctness by showing that it is an abstract interpretation of the standard trace semantics. Phase semantics precisely models the metamorphic code behavior by providing a set of traces of programs which correspond to the possible evolutions of the metamorphic code during execution. We show that metamorphic signatures can be automatically extracted by abstract interpretation of the phase semantics, and that regular metamorphism can be modelled as finite state automata abstraction of the phase semantics.
El-Damanhoury, Hatem M.; Fakhruddin, Kausar Sadia; Awad, Manal A.
2014-01-01
Objective: To assess the feasibility of teaching International Caries Detection and Assessment System (ICDAS) II and its e-learning program as tools for occlusal caries detection to freshmen dental students in comparison to dental graduates with 2 years of experience. Materials and Methods: Eighty-four freshmen and 32 dental graduates examined occlusal surfaces of molars/premolars (n = 72) after a lecture and a hands-on workshop. The same procedure was repeated after 1 month following the training with ICDAS II e-learning program. Validation of ICDAS II codes was done histologically. Intra- and inter-examiner reproducibility of ICDAS II severity scores were assessed before and after e-learning using (Fleiss's kappa). Results: The kappa values showed inter-examiner reproducibility ranged from 0.53 (ICDAS II code cut off ≥ 1) to 0.70 (ICDAS II code cut off ≥ 3) by undergraduates and 0.69 (ICDAS II code cut off ≥ 1) to 0.95 (ICDAS II code cut off ≥ 3) by graduates. The inter-examiner reproducibility ranged from 0.64 (ICDAS II code cut off ≥ 1) to 0.89 (ICDAS II code cut off ≥ 3). No statistically significant difference was found between both groups in intra-examiner agreements for assessing ICDAS II codes. A high statistically significant difference (P ≤ 0.01) in correct identification of codes 1, 2, and 4 from before to after e-learning were observed in both groups. The bias indices for the undergraduate group were higher than those of the graduate group. Conclusions: Early exposure of students to ICDAS II is a valuable method of teaching caries detection and its e-learning program significantly improves their caries diagnostic skills. PMID:25512730
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epiney, A.; Canepa, S.; Zerkak, O.
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)
NASA Technical Reports Server (NTRS)
Votava, Petr; Michaelis, Andrew; Spaulding, Ryan; Becker, Jeffrey C.
2016-01-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We believe this is an important step towards seamless process deployment in heterogeneous environments that will enhance community access to NASA data and tools in a scalable way, promote software reuse, and improve reproducibility of scientific results.
Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Spaulding, R.; Becker, J. C.
2016-12-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We believe this is an important step towards seamless process deployment in heterogeneous environments that will enhance community access to NASA data and tools in a scalable way, promote software reuse, and improve reproducibility of scientific results.
Contribution of finger tracing to the recognition of Chinese characters.
Yim-Ng, Y Y; Varley, R; Andrade, J
2000-01-01
Finger tracing is a simulation of the act of writing without the use of pen and paper. It is claimed to help in the processing of Chinese characters, possibly by providing additional motor coding. In this study, blindfolded subjects were equally good at identifying Chinese characters and novel visual stimuli through passive movements made with the index finger of the preferred hand and those made with the last finger of that hand. This suggests that finger tracing provides a relatively high level of coding specific to individual characters, but non-specific to motor effectors. Beginning each stroke from the same location, i.e. removing spatial information, impaired recognition of the familiar characters and the novel nonsense figures. Passively tracing the strokes in a random sequence also impaired recognition of the characters. These results therefore suggest that the beneficial effect of finger tracing on writing or recall of Chinese characters is mediated by sequence and spatial information embedded in the motor movements, and that proprioceptive channel may play a part in mediating visuo-spatial information. Finger tracing may be a useful strategy for remediation of Chinese language impairments.
Thermal-chemical Mantle Convection Models With Adaptive Mesh Refinement
NASA Astrophysics Data System (ADS)
Leng, W.; Zhong, S.
2008-12-01
In numerical modeling of mantle convection, resolution is often crucial for resolving small-scale features. New techniques, adaptive mesh refinement (AMR), allow local mesh refinement wherever high resolution is needed, while leaving other regions with relatively low resolution. Both computational efficiency for large- scale simulation and accuracy for small-scale features can thus be achieved with AMR. Based on the octree data structure [Tu et al. 2005], we implement the AMR techniques into the 2-D mantle convection models. For pure thermal convection models, benchmark tests show that our code can achieve high accuracy with relatively small number of elements both for isoviscous cases (i.e. 7492 AMR elements v.s. 65536 uniform elements) and for temperature-dependent viscosity cases (i.e. 14620 AMR elements v.s. 65536 uniform elements). We further implement tracer-method into the models for simulating thermal-chemical convection. By appropriately adding and removing tracers according to the refinement of the meshes, our code successfully reproduces the benchmark results in van Keken et al. [1997] with much fewer elements and tracers compared with uniform-mesh models (i.e. 7552 AMR elements v.s. 16384 uniform elements, and ~83000 tracers v.s. ~410000 tracers). The boundaries of the chemical piles in our AMR code can be easily refined to the scales of a few kilometers for the Earth's mantle and the tracers are concentrated near the chemical boundaries to precisely trace the evolvement of the boundaries. It is thus very suitable for our AMR code to study the thermal-chemical convection problems which need high resolution to resolve the evolvement of chemical boundaries, such as the entrainment problems [Sleep, 1988].
TEA: A Code Calculating Thermochemical Equilibrium Abundances
NASA Astrophysics Data System (ADS)
Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver
2016-07-01
We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. We tested the code against the method of Burrows & Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows & Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.
TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu
2016-07-01
We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but withmore » higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.« less
Digitized forensics: retaining a link between physical and digital crime scene traces using QR-codes
NASA Astrophysics Data System (ADS)
Hildebrandt, Mario; Kiltz, Stefan; Dittmann, Jana
2013-03-01
The digitization of physical traces from crime scenes in forensic investigations in effect creates a digital chain-of-custody and entrains the challenge of creating a link between the two or more representations of the same trace. In order to be forensically sound, especially the two security aspects of integrity and authenticity need to be maintained at all times. Especially the adherence to the authenticity using technical means proves to be a challenge at the boundary between the physical object and its digital representations. In this article we propose a new method of linking physical objects with its digital counterparts using two-dimensional bar codes and additional meta-data accompanying the acquired data for integration in the conventional documentation of collection of items of evidence (bagging and tagging process). Using the exemplary chosen QR-code as particular implementation of a bar code and a model of the forensic process, we also supply a means to integrate our suggested approach into forensically sound proceedings as described by Holder et al.1 We use the example of the digital dactyloscopy as a forensic discipline, where currently progress is being made by digitizing some of the processing steps. We show an exemplary demonstrator of the suggested approach using a smartphone as a mobile device for the verification of the physical trace to extend the chain-of-custody from the physical to the digital domain. Our evaluation of the demonstrator is performed towards the readability and the verification of its contents. We can read the bar code despite its limited size of 42 x 42 mm and rather large amount of embedded data using various devices. Furthermore, the QR-code's error correction features help to recover contents of damaged codes. Subsequently, our appended digital signature allows for detecting malicious manipulations of the embedded data.
Reproducible research in vadose zone sciences
USDA-ARS?s Scientific Manuscript database
A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...
NASA Astrophysics Data System (ADS)
Mishchenko, Michael I.; Liu, Li; Mackowski, Daniel W.
2013-07-01
We use state-of-the-art public-domain Fortran codes based on the T-matrix method to calculate orientation and ensemble averaged scattering matrix elements for a variety of morphologically complex black carbon (BC) and BC-containing aerosol particles, with a special emphasis on the linear depolarization ratio (LDR). We explain theoretically the quasi-Rayleigh LDR peak at side-scattering angles typical of low-density soot fractals and conclude that the measurement of this feature enables one to evaluate the compactness state of BC clusters and trace the evolution of low-density fluffy fractals into densely packed aggregates. We show that small backscattering LDRs measured with ground-based, airborne, and spaceborne lidars for fresh smoke generally agree with the values predicted theoretically for fluffy BC fractals and densely packed near-spheroidal BC aggregates. To reproduce higher lidar LDRs observed for aged smoke, one needs alternative particle models such as shape mixtures of BC spheroids or cylinders.
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Liu, Li; Mackowski, Daniel W.
2013-01-01
We use state-of-the-art public-domain Fortran codes based on the T-matrix method to calculate orientation and ensemble averaged scattering matrix elements for a variety of morphologically complex black carbon (BC) and BC-containing aerosol particles, with a special emphasis on the linear depolarization ratio (LDR). We explain theoretically the quasi-Rayleigh LDR peak at side-scattering angles typical of low-density soot fractals and conclude that the measurement of this feature enables one to evaluate the compactness state of BC clusters and trace the evolution of low-density fluffy fractals into densely packed aggregates. We show that small backscattering LDRs measured with groundbased, airborne, and spaceborne lidars for fresh smoke generally agree with the values predicted theoretically for fluffy BC fractals and densely packed near-spheroidal BC aggregates. To reproduce higher lidar LDRs observed for aged smoke, one needs alternative particle models such as shape mixtures of BC spheroids or cylinders.
A hybrid model of laser energy deposition for multi-dimensional simulations of plasmas and metals
NASA Astrophysics Data System (ADS)
Basko, Mikhail M.; Tsygvintsev, Ilia P.
2017-05-01
The hybrid model of laser energy deposition is a combination of the geometrical-optics ray-tracing method with the one-dimensional (1D) solution of the Helmholtz wave equation in regions where the geometrical optics becomes inapplicable. We propose an improved version of this model, where a new physically consistent criterion for transition to the 1D wave optics is derived, and a special rescaling procedure of the wave-optics deposition profile is introduced. The model is intended for applications in large-scale two- and three-dimensional hydrodynamic codes. Comparison with exact 1D solutions demonstrates that it can fairly accurately reproduce the absorption fraction in both the s- and p-polarizations on arbitrarily steep density gradients, provided that a sufficiently accurate algorithm for gradient evaluation is used. The accuracy of the model becomes questionable for long laser pulses simulated on too fine grids, where the hydrodynamic self-focusing instability strongly manifests itself.
AlBarakati, SF; Kula, KS; Ghoneima, AA
2012-01-01
Objective The aim of this study was to assess the reliability and reproducibility of angular and linear measurements of conventional and digital cephalometric methods. Methods A total of 13 landmarks and 16 skeletal and dental parameters were defined and measured on pre-treatment cephalometric radiographs of 30 patients. The conventional and digital tracings and measurements were performed twice by the same examiner with a 6 week interval between measurements. The reliability within the method was determined using Pearson's correlation coefficient (r2). The reproducibility between methods was calculated by paired t-test. The level of statistical significance was set at p < 0.05. Results All measurements for each method were above 0.90 r2 (strong correlation) except maxillary length, which had a correlation of 0.82 for conventional tracing. Significant differences between the two methods were observed in most angular and linear measurements except for ANB angle (p = 0.5), angle of convexity (p = 0.09), anterior cranial base (p = 0.3) and the lower anterior facial height (p = 0.6). Conclusion In general, both methods of conventional and digital cephalometric analysis are highly reliable. Although the reproducibility of the two methods showed some statistically significant differences, most differences were not clinically significant. PMID:22184624
NASA Astrophysics Data System (ADS)
Fagre, M.; Elias, A. G.; Chum, J.; Cabrera, M. A.
2017-12-01
In the present work, ray tracing of high frequency (HF) signals in ionospheric disturbed conditions is analyzed, particularly in the presence of electron density perturbations generated by gravity waves (GWs). The three-dimensional numerical ray tracing code by Jones and Stephenson, based on Hamilton's equations, which is commonly used to study radio propagation through the ionosphere, is used. An electron density perturbation model is implemented to this code based upon the consideration of atmospheric GWs generated at a height of 150 km in the thermosphere and propagating up into the ionosphere. The motion of the neutral gas at these altitudes induces disturbances in the background plasma which affects HF signals propagation. To obtain a realistic model of GWs in order to analyze the propagation and dispersion characteristics, a GW ray tracing method with kinematic viscosity and thermal diffusivity was applied. The IRI-2012, HWM14 and NRLMSISE-00 models were incorporated to assess electron density, wind velocities, neutral temperature and total mass density needed for the ray tracing codes. Preliminary results of gravity wave effects on ground range and reflection height are presented for low-mid latitude ionosphere.
Review Of Piping And Pressure Vessel Code Design Criteria. Technical Report 217.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
1969-04-18
This Technical Report summarizes a review of the design philosophies and criteria of the ASME Boiler and Pressure Vessel Code and the USASI Code for Pressure Piping. It traces the history of the Codes since their inception and critically reviews their present status. Recommendations are made concerning the applicability of the Codes to the special needs of LMFBR liquid sodium piping.
Porcupine: A visual pipeline tool for neuroimaging analysis
Snoek, Lukas; Knapen, Tomas
2018-01-01
The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one’s analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one’s analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0. PMID:29746461
Reproducibility and Transparency in Ocean-Climate Modeling
NASA Astrophysics Data System (ADS)
Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.
2015-12-01
Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.
Evolvix BEST Names for semantic reproducibility across code2brain interfaces.
Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha
2017-01-01
Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
Colour-dressed hexagon tessellations for correlation functions and non-planar corrections
NASA Astrophysics Data System (ADS)
Eden, Burkhard; Jiang, Yunfeng; le Plat, Dennis; Sfondrini, Alessandro
2018-02-01
We continue the study of four-point correlation functions by the hexagon tessellation approach initiated in [38] and [39]. We consider planar tree-level correlation functions in N=4 supersymmetric Yang-Mills theory involving two non-protected operators. We find that, in order to reproduce the field theory result, it is necessary to include SU( N) colour factors in the hexagon formalism; moreover, we find that the hexagon approach as it stands is naturally tailored to the single-trace part of correlation functions, and does not account for multi-trace admixtures. We discuss how to compute correlators involving double-trace operators, as well as more general 1 /N effects; in particular we compute the whole next-to-leading order in the large- N expansion of tree-level BMN two-point functions by tessellating a torus with punctures. Finally, we turn to the issue of "wrapping", Lüscher-like corrections. We show that SU( N) colour-dressing reproduces an earlier empirical rule for incorporating single-magnon wrapping, and we provide a direct interpretation of such wrapping processes in terms of N=2 supersymmetric Feynman diagrams.
Ray tracing through a hexahedral mesh in HADES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henderson, G L; Aufderheide, M B
In this paper we describe a new ray tracing method targeted for inclusion in HADES. The algorithm tracks rays through three-dimensional tetrakis hexahedral mesh objects, like those used by the ARES code to model inertial confinement experiments.
Adverb Code-Switching among Miami's Haitian Creole-English Second Generation
ERIC Educational Resources Information Center
Hebblethwaite, Benjamin
2010-01-01
The findings for adverbs and adverbial phrases in a naturalistic corpus of Miami Haitian Creole-English code-switching show that one language, Haitian Creole, asymmetrically supplies the grammatical frame while the other language, English, asymmetrically supplies mixed lexical categories like adverbs. Traces of code-switching with an English frame…
Coding of Stimuli by Animals: Retrospection, Prospection, Episodic Memory and Future Planning
ERIC Educational Resources Information Center
Zentall, Thomas R.
2010-01-01
When animals code stimuli for later retrieval they can either code them in terms of the stimulus presented (as a retrospective memory) or in terms of the response or outcome anticipated (as a prospective memory). Although retrospective memory is typically assumed (as in the form of a memory trace), evidence of prospective coding has been found…
Towards a Consolidated Approach for the Assessment of Evaluation Models of Nuclear Power Reactors
Epiney, A.; Canepa, S.; Zerkak, O.; ...
2016-11-02
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
Belostotsky, Inessa; Gridin, Vladimir V; Schechter, Israel; Yarnitzky, Chaim N
2003-02-01
An improved analytical method for airborne lead traces is reported. It is based on using a Venturi scrubber sampling device for simultaneous thin-film stripping and droplet entrapment of aerosol influxes. At least threefold enhancement of the lead-trace pre-concentration is achieved. The sampled traces are analyzed by square-wave anodic stripping voltammetry. The method was tested by a series of pilot experiments. These were performed using contaminant-controlled air intakes. Reproducible calibration plots were obtained. The data were validated by traditional analysis using filter sampling. LODs are comparable with the conventional techniques. The method was successfully applied to on-line and in situ environmental monitoring of lead.
The Evolution of Random Number Generation in MUVES
2017-01-01
mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results identical to the current...MUVES, includ- ing the mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results...questionable numerical and statistical properties. The development of the modern system is traced through software change requests, resulting in a random number
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laugeman, E; Weiss, E; Chen, S
2014-06-01
Purpose: Evaluate and compare the cycle-to-cycle consistency of breathing patterns and their reproducibility over the course of treatment, for supine and prone positioning. Methods: Respiratory traces from 25 patients were recorded for sequential supine/prone 4DCT scans acquired prior to treatment, and during the course of the treatment (weekly or bi-weekly). For each breathing cycle, the average(AVE), end-of-exhale(EoE) and end-of-inhale( EoI) locations were identified using in-house developed software. In addition, the mean values and variations for the above quantities were computed for each breathing trace. F-tests were used to compare the cycle-to-cycle consistency of all pairs of sequential supine and pronemore » scans. Analysis of variances was also performed using population means for AVE, EoE and EoI to quantify differences between the reproducibility of prone and supine respiration traces over the treatment course. Results: Consistency: Cycle-to-cycle variations are less in prone than supine in the pre-treatment and during-treatment scans for AVE, EoE and EoI points, for the majority of patients (differences significant at p<0.05). The few cases where the respiratory pattern had more variability in prone appeared to be random events. Reproducibility: The reproducibility of breathing patterns (supine and prone) improved as treatment progressed, perhaps due to patients becoming more comfortable with the procedure. However, variability in supine position continued to remain significantly larger than in prone (p<0.05), as indicated by the variance analysis of population means for the pretreatment and subsequent during-treatment scans. Conclusions: Prone positioning stabilizes breathing patterns in most subjects investigated in this study. Importantly, a parallel analysis of the same group of patients revealed a tendency towards increasing motion amplitude of tumor targets in prone position regardless of their size or location; thus, the choice for body positioning during radiation therapy will have to consider the clinical relevance of the two opposing trends - breathing consistency and motion amplitude.« less
The impact of preclinical irreproducibility on drug development.
Freedman, L P; Gibson, M C
2015-01-01
The development of novel therapeutics depends and builds upon the validity and reproducibility of previously published data and findings. Yet irreproducibility is pervasive in preclinical life science research and can be traced to cumulative errors or flaws in several areas, including reference materials, study design, laboratory protocols, and data collection and analysis. The expanded development and use of consensus-based standards and well-documented best practices is needed to both enhance reproducibility and drive therapeutic innovations. © 2014 ASCPT.
Reproducibility and validity of a semi-quantitative FFQ for trace elements.
Lee, Yujin; Park, Kyong
2016-09-01
The aim of this study was to test the reproducibility and validity of a self-administered FFQ for the Trace Element Study of Korean Adults in the Yeungnam area (SELEN). Study subjects were recruited from the SELEN cohort selected from rural and urban areas in Yeungnam, Korea. A semi-quantitative FFQ with 146 items was developed considering the dietary characteristics of cohorts in the study area. In a validation study, seventeen men and forty-eight women aged 38-62 years completed 3-d dietary records (DR) and two FFQ over a 3-month period. The validity was examined with the FFQ and DR, and the reproducibility was estimated using partial correlation coefficients, the Bland-Altman method and cross-classification. There were no significant differences between the mean intakes of selected nutrients as estimated from FFQ1, FFQ2 and DR. The median correlation coefficients for all nutrients were 0·47 and 0·56 in the reproducibility and validity tests, respectively. Bland-Altman's index and cross-classification showed acceptable agreement between FFQ1 and FFQ2 and between FFQ2 and DR. Ultimately, 78 % of the subjects were classified into the same and adjacent quartiles for most nutrients. In addition, the weighted κ value indicated that the two methods agreed fairly. In conclusion, this newly developed FFQ was a suitable dietary assessment method for the SELEN cohort study.
A three-dimensional spacecraft-charging computer code
NASA Technical Reports Server (NTRS)
Rubin, A. G.; Katz, I.; Mandell, M.; Schnuelle, G.; Steen, P.; Parks, D.; Cassidy, J.; Roche, J.
1980-01-01
A computer code is described which simulates the interaction of the space environment with a satellite at geosynchronous altitude. Employing finite elements, a three-dimensional satellite model has been constructed with more than 1000 surface cells and 15 different surface materials. Free space around the satellite is modeled by nesting grids within grids. Applications of this NASA Spacecraft Charging Analyzer Program (NASCAP) code to the study of a satellite photosheath and the differential charging of the SCATHA (satellite charging at high altitudes) satellite in eclipse and in sunlight are discussed. In order to understand detector response when the satellite is charged, the code is used to trace the trajectories of particles reaching the SCATHA detectors. Particle trajectories from positive and negative emitters on SCATHA also are traced to determine the location of returning particles, to estimate the escaping flux, and to simulate active control of satellite potentials.
Monte Carlo simulation of ion-neutral charge exchange collisions and grid erosion in an ion thruster
NASA Technical Reports Server (NTRS)
Peng, Xiaohang; Ruyten, Wilhelmus M.; Keefer, Dennis
1991-01-01
A combined particle-in-cell (PIC)/Monte Carlo simulation model has been developed in which the PIC method is used to simulate the charge exchange collisions. It is noted that a number of features were reproduced correctly by this code, but that its assumption of two-dimensional axisymmetry for a single set of grid apertures precluded the reproduction of the most characteristic feature of actual test data; namely, the concentrated grid erosion at the geometric center of the hexagonal aperture array. The first results of a three-dimensional code, which takes into account the hexagonal symmetry of the grid, are presented. It is shown that, with this code, the experimentally observed erosion patterns are reproduced correctly, demonstrating explicitly the concentration of sputtering between apertures.
NASA Astrophysics Data System (ADS)
Jiang, Chaowei; Yan, Xiaoli; Feng, Xueshang; Duan, Aiying; Hu, Qiang; Zuo, Pingbing; Wang, Yi
2017-11-01
As a fundamental magnetic structure in the solar corona, electric current sheets (CSs) can form either prior to or during a solar flare, and they are essential for magnetic energy dissipation in the solar corona because they enable magnetic reconnection. However, the static reconstruction of a CS is rare, possibly due to limitations that are inherent in the available coronal field extrapolation codes. Here we present the reconstruction of a large-scale pre-flare CS in solar active region 11967 using an MHD-relaxation model constrained by the SDO/HMI vector magnetogram. The CS is associated with a set of peculiar homologous flares that exhibit unique X-shaped ribbons and loops occurring in a quadrupolar magnetic configuration.This is evidenced by an ’X’ shape, formed from the field lines traced from the CS to the photosphere. This nearly reproduces the shape of the observed flare ribbons, suggesting that the flare is a product of the dissipation of the CS via reconnection. The CS forms in a hyperbolic flux tube, which is an intersection of two quasi-separatrix layers. The recurrence of the X-shaped flares might be attributed to the repetitive formation and dissipation of the CS, as driven by the photospheric footpoint motions. These results demonstrate the power of a data-constrained MHD model in reproducing a CS in the corona as well as providing insight into the magnetic mechanism of solar flares.
Comparing models of star formation simulating observed interacting galaxies
NASA Astrophysics Data System (ADS)
Quiroga, L. F.; Muñoz-Cuartas, J. C.; Rodrigues, I.
2017-07-01
In this work, we make a comparison between different models of star formation to reproduce observed interacting galaxies. We use observational data to model the evolution of a pair of galaxies undergoing a minor merger. Minor mergers represent situations weakly deviated from the equilibrium configuration but significant changes in star fomation (SF) efficiency can take place, then, minor mergers provide an unique scene to study SF in galaxies in a realistic but yet simple way. Reproducing observed systems also give us the opportunity to compare the results of the simulations with observations, which at the end can be used as probes to characterize the models of SF implemented in the comparison. In this work we compare two different star formation recipes implemented in Gadget3 and GIZMO codes. Both codes share the same numerical background, and differences arise mainly in the star formation recipe they use. We use observations from Pico dos Días and GEMINI telescopes and show how we use observational data of the interacting pair in AM2229-735 to characterize the interacting pair. Later we use this information to simulate the evolution of the system to finally reproduce the observations: Mass distribution, morphology and main features of the merger-induced star formation burst. We show that both methods manage to reproduce roughly the star formation activity. We show, through a careful study, that resolution plays a major role in the reproducibility of the system. In that sense, star formation recipe implemented in GIZMO code has shown a more robust performance. Acknowledgements: This work is supported by Colciencias, Doctorado Nacional - 617 program.
Yüksel, Sezin; Schwenke, Almut M; Soliveri, Guido; Ardizzone, Silvia; Weber, Karina; Cialla-May, Dana; Hoeppener, Stephanie; Schubert, Ulrich S; Popp, Jürgen
2016-10-05
In the present study, an ultra-sensitive and highly reproducible novel SERS-based capillary platform was developed and utilized for the trace detection of tetrahydrocannabinol (THC). The approach combines the advantages of microwave-assisted nanoparticle synthesis, plasmonics and capillary forces. By employing a microwave-assisted preparation method, glass capillaries were reproducibly coated with silver nanoparticles in a batch fabrication process that required a processing time of 3 min without needing to use any pre-surface modifications or add surfactants. The coated capillaries exhibited an excellent SERS activity with a high reproducibility and enabled the detection of low concentrations of target molecules. At the same time, only a small amount of analyte and a short and simple incubation process was required. The developed platform was applied to the spectroscopic characterization of tetrahydrocannabinol (THC) and its identification at concentration levels down to 1 nM. Thus, a highly efficient detection system for practical applications, e.g., in drug monitoring/detection, is introduced, which can be fabricated at low cost by using microwave-assisted batch synthesis techniques. Copyright © 2016 Elsevier B.V. All rights reserved.
Wang, Zhongshun; Feng, Lei; Xiao, Dongyang; Li, Ning; Li, Yao; Cao, Danfeng; Shi, Zuosen; Cui, Zhanchen; Lu, Nan
2017-11-09
The performance of surface-enhanced Raman scattering (SERS) for detecting trace amounts of analytes depends highly on the enrichment of the diluted analytes into a small region that can be detected. A super-hydrophobic delivery (SHD) process is an excellent process to enrich even femtomolar analytes for SERS detection. However, it is still challenging to easily fabricate a low detection limit, high sensitivity and reproducible SHD-SERS substrate. In this article, we present a cost-effective and fewer-step method to fabricate a SHD-SERS substrate, named the "silver nanoislands on silica spheres" (SNOSS) platform. It is easily prepared via the thermal evaporation of silver onto a layer of super-hydrophobic paint, which contains single-scale surface-fluorinated silica spheres. The SNOSS platform performs reproducible detection, which brings the relative standard deviation down to 8.85% and 5.63% for detecting 10 -8 M R6G in one spot and spot-to-spot set-ups, respectively. The coefficient of determination (R 2 ) is 0.9773 for R6G. The SNOSS platform can be applied to the quantitative detection of analytes whose concentrations range from sub-micromolar to femtomolar levels.
NASA Astrophysics Data System (ADS)
Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit
2017-03-01
In this article, we reply to a comment made by Melsen et al. [2017] on our previous commentary regarding reproducibility in computational hydrology. Re-executing someone else's code and workflow to derive a set of published results does not by itself constitute reproducibility. However, it forms a key part of the process: it demonstrates that all the degrees of freedom and choices made by the scientist in running the experiment are contained within that code and workflow. This does not only allow us to build and extend directly from the original work, but with full knowledge of decisions made in the original experimental setup, we can then focus our attention to the degrees of freedom of interest: those that occur in hydrological systems that are ultimately our subject of study.
NASA Astrophysics Data System (ADS)
Yang, Yong; Li, Zhi-Yuan; Yamaguchi, Kohei; Tanemura, Masaki; Huang, Zhengren; Jiang, Dongliang; Chen, Yuhui; Zhou, Fei; Nogami, Masayuki
2012-03-01
Novel surface-enhanced Raman scattering (SERS) substrates with high SERS-activity are ideal for novel SERS sensors, detectors to detect illicitly sold narcotics and explosives. The key to the wider application of SERS technique is to develop plasmon resonant structure with novel geometries to enhance Raman signals and to control the periodic ordering of these structures over a large area to obtain reproducible Raman enhancement. In this work, a simple Ar+-ion sputtering route has been developed to fabricate silver nanoneedles arrays on silicon substrates for SERS-active substrates to detect trace-level illicitly sold narcotics. These silver nanoneedles possess a very sharp apex with an apex diameter of 15 nm and an apex angle of 20°. The SERS enhancement factor of greater than 1010 was reproducibly achieved by the well-aligned nanoneedles arrays. Furthermore, ketamine hydrochloride molecules, one kind of illicitly sold narcotics, can be detected down to 27 ppb by using our SERS substrate within 3 s, indicating the sensitivity of our SERS substrates for trace amounts of narcotics and that SERS technology can become an important analytical technique in forensic laboratories because it can provide a rapid and nondestructive method for trace detection.
Stodden, Victoria; Guo, Peixuan; Ma, Zhaokun
2013-01-01
Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012) for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.
Stodden, Victoria; Guo, Peixuan; Ma, Zhaokun
2013-01-01
Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012) for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals. PMID:23805293
Historical Roots and Future Perspectives Related to Nursing Ethics.
ERIC Educational Resources Information Center
Freitas, Lorraine
1990-01-01
This article traces the evolution of the development and refinement of the professional code from concerns about the ethical conduct of nurses to its present state as a professional code for all nurses. The relationship of the Ethics Committee of the American Nurses' Association to the development of the code is also discussed. (Author/MLW)
Steady State Film Boiling Heat Transfer Simulated With Trace V4.160
DOE Office of Scientific and Technical Information (OSTI.GOV)
Audrius Jasiulevicius; Rafael Macian-Juan
2006-07-01
This paper presents the results of the assessment and analysis of TRACE v4.160 heat transfer predictions in the post-CHF (critical heat flux) region and discusses the possibilities to improve the TRACE v4.160 code predictions in the film boiling heat transfer when applying different film boiling correlations. For this purpose, the TRACE v4.160-calculated film boiling heat flux and the resulting maximum inner wall temperatures during film boiling in single tubes were compared with experimental data obtained at the Royal Institute of Technology (KTH) in Stockholm, Sweden. The experimental database included measurements for pressures ranging from 30 to 200 bar and coolantmore » mass fluxes from 500 to 3000 kg/m{sup 2}s. It was found that TRACE v4.160 does not produce correct predictions of the film boiling heat flux, and consequently of the maximum inner wall temperature in the test section, under the wide range of conditions documented in the KTH experiments. In particular, it was found that the standard TRACE v4.160 under-predicts the film boiling heat transfer coefficient at low pressure-low mass flux and high pressure-high mass flux conditions. For most of the rest of the investigated range of parameters, TRACE v4.160 over-predicts the film boiling heat transfer coefficient, which can lead to non-conservative predictions in applications to nuclear power plant analyses. Since no satisfactory agreement with the experimental database was obtained with the standard TRACE v4.160 film boiling heat transfer correlations, we have added seven film boiling correlations to TRACE v4.160 in order to investigate the possibility to improve the code predictions for the conditions similar to the KTH tests. The film boiling correlations were selected among the most commonly used film boiling correlations found in the open literature, namely Groeneveld 5.7, Bishop (2 correlations), Tong, Konkov, Miropolskii and Groeneveld-Delorme correlations. The only correlation among the investigated, which resulted in a significant improvement of TRACE predictions, was the Groeneveld 5.7. It was found, that replacing the current film boiling correlation (Dougall-Rohsenow) for the wall-togas heat transfer with Groeneveld 5.7 improves the code predictions for the film boiling heat transfer at high qualities in single tubes in the entire range of pressure and coolant mass flux considered. (authors)« less
Value-Based Requirements Traceability: Lessons Learned
NASA Astrophysics Data System (ADS)
Egyed, Alexander; Grünbacher, Paul; Heindl, Matthias; Biffl, Stefan
Traceability from requirements to code is mandated by numerous software development standards. These standards, however, are not explicit about the appropriate level of quality of trace links. From a technical perspective, trace quality should meet the needs of the intended trace utilizations. Unfortunately, long-term trace utilizations are typically unknown at the time of trace acquisition which represents a dilemma for many companies. This chapter suggests ways to balance the cost and benefits of requirements traceability. We present data from three case studies demonstrating that trace acquisition requires broad coverage but can tolerate imprecision. With this trade-off our lessons learned suggest a traceability strategy that (1) provides trace links more quickly, (2) refines trace links according to user-defined value considerations, and (3) supports the later refinement of trace links in case the initial value consideration has changed over time. The scope of our work considers the entire life cycle of traceability instead of just the creation of trace links.
Image Tracing: An Analysis of Its Effectiveness in Children's Pictorial Discrimination Learning
ERIC Educational Resources Information Center
Levin, Joel R.; And Others
1977-01-01
A total of 45 fifth grade students were the subjects of an experiment offering support for a component of learning strategy (memory imagery). Various theoretical explanations of the image-tracing phenomenon are considered, including depth of processing, dual coding and frequency. (MS)
CRKSPH: A new meshfree hydrodynamics method with applications to astrophysics
NASA Astrophysics Data System (ADS)
Owen, John Michael; Raskin, Cody; Frontiere, Nicholas
2018-01-01
The study of astrophysical phenomena such as supernovae, accretion disks, galaxy formation, and large-scale structure formation requires computational modeling of, at a minimum, hydrodynamics and gravity. Developing numerical methods appropriate for these kinds of problems requires a number of properties: shock-capturing hydrodynamics benefits from rigorous conservation of invariants such as total energy, linear momentum, and mass; lack of obvious symmetries or a simplified spatial geometry to exploit necessitate 3D methods that ideally are Galilean invariant; the dynamic range of mass and spatial scales that need to be resolved can span many orders of magnitude, requiring methods that are highly adaptable in their space and time resolution. We have developed a new Lagrangian meshfree hydrodynamics method called Conservative Reproducing Kernel Smoothed Particle Hydrodynamics, or CRKSPH, in order to meet these goals. CRKSPH is a conservative generalization of the meshfree reproducing kernel method, combining the high-order accuracy of reproducing kernels with the explicit conservation of mass, linear momentum, and energy necessary to study shock-driven hydrodynamics in compressible fluids. CRKSPH's Lagrangian, particle-like nature makes it simple to combine with well-known N-body methods for modeling gravitation, similar to the older Smoothed Particle Hydrodynamics (SPH) method. Indeed, CRKSPH can be substituted for SPH in existing SPH codes due to these similarities. In comparison to SPH, CRKSPH is able to achieve substantially higher accuracy for a given number of points due to the explicitly consistent (and higher-order) interpolation theory of reproducing kernels, while maintaining the same conservation principles (and therefore applicability) as SPH. There are currently two coded implementations of CRKSPH available: one in the open-source research code Spheral, and the other in the high-performance cosmological code HACC. Using these codes we have applied CRKSPH to a number of astrophysical scenarios, such as rotating gaseous disks, supernova remnants, and large-scale cosmological structure formation. In this poster we present an overview of CRKSPH and show examples of these astrophysical applications.
Comparison of a 3-D GPU-Assisted Maxwell Code and Ray Tracing for Reflectometry on ITER
NASA Astrophysics Data System (ADS)
Gady, Sarah; Kubota, Shigeyuki; Johnson, Irena
2015-11-01
Electromagnetic wave propagation and scattering in magnetized plasmas are important diagnostics for high temperature plasmas. 1-D and 2-D full-wave codes are standard tools for measurements of the electron density profile and fluctuations; however, ray tracing results have shown that beam propagation in tokamak plasmas is inherently a 3-D problem. The GPU-Assisted Maxwell Code utilizes the FDTD (Finite-Difference Time-Domain) method for solving the Maxwell equations with the cold plasma approximation in a 3-D geometry. Parallel processing with GPGPU (General-Purpose computing on Graphics Processing Units) is used to accelerate the computation. Previously, we reported on initial comparisons of the code results to 1-D numerical and analytical solutions, where the size of the computational grid was limited by the on-board memory of the GPU. In the current study, this limitation is overcome by using domain decomposition and an additional GPU. As a practical application, this code is used to study the current design of the ITER Low Field Side Reflectometer (LSFR) for the Equatorial Port Plug 11 (EPP11). A detailed examination of Gaussian beam propagation in the ITER edge plasma will be presented, as well as comparisons with ray tracing. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No.DE-AC02-09CH11466 and DE-FG02-99-ER54527.
NASA Technical Reports Server (NTRS)
Thaller, Lawrence H.; Quinzio, Michael V.
1997-01-01
The investigation of an aberrant cell voltage during the filling of a large lithium thionyl chloride cell summary is at: an aberrant voltage trace was noted during the review of cell filling data; incident was traced to an interruption during filling; experimentation suggested oxidizable sites within the carbon electrode were responsible for the drop in voltage; the voltage anomaly could be reproduced by interrupting the filling of similar cells; and anomalous voltage dip was not due to a short.
Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics
NASA Astrophysics Data System (ADS)
Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.
2015-12-01
Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.
Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations
NASA Astrophysics Data System (ADS)
Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.
2017-09-01
Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (I) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (II) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph; (III) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (IV) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (VI) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.
NASA Astrophysics Data System (ADS)
Grunloh, Timothy P.
The objective of this dissertation is to develop a 3-D domain-overlapping coupling method that leverages the superior flow field resolution of the Computational Fluid Dynamics (CFD) code STAR-CCM+ and the fast execution of the System Thermal Hydraulic (STH) code TRACE to efficiently and accurately model thermal hydraulic transport properties in nuclear power plants under complex conditions of regulatory and economic importance. The primary contribution is the novel Stabilized Inertial Domain Overlapping (SIDO) coupling method, which allows for on-the-fly correction of TRACE solutions for local pressures and velocity profiles inside multi-dimensional regions based on the results of the CFD simulation. The method is found to outperform the more frequently-used domain decomposition coupling methods. An STH code such as TRACE is designed to simulate large, diverse component networks, requiring simplifications to the fluid flow equations for reasonable execution times. Empirical correlations are therefore required for many sub-grid processes. The coarse grids used by TRACE diminish sensitivity to small scale geometric details such as Reactor Pressure Vessel (RPV) internals. A CFD code such as STAR-CCM+ uses much finer computational meshes that are sensitive to the geometric details of reactor internals. In turbulent flows, it is infeasible to fully resolve the flow solution, but the correlations used to model turbulence are at a low level. The CFD code can therefore resolve smaller scale flow processes. The development of a 3-D coupling method was carried out with the intention of improving predictive capabilities of transport properties in the downcomer and lower plenum regions of an RPV in reactor safety calculations. These regions are responsible for the multi-dimensional mixing effects that determine the distribution at the core inlet of quantities with reactivity implications, such as fluid temperature and dissolved neutron absorber concentration.
Modelling of the EAST lower-hybrid current drive experiment using GENRAY/CQL3D and TORLH/CQL3D
NASA Astrophysics Data System (ADS)
Yang, C.; Bonoli, P. T.; Wright, J. C.; Ding, B. J.; Parker, R.; Shiraiwa, S.; Li, M. H.
2014-12-01
The coupled GENRAY-CQL3D code has been used to do systematic ray-tracing and Fokker-Planck analysis for EAST Lower Hybrid wave Current Drive (LHCD) experiments. Despite being in the weak absorption regime, the experimental level of LH current drive is successfully simulated, by taking into account the variations in the parallel wavenumber due to the toroidal effect. The effect of radial transport of the fast LH electrons in EAST has also been studied, which shows that a modest amount of radial transport diffusion can redistribute the fast LH current significantly. Taking advantage of the new capability in GENRAY, the actual Scrape Off Layer (SOL) model with magnetic field, density, temperature, and geometry is included in the simulation for both the lower and the higher density cases, so that the collisional losses of Lower Hybrid Wave (LHW) power in the SOL has been accounted for, which together with fast electron losses can reproduce the LHCD experimental observations in different discharges of EAST. We have also analyzed EAST discharges where there is a significant ohmic contribution to the total current, and good agreement with experiment in terms of total current has been obtained. Also, the full-wave code TORLH has been used for the simulation of the LH physics in the EAST, including full-wave effects such as diffraction and focusing which may also play an important role in bridging the spectral gap. The comparisons between the GENRAY and the TORLH codes are done for both the Maxwellian and the quasi-linear electron Landau damping cases. These simulations represent an important addition to the validation studies of the GENRAY-CQL3D and TORLH models being used in weak absorption scenarios of tokamaks with large aspect ratio.
Method for rapid high-frequency seismogram calculation
NASA Astrophysics Data System (ADS)
Stabile, Tony Alfredo; De Matteis, Raffaella; Zollo, Aldo
2009-02-01
We present a method for rapid, high-frequency seismogram calculation that makes use of an algorithm to automatically generate an exhaustive set of seismic phases with an appreciable amplitude on the seismogram. The method uses a hierarchical order of ray and seismic-phase generation, taking into account some existing constraints for ray paths and some physical constraints. To compute synthetic seismograms, the COMRAD code (from the Italian: "COdice Multifase per il RAy-tracing Dinamico") uses as core a dynamic ray-tracing code. To validate the code, we have computed in a layered medium synthetic seismograms using both COMRAD and a code that computes the complete wave field by the discrete wave number method. The seismograms are compared according to a time-frequency misfit criteria based on the continuous wavelet transform of the signals. Although the number of phases is considerably reduced by the selection criteria, the results show that the loss in amplitude on the whole seismogram is negligible. Moreover, the time for the computing of the synthetics using the COMRAD code (truncating the ray series at the 10th generation) is 3-4-fold less than that needed for the AXITRA code (up to a frequency of 25 Hz).
How to Write a Reproducible Paper
NASA Astrophysics Data System (ADS)
Irving, D. B.
2016-12-01
The geosciences have undergone a computational revolution in recent decades, to the point where almost all modern research relies heavily on software and code. Despite this profound change in the research methods employed by geoscientists, the reporting of computational results has changed very little in academic journals. This lag has led to something of a reproducibility crisis, whereby it is impossible to replicate and verify most of today's published computational results. While it is tempting to decry the slow response of journals and funding agencies in the face of this crisis, there are very few examples of reproducible research upon which to base new communication standards. In an attempt to address this deficiency, this presentation will describe a procedure for reporting computational results that was employed in a recent Journal of Climate paper. The procedure was developed to be consistent with recommended computational best practices and seeks to minimize the time burden on authors, which has been identified as the most important barrier to publishing code. It should provide a starting point for geoscientists looking to publish reproducible research, and could be adopted by journals as a formal minimum communication standard.
Massive Data, the Digitization of Science, and Reproducibility of Results
Stodden, Victoria
2018-04-27
As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the âReproducible Research Standardâ (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.
A self-updating road map of The Cancer Genome Atlas.
Robbins, David E; Grüneberg, Alexander; Deus, Helena F; Tanik, Murat M; Almeida, Jonas S
2013-05-15
Since 2011, The Cancer Genome Atlas' (TCGA) files have been accessible through HTTP from a public site, creating entirely new possibilities for cancer informatics by enhancing data discovery and retrieval. Significantly, these enhancements enable the reporting of analysis results that can be fully traced to and reproduced using their source data. However, to realize this possibility, a continually updated road map of files in the TCGA is required. Creation of such a road map represents a significant data modeling challenge, due to the size and fluidity of this resource: each of the 33 cancer types is instantiated in only partially overlapping sets of analytical platforms, while the number of data files available doubles approximately every 7 months. We developed an engine to index and annotate the TCGA files, relying exclusively on third-generation web technologies (Web 3.0). Specifically, this engine uses JavaScript in conjunction with the World Wide Web Consortium's (W3C) Resource Description Framework (RDF), and SPARQL, the query language for RDF, to capture metadata of files in the TCGA open-access HTTP directory. The resulting index may be queried using SPARQL, and enables file-level provenance annotations as well as discovery of arbitrary subsets of files, based on their metadata, using web standard languages. In turn, these abilities enhance the reproducibility and distribution of novel results delivered as elements of a web-based computational ecosystem. The development of the TCGA Roadmap engine was found to provide specific clues about how biomedical big data initiatives should be exposed as public resources for exploratory analysis, data mining and reproducible research. These specific design elements align with the concept of knowledge reengineering and represent a sharp departure from top-down approaches in grid initiatives such as CaBIG. They also present a much more interoperable and reproducible alternative to the still pervasive use of data portals. A prepared dashboard, including links to source code and a SPARQL endpoint, is available at http://bit.ly/TCGARoadmap. A video tutorial is available at http://bit.ly/TCGARoadmapTutorial. robbinsd@uab.edu.
A self-updating road map of The Cancer Genome Atlas
Robbins, David E.; Grüneberg, Alexander; Deus, Helena F.; Tanik, Murat M.; Almeida, Jonas S.
2013-01-01
Motivation: Since 2011, The Cancer Genome Atlas’ (TCGA) files have been accessible through HTTP from a public site, creating entirely new possibilities for cancer informatics by enhancing data discovery and retrieval. Significantly, these enhancements enable the reporting of analysis results that can be fully traced to and reproduced using their source data. However, to realize this possibility, a continually updated road map of files in the TCGA is required. Creation of such a road map represents a significant data modeling challenge, due to the size and fluidity of this resource: each of the 33 cancer types is instantiated in only partially overlapping sets of analytical platforms, while the number of data files available doubles approximately every 7 months. Results: We developed an engine to index and annotate the TCGA files, relying exclusively on third-generation web technologies (Web 3.0). Specifically, this engine uses JavaScript in conjunction with the World Wide Web Consortium’s (W3C) Resource Description Framework (RDF), and SPARQL, the query language for RDF, to capture metadata of files in the TCGA open-access HTTP directory. The resulting index may be queried using SPARQL, and enables file-level provenance annotations as well as discovery of arbitrary subsets of files, based on their metadata, using web standard languages. In turn, these abilities enhance the reproducibility and distribution of novel results delivered as elements of a web-based computational ecosystem. The development of the TCGA Roadmap engine was found to provide specific clues about how biomedical big data initiatives should be exposed as public resources for exploratory analysis, data mining and reproducible research. These specific design elements align with the concept of knowledge reengineering and represent a sharp departure from top-down approaches in grid initiatives such as CaBIG. They also present a much more interoperable and reproducible alternative to the still pervasive use of data portals. Availability: A prepared dashboard, including links to source code and a SPARQL endpoint, is available at http://bit.ly/TCGARoadmap. A video tutorial is available at http://bit.ly/TCGARoadmapTutorial. Contact: robbinsd@uab.edu PMID:23595662
Shimansky, Y; Saling, M; Wunderlich, D A; Bracha, V; Stelmach, G E; Bloedel, J R
1997-01-01
This study addresses the issue of the role of the cerebellum in the processing of sensory information by determining the capability of cerebellar patients to acquire and use kinesthetic cues received via the active or passive tracing of an irregular shape while blindfolded. Patients with cerebellar lesions and age-matched healthy controls were tested on four tasks: (1) learning to discriminate a reference shape from three others through the repeated tracing of the reference template; (2) reproducing the reference shape from memory by drawing blindfolded; (3) performing the same task with vision; and (4) visually recognizing the reference shape. The cues used to acquire and then to recognize the reference shape were generated under four conditions: (1) "active kinesthesia," in which cues were acquired by the blindfolded subject while actively tracing a reference template; (2) "passive kinesthesia," in which the tracing was performed while the hand was guided passively through the template; (3) "sequential vision," in which the shape was visualized by the serial exposure of small segments of its outline; and (4) "full vision," in which the entire shape was visualized. The sequential vision condition was employed to emulate the sequential way in which kinesthetic information is acquired while tracing the reference shape. The results demonstrate a substantial impairment of cerebellar patients in their capability to perceive two-dimensional irregular shapes based only on kinesthetic cues. There also is evidence that this deficit in part relates to a reduced capacity to integrate temporal sequences of sensory cues into a complete image useful for shape discrimination tasks or for reproducing the shape through drawing. Consequently, the cerebellum has an important role in this type of sensory information processing even when it is not directly associated with the execution of movements.
Maize GO annotation—methods, evaluation, and review (maize-GAMER)
USDA-ARS?s Scientific Manuscript database
We created a new high-coverage, robust, and reproducible functional annotation of maize protein-coding genes based on Gene Ontology (GO) term assignments. Whereas the existing Phytozome and Gramene maize GO annotation sets only cover 41% and 56% of maize protein-coding genes, respectively, this stu...
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Held, Eric D.
2015-09-01
Neoclassical tearing modes are macroscopic (L ∼ 1 m) instabilities in magnetic fusion experiments; if unchecked, these modes degrade plasma performance and may catastrophically destroy plasma confinement by inducing a disruption. Fortunately, the use of properly tuned and directed radiofrequency waves (λ ∼ 1 mm) can eliminate these modes. Numerical modeling of this difficult multiscale problem requires the integration of separate mathematical models for each length and time scale (Jenkins and Kruger, 2012 [21]); the extended MHD model captures macroscopic plasma evolution while the RF model tracks the flow and deposition of injected RF power through the evolving plasma profiles. The scale separation enables use of the eikonal (ray-tracing) approximation to model the RF wave propagation. In this work we demonstrate a technique, based on methods of computational geometry, for mapping the ensuing RF data (associated with discrete ray trajectories) onto the finite-element/pseudospectral grid that is used to model the extended MHD physics. In the new representation, the RF data can then be used to construct source terms in the equations of the extended MHD model, enabling quantitative modeling of RF-induced tearing mode stabilization. Though our specific implementation uses the NIMROD extended MHD (Sovinec et al., 2004 [22]) and GENRAY RF (Smirnov et al., 1994 [23]) codes, the approach presented can be applied more generally to any code coupling requiring the mapping of ray tracing data onto Eulerian grids.
Hong, Samin; Kim, Chan Yun; Lee, Won Seok; Seong, Gong Je
2010-01-01
To assess the reproducibility of the new spectral domain Cirrus high-definition optical coherence tomography (HD-OCT; Carl Zeiss Meditec, Dublin, CA, USA) for analysis of peripapillary retinal nerve fiber layer (RNFL) thickness in healthy eyes. Thirty healthy Korean volunteers were enrolled. Three optic disc cube 200 x 200 Cirrus HD-OCT scans were taken on the same day in discontinuous sessions by the same operator without using the repeat scan function. The reproducibility of the calculated RNFL thickness and probability code were determined by the intraclass correlation coefficient (ICC), coefficient of variation (CV), test-retest variability, and Fleiss' generalized kappa (kappa). Thirty-six eyes were analyzed. For average RNFL thickness, the ICC was 0.970, CV was 2.38%, and test-retest variability was 4.5 microm. For all quadrants except the nasal, ICCs were 0.972 or higher and CVs were 4.26% or less. Overall test-retest variability ranged from 5.8 to 8.1 microm. The kappa value of probability codes for average RNFL thickness was 0.690. The kappa values of quadrants and clock-hour sectors were lower in the nasal areas than in other areas. The reproducibility of Cirrus HD-OCT to analyze peripapillary RNFL thickness in healthy eyes was excellent compared with the previous reports for time domain Stratus OCT. For the calculated RNFL thickness and probability code, variability was relatively higher in the nasal area, and more careful analyses are needed.
Simulating intrafraction prostate motion with a random walk model.
Pommer, Tobias; Oh, Jung Hun; Munck Af Rosenschöld, Per; Deasy, Joseph O
2017-01-01
Prostate motion during radiation therapy (ie, intrafraction motion) can cause unwanted loss of radiation dose to the prostate and increased dose to the surrounding organs at risk. A compact but general statistical description of this motion could be useful for simulation of radiation therapy delivery or margin calculations. We investigated whether prostate motion could be modeled with a random walk model. Prostate motion recorded during 548 radiation therapy fractions in 17 patients was analyzed and used for input in a random walk prostate motion model. The recorded motion was categorized on the basis of whether any transient excursions (ie, rapid prostate motion in the anterior and superior direction followed by a return) occurred in the trace and transient motion. This was separately modeled as a large step in the anterior/superior direction followed by a returning large step. Random walk simulations were conducted with and without added artificial transient motion using either motion data from all observed traces or only traces without transient excursions as model input, respectively. A general estimate of motion was derived with reasonable agreement between simulated and observed traces, especially during the first 5 minutes of the excursion-free simulations. Simulated and observed diffusion coefficients agreed within 0.03, 0.2 and 0.3 mm 2 /min in the left/right, superior/inferior, and anterior/posterior directions, respectively. A rapid increase in variance at the start of observed traces was difficult to reproduce and seemed to represent the patient's need to adjust before treatment. This could be estimated somewhat using artificial transient motion. Random walk modeling is feasible and recreated the characteristics of the observed prostate motion. Introducing artificial transient motion did not improve the overall agreement, although the first 30 seconds of the traces were better reproduced. The model provides a simple estimate of prostate motion during delivery of radiation therapy.
TRACE/PARCS analysis of the OECD/NEA Oskarshamn-2 BWR stability benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozlowski, T.; Downar, T.; Xu, Y.
2012-07-01
On February 25, 1999, the Oskarshamn-2 NPP experienced a stability event which culminated in diverging power oscillations with a decay ratio of about 1.4. The event was successfully modeled by the TRACE/PARCS coupled code system, and further analysis of the event is described in this paper. The results show very good agreement with the plant data, capturing the entire behavior of the transient including the onset of instability, growth of the oscillations (decay ratio) and oscillation frequency. This provides confidence in the prediction of other parameters which are not available from the plant records. The event provides coupled code validationmore » for a challenging BWR stability event, which involves the accurate simulation of neutron kinetics (NK), thermal-hydraulics (TH), and TH/NK. coupling. The success of this work has demonstrated the ability of the 3-D coupled systems code TRACE/PARCS to capture the complex behavior of BWR stability events. The problem was released as an international OECD/NEA benchmark, and it is the first benchmark based on measured plant data for a stability event with a DR greater than one. Interested participants are invited to contact authors for more information. (authors)« less
Boussen, S; Spiegler, A; Benar, C; Carrère, M; Bartolomei, F; Metellus, P; Voituriez, R; Velly, L; Bruder, N; Trébuchon, A
2018-04-16
General anesthesia (GA) is a reversible manipulation of consciousness whose mechanism is mysterious at the level of neural networks leaving space for several competing hypotheses. We recorded electrocorticography (ECoG) signals in patients who underwent intracranial monitoring during awake surgery for the treatment of cerebral tumors in functional areas of the brain. Therefore, we recorded the transition from unconsciousness to consciousness directly on the brain surface. Using frequency resolved interferometry; we studied the intermediate ECoG frequencies (4-40 Hz). In the theoretical study, we used a computational Jansen and Rit neuron model to simulate recovery of consciousness (ROC). During ROC, we found that f increased by a factor equal to 1.62 ± 0.09, and δf varied by the same factor (1.61 ± 0.09) suggesting the existence of a scaling factor. We accelerated the time course of an unconscious EEG trace by an approximate factor 1.6 and we showed that the resulting EEG trace match the conscious state. Using the theoretical model, we successfully reproduced this behavior. We show that the recovery of consciousness corresponds to a transition in the frequency (f, δf) space, which is exactly reproduced by a simple time rescaling. These findings may perhaps be applied to other altered consciousness states.
Comparison of laser ray-tracing and skiascopic ocular wavefront-sensing devices
Bartsch, D-UG; Bessho, K; Gomez, L; Freeman, WR
2009-01-01
Purpose To compare two wavefront-sensing devices based on different principles. Methods Thirty-eight healthy eyes of 19 patients were measured five times in the reproducibility study. Twenty eyes of 10 patients were measured in the comparison study. The Tracey Visual Function Analyzer (VFA), based on the ray-tracing principle and the Nidek optical pathway difference (OPD)-Scan, based on the dynamic skiascopy principle were compared. Standard deviation (SD) of root mean square (RMS) errors was compared to verify the reproducibility. We evaluated RMS errors, Zernike terms and conventional refractive indexes (Sph, Cyl, Ax, and spherical equivalent). Results In RMS errors reading, both devices showed similar ratios of SD to the mean measurement value (VFA: 57.5±11.7%, OPD-Scan: 53.9±10.9%). Comparison on the same eye showed that almost all terms were significantly greater using the VFA than using the OPD-Scan. However, certain high spatial frequency aberrations (tetrafoil, pentafoil, and hexafoil) were consistently measured near zero with the OPD-Scan. Conclusion Both devices showed similar level of reproducibility; however, there was considerable difference in the wavefront reading between machines when measuring the same eye. Differences in the number of sample points, centration, and measurement algorithms between the two instruments may explain our results. PMID:17571088
Studying the precision of ray tracing techniques with Szekeres models
NASA Astrophysics Data System (ADS)
Koksbang, S. M.; Hannestad, S.
2015-07-01
The simplest standard ray tracing scheme employing the Born and Limber approximations and neglecting lens-lens coupling is used for computing the convergence along individual rays in mock N-body data based on Szekeres swiss cheese and onion models. The results are compared with the exact convergence computed using the exact Szekeres metric combined with the Sachs formalism. A comparison is also made with an extension of the simple ray tracing scheme which includes the Doppler convergence. The exact convergence is reproduced very precisely as the sum of the gravitational and Doppler convergences along rays in Lemaitre-Tolman-Bondi swiss cheese and single void models. This is not the case when the swiss cheese models are based on nonsymmetric Szekeres models. For such models, there is a significant deviation between the exact and ray traced paths and hence also the corresponding convergences. There is also a clear deviation between the exact and ray tracing results obtained when studying both nonsymmetric and spherically symmetric Szekeres onion models.
Tissue Trace Elements and Lipid Peroxidation in Breeding Female Bank Voles Myodes glareolus.
Bonda-Ostaszewska, Elżbieta; Włostowski, Tadeusz; Łaszkiewicz-Tiszczenko, Barbara
2018-04-27
Recent studies have demonstrated that reproduction reduces oxidative damage in various tissues of small mammal females. The present work was designed to determine whether the reduction of oxidative stress in reproductive bank vole females was associated with changes in tissue trace elements (iron, copper, zinc) that play an essential role in the production of reactive oxygen species. Lipid peroxidation (a marker of oxidative stress) and iron concentration in liver, kidneys, and skeletal muscles of reproducing bank vole females that weaned one litter were significantly lower than in non-reproducing females; linear regression analysis confirmed a positive relation between the tissue iron and lipid peroxidation. The concentrations of copper were significantly lower only in skeletal muscles of reproductive females and correlated positively with lipid peroxidation. No changes in tissue zinc were found in breeding females when compared with non-breeding animals. These data indicate that decreases in tissue iron and copper concentrations may be responsible for the reduction of oxidative stress in reproductive bank vole females.
NASA Astrophysics Data System (ADS)
El-Kader, M. S. A.; Godet, J.-L.; El-Sadek, A. A.; Maroulis, G.
2017-10-01
Quantum mechanical line shapes of collision-induced light scattering at room temperature (295 K) and collision-induced absorption at T = 195 K are computed for gaseous mixtures of molecular hydrogen and argon using theoretical values for pair-polarisability trace and anisotropy and induced dipole moments as input. Comparison with other theoretical spectra of isotropic and anisotropic light scattering and measured spectra of absorption shows satisfactory agreement, for which the uncertainty in measurement of its spectral moments is seen to be large. Ab initio models of the trace and anisotropy polarisability which reproduce the recent spectra of scattering are given. Empirical model of the dipole moment which reproduce the experimental spectra and the first three spectral moments more closely than the fundamental theory are also given. Good agreement between computed and/or experimental line shapes of both absorption and scattering is obtained when the potential model which is constructed from the transport and thermo-physical properties is used.
Detecting Runtime Anomalies in AJAX Applications through Trace Analysis
2011-08-10
statements by adding the instrumentation to the GWT UI classes, leaving the user code untouched. Some content management frameworks such as Drupal [12...Google web toolkit.” http://code.google.com/webtoolkit/. [12] “Form generation – drupal api.” http://api.drupal.org/api/group/form_api/6. 9
Effects of a wavy neutral sheet on cosmic ray anisotropies
NASA Technical Reports Server (NTRS)
Kota, J.; Jokipii, J. R.
1985-01-01
The first results of a three-dimensional numerical code calculating cosmic ray anisotropies is presented. The code includes diffusion, convection, adiabatic cooling, and drift in an interplanetary magnetic field model containing a wavy neutral sheet. The 3-D model can reproduce all the principal observations for a reasonable set of parameters.
NASA Astrophysics Data System (ADS)
Burganos, Vasilis N.; Skouras, Eugene D.; Kalarakis, Alexandros N.
2017-10-01
The lattice-Boltzmann (LB) method is used in this work to reproduce the controlled addition of binder and hydrophobicity-promoting agents, like polytetrafluoroethylene (PTFE), into gas diffusion layers (GDLs) and to predict flow permeabilities in the through- and in-plane directions. The present simulator manages to reproduce spreading of binder and hydrophobic additives, sequentially, into the neat fibrous layer using a two-phase flow model. Gas flow simulation is achieved by the same code, sidestepping the need for a post-processing flow code and avoiding the usual input/output and data interface problems that arise in other techniques. Compression effects on flow anisotropy of the impregnated GDL are also studied. The permeability predictions for different compression levels and for different binder or PTFE loadings are found to compare well with experimental data for commercial GDL products and with computational fluid dynamics (CFD) predictions. Alternatively, the PTFE-impregnated structure is reproduced from Scanning Electron Microscopy (SEM) images using an independent, purely geometrical approach. A comparison of the two approaches is made regarding their adequacy to reproduce correctly the main structural features of the GDL and to predict anisotropic flow permeabilities at different volume fractions of binder and hydrophobic additives.
Predictions of one-group interfacial area transport in TRACE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worosz, T.; Talley, J. D.; Kim, S.
In current nuclear reactor system analysis codes utilizing the two-fluid model, flow regime dependent correlations are used to specify the interfacial area concentration (a i). This approach does not capture the continuous evolution of the interfacial structures, and thus, it can pose issues near the transition boundaries. Consequently, a pilot version of the system analysis code TRACE is being developed that employs the interfacial area transport equation (IATE). In this approach, dynamic estimation of a i is provided through mechanistic models for bubble coalescence and breakup. The implementation of the adiabatic, one-group IATE into TRACE is assessed against experimental datamore » from 50 air-water, two-phase flow conditions in pipes ranging in inner diameter from 2.54 to 20.32 cm for both vertical co-current upward and downward flows. Predictions of pressure, void fraction, bubble velocity, and a i data are made. TRACE employing the conventional flow regime-based approach is found to underestimate a i and can only predict linear trends since the calculation is governed by the pressure. Furthermore, trends opposite to that of the data are predicted for some conditions. In contrast, TRACE with the one-group IATE demonstrates a significant improvement in predicting the experimental data with an average disagreement of {+-} 13%. Additionally, TRACE with the one-group IATE is capable of predicting nonlinear axial development of a, by accounting for various bubble interaction mechanisms, such as coalescence and disintegration. (authors)« less
SNPmplexViewer--toward a cost-effective traceability system
2011-01-01
Background Beef traceability has become mandatory in many regions of the world and is typically achieved through the use of unique numerical codes on ear tags and animal passports. DNA-based traceability uses the animal's own DNA code to identify it and the products derived from it. Using SNaPshot, a primer-extension-based method, a multiplex of 25 SNPs in a single reaction has been practiced for reducing the expense of genotyping a panel of SNPs useful for identity control. Findings To further decrease SNaPshot's cost, we introduced the Perl script SNPmplexViewer, which facilitates the analysis of trace files for reactions performed without the use of fluorescent size standards. SNPmplexViewer automatically aligns reference and target trace electropherograms, run with and without fluorescent size standards, respectively. SNPmplexViewer produces a modified target trace file containing a normalised trace in which the reference size standards are embedded. SNPmplexViewer also outputs aligned images of the two electropherograms together with a difference profile. Conclusions Modified trace files generated by SNPmplexViewer enable genotyping of SnaPshot reactions performed without fluorescent size standards, using common fragment-sizing software packages. SNPmplexViewer's normalised output may also improve the genotyping software's performance. Thus, SNPmplexViewer is a general free tool enabling the reduction of SNaPshot's cost as well as the fast viewing and comparing of trace electropherograms for fragment analysis. SNPmplexViewer is available at http://cowry.agri.huji.ac.il/cgi-bin/SNPmplexViewer.cgi. PMID:21600063
Accurate Ray-tracing of Realistic Neutron Star Atmospheres for Constraining Their Parameters
NASA Astrophysics Data System (ADS)
Vincent, Frederic H.; Bejger, Michał; Różańska, Agata; Straub, Odele; Paumard, Thibaut; Fortin, Morgane; Madej, Jerzy; Majczyna, Agnieszka; Gourgoulhon, Eric; Haensel, Paweł; Zdunik, Leszek; Beldycki, Bartosz
2018-03-01
Thermal-dominated X-ray spectra of neutron stars in quiescent, transient X-ray binaries and neutron stars that undergo thermonuclear bursts are sensitive to mass and radius. The mass–radius relation of neutron stars depends on the equation of state (EoS) that governs their interior. Constraining this relation accurately is therefore of fundamental importance to understand the nature of dense matter. In this context, we introduce a pipeline to calculate realistic model spectra of rotating neutron stars with hydrogen and helium atmospheres. An arbitrarily fast-rotating neutron star with a given EoS generates the spacetime in which the atmosphere emits radiation. We use the LORENE/NROTSTAR code to compute the spacetime numerically and the ATM24 code to solve the radiative transfer equations self-consistently. Emerging specific intensity spectra are then ray-traced through the neutron star’s spacetime from the atmosphere to a distant observer with the GYOTO code. Here, we present and test our fully relativistic numerical pipeline. To discuss and illustrate the importance of realistic atmosphere models, we compare our model spectra to simpler models like the commonly used isotropic color-corrected blackbody emission. We highlight the importance of considering realistic model-atmosphere spectra together with relativistic ray-tracing to obtain accurate predictions. We also insist upon the crucial impact of the star’s rotation on the observables. Finally, we close a controversy that has been ongoing in the literature in the recent years, regarding the validity of the ATM24 code.
Study of statistical coding for digital TV
NASA Technical Reports Server (NTRS)
Gardenhire, L. W.
1972-01-01
The results are presented for a detailed study to determine a pseudo-optimum statistical code to be installed in a digital TV demonstration test set. Studies of source encoding were undertaken, using redundancy removal techniques in which the picture is reproduced within a preset tolerance. A method of source encoding, which preliminary studies show to be encouraging, is statistical encoding. A pseudo-optimum code was defined and the associated performance of the code was determined. The format was fixed at 525 lines per frame, 30 frames per second, as per commercial standards.
A Combinatorial Geometry Computer Description of the MEP-021A Generator Set
1979-02-01
Generator Computer Description Gasoline Generator GIFT MEP-021A 20. ABSTRACT fCbntteu* an rararaa eta* ft namamwaay anal Identify by block number) This... GIFT code is also stored on magnetic tape for future vulnerability analysis. 00,] *7,1473 EDITION OF • NOV 65 IS OBSOLETE UNCLASSIFIED SECURITY...the Geometric Information for Targets ( GIFT ) computer code. The GIFT code traces shotlines through a COM-GEOM description from any specified attack
3D Laser Imprint Using a Smoother Ray-Traced Power Deposition Method
NASA Astrophysics Data System (ADS)
Schmitt, Andrew J.
2017-10-01
Imprinting of laser nonuniformities in directly-driven icf targets is a challenging problem to accurately simulate with large radiation-hydro codes. One of the most challenging aspects is the proper construction of the complex and rapidly changing laser interference structure driving the imprint using the reduced laser propagation models (usually ray-tracing) found in these codes. We have upgraded the modelling capability in our massively-parallel
NASA Astrophysics Data System (ADS)
Bohrson, Wendy A.; Spera, Frank J.
2007-11-01
Volcanic and plutonic rocks provide abundant evidence for complex processes that occur in magma storage and transport systems. The fingerprint of these processes, which include fractional crystallization, assimilation, and magma recharge, is captured in petrologic and geochemical characteristics of suites of cogenetic rocks. Quantitatively evaluating the relative contributions of each process requires integration of mass, species, and energy constraints, applied in a self-consistent way. The energy-constrained model Energy-Constrained Recharge, Assimilation, and Fractional Crystallization (EC-RaχFC) tracks the trace element and isotopic evolution of a magmatic system (melt + solids) undergoing simultaneous fractional crystallization, recharge, and assimilation. Mass, thermal, and compositional (trace element and isotope) output is provided for melt in the magma body, cumulates, enclaves, and anatectic (i.e., country rock) melt. Theory of the EC computational method has been presented by Spera and Bohrson (2001, 2002, 2004), and applications to natural systems have been elucidated by Bohrson and Spera (2001, 2003) and Fowler et al. (2004). The purpose of this contribution is to make the final version of the EC-RAχFC computer code available and to provide instructions for code implementation, description of input and output parameters, and estimates of typical values for some input parameters. A brief discussion highlights measures by which the user may evaluate the quality of the output and also provides some guidelines for implementing nonlinear productivity functions. The EC-RAχFC computer code is written in Visual Basic, the programming language of Excel. The code therefore launches in Excel and is compatible with both PC and MAC platforms. The code is available on the authors' Web sites http://magma.geol.ucsb.edu/and http://www.geology.cwu.edu/ecrafc) as well as in the auxiliary material.
nIFTY galaxy cluster simulations - III. The similarity and diversity of galaxies and subhaloes
NASA Astrophysics Data System (ADS)
Elahi, Pascal J.; Knebe, Alexander; Pearce, Frazer R.; Power, Chris; Yepes, Gustavo; Cui, Weiguang; Cunnama, Daniel; Kay, Scott T.; Sembolini, Federico; Beck, Alexander M.; Davé, Romeel; February, Sean; Huang, Shuiyao; Katz, Neal; McCarthy, Ian G.; Murante, Giuseppe; Perret, Valentin; Puchwein, Ewald; Saro, Alexandro; Teyssier, Romain
2016-05-01
We examine subhaloes and galaxies residing in a simulated Λ cold dark matter galaxy cluster (M^crit_{200}=1.1× 10^{15} h^{-1} M_{⊙}) produced by hydrodynamical codes ranging from classic smooth particle hydrodynamics (SPH), newer SPH codes, adaptive and moving mesh codes. These codes use subgrid models to capture galaxy formation physics. We compare how well these codes reproduce the same subhaloes/galaxies in gravity-only, non-radiative hydrodynamics and full feedback physics runs by looking at the overall subhalo/galaxy distribution and on an individual object basis. We find that the subhalo population is reproduced to within ≲10 per cent for both dark matter only and non-radiative runs, with individual objects showing code-to-code scatter of ≲0.1 dex, although the gas in non-radiative simulations shows significant scatter. Including feedback physics significantly increases the diversity. Subhalo mass and Vmax distributions vary by ≈20 per cent. The galaxy populations also show striking code-to-code variations. Although the Tully-Fisher relation is similar in almost all codes, the number of galaxies with 109 h- 1 M⊙ ≲ M* ≲ 1012 h- 1 M⊙ can differ by a factor of 4. Individual galaxies show code-to-code scatter of ˜0.5 dex in stellar mass. Moreover, systematic differences exist, with some codes producing galaxies 70 per cent smaller than others. The diversity partially arises from the inclusion/absence of active galactic nucleus feedback. Our results combined with our companion papers demonstrate that subgrid physics is not just subject to fine-tuning, but the complexity of building galaxies in all environments remains a challenge. We argue that even basic galaxy properties, such as stellar mass to halo mass, should be treated with errors bars of ˜0.2-0.4 dex.
Estimation of trace amounts of benzene in solvent-extracted vegetable oils and oil seed cakes.
Masohan, A; Parsad, G; Khanna, M K; Chopra, S K; Rawat, B S; Garg, M O
2000-09-01
A new method is presented for the qualitative and quantitative estimation of trace amounts (up to 0.15 ppm) of benzene in crude as well as refined vegetable oils obtained by extraction with food grade hexane (FGH), and in the oil seed cakes left after extraction. The method involves the selection of two solvents; cyclohexanol, for thinning of viscous vegetable oil, and heptane, for azeotroping out trace benzene as a concentrate from the resulting mixture. Benzene is then estimated in the resulting azeotrope either by UV spectroscopy or by GC-MS subject to availability and cost effectiveness of the latter. Repeatability and reproducibility of the method is within 1-3% error. This method is suitable for estimating benzene in vegetable oils and oil seed cakes.
Ming, Liang; Xi, Xia; Chen, Tingting; Liu, Jie
2008-01-01
We have developed a simple, convenient and inexpensive voltammetric method for determining trace Sudan I contamination in chili powder, based on the catalyzed electrochemical reduction of Sudan I at the carbon nanotube modified electrode. Under optimized conditions, the method exhibited acceptable analytical performance in terms of linearity (over the concentration range 6.0×10−7 to 7.5×10−5 M, r = 0.9967), detection limit (2.0×10−7 M) and reproducibility (RSD = 4.6%, n=10, for 2.0×10−5 M Sudan I). PMID:27879800
Reus, Astrid A; Reisinger, Kerstin; Downs, Thomas R; Carr, Gregory J; Zeller, Andreas; Corvi, Raffaella; Krul, Cyrille A M; Pfuhler, Stefan
2013-11-01
Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P < 0.05) in DNA damage in every experiment. For the genotoxic carcinogen, 2,4-diaminotoluene, the overall result from all laboratories showed a smaller, but significant genotoxic response (P < 0.05). For cyclohexanone (CHN) (non-genotoxic in vitro and in vivo, and non-carcinogenic), an increase compared to the solvent control acetone was observed only in one laboratory. However, the response was not dose related and CHN was judged negative overall, as was p-nitrophenol (p-NP) (genotoxic in vitro but not in vivo and non-carcinogenic), which was the only compound showing clear cytotoxic effects. For p-NP, significant DNA damage generally occurred only at doses that were substantially cytotoxic (>30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure.
Pfuhler, Stefan
2013-01-01
Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P < 0.05) in DNA damage in every experiment. For the genotoxic carcinogen, 2,4-diaminotoluene, the overall result from all laboratories showed a smaller, but significant genotoxic response (P < 0.05). For cyclohexanone (CHN) (non-genotoxic in vitro and in vivo, and non-carcinogenic), an increase compared to the solvent control acetone was observed only in one laboratory. However, the response was not dose related and CHN was judged negative overall, as was p-nitrophenol (p-NP) (genotoxic in vitro but not in vivo and non-carcinogenic), which was the only compound showing clear cytotoxic effects. For p-NP, significant DNA damage generally occurred only at doses that were substantially cytotoxic (>30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594
Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction
Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta
2018-01-01
The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research. PMID:29599739
Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.
Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta
2018-01-01
The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.
Hovnanians, Ninel; Win, Theresa; Makkiya, Mohammed; Zheng, Qi; Taub, Cynthia
2017-11-01
To assess the efficiency and reproducibility of automated measurements of left ventricular (LV) volumes and LV ejection fraction (LVEF) in comparison to manually traced biplane Simpson's method. This is a single-center prospective study. Apical four- and two-chamber views were acquired in patients in sinus rhythm. Two operators independently measured LV volumes and LVEF using biplane Simpson's method. In addition, the image analysis software a2DQ on the Philips EPIQ system was applied to automatically assess the LV volumes and LVEF. Time spent on each analysis, using both methods, was documented. Concordance of echocardiographic measures was evaluated using intraclass correlation (ICC) and Bland-Altman analysis. Manual tracing and automated measurement of LV volumes and LVEF were performed in 184 patients with a mean age of 67.3 ± 17.3 years and BMI 28.0 ± 6.8 kg/m 2 . ICC and Bland-Altman analysis showed good agreements between manual and automated methods measuring LVEF, end-systolic, and end-diastolic volumes. The average analysis time was significantly less using the automated method than manual tracing (116 vs 217 seconds/patient, P < .0001). Automated measurement using the novel image analysis software a2DQ on the Philips EPIQ system produced accurate, efficient, and reproducible assessment of LV volumes and LVEF compared with manual measurement. © 2017, Wiley Periodicals, Inc.
A Combinatorial Geometry Computer Description of the M9 ACE (Armored Combat Earthmover) Vehicle
1984-12-01
program requires as input the M9 target descriptions as processed by the Geometric Information for Targets ( GIFT ) ’ computer code. The first step is...model of the target. This COM-GEOM target description is used as input to the Geometric Information For Targets ( GIFT ) computer code. Among other...things, the GIFT code traces shotlines through a COM-GEOM description from any specified aspect, listing pertinent information about each component hit
FLiT: a field line trace code for magnetic confinement devices
NASA Astrophysics Data System (ADS)
Innocente, P.; Lorenzini, R.; Terranova, D.; Zanca, P.
2017-04-01
This paper presents a field line tracing code (FLiT) developed to study particle and energy transport as well as other phenomena related to magnetic topology in reversed-field pinch (RFP) and tokamak experiments. The code computes magnetic field lines in toroidal geometry using curvilinear coordinates (r, ϑ, ϕ) and calculates the intersections of these field lines with specified planes. The code also computes the magnetic and thermal diffusivity due to stochastic magnetic field in the collisionless limit. Compared to Hamiltonian codes, there are no constraints on the magnetic field functional formulation, which allows the integration of whichever magnetic field is required. The code uses the magnetic field computed by solving the zeroth-order axisymmetric equilibrium and the Newcomb equation for the first-order helical perturbation matching the edge magnetic field measurements in toroidal geometry. Two algorithms are developed to integrate the field lines: one is a dedicated implementation of a first-order semi-implicit volume-preserving integration method, and the other is based on the Adams-Moulton predictor-corrector method. As expected, the volume-preserving algorithm is accurate in conserving divergence, but slow because the low integration order requires small amplitude steps. The second algorithm proves to be quite fast and it is able to integrate the field lines in many partially and fully stochastic configurations accurately. The code has already been used to study the core and edge magnetic topology of the RFX-mod device in both the reversed-field pinch and tokamak magnetic configurations.
Ordering Traces Logically to Identify Lateness in Message Passing Programs
Isaacs, Katherine E.; Gamblin, Todd; Bhatele, Abhinav; ...
2015-03-30
Event traces are valuable for understanding the behavior of parallel programs. However, automatically analyzing a large parallel trace is difficult, especially without a specific objective. We aid this endeavor by extracting a trace's logical structure, an ordering of trace events derived from happened-before relationships, while taking into account developer intent. Using this structure, we can calculate an operation's delay relative to its peers on other processes. The logical structure also serves as a platform for comparing and clustering processes as well as highlighting communication patterns in a trace visualization. We present an algorithm for determining this idealized logical structure frommore » traces of message passing programs, and we develop metrics to quantify delays and differences among processes. We implement our techniques in Ravel, a parallel trace visualization tool that displays both logical and physical timelines. Rather than showing the duration of each operation, we display where delays begin and end, and how they propagate. As a result, we apply our approach to the traces of several message passing applications, demonstrating the accuracy of our extracted structure and its utility in analyzing these codes.« less
Students' Views and Attitudes Towards the Communication Code Used in Press Articles about Science
ERIC Educational Resources Information Center
Halkia, Krystallia; Mantzouridis, Dimitris
2005-01-01
The present research was designed to investigate the reaction of secondary school students to the communication code that the press uses in science articles: it attempts to trace which communication techniques can be of potential use in science education. The sample of the research consists of 351 secondary school students. The research instrument…
Sticks and Stones: Why First Amendment Absolutism Fails When Applied to Campus Harassment Codes.
ERIC Educational Resources Information Center
Lumsden, Linda
This paper analyzes how absolutist arguments against campus harassment codes violate the spirit of the first amendment, examining in particular the United States Supreme Court ruling in "RAV v. St. Paul." The paper begins by tracing the current development of first amendment doctrine, analyzing its inadequacy in the campus hate speech…
Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.
Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (i) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (ii) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph;more » (iii) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (iv) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (vi) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.« less
Constraining the Post-Thermal Pulse Mass-Loss History of R Scl with SOFIA/FORCAST
NASA Astrophysics Data System (ADS)
Hankins, Matthew; Herter, Terry; maercker, matthias; Lau, Ryan M.; Sloan, Greg
2018-06-01
R Sculptoris (R Scl) is a nearby (~370 pc) carbon star with a massive circumstellar shell (Mshell∼7×10‑3 M⊙) which is thought to have been produced by a thermal pulse event ∼2200 years ago. We observed R Scl with the Faint Object InfraRed CAMera for the SOFIA Telescope (FORCAST) at 19.7, 25.2, 31.5, 34.8, and 37.1 μm to study its circumstellar dust emission. Maps of the infrared emission were used to examine the morphology and temperature structure of the spatially extended dust emission. We used the radiative transfer code DUSTY to fit the radial density profile of the circumstellar material, and find that a geometrically thin dust shell cannot reproduce the observed emission. Instead, a second dust component is needed to model the emission. This component, which lies interior to the dust shell, traces the post-thermal pulse mass loss of R Scl and is indicative of a slow decline in the star’s mass loss over thousands of years. This result is at odds with 'classical' thermal pulse models but is consistent with earlier observations of molecular gas in R Scl’s circumstellar environment.
NASA Astrophysics Data System (ADS)
Guarnieri, Vittorio; Francini, Franco
1997-12-01
Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.
2006-01-01
collected, code both. Code Type of Analysis Code Type of Analysis A Physical properties I Common ions/trace elements B Common ions J Sanitary analysis and...1) A ground-water site is coded as if it is a single point, not a geographic area or property . (2) Latitude and longitude should be determined at a...terrace from an adjacent upland on one side, and a lowland coast or valley on the other. Due to the effects of erosion, the terrace surface may not be as
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matenine, D; Cote, G; Mascolo-Fortin, J
2016-06-15
Purpose: Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersections between the photons’ trajectories and the object, also called ray-tracing or system matrix computation. This work evaluates different ways to store the system matrix, aiming to reconstruct dense image grids in reasonable time. Methods: We propose an optimized implementation of the Siddon’s algorithm using graphics processing units (GPUs) with a novel data storage scheme. The algorithm computes a part of the system matrix on demand, typically, for one projection angle. The proposed method was enhanced with accelerating options: storage of larger subsets of themore » system matrix, systematic reuse of data via geometric symmetries, an arithmetic-rich parallel code and code configuration via machine learning. It was tested on geometries mimicking a cone beam CT acquisition of a human head. To realistically assess the execution time, the ray-tracing routines were integrated into a regularized Poisson-based reconstruction algorithm. The proposed scheme was also compared to a different approach, where the system matrix is fully pre-computed and loaded at reconstruction time. Results: Fast ray-tracing of realistic acquisition geometries, which often lack spatial symmetry properties, was enabled via the proposed method. Ray-tracing interleaved with projection and backprojection operations required significant additional time. In most cases, ray-tracing was shown to use about 66 % of the total reconstruction time. In absolute terms, tracing times varied from 3.6 s to 7.5 min, depending on the problem size. The presence of geometrical symmetries allowed for non-negligible ray-tracing and reconstruction time reduction. Arithmetic-rich parallel code and machine learning permitted a modest reconstruction time reduction, in the order of 1 %. Conclusion: Partial system matrix storage permitted the reconstruction of higher 3D image grid sizes and larger projection datasets at the cost of additional time, when compared to the fully pre-computed approach. This work was supported in part by the Fonds de recherche du Quebec - Nature et technologies (FRQ-NT). The authors acknowledge partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council of Canada (Grant No. 432290).« less
Etxano, J; García-Lallana Valbuena, A; Antón Ibáñez, I; Elizalde, A; Pina, L; García-Foncillas, J; Boni, V
2015-01-01
To evaluate the reproducibility of a protocol for dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) for the pharmacokinetic study of breast tumors. We carried out this prospective study from October 2009 through December 2009. We studied 12 patients with stage ii-iii invasive breast cancer without prior treatment. Our center's research ethics committee approved the study. The 12 patients underwent on two consecutive days DCE-MRI with a high temporal resolution protocol (21 acquisitions/minute). The data obtained in an ROI traced around the largest diameter of the tumor (ROI 1) and in another ROI traced around the area of the lesion's highest K(trans) intensity (ROI 2) were analyzed separately. We used parametric and nonparametric statistical tests to study the reproducibility and concordance of the principal pharmacokinetic variables (K(trans), Kep, Ve and AUC90). The correlations were very high (r>.80; P<.01) for all the variables for ROI 1 and high (r=.70-.80; P<.01) for all the variables for ROI 2, with the exception of Ve both in ROI 1 (r=.44; P=.07) and in ROI 2 (r=.13; P=.235). There were no statistically significant differences between the two studies in the values obtained for K(trans), Kep and AUC90 (P>.05 for each), but there was a statistically significant difference between the two studies in the values obtained for Ve in ROI 2 (P=.008). The high temporal resolution protocol for DCE-MRI used at out center is very reproducible for the principal pharmacokinetic constants of breast. Copyright © 2012 SERAM. Published by Elsevier España, S.L.U. All rights reserved.
Au coated PS nanopillars as a highly ordered and reproducible SERS substrate
NASA Astrophysics Data System (ADS)
Kim, Yong-Tae; Schilling, Joerg; Schweizer, Stefan L.; Sauer, Guido; Wehrspohn, Ralf B.
2017-07-01
Noble metal nanostructures with nanometer gap size provide strong surface-enhanced Raman scattering (SERS) which can be used to detect trace amounts of chemical and biological molecules. Although several approaches were reported to obtain active SERS substrates, it still remains a challenge to fabricate SERS substrates with high sensitivity and reproducibility using low-cost techniques. In this article, we report on the fabrication of Au sputtered PS nanopillars based on a template synthetic method as highly ordered and reproducible SERS substrates. The SERS substrates are fabricated by anodic aluminum oxide (AAO) template-assisted infiltration of polystyrene (PS) resulting in hemispherical structures, and a following Au sputtering process. The optimum gap size between adjacent PS nanopillars and thickness of the Au layers for high SERS sensitivity are investigated. Using the Au sputtered PS nanopillars as an active SERS substrate, the Raman signal of 4-methylbenzenethiol (4-MBT) with a concentration down to 10-9 M is identified with good signal reproducibility, showing great potential as promising tool for SERS-based detection.
Ray-tracing critical-angle transmission gratings for the X-ray Surveyor and Explorer-size missions
NASA Astrophysics Data System (ADS)
Günther, Hans M.; Bautz, Marshall W.; Heilmann, Ralf K.; Huenemoerder, David P.; Marshall, Herman L.; Nowak, Michael A.; Schulz, Norbert S.
2016-07-01
We study a critical angle transmission (CAT) grating spectrograph that delivers a spectral resolution significantly above any X-ray spectrograph ever own. This new technology will allow us to resolve kinematic components in absorption and emission lines of galactic and extragalactic matter down to unprecedented dispersion levels. We perform ray-trace simulations to characterize the performance of the spectrograph in the context of an X-ray Surveyor or Arcus like layout (two mission concepts currently under study). Our newly developed ray-trace code is a tool suite to simulate the performance of X-ray observatories. The simulator code is written in Python, because the use of a high-level scripting language allows modifications of the simulated instrument design in very few lines of code. This is especially important in the early phase of mission development, when the performances of different configurations are contrasted. To reduce the run-time and allow for simulations of a few million photons in a few minutes on a desktop computer, the simulator code uses tabulated input (from theoretical models or laboratory measurements of samples) for grating efficiencies and mirror reflectivities. We find that the grating facet alignment tolerances to maintain at least 90% of resolving power that the spectrometer has with perfect alignment are (i) translation parallel to the optical axis below 0.5 mm, (ii) rotation around the optical axis or the groove direction below a few arcminutes, and (iii) constancy of the grating period to 1:105. Translations along and rotations around the remaining axes can be significantly larger than this without impacting the performance.
Post-test analysis of PIPER-ONE PO-IC-2 experiment by RELAP5/MOD3 codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bovalini, R.; D`Auria, F.; Galassi, G.M.
1996-11-01
RELAP5/MOD3.1 was applied to the PO-IC-2 experiment performed in PIPER-ONE facility, which has been modified to reproduce typical isolation condenser thermal-hydraulic conditions. RELAP5 is a well known code widely used at the University of Pisa during the past seven years. RELAP5/MOD3.1 was the latest version of the code made available by the Idaho National Engineering Laboratory at the time of the reported study. PIPER-ONE is an experimental facility simulating a General Electric BWR-6 with volume and height scaling ratios of 1/2,200 and 1./1, respectively. In the frame of the present activity a once-through heat exchanger immersed in a pool ofmore » ambient temperature water, installed approximately 10 m above the core, was utilized to reproduce qualitatively the phenomenologies expected for the Isolation Condenser in the simplified BWR (SBWR). The PO-IC-2 experiment is the flood up of the PO-SD-8 and has been designed to solve some of the problems encountered in the analysis of the PO-SD-8 experiment. A very wide analysis is presented hereafter including the use of different code versions.« less
ERIC Educational Resources Information Center
Kloza, Brad
2000-01-01
The Internet can help teach students about women's achievements during Women's History Month. Children can go online and see pictures of the space shuttle commanded by Eileen Collins, trace Amelia Earhart's flight, or see how the late Florence Joyner captured two Olympic gold medals. A student reproducible has students visit specific web sites and…
Drawing as Instrument, Drawings as Evidence: Capturing Mental Processes with Pencil and Paper
Puglionesi, Alicia
2016-01-01
Researchers in the mind sciences often look to the production and analysis of drawings to reveal the mental processes of their subjects. This essay presents three episodes that trace the emergence of drawing as an instrumental practice in the study of the mind. Between 1880 and 1930, drawings gained currency as a form of scientific evidence – as stable, reproducible signals from a hidden interior. I begin with the use of drawings as data in the child study movement, move to the telepathic transmission of drawings in psychical research and conclude with the development of drawing as an experimental and diagnostic tool for studying neurological impairment. Despite significant shifts in the theoretical and disciplinary organisation of the mind sciences in the early twentieth century, researchers attempted to stabilise the use of subject-generated drawings as evidence by controlling the contexts in which drawings were produced and reproduced, and crafting subjects whose interiority could be effectively circumscribed. While movements such as psychoanalysis and art therapy would embrace the narrative interpretation of patient art, neuropsychology continued to utilise drawings as material traces of cognitive functions. PMID:27292325
Drawing as Instrument, Drawings as Evidence: Capturing Mental Processes with Pencil and Paper.
Puglionesi, Alicia
2016-07-01
Researchers in the mind sciences often look to the production and analysis of drawings to reveal the mental processes of their subjects. This essay presents three episodes that trace the emergence of drawing as an instrumental practice in the study of the mind. Between 1880 and 1930, drawings gained currency as a form of scientific evidence - as stable, reproducible signals from a hidden interior. I begin with the use of drawings as data in the child study movement, move to the telepathic transmission of drawings in psychical research and conclude with the development of drawing as an experimental and diagnostic tool for studying neurological impairment. Despite significant shifts in the theoretical and disciplinary organisation of the mind sciences in the early twentieth century, researchers attempted to stabilise the use of subject-generated drawings as evidence by controlling the contexts in which drawings were produced and reproduced, and crafting subjects whose interiority could be effectively circumscribed. While movements such as psychoanalysis and art therapy would embrace the narrative interpretation of patient art, neuropsychology continued to utilise drawings as material traces of cognitive functions.
Generating Code Review Documentation for Auto-Generated Mission-Critical Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2009-01-01
Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.
Prospective Coding by Spiking Neurons
Brea, Johanni; Gaál, Alexisz Tamás; Senn, Walter
2016-01-01
Animals learn to make predictions, such as associating the sound of a bell with upcoming feeding or predicting a movement that a motor command is eliciting. How predictions are realized on the neuronal level and what plasticity rule underlies their learning is not well understood. Here we propose a biologically plausible synaptic plasticity rule to learn predictions on a single neuron level on a timescale of seconds. The learning rule allows a spiking two-compartment neuron to match its current firing rate to its own expected future discounted firing rate. For instance, if an originally neutral event is repeatedly followed by an event that elevates the firing rate of a neuron, the originally neutral event will eventually also elevate the neuron’s firing rate. The plasticity rule is a form of spike timing dependent plasticity in which a presynaptic spike followed by a postsynaptic spike leads to potentiation. Even if the plasticity window has a width of 20 milliseconds, associations on the time scale of seconds can be learned. We illustrate prospective coding with three examples: learning to predict a time varying input, learning to predict the next stimulus in a delayed paired-associate task and learning with a recurrent network to reproduce a temporally compressed version of a sequence. We discuss the potential role of the learning mechanism in classical trace conditioning. In the special case that the signal to be predicted encodes reward, the neuron learns to predict the discounted future reward and learning is closely related to the temporal difference learning algorithm TD(λ). PMID:27341100
[Determination of trace gallium by graphite furnace atomic absorption spectrometry in urine].
Zhou, L Z; Fu, S; Gao, S Q; He, G W
2016-06-20
To establish a method for determination trace gallium in urine by graphite furnace atomic absorption spectrometry (GFAAS). The ammonium dihydrogen phosphate was matrix modifier. The temperature effect about pyrolysis (Tpyr) and atomization temperature were optimized for determination of trace gallium. The method of technical standard about within-run, between-run and recoveries of standard were optimized. The method showed a linear relationship within the range of 0.20~80.00 μg/L (r=0.998). The within-run and between-run relative standard deviations (RSD) of repetitive measurement at 5.0, 10.0, 20.0 μg/L concentration levels were 2.1%~5.5% and 2.3%~3.0%. The detection limit was 0.06 μg/L. The recoveries of gallium were 98.2%~101.1%. This method is simple, low detection limit, accurate, reliable and reproducible. It has been applied for determination of trace gallium in urine samples those who need occupation health examination or poisoning diagnosis.
Trace explosives sensor testbed (TESTbed)
NASA Astrophysics Data System (ADS)
Collins, Greg E.; Malito, Michael P.; Tamanaha, Cy R.; Hammond, Mark H.; Giordano, Braden C.; Lubrano, Adam L.; Field, Christopher R.; Rogers, Duane A.; Jeffries, Russell A.; Colton, Richard J.; Rose-Pehrsson, Susan L.
2017-03-01
A novel vapor delivery testbed, referred to as the Trace Explosives Sensor Testbed, or TESTbed, is demonstrated that is amenable to both high- and low-volatility explosives vapors including nitromethane, nitroglycerine, ethylene glycol dinitrate, triacetone triperoxide, 2,4,6-trinitrotoluene, pentaerythritol tetranitrate, and hexahydro-1,3,5-trinitro-1,3,5-triazine. The TESTbed incorporates a six-port dual-line manifold system allowing for rapid actuation between a dedicated clean air source and a trace explosives vapor source. Explosives and explosives-related vapors can be sourced through a number of means including gas cylinders, permeation tube ovens, dynamic headspace chambers, and a Pneumatically Modulated Liquid Delivery System coupled to a perfluoroalkoxy total-consumption microflow nebulizer. Key features of the TESTbed include continuous and pulseless control of trace vapor concentrations with wide dynamic range of concentration generation, six sampling ports with reproducible vapor profile outputs, limited low-volatility explosives adsorption to the manifold surface, temperature and humidity control of the vapor stream, and a graphical user interface for system operation and testing protocol implementation.
Neutron transport analysis for nuclear reactor design
Vujic, Jasmina L.
1993-01-01
Replacing regular mesh-dependent ray tracing modules in a collision/transfer probability (CTP) code with a ray tracing module based upon combinatorial geometry of a modified geometrical module (GMC) provides a general geometry transfer theory code in two dimensions (2D) for analyzing nuclear reactor design and control. The primary modification of the GMC module involves generation of a fixed inner frame and a rotating outer frame, where the inner frame contains all reactor regions of interest, e.g., part of a reactor assembly, an assembly, or several assemblies, and the outer frame, with a set of parallel equidistant rays (lines) attached to it, rotates around the inner frame. The modified GMC module allows for determining for each parallel ray (line), the intersections with zone boundaries, the path length between the intersections, the total number of zones on a track, the zone and medium numbers, and the intersections with the outer surface, which parameters may be used in the CTP code to calculate collision/transfer probability and cross-section values.
Neutron transport analysis for nuclear reactor design
Vujic, J.L.
1993-11-30
Replacing regular mesh-dependent ray tracing modules in a collision/transfer probability (CTP) code with a ray tracing module based upon combinatorial geometry of a modified geometrical module (GMC) provides a general geometry transfer theory code in two dimensions (2D) for analyzing nuclear reactor design and control. The primary modification of the GMC module involves generation of a fixed inner frame and a rotating outer frame, where the inner frame contains all reactor regions of interest, e.g., part of a reactor assembly, an assembly, or several assemblies, and the outer frame, with a set of parallel equidistant rays (lines) attached to it, rotates around the inner frame. The modified GMC module allows for determining for each parallel ray (line), the intersections with zone boundaries, the path length between the intersections, the total number of zones on a track, the zone and medium numbers, and the intersections with the outer surface, which parameters may be used in the CTP code to calculate collision/transfer probability and cross-section values. 28 figures.
Properties of galaxies reproduced by a hydrodynamic simulation
NASA Astrophysics Data System (ADS)
Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Bird, S.; Nelson, D.; Hernquist, L.
2014-05-01
Previous simulations of the growth of cosmic structures have broadly reproduced the `cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the `metal' and hydrogen content of galaxies on small scales.
Introduction to the Security Engineering Risk Analysis (SERA) Framework
2014-11-01
military aircraft has increased from 8% to 80%. At the same time, the size of software in military aircraft has grown from 1,000 lines of code in the F...4A to 1.7 million lines of code in the F-22. This growth trend is expected to con- tinue over time [NASA 2009]. As software exerts more control of...their root causes can be traced to the software’s requirements, architecture, design, or code . Studies have shown that the cost of addressing a software
NASA Technical Reports Server (NTRS)
Owen, Albert K.
1987-01-01
A computer code was written which utilizes ray tracing techniques to predict the changes in position and geometry of a laser Doppler velocimeter probe volume resulting from refraction effects. The code predicts the position change, changes in beam crossing angle, and the amount of uncrossing that occur when the beams traverse a region with a changed index of refraction, such as a glass window. The code calculates the changes for flat plate, cylinder, general axisymmetric and general surface windows and is currently operational on a VAX 8600 computer system.
Resolving the Small-Scale Structure of the Circumgalactic Medium in Cosmological Simulations
NASA Astrophysics Data System (ADS)
Corlies, Lauren
2017-08-01
We propose to resolve the circumgalactic medium (CGM) of L* galaxies down to 100 Msun (250 pc) in a full cosmological simulation to examine how mixing and cooling shape the physical nature of this gas on the scales expected from observations. COS has provided the best characterization of the low-z CGM to date, revealing the extent and amount of low- and high-ions and hinting at the kinematic relations between them. Yet cosmological galaxy simulations that can reproduce the stellar properties of galaxies have all struggled to reproduce these results even qualitatively. However, while the COS data imply that the low-ion absorption is occurring on sub-kpc scales, such scales can not be traced by simulations with resolution between 1-5 kpc in the CGM. Our proposed simulations will, for the first time, reach the resolution required to resolve these structures in the outer halo of L* galaxies. Using the adaptive mesh refinement code enzo, we will experiment with the size, shape, and resolution of an enforced high refinement region extending from the disk into the CGM to identify the best configuration for probing the flows of gas throughout the CGM. Our test case has found that increasing the resolution alone can have dramatic consequences for the density, temperature, and kinematics along a line of sight. Coupling this technique with an independent feedback study already underway will help disentangle the roles of global and small scale physics in setting the physical state of the CGM. Finally, we will use the MISTY pipeline to generate realistic mock spectra for direct comparison with COS data which will be made available through MAST.
An Infrared Study of the Circumstellar Material Associated with the Carbon Star R Sculptoris
NASA Astrophysics Data System (ADS)
Hankins, M. J.; Herter, T. L.; Maercker, M.; Lau, R. M.; Sloan, G. C.
2018-01-01
The asymptotic giant branch (AGB) star R Sculptoris (R Scl) is one of the most extensively studied stars on the AGB. R Scl is a carbon star with a massive circumstellar shell (M shell ∼ 7.3 × 10‑3 M ⊙) that is thought to have been produced during a thermal pulse event ∼2200 years ago. To study the thermal dust emission associated with its circumstellar material, observations were taken with the Faint Object InfraRed CAMera for the SOFIA Telescope (FORCAST) at 19.7, 25.2, 31.5, 34.8, and 37.1 μm. Maps of the infrared emission at these wavelengths were used to study the morphology and temperature structure of the spatially extended dust emission. Using the radiative-transfer code DUSTY, and fitting the spatial profile of the emission, we find that a geometrically thin dust shell cannot reproduce the observed spatially resolved emission. Instead, a second dust component in addition to the shell is needed to reproduce the observed emission. This component, which lies interior to the dust shell, traces the circumstellar envelope of R Scl. It is best fit by a density profile with n ∝ r α , where α ={0.75}-0.25+0.45 and a dust mass of {M}d={9.0}-4.1+2.3× {10}-6 {M}ȯ . The strong departure from an r ‑2 law indicates that the mass-loss rate of R Scl has not been constant. This result is consistent with a slow decline in the post-pulse mass loss that has been inferred from observations of the molecular gas.
Laser Ray Tracing in a Parallel Arbitrary Lagrangian-Eulerian Adaptive Mesh Refinement Hydrocode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masters, N D; Kaiser, T B; Anderson, R W
2009-09-28
ALE-AMR is a new hydrocode that we are developing as a predictive modeling tool for debris and shrapnel formation in high-energy laser experiments. In this paper we present our approach to implementing laser ray-tracing in ALE-AMR. We present the equations of laser ray tracing, our approach to efficient traversal of the adaptive mesh hierarchy in which we propagate computational rays through a virtual composite mesh consisting of the finest resolution representation of the modeled space, and anticipate simulations that will be compared to experiments for code validation.
ITK: enabling reproducible research and open science
McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis
2014-01-01
Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387
ITK: enabling reproducible research and open science.
McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis
2014-01-01
Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.
Wood construction codes issues in the United States
Douglas R. Rammer
2006-01-01
The current wood construction codes find their origin in the 1935 Wood Handbook: Wood as an Engineering Material published by the USDA Forest Service. Many of the current design recommendations can be traced back to statements from this book. Since this time a series of development both historical and recent has led to a multi-layered system for use of wood products in...
On initial Brain Activity Mapping of episodic and semantic memory code in the hippocampus.
Tsien, Joe Z; Li, Meng; Osan, Remus; Chen, Guifen; Lin, Longian; Wang, Phillip Lei; Frey, Sabine; Frey, Julietta; Zhu, Dajiang; Liu, Tianming; Zhao, Fang; Kuang, Hui
2013-10-01
It has been widely recognized that the understanding of the brain code would require large-scale recording and decoding of brain activity patterns. In 2007 with support from Georgia Research Alliance, we have launched the Brain Decoding Project Initiative with the basic idea which is now similarly advocated by BRAIN project or Brain Activity Map proposal. As the planning of the BRAIN project is currently underway, we share our insights and lessons from our efforts in mapping real-time episodic memory traces in the hippocampus of freely behaving mice. We show that appropriate large-scale statistical methods are essential to decipher and measure real-time memory traces and neural dynamics. We also provide an example of how the carefully designed, sometime thinking-outside-the-box, behavioral paradigms can be highly instrumental to the unraveling of memory-coding cell assembly organizing principle in the hippocampus. Our observations to date have led us to conclude that the specific-to-general categorical and combinatorial feature-coding cell assembly mechanism represents an emergent property for enabling the neural networks to generate and organize not only episodic memory, but also semantic knowledge and imagination. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
On Initial Brain Activity Mapping of Associative Memory Code in the Hippocampus
Tsien, Joe Z.; Li, Meng; Osan, Remus; Chen, Guifen; Lin, Longian; Lei Wang, Phillip; Frey, Sabine; Frey, Julietta; Zhu, Dajiang; Liu, Tianming; Zhao, Fang; Kuang, Hui
2013-01-01
It has been widely recognized that the understanding of the brain code would require large-scale recording and decoding of brain activity patterns. In 2007 with support from Georgia Research Alliance, we have launched the Brain Decoding Project Initiative with the basic idea which is now similarly advocated by BRAIN project or Brain Activity Map proposal. As the planning of the BRAIN project is currently underway, we share our insights and lessons from our efforts in mapping real-time episodic memory traces in the hippocampus of freely behaving mice. We show that appropriate large-scale statistical methods are essential to decipher and measure real-time memory traces and neural dynamics. We also provide an example of how the carefully designed, sometime thinking-outside-the-box, behavioral paradigms can be highly instrumental to the unraveling of memory-coding cell assembly organizing principle in the hippocampus. Our observations to date have led us to conclude that the specific-to-general categorical and combinatorial feature-coding cell assembly mechanism represents an emergent property for enabling the neural networks to generate and organize not only episodic memory, but also semantic knowledge and imagination. PMID:23838072
NASA Astrophysics Data System (ADS)
Andre, R.; Carlsson, J.; Gorelenkova, M.; Jardin, S.; Kaye, S.; Poli, F.; Yuan, X.
2016-10-01
TRANSP is an integrated interpretive and predictive transport analysis tool that incorporates state of the art heating/current drive sources and transport models. The treatments and transport solvers are becoming increasingly sophisticated and comprehensive. For instance, the ISOLVER component provides a free boundary equilibrium solution, while the PT- SOLVER transport solver is especially suited for stiff transport models such as TGLF. TRANSP incorporates high fidelity heating and current drive source models, such as NUBEAM for neutral beam injection, the beam tracing code TORBEAM for EC, TORIC for ICRF, the ray tracing TORAY and GENRAY for EC. The implementation of selected components makes efficient use of MPI for speed up of code calculations. Recently the GENRAY-CQL3D solver for modeling of LH heating and current drive has been implemented and currently being extended to multiple antennas, to allow modeling of EAST discharges. Also, GENRAY+CQL3D is being extended to the use of EC/EBW and of HHFW for NSTX-U. This poster will describe present uses of the code worldwide, as well as plans for upgrading the physics modules and code framework. Work supported by the US Department of Energy under DE-AC02-CH0911466.
Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release
NASA Astrophysics Data System (ADS)
Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.
2017-11-01
We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.
Röper, Andrea; Reichert, Walter; Mattern, Rainer
2007-01-01
In the field of forensic DNA typing, the analysis of Short Tandem Repeats (STRs) can fail in cases of degraded DNA. The typing of coding region Single Nucleotide Polymorphisms (SNPs) of the mitochondrial genome provides an approach to acquire additional information. In the examined case of aggravated theft, both suspects could be excluded of having left the analyzed hair on the crime scene by SNP typing. This conclusion was not possible subsequent to STR typing. SNP typing of the trace on the torch light left on the crime scene increased the likelihood for suspect no. 2 to be the origin of this trace. This finding was already indicated by STR analysis. Suspect no. 1 was excluded for being the origin of this trace by SNP typing which was also indicated by STR analysis. A limiting factor for the analysis of SNPs is the maternal inheritance of mitochondrial DNA. Individualisation is not possible. In conclusion, it can be said that in the case of traces which cause problems with conventional STR typing the supplementary analysis of coding region SNPs from the mitochondrial genome is very reasonable and greatly contributes to the refinement of analysis methods in the field of forensic genetics.
GRay: A Massively Parallel GPU-based Code for Ray Tracing in Relativistic Spacetimes
NASA Astrophysics Data System (ADS)
Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal
2013-11-01
We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.
System and method for deriving a process-based specification
NASA Technical Reports Server (NTRS)
Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)
2009-01-01
A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.
GenePattern | Informatics Technology for Cancer Research (ITCR)
GenePattern is a genomic analysis platform that provides access to hundreds of tools for the analysis and visualization of multiple data types. A web-based interface provides easy access to these tools and allows the creation of multi-step analysis pipelines that enable reproducible in silico research. A new GenePattern Notebook environment allows users to combine GenePattern analyses with text, graphics, and code to create complete reproducible research narratives.
Analysis of volatile organic compounds. [trace amounts of organic volatiles in gas samples
NASA Technical Reports Server (NTRS)
Zlatkis, A. (Inventor)
1977-01-01
An apparatus and method are described for reproducibly analyzing trace amounts of a large number of organic volatiles existing in a gas sample. Direct injection of the trapped volatiles into a cryogenic percolum provides a sharply defined plug. Applications of the method include: (1) analyzing the headspace gas of body fluids and comparing a profile of the organic volatiles with standard profiles for the detection and monitoring of disease; (2) analyzing the headspace gas of foods and beverages and comparing the profile with standard profiles to monitor and control flavor and aroma; and (3) analyses for determining the organic pollutants in air or water samples.
Almeida, Jonas S.; Iriabho, Egiebade E.; Gorrepati, Vijaya L.; Wilkinson, Sean R.; Grüneberg, Alexander; Robbins, David E.; Hackney, James R.
2012-01-01
Background: Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. Materials and Methods: ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Results: Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. Conclusions: The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local “download and installation”. PMID:22934238
Almeida, Jonas S; Iriabho, Egiebade E; Gorrepati, Vijaya L; Wilkinson, Sean R; Grüneberg, Alexander; Robbins, David E; Hackney, James R
2012-01-01
Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local "download and installation".
Nöremark, Maria; Widgren, Stefan
2014-03-17
During outbreak of livestock diseases, contact tracing can be an important part of disease control. Animal movements can also be of relevance for risk-based surveillance and sampling, i.e. both when assessing consequences of introduction or likelihood of introduction. In many countries, animal movement data are collected with one of the major objectives to enable contact tracing. However, often an analytical step is needed to retrieve appropriate information for contact tracing or surveillance. In this study, an open source tool was developed to structure livestock movement data to facilitate contact-tracing in real time during disease outbreaks and for input in risk-based surveillance and sampling. The tool, EpiContactTrace, was written in the R-language and uses the network parameters in-degree, out-degree, ingoing contact chain and outgoing contact chain (also called infection chain), which are relevant for forward and backward tracing respectively. The time-frames for backward and forward tracing can be specified independently and search can be done on one farm at a time or for all farms within the dataset. Different outputs are available; datasets with network measures, contacts visualised in a map and automatically generated reports for each farm either in HTML or PDF-format intended for the end-users, i.e. the veterinary authorities, regional disease control officers and field-veterinarians. EpiContactTrace is available as an R-package at the R-project website (http://cran.r-project.org/web/packages/EpiContactTrace/). We believe this tool can help in disease control since it rapidly can structure essential contact information from large datasets. The reproducible reports make this tool robust and independent of manual compilation of data. The open source makes it accessible and easily adaptable for different needs.
Bandodkar, Amay J; Jia, Wenzhao; Ramírez, Julian; Wang, Joseph
2015-06-03
The development of enzymatic-ink-based roller pens for direct drawing of biocatalytic sensors, in general, and for realizing renewable glucose sensor strips, in particular, is described. The resulting enzymatic-ink pen allows facile fabrication of high-quality inexpensive electrochemical biosensors of any design by the user on a wide variety of surfaces having complex textures with minimal user training. Unlike prefabricated sensors, this approach empowers the end user with the ability of "on-demand" and "on-site" designing and fabricating of biocatalytic sensors to suit their specific requirement. The resulting devices are thus referred to as "do-it-yourself" sensors. The bio-active pens produce highly reproducible biocatalytic traces with minimal edge roughness. The composition of the new enzymatic inks has been optimized for ensuring good biocatalytic activity, electrical conductivity, biocompati-bility, reproducible writing, and surface adherence. The resulting inks are characterized using spectroscopic, viscometric, electrochemical, thermal and microscopic techniques. Applicability to renewable blood glucose testing, epidermal glucose monitoring, and on-leaf phenol detection are demonstrated in connection to glucose oxidase and tyrosinase-based carbon inks. The "do-it-yourself" renewable glucose sensor strips offer a "fresh," reproducible, low-cost biocatalytic sensor surface for each blood test. The ability to directly draw biocatalytic conducting traces even on unconventional surfaces opens up new avenues in various sensing applications in low-resource settings and holds great promise for diverse healthcare, environmental, and defense domains. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2007-09-01
practically have dropped the collaboration with Biotraces as the company was not able to provide us with an improved version of their instrument...Although the claimed sensitivity was reproduced in studies conducted at BioTraces with recombinant PrP. The question was whether the same sensitivity
Functional Equivalence Acceptance Testing of FUN3D for Entry Descent and Landing Applications
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.; Wood, William A.; Kleb, William L.; Alter, Stephen J.; Glass, Christopher E.; Padilla, Jose F.; Hammond, Dana P.; White, Jeffery A.
2013-01-01
The functional equivalence of the unstructured grid code FUN3D to the the structured grid code LAURA (Langley Aerothermodynamic Upwind Relaxation Algorithm) is documented for applications of interest to the Entry, Descent, and Landing (EDL) community. Examples from an existing suite of regression tests are used to demonstrate the functional equivalence, encompassing various thermochemical models and vehicle configurations. Algorithm modifications required for the node-based unstructured grid code (FUN3D) to reproduce functionality of the cell-centered structured code (LAURA) are also documented. Challenges associated with computation on tetrahedral grids versus computation on structured-grid derived hexahedral systems are discussed.
Digital Isotope Coding to Trace the Growth Process of Individual Single-Walled Carbon Nanotubes.
Otsuka, Keigo; Yamamoto, Shun; Inoue, Taiki; Koyano, Bunsho; Ukai, Hiroyuki; Yoshikawa, Ryo; Xiang, Rong; Chiashi, Shohei; Maruyama, Shigeo
2018-04-24
Single-walled carbon nanotubes (SWCNTs) are attracting increasing attention as an ideal material for high-performance electronics through the preparation of arrays of purely semiconducting SWCNTs. Despite significant progress in the controlled synthesis of SWCNTs, their growth mechanism remains unclear due to difficulties in analyzing the time-resolved growth of individual SWCNTs under practical growth conditions. Here we present a method for tracing the diverse growth profiles of individual SWCNTs by embedding digitally coded isotope labels. Raman mapping showed that, after various incubation times, SWCNTs elongated monotonically until their abrupt termination. Ex situ analysis offered an opportunity to capture rare chirality changes along the SWCNTs, which resulted in sudden acceleration/deceleration of the growth rate. Dependence on growth parameters, such as temperature and carbon concentration, was also traced along individual SWCNTs, which could provide clues to chirality control. Systematic growth studies with a variety of catalysts and conditions, which combine the presented method with other characterization techniques, will lead to further understanding and control of chirality, length, and density of SWCNTs.
Particle tracing modeling of ion fluxes at geosynchronous orbit
Brito, Thiago V.; Woodroffe, Jesse; Jordanova, Vania K.; ...
2017-10-31
The initial results of a coupled MHD/particle tracing method to evaluate particle fluxes in the inner magnetosphere are presented. This setup is capable of capturing the earthward particle acceleration process resulting from dipolarization events in the tail region of the magnetosphere. On the period of study, the MHD code was able to capture a dipolarization event and the particle tracing algorithm was able to capture our results of these disturbances and calculate proton fluxes in the night side geosynchronous orbit region. The simulation captured dispersionless injections as well as the energy dispersion signatures that are frequently observed by satellites atmore » geosynchronous orbit. Currently, ring current models rely on Maxwellian-type distributions based on either empirical flux values or sparse satellite data for their boundary conditions close to geosynchronous orbit. In spite of some differences in intensity and timing, the setup presented here is able to capture substorm injections, which represents an improvement regarding a reverse way of coupling these ring current models with MHD codes through the use of boundary conditions.« less
Particle tracing modeling of ion fluxes at geosynchronous orbit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brito, Thiago V.; Woodroffe, Jesse; Jordanova, Vania K.
The initial results of a coupled MHD/particle tracing method to evaluate particle fluxes in the inner magnetosphere are presented. This setup is capable of capturing the earthward particle acceleration process resulting from dipolarization events in the tail region of the magnetosphere. On the period of study, the MHD code was able to capture a dipolarization event and the particle tracing algorithm was able to capture our results of these disturbances and calculate proton fluxes in the night side geosynchronous orbit region. The simulation captured dispersionless injections as well as the energy dispersion signatures that are frequently observed by satellites atmore » geosynchronous orbit. Currently, ring current models rely on Maxwellian-type distributions based on either empirical flux values or sparse satellite data for their boundary conditions close to geosynchronous orbit. In spite of some differences in intensity and timing, the setup presented here is able to capture substorm injections, which represents an improvement regarding a reverse way of coupling these ring current models with MHD codes through the use of boundary conditions.« less
NASA Astrophysics Data System (ADS)
D'Amico, S.; Lombardo, C.; Moscato, I.; Polidori, M.; Vella, G.
2015-11-01
In the past few decades a lot of theoretical and experimental researches have been done to understand the physical phenomena characterizing nuclear accidents. In particular, after the Three Miles Island accident, several reactors have been designed to handle successfully LOCA events. This paper presents a comparison between experimental and numerical results obtained for the “2 inch Direct Vessel Injection line break” in SPES-2. This facility is an integral test facility built in Piacenza at the SIET laboratories and simulating the primary circuit, the relevant parts of the secondary circuits and the passive safety systems typical of the AP600 nuclear power plant. The numerical analysis here presented was performed by using TRACE and CATHARE thermal-hydraulic codes with the purpose of evaluating their prediction capability. The main results show that the TRACE model well predicts the overall behaviour of the plant during the transient, in particular it is able to simulate the principal thermal-hydraulic phenomena related to all passive safety systems. The performance of the presented CATHARE noding has suggested some possible improvements of the model.
Evaluating progressive-rendering algorithms in appearance design tasks.
Jiawei Ou; Karlik, Ondrej; Křivánek, Jaroslav; Pellacini, Fabio
2013-01-01
Progressive rendering is becoming a popular alternative to precomputational approaches to appearance design. However, progressive algorithms create images exhibiting visual artifacts at early stages. A user study investigated these artifacts' effects on user performance in appearance design tasks. Novice and expert subjects performed lighting and material editing tasks with four algorithms: random path tracing, quasirandom path tracing, progressive photon mapping, and virtual-point-light rendering. Both the novices and experts strongly preferred path tracing to progressive photon mapping and virtual-point-light rendering. None of the participants preferred random path tracing to quasirandom path tracing or vice versa; the same situation held between progressive photon mapping and virtual-point-light rendering. The user workflow didn’t differ significantly with the four algorithms. The Web Extras include a video showing how four progressive-rendering algorithms converged (at http://youtu.be/ck-Gevl1e9s), the source code used, and other supplementary materials.
Properties of galaxies reproduced by a hydrodynamic simulation.
Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L
2014-05-08
Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales.
3D-PDR: Three-dimensional photodissociation region code
NASA Astrophysics Data System (ADS)
Bisbas, T. G.; Bell, T. A.; Viti, S.; Yates, J.; Barlow, M. J.
2018-03-01
3D-PDR is a three-dimensional photodissociation region code written in Fortran. It uses the Sundials package (written in C) to solve the set of ordinary differential equations and it is the successor of the one-dimensional PDR code UCL_PDR (ascl:1303.004). Using the HEALpix ray-tracing scheme (ascl:1107.018), 3D-PDR solves a three-dimensional escape probability routine and evaluates the attenuation of the far-ultraviolet radiation in the PDR and the propagation of FIR/submm emission lines out of the PDR. The code is parallelized (OpenMP) and can be applied to 1D and 3D problems.
Bidirectional holographic codes and sub-AdS locality
NASA Astrophysics Data System (ADS)
Yang, Zhao; Hayden, Patrick; Qi, Xiaoliang
Tensor networks implementing quantum error correcting codes have recently been used as toy models of the holographic duality which explicitly realize some of the more puzzling features of the AdS/CFT correspondence. These models reproduce the Ryu-Takayanagi entropy formula for boundary intervals, and allow bulk operators to be mapped to the boundary in a redundant fashion. These exactly solvable, explicit models have provided valuable insight but nonetheless suffer from many deficiencies, some of which we attempt to address in this talk. We propose a new class of tensor network models that subsume the earlier advances and, in addition, incorporate additional features of holographic duality, including: (1) a holographic interpretation of all boundary states, not just those in a ''code'' subspace, (2) a set of bulk states playing the role of ''classical geometries'' which reproduce the Ryu-Takayanagi formula for boundary intervals, (3) a bulk gauge symmetry analogous to diffeomorphism invariance in gravitational theories, (4) emergent bulk locality for sufficiently sparse excitations, and the ability to describe geometry at sub-AdS resolutions or even flat space. David and Lucile Packard Foundation.
Bidirectional holographic codes and sub-AdS locality
NASA Astrophysics Data System (ADS)
Yang, Zhao; Hayden, Patrick; Qi, Xiao-Liang
2016-01-01
Tensor networks implementing quantum error correcting codes have recently been used to construct toy models of holographic duality explicitly realizing some of the more puzzling features of the AdS/CFT correspondence. These models reproduce the Ryu-Takayanagi entropy formula for boundary intervals, and allow bulk operators to be mapped to the boundary in a redundant fashion. These exactly solvable, explicit models have provided valuable insight but nonetheless suffer from many deficiencies, some of which we attempt to address in this article. We propose a new class of tensor network models that subsume the earlier advances and, in addition, incorporate additional features of holographic duality, including: (1) a holographic interpretation of all boundary states, not just those in a "code" subspace, (2) a set of bulk states playing the role of "classical geometries" which reproduce the Ryu-Takayanagi formula for boundary intervals, (3) a bulk gauge symmetry analogous to diffeomorphism invariance in gravitational theories, (4) emergent bulk locality for sufficiently sparse excitations, and (5) the ability to describe geometry at sub-AdS resolutions or even flat space.
NASA Astrophysics Data System (ADS)
Abu-Taha, M. I.; Abu-Teir, M. M.; Al-Jamal, A. J.; Eideh, H.
The aim of this work was to establish the feasibility of the combined photoacoustic (PA) and photopyroelectric (PPE) detection of the vapours emitted from essential oils and their corresponding uncrushed leaves or flowers. Gas traces of jasmine (Jessamine (Jasminum)), mint (Mentha arvensis L.) and Damask rose (Rosa damascena Miller) and their essential oils were tested using a combined cell fitted with both a photopyroelectric film (PVDF) and a microphone in conjunction with a pulsed wideband infrared source (PWBS) source. Infrared PA and PPE absorbances were obtained simultaneously at room temperatures with excellent reproducibility and high signal-to-noise ratios. Significant similarities found between the PA and PPE spectra of the trace gas emissions of plant parts, i.e., flowers or leaves and their related essential oils show the good correlation of their emissions and that both effects are initiated by the same absorbing molecules.
Quantum criticality and duality in the Sachdev-Ye-Kitaev/AdS2 chain
NASA Astrophysics Data System (ADS)
Jian, Shao-Kai; Xian, Zhuo-Yu; Yao, Hong
2018-05-01
We show that the quantum critical point (QCP) between a diffusive metal and ferromagnetic (or antiferromagnetic) phases in the SYK chain has a gravitational description corresponding to the double-trace deformation in an AdS2 chain. Specifically, by studying a double-trace deformation of a Z2 scalar in an AdS2 chain where the Z2 scalar is dual to the order parameter in the SYK chain, we find that the susceptibility and renormalization group equation describing the QCP in the SYK chain can be exactly reproduced in the holographic model. Our results suggest that the infrared geometry in the gravity theory dual to the diffusive metal of the SYK chain is also an AdS2 chain. We further show that the transition in SYK model captures universal information about double-trace deformation in generic black holes with near horizon AdS2 space-time.
Screening for trace explosives by AccuTOF™-DART®: an in-depth validation study.
Sisco, Edward; Dake, Jeffrey; Bridge, Candice
2013-10-10
Ambient ionization mass spectrometry is finding increasing utility as a rapid analysis technique in a number of fields. In forensic science specifically, analysis of many types of samples, including drugs, explosives, inks, bank dye, and lotions, has been shown to be possible using these techniques [1]. This paper focuses on one type of ambient ionization mass spectrometry, Direct Analysis in Real Time Mass Spectrometry (DART-MS or DART), and its viability as a screening tool for trace explosives analysis. In order to assess viability, a validation study was completed which focused on the analysis of trace amounts of nitro and peroxide based explosives. Topics which were studied, and are discussed, include method optimization, reproducibility, sensitivity, development of a search library, discrimination of mixtures, and blind sampling. Advantages and disadvantages of this technique over other similar screening techniques are also discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Determination of trace amount of formaldehyde base on a bromate-Malachite Green system.
Tang, Yufang; Chen, Hao; Weng, Chao; Tang, Xiaohui; Zhang, Miaoling; Hu, Tao
2015-01-25
A novel catalytic kinetic spectrophotometric method for determination of trace amount of formaldehyde (FA) has been established, based on catalytic effect of trace amount of FA on the oxidation of Malachite Green (MG) by potassium bromate in presence of sulfuric acid medium, and was reported for the first time. The method was monitored by measuring the decrease in absorbance of MG at 617 nm and allowed a precise determination of FA in the range of 0.003-0.08 μg mL(-1), with a limit of detection down to 1 ng mL(-1). The relative standard deviation of 10 replicate measurements was 1.63%. The method developed was approved to be sensitive, selective and accurate, and adopted to determinate free FA in samples directly with good accuracy and reproducibility. Copyright © 2014 Elsevier B.V. All rights reserved.
Effective holographic models for QCD: Glueball spectrum and trace anomaly
NASA Astrophysics Data System (ADS)
Ballon-Bayona, Alfonso; Boschi-Filho, Henrique; Mamani, Luis A. H.; Miranda, Alex S.; Zanchin, Vilson T.
2018-02-01
We investigate effective holographic models for QCD arising from five-dimensional dilaton gravity. The models are characterized by a dilaton with a mass term in the UV, dual to a CFT deformation by a relevant operator, and quadratic in the IR. The UV constraint leads to the explicit breaking of conformal symmetry, whereas the IR constraint guarantees linear confinement. We propose semianalytic interpolations between the UV and the IR and obtain a spectrum for scalar and tensor glueballs consistent with lattice QCD data. We use the glueball spectrum as a physical constraint to find the evolution of the model parameters as the mass term goes to 0. Finally, we reproduce the universal result for the trace anomaly of deformed CFTs and propose a dictionary between this result and the QCD trace anomaly. A nontrivial consequence of this dictionary is the emergence of a β function similar to the two-loop perturbative QCD result.
NASA Technical Reports Server (NTRS)
Kavelund, Klaus; Barringer, Howard
2012-01-01
TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.
Analysis of memory use for improved design and compile-time allocation of local memory
NASA Technical Reports Server (NTRS)
Mcniven, Geoffrey D.; Davidson, Edward S.
1986-01-01
Trace analysis techniques are used to study memory referencing behavior for the purpose of designing local memories and determining how to allocate them for data and instructions. In an attempt to assess the inherent behavior of the source code, the trace analysis system described here reduced the effects of the compiler and host architecture on the trace by using a technical called flattening. The variables in the trace, their associated single-assignment values, and references are histogrammed on the basis of various parameters describing memory referencing behavior. Bounds are developed specifying the amount of memory space required to store all live values in a particular histogram class. The reduction achieved in main memory traffic by allocating local memory is specified for each class.
The Increasing Urgency for Standards in Basic Biological Research
Freedman, Leonard P.; Inglese, James
2016-01-01
Research advances build upon the validity and reproducibility of previously published data and findings. Yet irreproducibility in basic biological and preclinical research is pervasive in both academic and commercial settings. Lack of reproducibility has led to invalidated research breakthroughs, retracted papers, and aborted clinical trials. Concerns and requirements for transparent, reproducible, and translatable research are accelerated by the rapid growth of “post-publication peer review,” open access publishing, and data sharing that facilitate the identification of irreproducible data/studies; they are magnified by the explosion of high-throughput technologies, genomics, and other data-intensive disciplines. Collectively, these changes and challenges are decreasing the effectiveness of traditional research quality mechanisms and are contributing to unacceptable—and unsustainable—levels of irreproducibility. The global oncology and basic biological research communities can no longer tolerate or afford widespread irreproducible research. This article discusses (1) how irreproducibility in preclinical research can ultimately be traced to an absence of a unifying life science standards framework, and (2) makes an urgent case for the expanded development and use of consensus-based standards to both enhance reproducibility and drive innovations in cancer research. PMID:25035389
The Sagittarius tidal stream as a gravitationnal experiment in the Milky Way
NASA Astrophysics Data System (ADS)
Thomas, G. F.; Famaey, B.; Ibata, R.; Lüghausen, F.; Kroupa, P.
2015-12-01
Modified Newtonian Dynamics (MOND or Milgromian dynamics) gives a successful description of many galaxy properties that are hard to understand in the classical framework. The rotation curves of spiral galaxies are, for instance, perfectly reproduced and understood within this framework. Nevertheless, rotation curves only trace the potential in the galactic plane, and it is thus useful to test the shape of the potential outside the plane. Here we use the Sagittarius tidal stream as a gravitational experiment in the Milky Way, in order to check whether MOND can explain both its characteristics and those of the remnant dwarf spheroidal galaxy progenitor. We show that a MOND model of the Sagittarius stream can both perfectly reproduce the observed positions of stars in the stream, and even more strikingly, perfectly reproduce the observed properties of the remnant. Nevertheless, this first model does not reproduce well the observed radial velocities, which could be a signature of a rotating component in the progenitor or of the presence of a massive hot gaseous halo around the Milky Way.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Favorite, Jeffrey A.
The Second-Level Adjoint Sensitivity System (2nd-LASS) that yields the second-order sensitivities of a response of uncollided particles with respect to isotope densities, cross sections, and source emission rates is derived in Refs. 1 and 2. In Ref. 2, we solved problems for the uncollided leakage from a homogeneous sphere and a multiregion cylinder using the PARTISN multigroup discrete-ordinates code. In this memo, we derive solutions of the 2nd-LASS for the particular case when the response is a flux or partial current density computed at a single point on the boundary, and the inner products are computed using ray-tracing. Both themore » PARTISN approach and the ray-tracing approach are implemented in a computer code, SENSPG. The next section of this report presents the equations of the 1st- and 2nd-LASS for uncollided particles and the first- and second-order sensitivities that use the solutions of the 1st- and 2nd-LASS. Section III presents solutions of the 1st- and 2nd-LASS equations for the case of ray-tracing from a detector point. Section IV presents specific solutions of the 2nd-LASS and derives the ray-trace form of the inner products needed for second-order sensitivities. Numerical results for the total leakage from a homogeneous sphere are presented in Sec. V and for the leakage from one side of a two-region slab in Sec. VI. Section VII is a summary and conclusions.« less
NVIDIA OptiX ray-tracing engine as a new tool for modelling medical imaging systems
NASA Astrophysics Data System (ADS)
Pietrzak, Jakub; Kacperski, Krzysztof; Cieślar, Marek
2015-03-01
The most accurate technique to model the X- and gamma radiation path through a numerically defined object is the Monte Carlo simulation which follows single photons according to their interaction probabilities. A simplified and much faster approach, which just integrates total interaction probabilities along selected paths, is known as ray tracing. Both techniques are used in medical imaging for simulating real imaging systems and as projectors required in iterative tomographic reconstruction algorithms. These approaches are ready for massive parallel implementation e.g. on Graphics Processing Units (GPU), which can greatly accelerate the computation time at a relatively low cost. In this paper we describe the application of the NVIDIA OptiX ray-tracing engine, popular in professional graphics and rendering applications, as a new powerful tool for X- and gamma ray-tracing in medical imaging. It allows the implementation of a variety of physical interactions of rays with pixel-, mesh- or nurbs-based objects, and recording any required quantities, like path integrals, interaction sites, deposited energies, and others. Using the OptiX engine we have implemented a code for rapid Monte Carlo simulations of Single Photon Emission Computed Tomography (SPECT) imaging, as well as the ray-tracing projector, which can be used in reconstruction algorithms. The engine generates efficient, scalable and optimized GPU code, ready to run on multi GPU heterogeneous systems. We have compared the results our simulations with the GATE package. With the OptiX engine the computation time of a Monte Carlo simulation can be reduced from days to minutes.
140 GHz EC waves propagation and absorption for normal/oblique injection on FTU tokamak
NASA Astrophysics Data System (ADS)
Nowak, S.; Airoldi, A.; Bruschi, A.; Buratti, P.; Cirant, S.; Gandini, F.; Granucci, G.; Lazzaro, E.; Panaccione, L.; Ramponi, G.; Simonetto, A.; Sozzi, C.; Tudisco, O.; Zerbini, M.
1999-09-01
Most of the interest in ECRH experiments is linked to the high localization of EC waves absorption in well known portions of the plasma volume. In order to take full advantage of this capability a reliable code has been developed for beam tracing and absorption calculations. The code is particularly important for oblique (poloidal and toroidal) injection, when the absorbing layer is not simply dependent on the position of the EC resonance only. An experimental estimate of the local heating power density is given by the jump in the time derivative of the local electron pressure at the switching ON of the gyrotron power. The evolution of the temperature profile increase (from ECE polychromator) during the nearly adiabatic phase is also considered for ECRH profile reconstruction. An indirect estimate of optical thickness and of the overall absorption coefficient is given by the measure of the residual e.m. power at the tokamak walls. Beam tracing code predictions of the power deposition profile are compared with experimental estimates. The impact of the finite spatial resolution of the temperature diagnostic on profile reconstruction is also discussed.
Computational models for the analysis of three-dimensional internal and exhaust plume flowfields
NASA Technical Reports Server (NTRS)
Dash, S. M.; Delguidice, P. D.
1977-01-01
This paper describes computational procedures developed for the analysis of three-dimensional supersonic ducted flows and multinozzle exhaust plume flowfields. The models/codes embodying these procedures cater to a broad spectrum of geometric situations via the use of multiple reference plane grid networks in several coordinate systems. Shock capturing techniques are employed to trace the propagation and interaction of multiple shock surfaces while the plume interface, separating the exhaust and external flows, and the plume external shock are discretely analyzed. The computational grid within the reference planes follows the trace of streamlines to facilitate the incorporation of finite-rate chemistry and viscous computational capabilities. Exhaust gas properties consist of combustion products in chemical equilibrium. The computational accuracy of the models/codes is assessed via comparisons with exact solutions, results of other codes and experimental data. Results are presented for the flows in two-dimensional convergent and divergent ducts, expansive and compressive corner flows, flow in a rectangular nozzle and the plume flowfields for exhausts issuing out of single and multiple rectangular nozzles.
Optimizing multi-dimensional high throughput screening using zebrafish
Truong, Lisa; Bugel, Sean M.; Chlebowski, Anna; Usenko, Crystal Y.; Simonich, Michael T.; Massey Simonich, Staci L.; Tanguay, Robert L.
2016-01-01
The use of zebrafish for high throughput screening (HTS) for chemical bioactivity assessments is becoming routine in the fields of drug discovery and toxicology. Here we report current recommendations from our experiences in zebrafish HTS. We compared the effects of different high throughput chemical delivery methods on nominal water concentration, chemical sorption to multi-well polystyrene plates, transcription responses, and resulting whole animal responses. We demonstrate that digital dispensing consistently yields higher data quality and reproducibility compared to standard plastic tip-based liquid handling. Additionally, we illustrate the challenges in using this sensitive model for chemical assessment when test chemicals have trace impurities. Adaptation of these better practices for zebrafish HTS should increase reproducibility across laboratories. PMID:27453428
Primary acoustic signal structure during free falling drop collision with a water surface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chashechkin, Yu. D., E-mail: chakin@ipmnet.ru; Prokhorov, V. E., E-mail: prohorov@ipmnet.ru
2016-04-15
Consistent optical and acoustic techniques have been used to study the structure of hydrodynamic disturbances and acoustic signals generated as a free falling drop penetrates water. The relationship between the structures of hydrodynamic and acoustic perturbations arising as a result of a falling drop contacting with the water surface and subsequent immersion into water is traced. The primary acoustic signal is characterized, in addition to stably reproduced features (steep leading edge followed by long decay with local pressure maxima), by irregular high-frequency packets, which are studied for the first time. Reproducible experimental data are used to recognize constant and variablemore » components of the primary acoustic signal.« less
Mechanism on brain information processing: Energy coding
NASA Astrophysics Data System (ADS)
Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa
2006-09-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.
Top ten reasons to register your code with the Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, Alice; DuPrie, Kimberly; Berriman, G. Bruce; Mink, Jessica D.; Nemiroff, Robert J.; Robitaille, Thomas; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Teuben, Peter J.; Wallin, John F.; Warmels, Rein
2017-01-01
With 1,400 codes, the Astrophysics Source Code Library (ASCL, ascl.net) is the largest indexed resource for codes used in astronomy research in existence. This free online registry was established in 1999, is indexed by Web of Science and ADS, and is citable, with citations to its entries tracked by ADS. Registering your code with the ASCL is easy with our online submissions system. Making your software available for examination shows confidence in your research and makes your research more transparent, reproducible, and falsifiable. ASCL registration allows your software to be cited on its own merits and provides a citation that is trackable and accepted by all astronomy journals and journals such as Science and Nature. Registration also allows others to find your code more easily. This presentation covers the benefits of registering astronomy research software with the ASCL.
Energy coding in biological neural networks
Zhang, Zhikang
2007-01-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513
GRay: A MASSIVELY PARALLEL GPU-BASED CODE FOR RAY TRACING IN RELATIVISTIC SPACETIMES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal
We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparingmore » theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.« less
TransFit: Finite element analysis data fitting software
NASA Technical Reports Server (NTRS)
Freeman, Mark
1993-01-01
The Advanced X-Ray Astrophysics Facility (AXAF) mission support team has made extensive use of geometric ray tracing to analyze the performance of AXAF developmental and flight optics. One important aspect of this performance modeling is the incorporation of finite element analysis (FEA) data into the surface deformations of the optical elements. TransFit is software designed for the fitting of FEA data of Wolter I optical surface distortions with a continuous surface description which can then be used by SAO's analytic ray tracing software, currently OSAC (Optical Surface Analysis Code). The improved capabilities of Transfit over previous methods include bicubic spline fitting of FEA data to accommodate higher spatial frequency distortions, fitted data visualization for assessing the quality of fit, the ability to accommodate input data from three FEA codes plus other standard formats, and options for alignment of the model coordinate system with the ray trace coordinate system. TransFit uses the AnswerGarden graphical user interface (GUI) to edit input parameters and then access routines written in PV-WAVE, C, and FORTRAN to allow the user to interactively create, evaluate, and modify the fit. The topics covered include an introduction to TransFit: requirements, designs philosophy, and implementation; design specifics: modules, parameters, fitting algorithms, and data displays; a procedural example; verification of performance; future work; and appendices on online help and ray trace results of the verification section.
Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica
2017-12-28
The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. © Hara et al.
Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica
2017-01-01
The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. PMID:29284701
Rapid Analysis of Trace Drugs and Metabolites Using a Thermal Desorption DART-MS Configuration.
Sisco, Edward; Forbes, Thomas P; Staymates, Matthew E; Gillen, Greg
2016-01-01
The need to analyze trace narcotic samples rapidly for screening or confirmatory purposes is of increasing interest to the forensic, homeland security, and criminal justice sectors. This work presents a novel method for the detection and quantification of trace drugs and metabolites off of a swipe material using a thermal desorption direct analysis in real time mass spectrometry (TD-DART-MS) configuration. A variation on traditional DART, this configuration allows for desorption of the sample into a confined tube, completely independent of the DART source, allowing for more efficient and thermally precise analysis of material present on a swipe. Over thirty trace samples of narcotics, metabolites, and cutting agents deposited onto swipes were rapidly differentiated using this methodology. The non-optimized method led to sensitivities ranging from single nanograms to hundreds of picograms. Direct comparison to traditional DART with a subset of the samples highlighted an improvement in sensitivity by a factor of twenty to thirty and an increase in reproducibility sample to sample from approximately 45 % RSD to less than 15 % RSD. Rapid extraction-less quantification was also possible.
Ghosh, Adarsh; Singh, Tulika; Singla, Veenu; Bagga, Rashmi; Khandelwal, Niranjan
2017-12-01
Apparent diffusion coefficient (ADC) maps are usually generated by builtin software provided by the MRI scanner vendors; however, various open-source postprocessing software packages are available for image manipulation and parametric map generation. The purpose of this study is to establish the reproducibility of absolute ADC values obtained using different postprocessing software programs. DW images with three b values were obtained with a 1.5-T MRI scanner, and the trace images were obtained. ADC maps were automatically generated by the in-line software provided by the vendor during image generation and were also separately generated on postprocessing software. These ADC maps were compared on the basis of ROIs using paired t test, Bland-Altman plot, mountain plot, and Passing-Bablok regression plot. There was a statistically significant difference in the mean ADC values obtained from the different postprocessing software programs when the same baseline trace DW images were used for the ADC map generation. For using ADC values as a quantitative cutoff for histologic characterization of tissues, standardization of the postprocessing algorithm is essential across processing software packages, especially in view of the implementation of vendor-neutral archiving.
NASA Technical Reports Server (NTRS)
Awtry, A. R.; Miller, J. H.
2002-01-01
The progress in the development of a sensor for the detection of trace air constituents to monitor spacecraft air quality is reported. A continuous-wave (cw), external-cavity tunable diode laser centered at 1.55 micrometers is used to pump an optical cavity absorption cell in cw-cavity ringdown spectroscopy (cw-CRDS). Preliminary results are presented that demonstrate the sensitivity, selectivity and reproducibility of this method. Detection limits of 2.0 ppm for CO, 2.5 ppm for CO2, 1.8 ppm for H2O, 19.4 ppb for NH3, 7.9 ppb for HCN and 4.0 ppb for C2H2 are calculated.
In-injection port thermal desorption for explosives trace evidence analysis.
Sigman, M E; Ma, C Y
1999-10-01
A gas chromatographic method utilizing thermal desorption of a dry surface wipe for the analysis of explosives trace chemical evidence has been developed and validated using electron capture and negative ion chemical ionization mass spectrometric detection. Thermal desorption was performed within a split/splitless injection port with minimal instrument modification. Surface-abraded Teflon tubing provided the solid support for sample collection and desorption. Performance was characterized by desorption efficiency, reproducibility, linearity of the calibration, and method detection and quantitation limits. Method validation was performed with a series of dinitrotoluenes, trinitrotoluene, two nitroester explosives, and one nitramine explosive. The method was applied to the sampling of a single piece of debris from an explosion containing trinitrotoluene.
DNA-PCR analysis of bloodstains sampled by the polyvinyl-alcohol method.
Schyma, C; Huckenbeck, W; Bonte, W
1999-01-01
Among the usual techniques of sampling gunshot residues (GSR), the polyvinyl-alcohol method (PVAL) includes the advantage of embedding all particles, foreign bodies and stains on the surface of the shooter's hand in exact and reproducible topographic localization. The aim of the present study on ten persons killed by firearms was to check the possibility of DNA-PCR typing of blood traces embedded in the PVAL gloves in a second step following GSR analysis. The results of these examinations verify that the PVAL technique does not include factors that inhibit successful PCR typing. Thus the PVAL method can be recommended as a combination technique to secure and preserve inorganic and biological traces at the same time.
Detection of chemical residues in food oil via surface-enhanced Raman spectroscopy
NASA Astrophysics Data System (ADS)
Sun, Kexi; Huang, Qing
2016-05-01
Highly ordered hexagonally patterned Ag-nanorod (Ag-NR) arrays for surface-enhanced Raman scattering (SERS) detection of unhealthy chemical residues in food oil was reported, which was obtained by sputtering Ag on the alumina nanotip arrays stuck out of conical-pore anodic aluminum oxide (AAO) templates. SERS measurements demonstrate that the as-fabricated large-scale Ag-nanostructures can serve as highly sensitive and reproducible SERS substrates for detection of trace amount of chemicals in oil with the lower detection limits of 2×10-6 M for thiram and 10-7 M for rhodamine B, showing the potential of application of SERS in rapid trace detection of pesticide residues and illegal additives in food oils.
Validation of Multitemperature Nozzle Flow Code
NASA Technical Reports Server (NTRS)
Park, Chul; Lee, Seung -Ho.
1994-01-01
A computer code nozzle in n-temperatures (NOZNT), which calculates one-dimensional flows of partially dissociated and ionized air in an expanding nozzle, is tested against three existing sets of experimental data taken in arcjet wind tunnels. The code accounts for the differences among various temperatures, i.e., translational-rotational temperature, vibrational temperatures of individual molecular species, and electron-electronic temperature, and the effects of impurities. The experimental data considered are (1) the spectroscopic emission data; (2) electron beam data on vibrational temperature; and (3) mass-spectrometric species concentration data. It is shown that the impurities are inconsequential for the arcjet flows, and the NOZNT code is validated by numerically reproducing the experimental data.
A Comparison of Three Elliptical Galaxy Photochemical Evolution Codes
NASA Astrophysics Data System (ADS)
Gibson, Brad K.
1996-09-01
Working within the classic supernovae-driven wind framework for elliptical galaxy evolution, We perform a systematic investigation into the discrepancies between the predictions of three contemporary codes (by Arimoto & Yoshii, Bressan et al., and Gibson). By being primarily concerned with reproducing the present-day color-metallicity-luminosity (CML) relations among elliptical galaxies, the approaches taken in the theoretical modeling have managed to obscure many of the hidden differences between the codes. Targeting the timescale for the onset of the initial galactic wind, t_GW_, as a primary "difference" indicator, We demonstrate exactly how and why each code is able to claim successful reproduction of the CML relations, despite possessing apparently incompatible input ingredients.
Noël, Marie; Christensen, Jennie R; Spence, Jody; Robbins, Charles T
2015-10-01
We enhanced an existing technique, laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), to function as a non-lethal tool in the temporal characterization of trace element exposure in wild mammals. Mercury (Hg), copper (Cu), cadmium (Cd), lead (Pb), iron (Fe) and zinc (Zn) were analyzed along the hair of captive and wild grizzly bears (Ursus arctos horribilis). Laser parameters were optimized (consecutive 2000 μm line scans along the middle line of the hair at a speed of 50 μm/s; spot size=30 μm) for consistent ablation of the hair. A pressed pellet of reference material DOLT-2 and sulfur were used as external and internal standards, respectively. Our newly adapted method passed the quality control tests with strong correlations between trace element concentrations obtained using LA-ICP-MS and those obtained with regular solution-ICP-MS (r(2)=0.92, 0.98, 0.63, 0.57, 0.99 and 0.90 for Hg, Fe, Cu, Zn, Cd and Pb, respectively). Cross-correlation analyses revealed good reproducibility between trace element patterns obtained from hair collected from the same bear. One exception was Cd for which external contamination was observed resulting in poor reproducibility. In order to validate the method, we used LA-ICP-MS on the hair of five captive grizzly bears fed known and varying amounts of cutthroat trout over a period of 33 days. Trace element patterns along the hair revealed strong Hg, Cu and Zn signals coinciding with fish consumption. Accordingly, significant correlations between Hg, Cu, and Zn in the hair and Hg, Cu, and Zn intake were evident and we were able to develop accumulation models for each of these elements. While the use of LA-ICP-MS for the monitoring of trace elements in wildlife is in its infancy, this study highlights the robustness and applicability of this newly adapted method. Copyright © 2015 Elsevier B.V. All rights reserved.
Predicting Morphology of Polymers Using Mesotek+
2010-02-01
file is then produced for Mesotek+ to reproduce the phase behavior for an experimental system of poly (styrene-b- isoprene ) in the solvent tetradecane...theoretical code 3a and (b) experimental code 3b. .....6 Figure 3. Results from 40/60 volume styrene-b- isoprene + tetradecane using gnuplot: A...styrene volume fraction, B) isoprene volume fraction, and C) tetradecane volume fraction. The color bar to the right of each plot indicates how the
NASA Astrophysics Data System (ADS)
Morozov, A.; Defendi, I.; Engels, R.; Fraga, F. A. F.; Fraga, M. M. F. R.; Guerard, B.; Jurkovic, M.; Kemmerling, G.; Manzin, G.; Margato, L. M. S.; Niko, H.; Pereira, L.; Petrillo, C.; Peyaud, A.; Piscitelli, F.; Raspino, D.; Rhodes, N. J.; Sacchetti, F.; Schooneveld, E. M.; Van Esch, P.; Zeitelhack, K.
2012-08-01
A custom and fully interactive simulation package ANTS (Anger-camera type Neutron detector: Toolkit for Simulations) has been developed to optimize the design and operation conditions of secondary scintillation Anger-camera type gaseous detectors for thermal neutron imaging. The simulation code accounts for all physical processes related to the neutron capture, energy deposition pattern, drift of electrons of the primary ionization and secondary scintillation. The photons are traced considering the wavelength-resolved refraction and transmission of the output window. Photo-detection accounts for the wavelength-resolved quantum efficiency, angular response, area sensitivity, gain and single-photoelectron spectra of the photomultipliers (PMTs). The package allows for several geometrical shapes of the PMT photocathode (round, hexagonal and square) and offers a flexible PMT array configuration: up to 100 PMTs in a custom arrangement with the square or hexagonal packing. Several read-out patterns of the PMT array are implemented. Reconstruction of the neutron capture position (projection on the plane of the light emission) is performed using the center of gravity, maximum likelihood or weighted least squares algorithm. Simulation results reproduce well the preliminary results obtained with a small-scale detector prototype. ANTS executables can be downloaded from http://coimbra.lip.pt/~andrei/.
Simulating Cosmic Reionization and Its Observable Consequences
NASA Astrophysics Data System (ADS)
Shapiro, Paul
2017-01-01
I summarize recent progress in modelling the epoch of reionization by large- scale simulations of cosmic structure formation, radiative transfer and their interplay, which trace the ionization fronts that swept across the IGM, to predict observable signatures. Reionization by starlight from early galaxies affected their evolution, impacting reionization, itself, and imprinting the galaxies with a memory of reionization. Star formation suppression, e.g., may explain the observed underabundance of Local Group dwarfs relative to N-body predictions for Cold Dark Matter. I describe CoDa (''Cosmic Dawn''), the first fully-coupled radiation-hydrodynamical simulation of reionization and galaxy formation in the Local Universe, in a volume large enough to model reionization globally but with enough resolving power to follow all the atomic-cooling galactic halos in that volume. A 90 Mpc box was simulated from a constrained realization of primordial fluctuations, chosen to reproduce present-day features of the Local Group, including the Milky Way and M31, and the local universe beyond, including the Virgo cluster. The new RAMSES-CUDATON hybrid CPU-GPU code took 11 days to perform this simulation on the Titan supercomputer at Oak Ridge National Laboratory, with 4096-cubed N-body particles for the dark matter and 4096-cubed cells for the atomic gas and ionizing radiation.
A novel optical assay system for the quantitative measurement of chemotaxis.
Kanegasaki, Shiro; Nomura, Yuka; Nitta, Nao; Akiyama, Shuichi; Tamatani, Takuya; Goshoh, Yasuhiro; Yoshida, Takashi; Sato, Tsuyoshi; Kikuchi, Yuji
2003-11-01
We have developed an optically accessible, horizontal chemotaxis apparatus consisting of an etched silicon substrate and a flat glass plate, both of which form two compartments with a 5-microm-deep microchannel in between. The device is held together with a stainless steel holder with holes for injecting cells and a chemoattractant to the different compartments. Migration of cells in the channel is traced with time-lapse intervals using a CCD camera. By developing a method for aligning cells at the edge of the channel, we could successfully reduce the number of cells required for a chemotactic assay, depending on the experiment, to 100 or less. To prevent ceaseless flow of contents between the adjacent compartments via the communicating microchannel, a space at the top end of the holder was filled with medium after aligning the cells. By using a fluorescent probe, we demonstrated experimentally that a stable concentration gradient could be maintained. Furthermore, we determined theoretical details of the gradient established using a model chemokine and a computational fluid dynamics code. Reproducible kinetic results of cell migration were obtained using human neutrophils and IL-8 as a model. Migration of other cells such as eosinophils, basophils and Jurkat lymphocytes toward the appropriate chemokines were also demonstrated.
NASA Astrophysics Data System (ADS)
de Winter, Niels; Goderis, Steven; van Malderen, Stijn; Vanhaecke, Frank; Claeys, Philippe
2016-04-01
A combination of laboratory micro-X-ray Fluorescence (μXRF) and stable carbon and oxygen isotope analysis shows that trace element profiles from modern horse molars reveal a seasonal pattern that co-varies with seasonality in the oxygen isotope records of enamel carbonate from the same teeth. A combination of six cheek teeth (premolars and molars) from the same individual yields a seasonal isotope and trace element record of approximately three years recorded during the growth of the molars. This record shows that reproducible measurements of various trace element ratios (e.g., Sr/Ca, Zn/Ca, Fe/Ca, K/Ca and S/Ca) lag the seasonal pattern in oxygen isotope records by 2-3 months. Laser Ablation-ICP-Mass Spectrometry (LA-ICP-MS) analysis on a cross-section of the first molar of the same individual is compared to the bench-top tube-excitation μXRF results to test the robustness of the measurements and to compare both methods. Furthermore, trace element (e.g. Sr, Zn, Mg & Ba) profiles perpendicular to the growth direction of the same tooth, as well as profiles parallel to the growth direction are measured with LA-ICP-MS and μXRF to study the internal distribution of trace element ratios in two dimensions. Results of this extensive complementary line-scanning procedure shows the robustness of state of the art laboratory micro-XRF scanning for the measurement of trace elements in bioapatite. The comparison highlights the advantages and disadvantages of both methods for trace element analysis and illustrates their complementarity. Results of internal variation within the teeth shed light on the origins of trace elements in mammal teeth and their potential use for paleo-environmental reconstruction.
Simulations of Laboratory Astrophysics Experiments using the CRASH code
NASA Astrophysics Data System (ADS)
Trantham, Matthew; Kuranz, Carolyn; Fein, Jeff; Wan, Willow; Young, Rachel; Keiter, Paul; Drake, R. Paul
2015-11-01
Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, magnetized flows, jets, and laser-produced plasmas. This work is funded by the following grants: DEFC52-08NA28616, DE-NA0001840, and DE-NA0002032.
Audetat, Andreas; Garbe-Schonberg, Dieter; Kronz, Andreas; Pettke, Thomas; Rusk, Brian G.; Donovan, John J.; Lowers, Heather
2015-01-01
A natural smoky quartz crystal from Shandong province, China, was characterised by laser ablation ICP-MS, electron probe microanalysis (EPMA) and solution ICP-MS to determine the concentration of twenty-four trace and ultra trace elements. Our main focus was on Ti quantification because of the increased use of this element for titanium-in-quartz (TitaniQ) thermobarometry. Pieces of a uniform growth zone of 9 mm thickness within the quartz crystal were analysed in four different LA-ICP-MS laboratories, three EPMA laboratories and one solution-ICP-MS laboratory. The results reveal reproducible concentrations of Ti (57 ± 4 μg g-1), Al (154 ± 15 μg g-1), Li (30 ± 2 μg g-1), Fe (2.2 ± 0.3 μg g-1), Mn (0.34 ± 0.04 μg g-1), Ge (1.7 ± 0.2 μg g-1) and Ga (0.020 ± 0.002 μg g-1) and detectable, but less reproducible, concentrations of Be, B, Na, Cu, Zr, Sn and Pb. Concentrations of K, Ca, Sr, Mo, Ag, Sb, Ba and Au were below the limits of detection of all three techniques. The uncertainties on the average concentration determinations by multiple techniques and laboratories for Ti, Al, Li, Fe, Mn, Ga and Ge are low; hence, this quartz can serve as a reference material or a secondary reference material for microanalytical applications involving the quantification of trace elements in quartz.
Tracing Multiple Generations of Active Galactic Nucleau Feedback in the Core of Abell 262
2009-06-01
Virgo cluster reveal a series of filaments, which trace regions that are thought 1481 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public...L. Sarazin4, L. D. Anderson3, Gopal-Krishna5, E. M. Douglass3, and N. E. Kassim1 1 Naval Research Laboratory, 4555 Overlook Avenue SW, Code 7213...Washington, DC 20375, USA 2 Interferometrics Inc., 13454 Sunrise Valley Drive, Suite 240, Herndon, VA 20171, USA 3 Institute for Astrophysical Research
Signal-processing theory for the TurboRogue receiver
NASA Technical Reports Server (NTRS)
Thomas, J. B.
1995-01-01
Signal-processing theory for the TurboRogue receiver is presented. The signal form is traced from its formation at the GPS satellite, to the receiver antenna, and then through the various stages of the receiver, including extraction of phase and delay. The analysis treats the effects of ionosphere, troposphere, signal quantization, receiver components, and system noise, covering processing in both the 'code mode' when the P code is not encrypted and in the 'P-codeless mode' when the P code is encrypted. As a possible future improvement to the current analog front end, an example of a highly digital front end is analyzed.
Progress toward openness, transparency, and reproducibility in cognitive neuroscience.
Gilmore, Rick O; Diaz, Michele T; Wyble, Brad A; Yarkoni, Tal
2017-05-01
Accumulating evidence suggests that many findings in psychological science and cognitive neuroscience may prove difficult to reproduce; statistical power in brain imaging studies is low and has not improved recently; software errors in analysis tools are common and can go undetected for many years; and, a few large-scale studies notwithstanding, open sharing of data, code, and materials remain the rare exception. At the same time, there is a renewed focus on reproducibility, transparency, and openness as essential core values in cognitive neuroscience. The emergence and rapid growth of data archives, meta-analytic tools, software pipelines, and research groups devoted to improved methodology reflect this new sensibility. We review evidence that the field has begun to embrace new open research practices and illustrate how these can begin to address problems of reproducibility, statistical power, and transparency in ways that will ultimately accelerate discovery. © 2017 New York Academy of Sciences.
TIM, a ray-tracing program for METATOY research and its dissemination
NASA Astrophysics Data System (ADS)
Lambert, Dean; Hamilton, Alasdair C.; Constable, George; Snehanshu, Harsh; Talati, Sharvil; Courtial, Johannes
2012-03-01
TIM (The Interactive METATOY) is a ray-tracing program specifically tailored towards our research in METATOYs, which are optical components that appear to be able to create wave-optically forbidden light-ray fields. For this reason, TIM possesses features not found in other ray-tracing programs. TIM can either be used interactively or by modifying the openly available source code; in both cases, it can easily be run as an applet embedded in a web page. Here we describe the basic structure of TIM's source code and how to extend it, and we give examples of how we have used TIM in our own research. Program summaryProgram title: TIM Catalogue identifier: AEKY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 124 478 No. of bytes in distributed program, including test data, etc.: 4 120 052 Distribution format: tar.gz Programming language: Java Computer: Any computer capable of running the Java Virtual Machine (JVM) 1.6 Operating system: Any; developed under Mac OS X Version 10.6 RAM: Typically 145 MB (interactive version running under Mac OS X Version 10.6) Classification: 14, 18 External routines: JAMA [1] (source code included) Nature of problem: Visualisation of scenes that include scene objects that create wave-optically forbidden light-ray fields. Solution method: Ray tracing. Unusual features: Specifically designed to visualise wave-optically forbidden light-ray fields; can visualise ray trajectories; can visualise geometric optic transformations; can create anaglyphs (for viewing with coloured "3D glasses") and random-dot autostereograms of the scene; integrable into web pages. Running time: Problem-dependent; typically seconds for a simple scene.
Cross-species inference of long non-coding RNAs greatly expands the ruminant transcriptome.
Bush, Stephen J; Muriuki, Charity; McCulloch, Mary E B; Farquhar, Iseabail L; Clark, Emily L; Hume, David A
2018-04-24
mRNA-like long non-coding RNAs (lncRNAs) are a significant component of mammalian transcriptomes, although most are expressed only at low levels, with high tissue-specificity and/or at specific developmental stages. Thus, in many cases lncRNA detection by RNA-sequencing (RNA-seq) is compromised by stochastic sampling. To account for this and create a catalogue of ruminant lncRNAs, we compared de novo assembled lncRNAs derived from large RNA-seq datasets in transcriptional atlas projects for sheep and goats with previous lncRNAs assembled in cattle and human. We then combined the novel lncRNAs with the sheep transcriptional atlas to identify co-regulated sets of protein-coding and non-coding loci. Few lncRNAs could be reproducibly assembled from a single dataset, even with deep sequencing of the same tissues from multiple animals. Furthermore, there was little sequence overlap between lncRNAs that were assembled from pooled RNA-seq data. We combined positional conservation (synteny) with cross-species mapping of candidate lncRNAs to identify a consensus set of ruminant lncRNAs and then used the RNA-seq data to demonstrate detectable and reproducible expression in each species. In sheep, 20 to 30% of lncRNAs were located close to protein-coding genes with which they are strongly co-expressed, which is consistent with the evolutionary origin of some ncRNAs in enhancer sequences. Nevertheless, most of the lncRNAs are not co-expressed with neighbouring protein-coding genes. Alongside substantially expanding the ruminant lncRNA repertoire, the outcomes of our analysis demonstrate that stochastic sampling can be partly overcome by combining RNA-seq datasets from related species. This has practical implications for the future discovery of lncRNAs in other species.
2014-01-01
Background During outbreak of livestock diseases, contact tracing can be an important part of disease control. Animal movements can also be of relevance for risk-based surveillance and sampling, i.e. both when assessing consequences of introduction or likelihood of introduction. In many countries, animal movement data are collected with one of the major objectives to enable contact tracing. However, often an analytical step is needed to retrieve appropriate information for contact tracing or surveillance. Results In this study, an open source tool was developed to structure livestock movement data to facilitate contact-tracing in real time during disease outbreaks and for input in risk-based surveillance and sampling. The tool, EpiContactTrace, was written in the R-language and uses the network parameters in-degree, out-degree, ingoing contact chain and outgoing contact chain (also called infection chain), which are relevant for forward and backward tracing respectively. The time-frames for backward and forward tracing can be specified independently and search can be done on one farm at a time or for all farms within the dataset. Different outputs are available; datasets with network measures, contacts visualised in a map and automatically generated reports for each farm either in HTML or PDF-format intended for the end-users, i.e. the veterinary authorities, regional disease control officers and field-veterinarians. EpiContactTrace is available as an R-package at the R-project website (http://cran.r-project.org/web/packages/EpiContactTrace/). Conclusions We believe this tool can help in disease control since it rapidly can structure essential contact information from large datasets. The reproducible reports make this tool robust and independent of manual compilation of data. The open source makes it accessible and easily adaptable for different needs. PMID:24636731
Benchmarking the SPHINX and CTH shock physics codes for three problems in ballistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, L.T.; Hertel, E.; Schwalbe, L.
1998-02-01
The CTH Eulerian hydrocode, and the SPHINX smooth particle hydrodynamics (SPH) code were used to model a shock tube, two long rod penetrations into semi-infinite steel targets, and a long rod penetration into a spaced plate array. The results were then compared to experimental data. Both SPHINX and CTH modeled the one-dimensional shock tube problem well. Both codes did a reasonable job in modeling the outcome of the axisymmetric rod impact problem. Neither code correctly reproduced the depth of penetration in both experiments. In the 3-D problem, both codes reasonably replicated the penetration of the rod through the first plate.more » After this, however, the predictions of both codes began to diverge from the results seen in the experiment. In terms of computer resources, the run times are problem dependent, and are discussed in the text.« less
Astrophysics Source Code Library: Incite to Cite!
NASA Astrophysics Data System (ADS)
DuPrie, K.; Allen, A.; Berriman, B.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J. F.
2014-05-01
The Astrophysics Source Code Library (ASCl,http://ascl.net/) is an on-line registry of over 700 source codes that are of interest to astrophysicists, with more being added regularly. The ASCL actively seeks out codes as well as accepting submissions from the code authors, and all entries are citable and indexed by ADS. All codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. In addition to being the largest directory of scientist-written astrophysics programs available, the ASCL is also an active participant in the reproducible research movement with presentations at various conferences, numerous blog posts and a journal article. This poster provides a description of the ASCL and the changes that we are starting to see in the astrophysics community as a result of the work we are doing.
Introducing GAMER: A fast and accurate method for ray-tracing galaxies using procedural noise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groeneboom, N. E.; Dahle, H., E-mail: nicolaag@astro.uio.no
2014-03-10
We developed a novel approach for fast and accurate ray-tracing of galaxies using procedural noise fields. Our method allows for efficient and realistic rendering of synthetic galaxy morphologies, where individual components such as the bulge, disk, stars, and dust can be synthesized in different wavelengths. These components follow empirically motivated overall intensity profiles but contain an additional procedural noise component that gives rise to complex natural patterns that mimic interstellar dust and star-forming regions. These patterns produce more realistic-looking galaxy images than using analytical expressions alone. The method is fully parallelized and creates accurate high- and low- resolution images thatmore » can be used, for example, in codes simulating strong and weak gravitational lensing. In addition to having a user-friendly graphical user interface, the C++ software package GAMER is easy to implement into an existing code.« less
Introducing GAMER: A Fast and Accurate Method for Ray-tracing Galaxies Using Procedural Noise
NASA Astrophysics Data System (ADS)
Groeneboom, N. E.; Dahle, H.
2014-03-01
We developed a novel approach for fast and accurate ray-tracing of galaxies using procedural noise fields. Our method allows for efficient and realistic rendering of synthetic galaxy morphologies, where individual components such as the bulge, disk, stars, and dust can be synthesized in different wavelengths. These components follow empirically motivated overall intensity profiles but contain an additional procedural noise component that gives rise to complex natural patterns that mimic interstellar dust and star-forming regions. These patterns produce more realistic-looking galaxy images than using analytical expressions alone. The method is fully parallelized and creates accurate high- and low- resolution images that can be used, for example, in codes simulating strong and weak gravitational lensing. In addition to having a user-friendly graphical user interface, the C++ software package GAMER is easy to implement into an existing code.
46 CFR 54.10-5 - Maximum allowable working pressure (reproduces UG-98).
Code of Federal Regulations, 2010 CFR
2010-10-01
... section VIII of the ASME Boiler and Pressure Vessel Code, together with the effect of any combination of... operating temperature, using for each temperature the applicable allowable stress value. Note: Table 54.10-5...
NASA Astrophysics Data System (ADS)
Caserta, A.; Doumaz, F.; Pischiutta, M.; Costanzo, A.
2017-12-01
In the European design code EU08 used in Italy as NT08, site effects are accounted through several scaling factors, depending on the Vs30 and topographic conditions. The effectiveness of such approach has been tested in two case studies. The first one is located in the Tiber valley, the main sedimentary basin of the city of Rome. The second one is located in Acquasanta Terme town (central Appennines). In both cases, the expected amplification levels according to the Italian design code, was calculated on the basis of the velocity profile and other geological information collected in-situ. The expected values were compared in the former case (Rome) with data recorded during the seismic sequence following the 2009 April 6th, Mw=6.3 L'Aquila earthquake (mainshock, and aftershocks) and in the latter case (Acquasanta Terme) with moderate-magnitude aftershocks following the 2016 August 24th, Mw = 6.0, Amatrice missed main shock. Our results highlight that the parameterizations adopted by the design code are not sufficient to reproduce the real ground shaking occurring during earthquakes. These means that the Vs30 parameter ignores three-dimensional and frequency-dependent effects, as well as the influence of the near surface geology deeper than 30 meters.
Calibration of the Nikon 200 for Close Range Photogrammetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheriff, Lassana; /City Coll., N.Y. /SLAC
2010-08-25
The overall objective of this project is to study the stability and reproducibility of the calibration parameters of the Nikon D200 camera with a Nikkor 20 mm lens for close-range photogrammetric surveys. The well known 'central perspective projection' model is used to determine the camera parameters for interior orientation. The Brown model extends it with the introduction of radial distortion and other less critical variables. The calibration process requires a dense network of targets to be photographed at different angles. For faster processing, reflective coded targets are chosen. Two scenarios have been used to check the reproducibility of the parameters.more » The first one is using a flat 2D wall with 141 coded targets and 12 custom targets that were previously measured with a laser tracker. The second one is a 3D Unistrut structure with a combination of coded targets and 3D reflective spheres. The study has shown that this setup is only stable during a short period of time. In conclusion, this camera is acceptable when calibrated before each use. Future work should include actual field tests and possible mechanical improvements, such as securing the lens to the camera body.« less
A Practical Guide for Improving Transparency and Reproducibility in Neuroimaging Research
Poldrack, Russell A.
2016-01-01
Recent years have seen an increase in alarming signals regarding the lack of replicability in neuroscience, psychology, and other related fields. To avoid a widespread crisis in neuroimaging research and consequent loss of credibility in the public eye, we need to improve how we do science. This article aims to be a practical guide for researchers at any stage of their careers that will help them make their research more reproducible and transparent while minimizing the additional effort that this might require. The guide covers three major topics in open science (data, code, and publications) and offers practical advice as well as highlighting advantages of adopting more open research practices that go beyond improved transparency and reproducibility. PMID:27389358
Optimizing multi-dimensional high throughput screening using zebrafish.
Truong, Lisa; Bugel, Sean M; Chlebowski, Anna; Usenko, Crystal Y; Simonich, Michael T; Simonich, Staci L Massey; Tanguay, Robert L
2016-10-01
The use of zebrafish for high throughput screening (HTS) for chemical bioactivity assessments is becoming routine in the fields of drug discovery and toxicology. Here we report current recommendations from our experiences in zebrafish HTS. We compared the effects of different high throughput chemical delivery methods on nominal water concentration, chemical sorption to multi-well polystyrene plates, transcription responses, and resulting whole animal responses. We demonstrate that digital dispensing consistently yields higher data quality and reproducibility compared to standard plastic tip-based liquid handling. Additionally, we illustrate the challenges in using this sensitive model for chemical assessment when test chemicals have trace impurities. Adaptation of these better practices for zebrafish HTS should increase reproducibility across laboratories. Copyright © 2016 Elsevier Inc. All rights reserved.
Program Instrumentation and Trace Analysis
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Goldberg, Allen; Filman, Robert; Rosu, Grigore; Koga, Dennis (Technical Monitor)
2002-01-01
Several attempts have been made recently to apply techniques such as model checking and theorem proving to the analysis of programs. This shall be seen as a current trend to analyze real software systems instead of just their designs. This includes our own effort to develop a model checker for Java, the Java PathFinder 1, one of the very first of its kind in 1998. However, model checking cannot handle very large programs without some kind of abstraction of the program. This paper describes a complementary scalable technique to handle such large programs. Our interest is turned on the observation part of the equation: How much information can be extracted about a program from observing a single execution trace? It is our intention to develop a technology that can be applied automatically and to large full-size applications, with minimal modification to the code. We present a tool, Java PathExplorer (JPaX), for exploring execution traces of Java programs. The tool prioritizes scalability for completeness, and is directed towards detecting errors in programs, not to prove correctness. One core element in JPaX is an instrumentation package that allows to instrument Java byte code files to log various events when executed. The instrumentation is driven by a user provided script that specifies what information to log. Examples of instructions that such a script can contain are: 'report name and arguments of all called methods defined in class C, together with a timestamp'; 'report all updates to all variables'; and 'report all acquisitions and releases of locks'. In more complex instructions one can specify that certain expressions should be evaluated and even that certain code should be executed under various conditions. The instrumentation package can hence be seen as implementing Aspect Oriented Programming for Java in the sense that one can add functionality to a Java program without explicitly changing the code of the original program, but one rather writes an aspect and compiles it into the original program using the instrumentation. Another core element of JPaX is an observation package that supports the analysis of the generated event stream. Two kinds of analysis are currently supported. In temporal analysis the execution trace is evaluated against formulae written in temporal logic. We have implemented a temporal logic evaluator on finite traces using the Maude rewriting system from SRI International, USA. Temporal logic is defined in Maude by giving its syntax as a signature and its semantics as rewrite equations. The resulting semantics is extremely efficient and can handle event streams of hundreds of millions events in few minutes. Furthermore, the implementation is very succinct. The second form of even stream analysis supported is error pattern analysis where an execution trace is analyzed using various error detection algorithms that can identify error-prone programming practices that may potentially lead to errors in some different executions. Two such algorithms focusing on concurrency errors have been implemented in JPaX, one for deadlocks and the other for data races. It is important to note, that a deadlock or data race potential does not need to occur in order for its potential to be detected with these algorithms. This is what makes them very scalable in practice. The data race algorithm implemented is the Eraser algorithm from Compaq, however adopted to Java. The tool is currently being applied to a code base for controlling a spacecraft by the developers of that software in order to evaluate its applicability.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
Force-Free Magnetic Fields Calculated from Automated Tracing of Coronal Loops with AIA/SDO
NASA Astrophysics Data System (ADS)
Aschwanden, M. J.
2013-12-01
One of the most realistic magnetic field models of the solar corona is a nonlinear force-free field (NLFFF) solution. There exist about a dozen numeric codes that compute NLFFF solutions based on extrapolations of photospheric vector magnetograph data. However, since the photosphere and lower chromosphere is not force-free, a suitable correction has to be applied to the lower boundary condition. Despite of such "pre-processing" corrections, the resulting theoretical magnetic field lines deviate substantially from observed coronal loop geometries. - Here we developed an alternative method that fits an analytical NLFFF approximation to the observed geometry of coronal loops. The 2D coordinates of the geometry of coronal loop structures observed with AIA/SDO are traced with the "Oriented Coronal CUrved Loop Tracing" (OCCULT-2) code, an automated pattern recognition algorithm that has demonstrated the fidelity in loop tracing matching visual perception. A potential magnetic field solution is then derived from a line-of-sight magnetogram observed with HMI/SDO, and an analytical NLFFF approximation is then forward-fitted to the twisted geometry of coronal loops. We demonstrate the performance of this magnetic field modeling method for a number of solar active regions, before and after major flares observed with SDO. The difference of the NLFFF and the potential field energies allows us then to compute the free magnetic energy, which is an upper limit of the energy that is released during a solar flare.
Predictions of Critical Heat Flux in Annular Pipes with TRACEv4.160 code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jasiulevicius, Audrius; Macian-Juan, Rafael
2006-07-01
This paper presents the assessment of TRACE (version v4.160) against the Critical Heat Flux (CHF) experiments in annular tubes performed at the Royal Institute of Technology (KTH) in Stockholm, Sweden. The experimental database includes data for coolant mass fluxes between 250 and 2500 kg/m{sup 2}s and inlet subcooling of 10 and 40 K at a pressure of 70 bar. The work presented in this paper supplements the calculations of single round tube experiments carried out earlier and provides a broader scope of validated geometries. In addition to the Biasi and CISE-GE CHF correlations available in the code, a number ofmore » experimental points at low flow conditions are available for the annular geometry experiments, which also permitted the assessment of the Biasi/Zuber CHF correlation used in TRACE v4.160 for low flow conditions. Experiments with different axial power distribution were simulated and the effects of the axial power profile and the coolant inlet subcooling on the TRACE predictions were investigated. The results of this work show that the Biasi/Zuber correlation provides good estimation of the CHF at 70 bar, and, for the same conditions, the simulation of the annular experiments resulted in the calculation of lower CHF values compared to single-tube experiments. The analysis of the performance of the standard TRACE CHF correlations shows that the CISE-GE correlation yields critical qualities (quality at CHF) closer to the experimental values at 70 bar than the Biasi correlation for annular flow conditions. Regarding the power profile, the results of the TRACE calculations seem to be very sensitive to its shape, since, depending on the profile, different accuracies in the predictions were noted while other system conditions remained constant. The inlet coolant subcooling was also an important factor in the accuracy of TRACE CHF predictions. Thus, an increase in the inlet subcooling led to a clear improvement in the estimation of the critical quality with both Biasi and CISE-GE correlations. To complement the work, three additional CHF correlations were implemented in TRACE v4.160, namely the Bowring, Tong W-3 and Levitan-Lantsman CHF models, in order to assess the applicability of these correlations to simulate the CHF in annular tubes. The improvement of CHF predictions for low coolant mass flows (up to 1500 kg/m{sup 2}s) is noted when applying Bowring CHF correlation. However, the increase in the inlet subcooling increases the error in predicted critical quality with the Bowring correlation. The Levitan-Lantsman and Tong-W-3 correlations provide results similar to the Biasi model. Therefore, the most correct CHF predictions among the investigated correlations were obtained using CISE-GE model in the standard TRAC v4.160 code. (authors)« less
Glenn, Rachel; Dantus, Marcos
2016-01-07
Recent success with trace explosives detection based on the single ultrafast pulse excitation for remote stimulated Raman scattering (SUPER-SRS) prompts us to provide new results and a Perspective that describes the theoretical foundation of the strategy used for achieving the desired sensitivity and selectivity. SUPER-SRS provides fast and selective imaging while being blind to optical properties of the substrate such as color, texture, or laser speckle. We describe the strategy of combining coherent vibrational excitation with a reference pulse in order to detect stimulated Raman gain or loss. A theoretical model is used to reproduce experimental spectra and to determine the ideal pulse parameters for best sensitivity, selectivity, and resolution when detecting one or more compounds simultaneously.
Quantum-dot-tagged microbeads for multiplexed optical coding of biomolecules.
Han, M; Gao, X; Su, J Z; Nie, S
2001-07-01
Multicolor optical coding for biological assays has been achieved by embedding different-sized quantum dots (zinc sulfide-capped cadmium selenide nanocrystals) into polymeric microbeads at precisely controlled ratios. Their novel optical properties (e.g., size-tunable emission and simultaneous excitation) render these highly luminescent quantum dots (QDs) ideal fluorophores for wavelength-and-intensity multiplexing. The use of 10 intensity levels and 6 colors could theoretically code one million nucleic acid or protein sequences. Imaging and spectroscopic measurements indicate that the QD-tagged beads are highly uniform and reproducible, yielding bead identification accuracies as high as 99.99% under favorable conditions. DNA hybridization studies demonstrate that the coding and target signals can be simultaneously read at the single-bead level. This spectral coding technology is expected to open new opportunities in gene expression studies, high-throughput screening, and medical diagnostics.
NASA Technical Reports Server (NTRS)
Brieda, Lubos
2015-01-01
This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.
Multi-code analysis of scrape-off layer filament dynamics in MAST
NASA Astrophysics Data System (ADS)
Militello, F.; Walkden, N. R.; Farley, T.; Gracias, W. A.; Olsen, J.; Riva, F.; Easy, L.; Fedorczak, N.; Lupelli, I.; Madsen, J.; Nielsen, A. H.; Ricci, P.; Tamain, P.; Young, J.
2016-11-01
Four numerical codes are employed to investigate the dynamics of scrape-off layer filaments in tokamak relevant conditions. Experimental measurements were taken in the MAST device using visual camera imaging, which allows the evaluation of the perpendicular size and velocity of the filaments, as well as the combination of density and temperature associated with the perturbation. A new algorithm based on the light emission integrated along the field lines associated with the position of the filament is developed to ensure that it is properly detected and tracked. The filaments are found to have velocities of the order of 1~\\text{km}~{{\\text{s}}-1} , a perpendicular diameter of around 2-3 cm and a density amplitude 2-3.5 times the background plasma. 3D and 2D numerical codes (the STORM module of BOUT++, GBS, HESEL and TOKAM3X) are used to reproduce the motion of the observed filaments with the purpose of validating the codes and of better understanding the experimental data. Good agreement is found between the 3D codes. The seeded filament simulations are also able to reproduce the dynamics observed in experiments with accuracy up to the experimental errorbar levels. In addition, the numerical results showed that filaments characterised by similar size and light emission intensity can have quite different dynamics if the pressure perturbation is distributed differently between density and temperature components. As an additional benefit, several observations on the dynamics of the filaments in the presence of evolving temperature fields were made and led to a better understanding of the behaviour of these coherent structures.
NASA Technical Reports Server (NTRS)
Potapczuk, Mark G.; Berkowitz, Brian M.
1989-01-01
An investigation of the ice accretion pattern and performance characteristics of a multi-element airfoil was undertaken in the NASA Lewis 6- by 9-Foot Icing Research Tunnel. Several configurations of main airfoil, slat, and flaps were employed to examine the effects of ice accretion and provide further experimental information for code validation purposes. The text matrix consisted of glaze, rime, and mixed icing conditions. Airflow and icing cloud conditions were set to correspond to those typical of the operating environment anticipated tor a commercial transport vehicle. Results obtained included ice profile tracings, photographs of the ice accretions, and force balance measurements obtained both during the accretion process and in a post-accretion evaluation over a range of angles of attack. The tracings and photographs indicated significant accretions on the slat leading edge, in gaps between slat or flaps and the main wing, on the flap leading-edge surfaces, and on flap lower surfaces. Force measurments indicate the possibility of severe performance degradation, especially near C sub Lmax, for both light and heavy ice accretion and performance analysis codes presently in use. The LEWICE code was used to evaluate the ice accretion shape developed during one of the rime ice tests. The actual ice shape was then evaluated, using a Navier-Strokes code, for changes in performance characteristics. These predicted results were compared to the measured results and indicate very good agreement.
Application of a GPU-Assisted Maxwell Code to Electromagnetic Wave Propagation in ITER
NASA Astrophysics Data System (ADS)
Kubota, S.; Peebles, W. A.; Woodbury, D.; Johnson, I.; Zolfaghari, A.
2014-10-01
The Low Field Side Reflectometer (LSFR) on ITER is envisioned to provide capabilities for electron density profile and fluctuations measurements in both the plasma core and edge. The current design for the Equatorial Port Plug 11 (EPP11) employs seven monostatic antennas for use with both fixed-frequency and swept-frequency systems. The present work examines the characteristics of this layout using the 3-D version of the GPU-Assisted Maxwell Code (GAMC-3D). Previous studies in this area were performed with either 2-D full wave codes or 3-D ray- and beam-tracing. GAMC-3D is based on the FDTD method and can be run with either a fixed-frequency or modulated (e.g. FMCW) source, and with either a stationary or moving target (e.g. Doppler backscattering). The code is designed to run on a single NVIDIA Tesla GPU accelerator, and utilizes a technique based on the moving window method to overcome the size limitation of the onboard memory. Effects such as beam drift, linear mode conversion, and diffraction/scattering will be examined. Comparisons will be made with beam-tracing calculations using the complex eikonal method. Supported by U.S. DoE Grants DE-FG02-99ER54527 and DE-AC02-09CH11466, and the DoE SULI Program at PPPL.
Optimization of immunolabeling and clearing techniques for indelibly-labeled memory traces.
Pavlova, Ina P; Shipley, Shannon C; Lanio, Marcos; Hen, René; Denny, Christine A
2018-04-16
Recent genetic tools have allowed researchers to visualize and manipulate memory traces (i.e. engrams) in small brain regions. However, the ultimate goal is to visualize memory traces across the entire brain in order to better understand how memories are stored in neural networks and how multiple memories may coexist. Intact tissue clearing and imaging is a new and rapidly growing area of focus that could accomplish this task. Here, we utilized the leading protocols for whole-brain clearing and applied them to the ArcCreER T2 mice, a murine line that allows for the indelible labeling of memory traces. We found that CLARITY and PACT greatly distorted the tissue, and iDISCO quenched enhanced yellow fluorescent protein (EYFP) fluorescence and hindered immunolabeling. Alternative clearing solutions, such as tert-Butanol, circumvented these harmful effects, but still did not permit whole-brain immunolabeling. CUBIC and CUBIC with Reagent 1A produced improved antibody penetration and preserved EYFP fluorescence, but also did not allow for whole-brain memory trace visualization. Modification of CUBIC with Reagent-1A resulted in EYFP fluorescence preservation and immunolabeling of the immediate early gene (IEG) Arc in deep brain areas; however, optimized memory trace labeling still required tissue slicing into mm-thick tissue sections. In summary, our data show that CUBIC with Reagent-1A* is the ideal method for reproducible clearing and immunolabeling for the visualization of memory traces in mm-thick tissue sections from ArcCreER T2 mice. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.
Boulyga, Sergei F; Loreti, Valeria; Bettmer, Jörg; Heumann, Klaus G
2004-09-01
Size exclusion chromatography (SEC) was coupled on-line to inductively coupled plasma mass spectrometry (ICP-MS) for speciation study of trace metals in cancerous thyroid tissues in comparison to healthy thyroids aimed to estimation of changes in metalloprotein speciation in pathological tissue. The study showed a presence of species binding Cu, Zn, Cd and Pb in healthy thyroid tissue with a good reproducibility of chromatographic results, whereas the same species could not be detected in cancerous tissues. Thus, remarkable differences with respect to metal-binding species were revealed between healthy and pathological thyroid samples, pointing out a completely different distribution of trace metals in cancerous tissues. The metal-binding species could not be identified in the frame of this work because of a lack of appropriate standards. Nevertheless, the results obtained confirm the suitability of SEC-ICP-MS for monitoring of changes in trace metal distribution in cancerous tissue and will help to better understand the role of metal-containing species in thyroid pathology.
Software Writing Skills for Your Research - Lessons Learned from Workshops in the Geosciences
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin
2016-04-01
Findings presented in scientific papers are based on data and software. Once in a while they come along with data - but not commonly with software. However, the software used to gain findings plays a crucial role in the scientific work. Nevertheless, software is rarely seen publishable. Thus researchers may not reproduce the findings without the software which is in conflict with the principle of reproducibility in sciences. For both, the writing of publishable software and the reproducibility issue, the quality of software is of utmost importance. For many programming scientists the treatment of source code, e.g. with code design, version control, documentation, and testing is associated with additional work that is not covered in the primary research task. This includes the adoption of processes following the software development life cycle. However, the adoption of software engineering rules and best practices has to be recognized and accepted as part of the scientific performance. Most scientists have little incentive to improve code and do not publish code because software engineering habits are rarely practised by researchers or students. Software engineering skills are not passed on to followers as for paper writing skill. Thus it is often felt that the software or code produced is not publishable. The quality of software and its source code has a decisive influence on the quality of research results obtained and their traceability. So establishing best practices from software engineering to serve scientific needs is crucial for the success of scientific software. Even though scientists use existing software and code, i.e., from open source software repositories, only few contribute their code back into the repositories. So writing and opening code for Open Science means that subsequent users are able to run the code, e.g. by the provision of sufficient documentation, sample data sets, tests and comments which in turn can be proven by adequate and qualified reviews. This assumes that scientist learn to write and release code and software as they learn to write and publish papers. Having this in mind, software could be valued and assessed as a contribution to science. But this requires the relevant skills that can be passed to colleagues and followers. Therefore, the GFZ German Research Centre for Geosciences performed three workshops in 2015 to address the passing of software writing skills to young scientists, the next generation of researchers in the Earth, planetary and space sciences. Experiences in running these workshops and the lessons learned will be summarized in this presentation. The workshops have received support and funding by Software Carpentry, a volunteer organization whose goal is to make scientists more productive, and their work more reliable, by teaching them basic computing skills, and by FOSTER (Facilitate Open Science Training for European Research), a two-year, EU-Funded (FP7) project, whose goal to produce a European-wide training programme that will help to incorporate Open Access approaches into existing research methodologies and to integrate Open Science principles and practice in the current research workflow by targeting the young researchers and other stakeholders.
1977-02-10
RL Report SUM F ~ SSPARAMA: A Nonlinear, Wave Optics Multipulse (and CW) Steady-State Propagation * Code with Adaptive Coordinates K. G. WHIITNEY...ie rmtu o- a ~e oD DISCLAIMER NOTICE THIS DOCUMENT IS BEST QUALITY AVAILABLE. THE COPY FURNISHED TO DTIC CONTAINED A SIGNIFICANT NUMBER OF PAGES WHICH...DO NOT REPRODUCE LEGIBLY. SECU RITY CL ASSI FICATION OF TVII, PAZOE Fl?l ba PJM 0vI,.j REPOR DOCMENTTIONPAGEREAL) INS~TRUCTION~S REPOT DOUMENATIO PAG
New developments of the CARTE thermochemical code: A two-phase equation of state for nanocarbons
NASA Astrophysics Data System (ADS)
Dubois, Vincent; Pineau, Nicolas
2016-01-01
We developed a new equation of state (EOS) for nanocarbons in the thermodynamic range of high explosives detonation products (up to 50 GPa and 4000 K). This EOS was fitted to an extensive database of thermodynamic properties computed by molecular dynamics simulations of nanodiamonds and nano-onions with the LCBOPII potential. We reproduced the detonation properties of a variety of high explosives with the CARTE thermochemical code, including carbon-poor and carbon-rich explosives, with excellent accuracy.
NASA Astrophysics Data System (ADS)
Schiavi, A.; Senzacqua, M.; Pioli, S.; Mairani, A.; Magro, G.; Molinelli, S.; Ciocca, M.; Battistoni, G.; Patera, V.
2017-09-01
Ion beam therapy is a rapidly growing technique for tumor radiation therapy. Ions allow for a high dose deposition in the tumor region, while sparing the surrounding healthy tissue. For this reason, the highest possible accuracy in the calculation of dose and its spatial distribution is required in treatment planning. On one hand, commonly used treatment planning software solutions adopt a simplified beam-body interaction model by remapping pre-calculated dose distributions into a 3D water-equivalent representation of the patient morphology. On the other hand, Monte Carlo (MC) simulations, which explicitly take into account all the details in the interaction of particles with human tissues, are considered to be the most reliable tool to address the complexity of mixed field irradiation in a heterogeneous environment. However, full MC calculations are not routinely used in clinical practice because they typically demand substantial computational resources. Therefore MC simulations are usually only used to check treatment plans for a restricted number of difficult cases. The advent of general-purpose programming GPU cards prompted the development of trimmed-down MC-based dose engines which can significantly reduce the time needed to recalculate a treatment plan with respect to standard MC codes in CPU hardware. In this work, we report on the development of fred, a new MC simulation platform for treatment planning in ion beam therapy. The code can transport particles through a 3D voxel grid using a class II MC algorithm. Both primary and secondary particles are tracked and their energy deposition is scored along the trajectory. Effective models for particle-medium interaction have been implemented, balancing accuracy in dose deposition with computational cost. Currently, the most refined module is the transport of proton beams in water: single pencil beam dose-depth distributions obtained with fred agree with those produced by standard MC codes within 1-2% of the Bragg peak in the therapeutic energy range. A comparison with measurements taken at the CNAO treatment center shows that the lateral dose tails are reproduced within 2% in the field size factor test up to 20 cm. The tracing kernel can run on GPU hardware, achieving 10 million primary s-1 on a single card. This performance allows one to recalculate a proton treatment plan at 1% of the total particles in just a few minutes.
Dieltjes, Patrick; Mieremet, René; Zuniga, Sofia; Kraaijenbrink, Thirsa; Pijpe, Jeroen; de Knijff, Peter
2011-07-01
Exploring technological limits is a common practice in forensic DNA research. Reliable genetic profiling based on only a few cells isolated from trace material retrieved from a crime scene is nowadays more and more the rule rather than the exception. On many crime scenes, cartridges, bullets, and casings (jointly abbreviated as CBCs) are regularly found, and even after firing, these potentially carry trace amounts of biological material. Since 2003, the Forensic Laboratory for DNA Research is routinely involved in the forensic investigation of CBCs in the Netherlands. Reliable DNA profiles were frequently obtained from CBCs and used to match suspects, victims, or other crime scene-related DNA traces. In this paper, we describe the sensitive method developed by us to extract DNA from CBCs. Using PCR-based genotyping of autosomal short tandem repeats, we were able to obtain reliable and reproducible DNA profiles in 163 out of 616 criminal cases (26.5%) and in 283 out of 4,085 individual CBC items (6.9%) during the period January 2003-December 2009. We discuss practical aspects of the method and the sometimes unexpected effects of using cell lysis buffer on the subsequent investigation of striation patterns on CBCs.
Tools for open geospatial science
NASA Astrophysics Data System (ADS)
Petras, V.; Petrasova, A.; Mitasova, H.
2017-12-01
Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.
NASA Astrophysics Data System (ADS)
Ojha, Narendra; Pozzer, Andrea; Jöckel, Patrick; Fischer, Horst; Zahn, Andreas; Tomsche, Laura; Lelieveld, Jos
2017-04-01
The Asian monsoon convection redistributes trace species, affecting the tropospheric chemistry and radiation budget over Asia and downwind as far as the Mediterranean. It remains challenging to model these impacts due to uncertainties, e.g. associated with the convection parameterization and input emissions. Here, we perform a series of numerical experiments using the global ECHAM5/MESSy atmospheric chemistry model (EMAC) to investigate the tropospheric distribution of O3 and related tracers measured during the Oxidation Mechanism Observations (OMO) conducted during July-August 2015. The reference simulation can reproduce the spatio-temporal variations to some extent (e.g. r2 = 0.7 for O3, 0.6 for CO). However, this simulation underestimates mean CO in the lower troposphere by about 30 ppbv and overestimates mean O3 up to 35 ppbv, especially in the middle-upper troposphere. Interestingly, sensitivity simulations with 50% higher biofuel emissions of CO over South Asia had insignificant effect on CO underestimation, pointing to sources upwind of South Asia. Use of an alternative convection parameterization is found to significantly improve simulated O3. The study reveals the abilities as well as the limitations of the model to reproduce observations and study atmospheric chemistry and climate implications of the monsoon.
Giles, Tracey M; de Lacey, Sheryl; Muir-Cochrane, Eimear
2016-01-01
Grounded theory method has been described extensively in the literature. Yet, the varying processes portrayed can be confusing for novice grounded theorists. This article provides a worked example of the data analysis phase of a constructivist grounded theory study that examined family presence during resuscitation in acute health care settings. Core grounded theory methods are exemplified, including initial and focused coding, constant comparative analysis, memo writing, theoretical sampling, and theoretical saturation. The article traces the construction of the core category "Conditional Permission" from initial and focused codes, subcategories, and properties, through to its position in the final substantive grounded theory.
Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)
2002-01-01
In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory
NUSC Technical Publications Guide.
1985-05-01
Facility personnel especially that of A. Castelluzzo, E. Deland, J. Gesel , and E. Szlosek (all of Code 4343). Reviewed and Approved: 14 July 1980 D...their technical content and format. Review and approve the manual outline, the review manuscript, and the final camera - reproducible copy. Conduct in
Patient training in respiratory-gated radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kini, Vijay R.; Vedam, Subrahmanya S.; Keall, Paul J.
2003-03-31
Respiratory gating is used to counter the effects of organ motion during radiotherapy for chest tumors. The effects of variations in patient breathing patterns during a single treatment and from day to day are unknown. We evaluated the feasibility of using patient training tools and their effect on the breathing cycle regularity and reproducibility during respiratory-gated radiotherapy. To monitor respiratory patterns, we used a component of a commercially available respiratory-gated radiotherapy system (Real Time Position Management (RPM) System, Varian Oncology Systems, Palo Alto, CA 94304). This passive marker video tracking system consists of reflective markers placed on the patient's chestmore » or abdomen, which are detected by a wall-mounted video camera. Software installed on a PC interfaced to this camera detects the marker motion digitally and records it. The marker position as a function of time serves as the motion signal that may be used to trigger imaging or treatment. The training tools used were audio prompting and visual feedback, with free breathing as a control. The audio prompting method used instructions to 'breathe in' or 'breathe out' at periodic intervals deduced from patients' own breathing patterns. In the visual feedback method, patients were shown a real-time trace of their abdominal wall motion due to breathing. Using this, they were asked to maintain a constant amplitude of motion. Motion traces of the abdominal wall were recorded for each patient for various maneuvers. Free breathing showed a variable amplitude and frequency. Audio prompting resulted in a reproducible frequency; however, the variability and the magnitude of amplitude increased. Visual feedback gave a better control over the amplitude but showed minor variations in frequency. We concluded that training improves the reproducibility of amplitude and frequency of patient breathing cycles. This may increase the accuracy of respiratory-gated radiation therapy.« less
Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.
Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P
2018-02-23
Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.
A Tractography Comparison between Turboprop and Spin-Echo Echo-Planar Diffusion Tensor Imaging
Gui, Minzhi; Peng, Huiling; Carew, John D.; Lesniak, Maciej S.; Arfanakis, Konstantinos
2008-01-01
The development of accurate, non-invasive methods for mapping white matter fiber-tracts is of critical importance. However, fiber-tracking is typically performed on diffusion tensor imaging (DTI) data obtained with echo-planar-based imaging techniques (EPI), which suffer from susceptibility-related image artifacts, and image warping due to eddy-currents. Thus, a number of white matter fiber-bundles mapped using EPI-based DTI data are distorted and/or terminated early. This severely limits the clinical potential of fiber-tracking. In contrast, Turboprop-MRI provides images with significantly fewer susceptibility and eddy-current-related artifacts than EPI. The purpose of this work was to compare fiber-tracking results obtained from DTI data acquired with Turboprop-DTI and EPI-based DTI. It was shown that, in brain regions near magnetic field inhomogeneities, white matter fiber-bundles obtained with EPI-based DTI were distorted and/or partially detected, when magnetic susceptibility-induced distortions were not corrected. After correction, residual distortions were still present and several fiber-tracts remained partially detected. In contrast, when using Turboprop-DTI data, all traced fiber-tracts were in agreement with known anatomy. The inter-session reproducibility of tractography results was higher for Turboprop than EPI-based DTI data in regions near field inhomogeneities. Thus, Turboprop may be a more appropriate DTI data acquisition technique for tracing white matter fibers near regions with significant magnetic susceptibility differences, as well as in longitudinal studies of such fibers. However, the intra-session reproducibility of tractography results was higher for EPI-based than Turboprop DTI data. Thus, EPI-based DTI may be more advantageous for tracing fibers minimally affected by field inhomogeneities. PMID:18621131
A tractography comparison between turboprop and spin-echo echo-planar diffusion tensor imaging.
Gui, Minzhi; Peng, Huiling; Carew, John D; Lesniak, Maciej S; Arfanakis, Konstantinos
2008-10-01
The development of accurate, non-invasive methods for mapping white matter fiber-tracts is of critical importance. However, fiber-tracking is typically performed on diffusion tensor imaging (DTI) data obtained with echo-planar-based imaging techniques (EPI), which suffer from susceptibility-related image artifacts, and image warping due to eddy-currents. Thus, a number of white matter fiber-bundles mapped using EPI-based DTI data are distorted and/or terminated early. This severely limits the clinical potential of fiber-tracking. In contrast, Turboprop-MRI provides images with significantly fewer susceptibility and eddy-current-related artifacts than EPI. The purpose of this work was to compare fiber-tracking results obtained from DTI data acquired with Turboprop-DTI and EPI-based DTI. It was shown that, in brain regions near magnetic field inhomogeneities, white matter fiber-bundles obtained with EPI-based DTI were distorted and/or partially detected, when magnetic susceptibility-induced distortions were not corrected. After correction, residual distortions were still present and several fiber-tracts remained partially detected. In contrast, when using Turboprop-DTI data, all traced fiber-tracts were in agreement with known anatomy. The inter-session reproducibility of tractography results was higher for Turboprop than EPI-based DTI data in regions near field inhomogeneities. Thus, Turboprop may be a more appropriate DTI data acquisition technique for tracing white matter fibers near regions with significant magnetic susceptibility differences, as well as in longitudinal studies of such fibers. However, the intra-session reproducibility of tractography results was higher for EPI-based than Turboprop DTI data. Thus, EPI-based DTI may be more advantageous for tracing fibers minimally affected by field inhomogeneities.
NASA Astrophysics Data System (ADS)
Fourny, Anaïs.; Weis, Dominique; Scoates, James S.
2016-03-01
Controlling the accuracy and precision of geochemical analyses requires the use of characterized reference materials with matrices similar to those of the unknown samples being analyzed. We report a comprehensive Pb-Sr-Nd-Hf isotopic and trace element concentration data set, combined with quantitative phase analysis by XRD Rietveld refinement, for a wide range of mafic to ultramafic rock reference materials analyzed at the Pacific Centre for Isotopic and Geochemical Research, University of British Columbia. The samples include a pyroxenite (NIM-P), five basalts (BHVO-2, BIR-1a, JB-3, BE-N, GSR-3), a diabase (W-2), a dolerite (DNC-1), a norite (NIM-N), and an anorthosite (AN-G); results from a leucogabbro (Stillwater) are also reported. Individual isotopic ratios determined by MC-ICP-MS and TIMS, and multielement analyses by HR-ICP-MS are reported with 4-12 complete analytical duplicates for each sample. The basaltic reference materials have coherent Sr and Nd isotopic ratios with external precision below 50 ppm (2SD) and below 100 ppm for Hf isotopes (except BIR-1a). For Pb isotopic reproducibility, several of the basalts (JB-3, BHVO-2) require acid leaching prior to dissolution. The plutonic reference materials also have coherent Sr and Nd isotopic ratios (<50 ppm), however, obtaining good reproducibility for Pb and Hf isotopic ratios is more challenging for NIM-P, NIM-N, and AN-G due to a variety of factors, including postcrystallization Pb mobility and the presence of accessory zircon. Collectively, these results form a comprehensive new database that can be used by the geochemical community for evaluating the radiogenic isotope and trace element compositions of volcanic and plutonic mafic-ultramafic rocks.
Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling
NASA Astrophysics Data System (ADS)
Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.
2017-12-01
Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.
Certifying Auto-Generated Flight Code
NASA Technical Reports Server (NTRS)
Denney, Ewen
2008-01-01
Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.
Worthwhile optical method for free-form mirrors qualification
NASA Astrophysics Data System (ADS)
Sironi, G.; Canestrari, R.; Toso, G.; Pareschi, G.
2013-09-01
We present an optical method for free-form mirrors qualification developed by the Italian National Institute for Astrophysics (INAF) in the context of the ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana) Project which includes, among its items, the design, development and installation of a dual-mirror telescope prototype for the Cherenkov Telescope Array (CTA) observatory. The primary mirror panels of the telescope prototype are free-form concave mirrors with few microns accuracy required on the shape error. The developed technique is based on the synergy between a Ronchi-like optical test performed on the reflecting surface and the image, obtained by means of the TraceIT ray-tracing proprietary code, a perfect optics should generate in the same configuration. This deflectometry test allows the reconstruction of the slope error map that the TraceIT code can process to evaluate the measured mirror optical performance at the telescope focus. The advantages of the proposed method is that it substitutes the use of 3D coordinates measuring machine reducing production time and costs and offering the possibility to evaluate on-site the mirror image quality at the focus. In this paper we report the measuring concept and compare the obtained results to the similar ones obtained processing the shape error acquired by means of a 3D coordinates measuring machine.
DREAM-3D and the importance of model inputs and boundary conditions
NASA Astrophysics Data System (ADS)
Friedel, Reiner; Tu, Weichao; Cunningham, Gregory; Jorgensen, Anders; Chen, Yue
2015-04-01
Recent work on radiation belt 3D diffusion codes such as the Los Alamos "DREAM-3D" code have demonstrated the ability of such codes to reproduce realistic magnetospheric storm events in the relativistic electron dynamics - as long as sufficient "event-oriented" boundary conditions and code inputs such as wave powers, low energy boundary conditions, background plasma densities, and last closed drift shell (outer boundary) are available. In this talk we will argue that the main limiting factor in our modeling ability is no longer our inability to represent key physical processes that govern the dynamics of the radiation belts (radial, pitch angle and energy diffusion) but rather our limitations in specifying accurate boundary conditions and code inputs. We use here DREAM-3D runs to show the sensitivity of the modeled outcomes to these boundary conditions and inputs, and also discuss alternate "proxy" approaches to obtain the required inputs from other (ground-based) sources.
Yu, Jun; Shen, Zhengxiang; Sheng, Pengfeng; Wang, Xiaoqiang; Hailey, Charles J; Wang, Zhanshan
2018-03-01
The nested grazing incidence telescope can achieve a large collecting area in x-ray astronomy, with a large number of closely packed, thin conical mirrors. Exploiting the surface metrological data, the ray tracing method used to reconstruct the shell surface topography and evaluate the imaging performance is a powerful tool to assist iterative improvement in the fabrication process. However, current two-dimensional (2D) ray tracing codes, especially when utilized with densely sampled surface shape data, may not provide sufficient accuracy of reconstruction and are computationally cumbersome. In particular, 2D ray tracing currently employed considers coplanar rays and thus simulates only these rays along the meridional plane. This captures axial figure errors but leaves other important errors, such as roundness errors, unaccounted for. We introduce a semianalytic, three-dimensional (3D) ray tracing approach for x-ray optics that overcomes these shortcomings. And the present method is both computationally fast and accurate. We first introduce the principles and the computational details of this 3D ray tracing method. Then the computer simulations of this approach compared to 2D ray tracing are demonstrated, using an ideal conic Wolter-I telescope for benchmarking. Finally, the present 3D ray tracing is used to evaluate the performance of a prototype x-ray telescope fabricated for the enhanced x-ray timing and polarization mission.
Madejón, P; Ciadamidaro, L; Marañón, T; Murillo, J M
2013-01-01
Phytostabilization aims to immobilize soil contaminants using higher plants. The accumulation of trace elements in Populus alba leaves was monitored for 12 years after a mine spill. Concentrations of As and Pb significantly decreased, while concentrations of Cd and Zn did not significantly over time. Soil concentrations extracted by CaCl2 were measured by ICP-OES and results of As and Pb were below the detection limit. Long-term biomonitoring of soil contamination using poplar leaves was proven to be better suited for the study of trace elements. Plants suitable for phytostabilization must also be able to survive and reproduce in contaminated soils. Concentrations of trace elements were also measured in P. alba fruiting catkins to determine the effect on its reproduction potential. Cadmium and Zn were found to accumulate in fruiting catkins, with the transfer coefficient for Cd significantly greater than Zn. It is possible for trace elements to translocate to seed, which presents a concern for seed germination, establishment and colonization. We conclude that white poplar is a suitable tree for long-term monitoring of soil contaminated with Cd and Zn, and for phytostabilization in riparian habitats, although some caution should be taken with the possible effects on the food web. Supplemental materials are available for this article. Go to the publisher's online edition of International Journal of Phytoremediation to view the supplemental file.
Propagation Effects of Wind and Temperature on Acoustic Ground Contour Levels
NASA Technical Reports Server (NTRS)
Heath, Stephanie L.; McAninch, Gerry L.
2006-01-01
Propagation characteristics for varying wind and temperature atmospheric conditions are identified using physically-limiting propagation angles to define shadow boundary regions. These angles are graphically illustrated for various wind and temperature cases using a newly developed ray-tracing propagation code.
Fission time scale from pre-scission neutron and α multiplicities in the 16O + 194Pt reaction
NASA Astrophysics Data System (ADS)
Kapoor, K.; Verma, S.; Sharma, P.; Mahajan, R.; Kaur, N.; Kaur, G.; Behera, B. R.; Singh, K. P.; Kumar, A.; Singh, H.; Dubey, R.; Saneesh, N.; Jhingan, A.; Sugathan, P.; Mohanto, G.; Nayak, B. K.; Saxena, A.; Sharma, H. P.; Chamoli, S. K.; Mukul, I.; Singh, V.
2017-11-01
Pre- and post-scission α -particle multiplicities have been measured for the reaction 16O+P194t at 98.4 MeV forming R210n compound nucleus. α particles were measured at various angles in coincidence with the fission fragments. Moving source technique was used to extract the pre- and post-scission contributions to the particle multiplicity. Study of the fission mechanism using the different probes are helpful in understanding the detailed reaction dynamics. The neutron multiplicities for this reaction have been reported earlier. The multiplicities of neutrons and α particles were reproduced using standard statistical model code joanne2 by varying the transient (τt r) and saddle to scission (τs s c) times. This code includes deformation dependent-particle transmission coefficients, binding energies and level densities. Fission time scales of the order of 50-65 ×10-21 s are required to reproduce the neutron and α -particle multiplicities.
Nonequilibrium radiation behind a strong shock wave in CO 2-N 2
NASA Astrophysics Data System (ADS)
Rond, C.; Boubert, P.; Félio, J.-M.; Chikhaoui, A.
2007-11-01
This work presents experiments reproducing plasma re-entry for one trajectory point of a Martian mission. The typical facility to investigate such hypersonic flow is shock tube; here we used the free-piston shock tube TCM2. Measurements of radiative flux behind the shock wave are realized thanks to time-resolved emission spectroscopy which is calibrated in intensity. As CN violet system is the main radiator in near UV-visible range, we have focused our study on its spectrum. Moreover a physical model, based on a multi-temperature kinetic code and a radiative code, for calculation of non equilibrium radiation behind a shock wave is developed for CO 2-N 2-Ar mixtures. Comparisons between experiments and calculations show that standard kinetic models (Park, McKenzie) are inefficient to reproduce our experimental results. Therefore we propose new rate coefficients in particular for the dissociation of CO 2, showing the way towards a better description of the chemistry of the mixture.
HESS Opinions: Repeatable research: what hydrologistscan learn from the Duke cancer research scandal
Fienen, Michael; Bakker, Mark
2016-01-01
In the past decade, difficulties encountered in reproducing the results of a cancer study at Duke University resulted in a scandal and an investigation which concluded that tools used for data management, analysis, and modeling were inappropriate for the documentation of the study, let alone the reproduction of the results. New protocols were developed which require that data analysis and modeling be carried out with scripts that can be used to reproduce the results and are a record of all decisions and interpretations made during an analysis or a modeling effort. In the hydrological sciences, we face similar challenges and need to develop similar standards for transparency and repeatability of results. A promising route is to start making use of open-source languages (such as R and Python) to write scripts and to use collaborative coding environments (such as Git) to share our codes for inspection and use by the hydrological community. An important side-benefit to adopting such protocols is consistency and efficiency among collaborators.
CMCpy: Genetic Code-Message Coevolution Models in Python
Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.
2013-01-01
Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367
Pan-cancer transcriptomic analysis associates long non-coding RNAs with key mutational driver events
Ashouri, Arghavan; Sayin, Volkan I.; Van den Eynden, Jimmy; Singh, Simranjit X.; Papagiannakopoulos, Thales; Larsson, Erik
2016-01-01
Thousands of long non-coding RNAs (lncRNAs) lie interspersed with coding genes across the genome, and a small subset has been implicated as downstream effectors in oncogenic pathways. Here we make use of transcriptome and exome sequencing data from thousands of tumours across 19 cancer types, to identify lncRNAs that are induced or repressed in relation to somatic mutations in key oncogenic driver genes. Our screen confirms known coding and non-coding effectors and also associates many new lncRNAs to relevant pathways. The associations are often highly reproducible across cancer types, and while many lncRNAs are co-expressed with their protein-coding hosts or neighbours, some are intergenic and independent. We highlight lncRNAs with possible functions downstream of the tumour suppressor TP53 and the master antioxidant transcription factor NFE2L2. Our study provides a comprehensive overview of lncRNA transcriptional alterations in relation to key driver mutational events in human cancers. PMID:28959951
Reliability of SNOMED-CT Coding by Three Physicians using Two Terminology Browsers
Chiang, Michael F.; Hwang, John C.; Yu, Alexander C.; Casper, Daniel S.; Cimino, James J.; Starren, Justin
2006-01-01
SNOMED-CT has been promoted as a reference terminology for electronic health record (EHR) systems. Many important EHR functions are based on the assumption that medical concepts will be coded consistently by different users. This study is designed to measure agreement among three physicians using two SNOMED-CT terminology browsers to encode 242 concepts from five ophthalmology case presentations in a publicly-available clinical journal. Inter-coder reliability, based on exact coding match by each physician, was 44% using one browser and 53% using the other. Intra-coder reliability testing revealed that a different SNOMED-CT code was obtained up to 55% of the time when the two browsers were used by one user to encode the same concept. These results suggest that the reliability of SNOMED-CT coding is imperfect, and may be a function of browsing methodology. A combination of physician training, terminology refinement, and browser improvement may help increase the reproducibility of SNOMED-CT coding. PMID:17238317
GPU-accelerated simulations of isolated black holes
NASA Astrophysics Data System (ADS)
Lewis, Adam G. M.; Pfeiffer, Harald P.
2018-05-01
We present a port of the numerical relativity code SpEC which is capable of running on NVIDIA GPUs. Since this code must be maintained in parallel with SpEC itself, a primary design consideration is to perform as few explicit code changes as possible. We therefore rely on a hierarchy of automated porting strategies. At the highest level we use TLoops, a C++ library of our design, to automatically emit CUDA code equivalent to tensorial expressions written into C++ source using a syntax similar to analytic calculation. Next, we trace out and cache explicit matrix representations of the numerous linear transformations in the SpEC code, which allows these to be performed on the GPU using pre-existing matrix-multiplication libraries. We port the few remaining important modules by hand. In this paper we detail the specifics of our port, and present benchmarks of it simulating isolated black hole spacetimes on several generations of NVIDIA GPU.
Loops formed by tidal tails as fossil records of a major merger
NASA Astrophysics Data System (ADS)
Wang, J.; Hammer, F.; Athanassoula, E.; Puech, M.; Yang, Y.; Flores, H.
2012-02-01
Context. Many haloes of nearby disc galaxies contain faint and extended features, including loops, which are often interpreted as relics of satellite infall in the main galaxy's potential well. In most cases, however, the residual nucleus of the satellite is not seen, although it is predicted by numerical simulations. Aims: We test whether such faint and extended features can be associated to gas-rich, major mergers, which may also lead to disc rebuilding and thus be a corner stone for the formation of spiral galaxies. Our goal is to test whether the major merger scenario can provide a good model for a particularly difficult case, that of NGC 5907, and to compare to the scenario of a satellite infall. Methods: Using the TreeSPH code GADGET-2, we model the formation of an almost bulge-less galaxy similar to NGC 5907 (B/T ≲ 0.2) after a gas-rich major merger. First, we trace tidal tail particles captured by the galaxy gravitational potential to verify whether they can form loops similar to those discovered in the galactic haloes. Results: We indeed find that 3:1 major mergers can form features similar to the loops found in many galactic haloes, including in NGC 5907, and can reproduce an extended thin disc, a bulge, as well as the pronounced warp of the gaseous disc. Relatively small bulge fractions can be reproduced by a large gas fraction in the progenitors, as well as appropriate orbital parameters. Conclusions: Even though it remains difficult to fully cover the large volume of free parameters, the present modelling of the loops in NGC 5907 proves that they could well be the result of a major merger. It has many advantages over the satellite infall scenario; e.g., it solves the problem of the visibility of the satellite remnant, and it may explain some additional features in the NGC 5907 halo, as well as some gas properties of this system. For orbital parameters derived from cosmological simulations, the loops in NGC 5907 can be reproduced by major mergers (3:1 to 5:1) and possibly by intermediate mergers (5:1 to 12:1). The major merger scenario thus challenges the minor merger one and could explain many properties that haloes of spiral galaxies have in common, including their red colours and the presence of faint extended features.
Reproducible Research in the Geosciences at Scale: Achievable Goal or Elusive Dream?
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Evans, B. J. K.
2016-12-01
Reproducibility is a fundamental tenant of the scientific method: it implies that any researcher, or a third party working independently, can duplicate any experiment or investigation and produce the same results. Historically computationally based research involved an individual using their own data and processing it in their own private area, often using software they wrote or inherited from close collaborators. Today, a researcher is likely to be part of a large team that will use a subset of data from an external repository and then process the data on a public or private cloud or on a large centralised supercomputer, using a mixture of their own code, third party software and libraries, or global community codes. In 'Big Geoscience' research it is common for data inputs to be extracts from externally managed dynamic data collections, where new data is being regularly appended, or existing data is revised when errors are detected and/or as processing methods are improved. New workflows increasingly use services to access data dynamically to create subsets on-the-fly from distributed sources, each of which can have a complex history. At major computational facilities, underlying systems, libraries, software and services are being constantly tuned and optimised, or as new or replacement infrastructure being installed. Likewise code used from a community repository is continually being refined, re-packaged and ported to the target platform. To achieve reproducibility, today's researcher increasingly needs to track their workflow, including querying information on the current or historical state of facilities used. Versioning methods are standard practice for software repositories or packages, but it is not common for either data repositories or data services to provide information about their state, or for systems to provide query-able access to changes in the underlying software. While a researcher can achieve transparency and describe steps in their workflow so that others can repeat them and replicate processes undertaken, they cannot achieve exact reproducibility or even transparency of results generated. In Big Geoscience, full reproducibiliy will be an elusive dream until data repositories and compute facilities can provide provenance information in a standards compliant, machine query-able way.
The Automated Instrumentation and Monitoring System (AIMS) reference manual
NASA Technical Reports Server (NTRS)
Yan, Jerry; Hontalas, Philip; Listgarten, Sherry
1993-01-01
Whether a researcher is designing the 'next parallel programming paradigm,' another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of execution traces can help computer designers and software architects to uncover system behavior and to take advantage of specific application characteristics and hardware features. A software tool kit that facilitates performance evaluation of parallel applications on multiprocessors is described. The Automated Instrumentation and Monitoring System (AIMS) has four major software components: a source code instrumentor which automatically inserts active event recorders into the program's source code before compilation; a run time performance-monitoring library, which collects performance data; a trace file animation and analysis tool kit which reconstructs program execution from the trace file; and a trace post-processor which compensate for data collection overhead. Besides being used as prototype for developing new techniques for instrumenting, monitoring, and visualizing parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware test beds to evaluate their impact on user productivity. Currently, AIMS instrumentors accept FORTRAN and C parallel programs written for Intel's NX operating system on the iPSC family of multi computers. A run-time performance-monitoring library for the iPSC/860 is included in this release. We plan to release monitors for other platforms (such as PVM and TMC's CM-5) in the near future. Performance data collected can be graphically displayed on workstations (e.g. Sun Sparc and SGI) supporting X-Windows (in particular, Xl IR5, Motif 1.1.3).
Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David
2014-01-01
Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.
Springate, David A.; Kontopantelis, Evangelos; Ashcroft, Darren M.; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David
2014-01-01
Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects. PMID:24941260
Watson, Jessica; Nicholson, Brian D; Hamilton, Willie; Price, Sarah
2017-11-22
Analysis of routinely collected electronic health record (EHR) data from primary care is reliant on the creation of codelists to define clinical features of interest. To improve scientific rigour, transparency and replicability, we describe and demonstrate a standardised reproducible methodology for clinical codelist development. We describe a three-stage process for developing clinical codelists. First, the clear definition a priori of the clinical feature of interest using reliable clinical resources. Second, development of a list of potential codes using statistical software to comprehensively search all available codes. Third, a modified Delphi process to reach consensus between primary care practitioners on the most relevant codes, including the generation of an 'uncertainty' variable to allow sensitivity analysis. These methods are illustrated by developing a codelist for shortness of breath in a primary care EHR sample, including modifiable syntax for commonly used statistical software. The codelist was used to estimate the frequency of shortness of breath in a cohort of 28 216 patients aged over 18 years who received an incident diagnosis of lung cancer between 1 January 2000 and 30 November 2016 in the Clinical Practice Research Datalink (CPRD). Of 78 candidate codes, 29 were excluded as inappropriate. Complete agreement was reached for 44 (90%) of the remaining codes, with partial disagreement over 5 (10%). 13 091 episodes of shortness of breath were identified in the cohort of 28 216 patients. Sensitivity analysis demonstrates that codes with the greatest uncertainty tend to be rarely used in clinical practice. Although initially time consuming, using a rigorous and reproducible method for codelist generation 'future-proofs' findings and an auditable, modifiable syntax for codelist generation enables sharing and replication of EHR studies. Published codelists should be badged by quality and report the methods of codelist generation including: definitions and justifications associated with each codelist; the syntax or search method; the number of candidate codes identified; and the categorisation of codes after Delphi review. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Toward Real-Time Infoveillance of Twitter Health Messages.
Colditz, Jason B; Chu, Kar-Hai; Emery, Sherry L; Larkin, Chandler R; James, A Everette; Welling, Joel; Primack, Brian A
2018-06-21
There is growing interest in conducting public health research using data from social media. In particular, Twitter "infoveillance" has demonstrated utility across health contexts. However, rigorous and reproducible methodologies for using Twitter data in public health are not yet well articulated, particularly those related to content analysis, which is a highly popular approach. In 2014, we gathered an interdisciplinary team of health science researchers, computer scientists, and methodologists to begin implementing an open-source framework for real-time infoveillance of Twitter health messages (RITHM). Through this process, we documented common challenges and novel solutions to inform future work in real-time Twitter data collection and subsequent human coding. The RITHM framework allows researchers and practitioners to use well-planned and reproducible processes in retrieving, storing, filtering, subsampling, and formatting data for health topics of interest. Further considerations for human coding of Twitter data include coder selection and training, data representation, codebook development and refinement, and monitoring coding accuracy and productivity. We illustrate methodological considerations through practical examples from formative work related to hookah tobacco smoking, and we reference essential methods literature related to understanding and using Twitter data. (Am J Public Health. Published online ahead of print June 21, 2018: e1-e6. doi:10.2105/AJPH.2018.304497).
mocca code for star cluster simulations - VI. Bimodal spatial distribution of blue stragglers
NASA Astrophysics Data System (ADS)
Hypki, Arkadiusz; Giersz, Mirek
2017-11-01
The paper presents an analysis of formation mechanism and properties of spatial distributions of blue stragglers in evolving globular clusters, based on numerical simulations done with the mocca code. First, there are presented N-body and mocca simulations which try to reproduce the simulations presented by Ferraro et al. (2012). Then, we show the agreement between N-body and the mocca code. Finally, we discuss the formation process of the bimodal distribution. We report that we could not reproduce simulations from Ferraro et al. (2012). Moreover, we show that the so-called bimodal spatial distribution of blue stragglers is a very transient feature. It is formed for one snapshot in time and it can easily vanish in the next one. Moreover, we show that the radius of avoidance proposed by Ferraro et al. (2012) goes out of sync with the apparent minimum of the bimodal distribution after about two half-mass relaxation times (without finding out what is the reason for that). This finding creates a real challenge for the dynamical clock, which uses this radius to determine the dynamical age of globular clusters. Additionally, the paper discusses a few important problems concerning the apparent visibilities of the bimodal distributions, which have to be taken into account while studying the spatial distributions of blue stragglers.
Development of an Acoustic Signal Analysis Tool “Auto-F” Based on the Temperament Scale
NASA Astrophysics Data System (ADS)
Modegi, Toshio
The MIDI interface is originally designed for electronic musical instruments but we consider this music-note based coding concept can be extended for general acoustic signal description. We proposed applying the MIDI technology to coding of bio-medical auscultation sound signals such as heart sounds for retrieving medical records and performing telemedicine. Then we have tried to extend our encoding targets including vocal sounds, natural sounds and electronic bio-signals such as ECG, using Generalized Harmonic Analysis method. Currently, we are trying to separate vocal sounds included in popular songs and encode both vocal sounds and background instrumental sounds into separate MIDI channels. And also, we are trying to extract articulation parameters such as MIDI pitch-bend parameters in order to reproduce natural acoustic sounds using a GM-standard MIDI tone generator. In this paper, we present an overall algorithm of our developed acoustic signal analysis tool, based on those research works, which can analyze given time-based signals on the musical temperament scale. The prominent feature of this tool is producing high-precision MIDI codes, which reproduce the similar signals as the given source signal using a GM-standard MIDI tone generator, and also providing analyzed texts in the XML format.
NASA Astrophysics Data System (ADS)
Cattaneo, A.; Blaizot, J.; Devriendt, J. E. G.; Mamon, G. A.; Tollet, E.; Dekel, A.; Guiderdoni, B.; Kucukbas, M.; Thob, A. C. R.
2017-10-01
GalICS 2.0 is a new semi-analytic code to model the formation and evolution of galaxies in a cosmological context. N-body simulations based on a Planck cosmology are used to construct halo merger trees, track subhaloes, compute spins and measure concentrations. The accretion of gas on to galaxies and the morphological evolution of galaxies are modelled with prescriptions derived from hydrodynamic simulations. Star formation and stellar feedback are described with phenomenological models (as in other semi-analytic codes). GalICS 2.0 computes rotation speeds from the gravitational potential of the dark matter, the disc and the central bulge. As the rotation speed depends not only on the virial velocity but also on the ratio of baryons to dark matter within a galaxy, our calculation predicts a different Tully-Fisher relation from models in which vrot ∝ vvir. This is why, GalICS 2.0 is able to reproduce the galaxy stellar mass function and the Tully-Fisher relation simultaneously. Our results are also in agreement with halo masses from weak lensing and satellite kinematics, gas fractions, the relation between star formation rate (SFR) and stellar mass, the evolution of the cosmic SFR density, bulge-to-disc ratios, disc sizes and the Faber-Jackson relation.
Simulations of vertical disruptions with VDE code: Hiro and Evans currents
NASA Astrophysics Data System (ADS)
Li, Xujing; Di Hu Team; Leonid Zakharov Team; Galkin Team
2014-10-01
The recently created numerical code VDE for simulations of vertical instability in tokamaks is presented. The numerical scheme uses the Tokamak MHD model, where the plasma inertia is replaced by the friction force, and an adaptive grid numerical scheme. The code reproduces well the surface currents generated at the plasma boundary by the instability. Five regimes of the vertical instability are presented: (1) Vertical instability in a given plasma shaping field without a wall; (2) The same with a wall and magnetic flux ΔΨ|plX< ΔΨ|Xwall(where X corresponds to the X-point of a separatrix); (3) The same with a wall and magnetic flux ΔΨ|plX> ΔΨ|Xwall; (4) Vertical instability without a wall with a tile surface at the plasma path; (5) The same in the presence of a wall and a tile surface. The generation of negative Hiro currents along the tile surface, predicted earlier by the theory and measured on EAST in 2012, is well-reproduced by simulations. In addition, the instability generates the force-free Evans currents at the free plasma surface. The new pattern of reconnection of the plasma with the vacuum magnetic field is discovered. This work is supported by US DoE Contract No. DE-AC02-09-CH11466.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cleveland, Mathew A., E-mail: cleveland7@llnl.gov; Brunner, Thomas A.; Gentile, Nicholas A.
2013-10-15
We describe and compare different approaches for achieving numerical reproducibility in photon Monte Carlo simulations. Reproducibility is desirable for code verification, testing, and debugging. Parallelism creates a unique problem for achieving reproducibility in Monte Carlo simulations because it changes the order in which values are summed. This is a numerical problem because double precision arithmetic is not associative. Parallel Monte Carlo, both domain replicated and decomposed simulations, will run their particles in a different order during different runs of the same simulation because the non-reproducibility of communication between processors. In addition, runs of the same simulation using different domain decompositionsmore » will also result in particles being simulated in a different order. In [1], a way of eliminating non-associative accumulations using integer tallies was described. This approach successfully achieves reproducibility at the cost of lost accuracy by rounding double precision numbers to fewer significant digits. This integer approach, and other extended and reduced precision reproducibility techniques, are described and compared in this work. Increased precision alone is not enough to ensure reproducibility of photon Monte Carlo simulations. Non-arbitrary precision approaches require a varying degree of rounding to achieve reproducibility. For the problems investigated in this work double precision global accuracy was achievable by using 100 bits of precision or greater on all unordered sums which where subsequently rounded to double precision at the end of every time-step.« less
Impact of Different Correlations on TRACEv4.160 Predicted Critical Heat Flux
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jasiulevicius, A.; Macian-Juan, R.
2006-07-01
This paper presents an independent assessment of the Critical Heat Flux (CHF) models implemented in TRACEv4.160 with data from the experiments carried out at the Royal Institute of Technology (RIT) in Stockholm, Sweden, with single vertical uniformly heated 7.0 m long tubes. In previous CHF assessment studies with TRACE, it was noted that, although the overall code predictions in long single tubes with inner diameters of 1.0 to 2.49 cm agreed rather well with the results of experiments (with r.m.s. error being 25.6%), several regions of pressure and coolant mass flux could be identified, in which the code strongly under-predictsmore » or over-predicts the CHF. In order to evaluate the possibility of improving the code performance, some of the most widely used and assessed CHF correlations were additionally implemented in TRACEv4.160, namely Bowring, Levitan - Lantsman, and Tong-W3. The results obtained for the CHF predictions in single tubes with uniform axial heat flux by using these correlations, were compared to the results produced with the standard TRACE correlations (Biasi and CISE-GE), and with the experimental data from RIT, which covered a broad range of pressures (3-20 MPa) and coolant mass fluxes (500-3000 kg/m{sup 2}s). Several hundreds of experimental points were calculated to cover the parameter range mentioned above for the evaluation of the newly implemented correlations in the TRACEv4.160 code. (author)« less
2016-01-01
Background A high-quality search strategy is considered an essential component of systematic reviews but many do not contain reproducible search strategies. It is unclear if low reproducibility spans medical disciplines, is affected by librarian/search specialist involvement or has improved with increased awareness of reporting guidelines. Objectives To examine the reporting of search strategies in systematic reviews published in Pediatrics, Surgery or Cardiology journals in 2012 and determine rates and predictors of including a reproducible search strategy. Methods We identified all systematic reviews published in 2012 in the ten highest impact factor journals in Pediatrics, Surgery and Cardiology. Each search strategy was coded to indicate what elements were reported and whether the overall search was reproducible. Reporting and reproducibility rates were compared across disciplines and we measured the influence of librarian/search specialist involvement, discipline or endorsement of a reporting guideline on search reproducibility. Results 272 articles from 25 journals were included. Reporting of search elements ranged widely from 91% of articles naming search terms to 33% providing a full search strategy and 22% indicating the date the search was executed. Only 22% of articles provided at least one reproducible search strategy and 13% provided a reproducible strategy for all databases searched in the article. Librarians or search specialists were reported as involved in 17% of articles. There were strong disciplinary differences on the reporting of search elements. In the multivariable analysis, only discipline (Pediatrics) was a significant predictor of the inclusion of a reproducible search strategy. Conclusions Despite recommendations to report full, reproducible search strategies, many articles still do not. In addition, authors often report a single strategy as covering all databases searched, further decreasing reproducibility. Further research is needed to determine how disciplinary culture may encourage reproducibility and the role that journal editors and peer reviewers could play. PMID:27669416
Using AORSA to simulate helicon waves in DIII-D
NASA Astrophysics Data System (ADS)
Lau, C.; Jaeger, E. F.; Bertelli, N.; Berry, L. A.; Blazevski, D.; Green, D. L.; Murakami, M.; Park, J. M.; Pinsker, R. I.; Prater, R.
2015-12-01
Recent efforts have shown that helicon waves (fast waves at > 20ωci) may be an attractive option for driving efficient off-axis current drive during non-inductive tokamak operation for DIII-D, ITER and DEMO. For DIII-D scenarios, the ray tracing code, GENRAY, has been extensively used to study helicon current drive efficiency and location as a function of many plasma parameters. The full wave code, AORSA, which is applicable to arbitrary Larmor radius and can resolve arbitrary ion cyclotron harmonic order, has been recently used to validate the ray tracing technique at these high cyclotron harmonics. If the SOL is ignored, it will be shown that the GENRAY and AORSA calculated current drive profiles are comparable for the envisioned high beta advanced scenarios for DIII-D, where there is high single pass absorption due to electron Landau damping and minimal ion damping. AORSA is also been used to estimate possible SOL effects on helicon current drive coupling and SOL absorption due to collisional and slow wave effects.
Altzitzoglou, Timotheos; Rožkov, Andrej
2016-03-01
The (129)I, (151)Sm and (166m)Ho standardisations using the CIEMAT/NIST efficiency tracing method, that have been carried out in the frame of the European Metrology Research Program project "Metrology for Radioactive Waste Management" are described. The radionuclide beta counting efficiencies were calculated using two computer codes CN2005 and MICELLE2. The sensitivity analysis of the code input parameters (ionization quenching factor, beta shape factor) on the calculated efficiencies was performed, and the results are discussed. The combined relative standard uncertainty of the standardisations of the (129)I, (151)Sm and (166m)Ho solutions were 0.4%, 0.5% and 0.4%, respectively. The stated precision obtained using the CIEMAT/NIST method is better than that previously reported in the literature obtained by the TDCR ((129)I), the 4πγ-NaI ((166m)Ho) counting or the CIEMAT/NIST method ((151)Sm). Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
MMA-EoS: A Computational Framework for Mineralogical Thermodynamics
NASA Astrophysics Data System (ADS)
Chust, T. C.; Steinle-Neumann, G.; Dolejš, D.; Schuberth, B. S. A.; Bunge, H.-P.
2017-12-01
We present a newly developed software framework, MMA-EoS, that evaluates phase equilibria and thermodynamic properties of multicomponent systems by Gibbs energy minimization, with application to mantle petrology. The code is versatile in terms of the equation-of-state and mixing properties and allows for the computation of properties of single phases, solution phases, and multiphase aggregates. Currently, the open program distribution contains equation-of-state formulations widely used, that is, Caloric-Murnaghan, Caloric-Modified-Tait, and Birch-Murnaghan-Mie-Grüneisen-Debye models, with published databases included. Through its modular design and easily scripted database, MMA-EoS can readily be extended with new formulations of equations-of-state and changes or extensions to thermodynamic data sets. We demonstrate the application of the program by reproducing and comparing physical properties of mantle phases and assemblages with previously published work and experimental data, successively increasing complexity, up to computing phase equilibria of six-component compositions. Chemically complex systems allow us to trace the budget of minor chemical components in order to explore whether they lead to the formation of new phases or extend stability fields of existing ones. Self-consistently computed thermophysical properties for a homogeneous mantle and a mechanical mixture of slab lithologies show no discernible differences that require a heterogeneous mantle structure as has been suggested previously. Such examples illustrate how thermodynamics of mantle mineralogy can advance the study of Earth's interior.
Field aligned flows driven by neutral puffing at MAST
NASA Astrophysics Data System (ADS)
Waters, I.; Frerichs, H.; Silburn, S.; Feng, Y.; Harrison, J.; Kirk, A.; Schmitz, O.
2018-06-01
Neutral deuterium gas puffing at the high field side of the mega ampere spherical tokamak (MAST) is shown to drive carbon impurity flows that are aligned with the trajectory of the magnetic field lines in the plasma scrape-off-layer. These impurity flows were directly imaged with emissions from C2+ ions at MAST by coherence imaging spectroscopy and were qualitatively reproduced in deuterium plasmas by modeling with the EMC3-EIRENE plasma edge fluid and kinetic neutral transport code. A reduced one-dimensional momentum and particle balance shows that a localized increase in the static plasma pressure in front of the neutral gas puff yields an acceleration of the plasma due to local ionization. Perpendicular particle transport yields a decay from which a parallel length scale can be determined. Parameter scans in EMC3-EIRENE were carried out to determine the sensitivity of the deuterium plasma flow phenomena to local fueling and diffusion parameters and it is found that these flows robustly form across a wide variety of plasma conditions. Finally, efforts to couple this behavior in the background plasma directly to the impurity flows observed experimentally in MAST using a trace impurity model are discussed. These results provide insight into the fueling and exhaust features at this pivotal point of the radial and parallel particle flux balance, which is a major part of the plasma fueling and exhaust characteristics in a magnetically confined fusion device.
Moral codes of mothering and the introduction of welfare-to-work in Ontario.
Gazso, Amber
2012-02-01
In this paper, I trace how the reform of social assistance in Ontario, especially the post-1990s enforcement of lone mothers' employability via welfare-to-work programs, parallels shifts in dominant moral codes of mothering, from "mother-carer" to "mother-worker." Additionally, I use this case as an entry point to consider the implications of public and policy allegiance to these moral codes for all mothers. The central argument I make is that the introduction of welfare-to-work programs in Ontario did not occur in a neoliberal state-sanctioned vacuum but also involved the circulation of ideas about moral mothering outside of policy into policy.
An update on the BQCD Hybrid Monte Carlo program
NASA Astrophysics Data System (ADS)
Haar, Taylor Ryan; Nakamura, Yoshifumi; Stüben, Hinnerk
2018-03-01
We present an update of BQCD, our Hybrid Monte Carlo program for simulating lattice QCD. BQCD is one of the main production codes of the QCDSF collaboration and is used by CSSM and in some Japanese finite temperature and finite density projects. Since the first publication of the code at Lattice 2010 the program has been extended in various ways. New features of the code include: dynamical QED, action modification in order to compute matrix elements by using Feynman-Hellman theory, more trace measurements (like Tr(D-n) for K, cSW and chemical potential reweighting), a more flexible integration scheme, polynomial filtering, term-splitting for RHMC, and a portable implementation of performance critical parts employing SIMD.
Method for calculating internal radiation and ventilation with the ADINAT heat-flow code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butkovich, T.R.; Montan, D.N.
1980-04-01
One objective of the spent fuel test in Climax Stock granite (SFTC) is to correctly model the thermal transport, and the changes in the stress field and accompanying displacements from the application of the thermal loads. We have chosen the ADINA and ADINAT finite element codes to do these calculations. ADINAT is a heat transfer code compatible to the ADINA displacement and stress analysis code. The heat flow problem encountered at SFTC requires a code with conduction, radiation, and ventilation capabilities, which the present version of ADINAT does not have. We have devised a method for calculating internal radiation andmore » ventilation with the ADINAT code. This method effectively reproduces the results from the TRUMP multi-dimensional finite difference code, which correctly models radiative heat transport between drift surfaces, conductive and convective thermal transport to and through air in the drifts, and mass flow of air in the drifts. The temperature histories for each node in the finite element mesh calculated with ADINAT using this method can be used directly in the ADINA thermal-mechanical calculation.« less
NASA Astrophysics Data System (ADS)
Meléndez, A.; Korenaga, J.; Sallarès, V.; Miniussi, A.; Ranero, C. R.
2015-10-01
We present a new 3-D traveltime tomography code (TOMO3D) for the modelling of active-source seismic data that uses the arrival times of both refracted and reflected seismic phases to derive the velocity distribution and the geometry of reflecting boundaries in the subsurface. This code is based on its popular 2-D version TOMO2D from which it inherited the methods to solve the forward and inverse problems. The traveltime calculations are done using a hybrid ray-tracing technique combining the graph and bending methods. The LSQR algorithm is used to perform the iterative regularized inversion to improve the initial velocity and depth models. In order to cope with an increased computational demand due to the incorporation of the third dimension, the forward problem solver, which takes most of the run time (˜90 per cent in the test presented here), has been parallelized with a combination of multi-processing and message passing interface standards. This parallelization distributes the ray-tracing and traveltime calculations among available computational resources. The code's performance is illustrated with a realistic synthetic example, including a checkerboard anomaly and two reflectors, which simulates the geometry of a subduction zone. The code is designed to invert for a single reflector at a time. A data-driven layer-stripping strategy is proposed for cases involving multiple reflectors, and it is tested for the successive inversion of the two reflectors. Layers are bound by consecutive reflectors, and an initial velocity model for each inversion step incorporates the results from previous steps. This strategy poses simpler inversion problems at each step, allowing the recovery of strong velocity discontinuities that would otherwise be smoothened.
TRAC posttest calculations of Semiscale Test S-06-3. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ireland, J.R.; Bleiweis, P.B.
A comparison of Transient Reactor Analysis Code (TRAC) steady-state and transient results with Semiscale Test S-06-3 (US Standard Problem 8) experimental data is discussed. The TRAC model used employs fewer mesh cells than normal data comparison models so that TRAC's ability to obtain reasonable results with less computer time can be assessed. In general, the TRAC results are in good agreement with the data and the major phenomena found in the experiment are reproduced by the code with a substantial reduction in computing times.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miyamoto, K.; Okuda, S.; Hatayama, A.
2013-01-14
To understand the physical mechanism of the beam halo formation in negative ion beams, a two-dimensional particle-in-cell code for simulating the trajectories of negative ions created via surface production has been developed. The simulation code reproduces a beam halo observed in an actual negative ion beam. The negative ions extracted from the periphery of the plasma meniscus (an electro-static lens in a source plasma) are over-focused in the extractor due to large curvature of the meniscus.
NASA Astrophysics Data System (ADS)
Work, Paul R.
1991-12-01
This thesis investigates the parallelization of existing serial programs in computational electromagnetics for use in a parallel environment. Existing algorithms for calculating the radar cross section of an object are covered, and a ray-tracing code is chosen for implementation on a parallel machine. Current parallel architectures are introduced and a suitable parallel machine is selected for the implementation of the chosen ray-tracing algorithm. The standard techniques for the parallelization of serial codes are discussed, including load balancing and decomposition considerations, and appropriate methods for the parallelization effort are selected. A load balancing algorithm is modified to increase the efficiency of the application, and a high level design of the structure of the serial program is presented. A detailed design of the modifications for the parallel implementation is also included, with both the high level and the detailed design specified in a high level design language called UNITY. The correctness of the design is proven using UNITY and standard logic operations. The theoretical and empirical results show that it is possible to achieve an efficient parallel application for a serial computational electromagnetic program where the characteristics of the algorithm and the target architecture critically influence the development of such an implementation.
Decoding the genome with an integrative analysis tool: combinatorial CRM Decoder.
Kang, Keunsoo; Kim, Joomyeong; Chung, Jae Hoon; Lee, Daeyoup
2011-09-01
The identification of genome-wide cis-regulatory modules (CRMs) and characterization of their associated epigenetic features are fundamental steps toward the understanding of gene regulatory networks. Although integrative analysis of available genome-wide information can provide new biological insights, the lack of novel methodologies has become a major bottleneck. Here, we present a comprehensive analysis tool called combinatorial CRM decoder (CCD), which utilizes the publicly available information to identify and characterize genome-wide CRMs in a species of interest. CCD first defines a set of the epigenetic features which is significantly associated with a set of known CRMs as a code called 'trace code', and subsequently uses the trace code to pinpoint putative CRMs throughout the genome. Using 61 genome-wide data sets obtained from 17 independent mouse studies, CCD successfully catalogued ∼12 600 CRMs (five distinct classes) including polycomb repressive complex 2 target sites as well as imprinting control regions. Interestingly, we discovered that ∼4% of the identified CRMs belong to at least two different classes named 'multi-functional CRM', suggesting their functional importance for regulating spatiotemporal gene expression. From these examples, we show that CCD can be applied to any potential genome-wide datasets and therefore will shed light on unveiling genome-wide CRMs in various species.
Students' Views and Attitudes Towards the Communication Code Used in Press Articles About Science
NASA Astrophysics Data System (ADS)
Halkia, Krystallia; Mantzouridis, Dimitris
2005-10-01
The present research was designed to investigate the reaction of secondary school students to the communication code that the press uses in science articles: it attempts to trace which communication techniques can be of potential use in science education. The sample of the research consists of 351 secondary school students. The research instrument is a questionnaire, which attempts to trace students’ preferences regarding newspaper science articles, to explore students’ attitudes towards the science articles published in the press and to investigate students’ reactions towards four newspaper science articles. These articles deal with different aspects of science and reflect different communication strategies. The results of the research reveal that secondary school students view the communication codes used in press science articles as being more interesting and comprehensible than those of their science textbooks. Predominantly, they do not select science articles that present their data in a scientific way (diagrams and abstract graphs). On the contrary, they do select science articles and passages in them, which use an emotional/‘poetic’ language with a lot of metaphors and analogies to introduce complex science concepts. It also seems that the narrative elements found in popularized science articles attract students’ interest and motivate them towards further reading.
Liu, Baodong; Liu, Xiaoling; Lai, Weiyi; Wang, Hailin
2017-06-06
DNA N 6 -methyl-2'-deoxyadenosine (6mdA) is an epigenetic modification in both eukaryotes and bacteria. Here we exploited stable isotope-labeled deoxynucleoside [ 15 N 5 ]-2'-deoxyadenosine ([ 15 N 5 ]-dA) as an initiation tracer and for the first time developed a metabolically differential tracing code for monitoring DNA 6mdA in human cells. We demonstrate that the initiation tracer [ 15 N 5 ]-dA undergoes a specific and efficient adenine deamination reaction leading to the loss the exocyclic amine 15 N, and further utilizes the purine salvage pathway to generate mainly both [ 15 N 4 ]-dA and [ 15 N 4 ]-2'-deoxyguanosine ([ 15 N 4 ]-dG) in mammalian genomes. However, [ 15 N 5 ]-dA is largely retained in the genomes of mycoplasmas, which are often found in cultured cells and experimental animals. Consequently, the methylation of dA generates 6mdA with a consistent coding pattern, with a predominance of [ 15 N 4 ]-6mdA. Therefore, mammalian DNA 6mdA can be potentially discriminated from that generated by infecting mycoplasmas. Collectively, we show a promising approach for identification of authentic DNA 6mdA in human cells and determine if the human cells are contaminated with mycoplasmas.
NASA Astrophysics Data System (ADS)
Guillong, M.; Günther, D.
2001-07-01
A homogenized 193 nm excimer laser with a flat-top beam profile was used to study the capabilities of LA-ICP-MS for 'quasi' non-destructive fingerprinting and sourcing of sapphires from different locations. Sapphires contain 97-99% of Al 2O 3 (corundum), with the remainder composed of several trace elements, which can be used to distinguish the origin of these gemstones. The ablation behavior of sapphires, as well as the minimum quantity of sample removal that is required to determine these trace elements, was investigated. The optimum ablation conditions were a fluency of 6 J cm -2, a crater diameter of 120 μm, and a laser repetition rate of 10 Hz. The optimum time for the ablation was determined to be 2 s, equivalent to 20 laser pulses. The mean sample removal was 60 nm per pulse (approx. 3 ng per pulse). This allowed satisfactory trace element determination, and was found to cause the minimum amount of damage, while allowing for the fingerprinting of sapphires. More than 40 isotopes were measured using different spatial resolutions (20-120 μm) and eight elements were reproducibly detected in 25 sapphire samples from five different locations. The reproducibility of the trace element distribution is limited by the heterogeneity of the sample. The mean of five or more replicate analyses per sample was used. Calibration was carried out using NIST 612 glass reference material as external standard. The linear dynamic range of the ICP-MS (nine orders of magnitude) allowed the use of Al, the major element in sapphire, as an internal standard. The limits of detection for most of the light elements were in the μg g -1 range and were better for heavier elements (mass >85), being in the 0.1 μg g -1 range. The accuracy of the determinations was demonstrated by comparison with XRF analyses of the same set of samples. Using the quantitative analyses obtained using LA-ICP-MS, natural sapphires from five different origins were statistically classified using ternary plots and principal multi-component analysis.
Emotional expressions in voice and music: same code, same effect?
Escoffier, Nicolas; Zhong, Jidan; Schirmer, Annett; Qiu, Anqi
2013-08-01
Scholars have documented similarities in the way voice and music convey emotions. By using functional magnetic resonance imaging (fMRI) we explored whether these similarities imply overlapping processing substrates. We asked participants to trace changes in either the emotion or pitch of vocalizations and music using a joystick. Compared to music, vocalizations more strongly activated superior and middle temporal cortex, cuneus, and precuneus. However, despite these differences, overlapping rather than differing regions emerged when comparing emotion with pitch tracing for music and vocalizations, respectively. Relative to pitch tracing, emotion tracing activated medial superior frontal and anterior cingulate cortex regardless of stimulus type. Additionally, we observed emotion specific effects in primary and secondary auditory cortex as well as in medial frontal cortex that were comparable for voice and music. Together these results indicate that similar mechanisms support emotional inferences from vocalizations and music and that these mechanisms tap on a general system involved in social cognition. Copyright © 2011 Wiley Periodicals, Inc.
MetaboAnalystR: an R package for flexible and reproducible analysis of metabolomics data.
Chong, Jasmine; Xia, Jianguo
2018-06-28
The MetaboAnalyst web application has been widely used for metabolomics data analysis and interpretation. Despite its user-friendliness, the web interface has presented its inherent limitations (especially for advanced users) with regard to flexibility in creating customized workflow, support for reproducible analysis, and capacity in dealing with large data. To address these limitations, we have developed a companion R package (MetaboAnalystR) based on the R code base of the web server. The package has been thoroughly tested to ensure that the same R commands will produce identical results from both interfaces. MetaboAnalystR complements the MetaboAnalyst web server to facilitate transparent, flexible and reproducible analysis of metabolomics data. MetaboAnalystR is freely available from https://github.com/xia-lab/MetaboAnalystR. Supplementary data are available at Bioinformatics online.
Reproducing a Prospective Clinical Study as a Computational Retrospective Study in MIMIC-II.
Kury, Fabrício S P; Huser, Vojtech; Cimino, James J
2015-01-01
In this paper we sought to reproduce, as a computational retrospective study in an EHR database (MIMIC-II), a recent large prospective clinical study: the 2013 publication, by the Japanese Association for Acute Medicine (JAAM), about disseminated intravascular coagulation, in the journal Critical Care (PMID: 23787004). We designed in SQL and Java a set of electronic phenotypes that reproduced the study's data sampling, and used R to perform the same statistical inference procedures. All produced source code is available online at https://github.com/fabkury/paamia2015. Our program identified 2,257 eligible patients in MIMIC-II, and the results remarkably agreed with the prospective study. A minority of the needed data elements was not found in MIMIC-II, and statistically significant inferences were possible in the majority of the cases.
NASA Astrophysics Data System (ADS)
Tichý, Vladimír; Hudec, René; Němcová, Šárka
2016-06-01
The algorithm presented is intended mainly for lobster eye optics. This type of optics (and some similar types) allows for a simplification of the classical ray-tracing procedure that requires great many rays to simulate. The method presented performs the simulation of a only few rays; therefore it is extremely effective. Moreover, to simplify the equations, a specific mathematical formalism is used. Only a few simple equations are used, therefore the program code can be simple as well. The paper also outlines how to apply the method to some other reflective optical systems.
A Parameter Study for Modeling Mg ii h and k Emission during Solar Flares
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubio da Costa, Fatima; Kleint, Lucia, E-mail: frubio@stanford.edu
2017-06-20
Solar flares show highly unusual spectra in which the thermodynamic conditions of the solar atmosphere are encoded. Current models are unable to fully reproduce the spectroscopic flare observations, especially the single-peaked spectral profiles of the Mg ii h and k lines. We aim to understand the formation of the chromospheric and optically thick Mg ii h and k lines in flares through radiative transfer calculations. We take a flare atmosphere obtained from a simulation with the radiative hydrodynamic code RADYN as input for a radiative transfer modeling with the RH code. By iteratively changing this model atmosphere and varying thermodynamicmore » parameters such as temperature, electron density, and velocity, we study their effects on the emergent intensity spectra. We reproduce the typical single-peaked Mg ii h and k flare spectral shape and approximate the intensity ratios to the subordinate Mg ii lines by increasing either densities, temperatures, or velocities at the line core formation height range. Additionally, by combining unresolved upflows and downflows up to ∼250 km s{sup −1} within one resolution element, we reproduce the widely broadened line wings. While we cannot unambiguously determine which mechanism dominates in flares, future modeling efforts should investigate unresolved components, additional heat dissipation, larger velocities, and higher densities and combine the analysis of multiple spectral lines.« less
Abo, Takayuki; Hilberer, Allison; Behle-Wagner, Christine; Watanabe, Mika; Cameron, David; Kirst, Annette; Nukada, Yuko; Yuki, Takuo; Araki, Daisuke; Sakaguchi, Hitoshi; Itagaki, Hiroshi
2018-04-01
The Short Time Exposure (STE) test method is an alternative method for assessing eye irritation potential using Statens Seruminstitut Rabbit Cornea cells and has been adopted as test guideline 491 by the Organisation for Economic Co-operation and Development. Its good predictive performance in identifying the Globally Harmonized System (GHS) No Category (NC) or Irritant Category has been demonstrated in evaluations of water-soluble substances, oil-soluble substances, and water-soluble mixtures. However, the predictive performance for oil-soluble mixtures was not evaluated. Twenty-four oil-soluble mixtures were evaluated using the STE test method. The GHS NC or Irritant Category of 22 oil-soluble mixtures were consistent with that of a Reconstructed human Cornea-like Epithelium (RhCE) test method. Inter-laboratory reproducibility was then confirmed using 20 water- and oil-soluble mixtures blind-coded. The concordance in GHS NC or Irritant Category among four laboratories was 90%-100%. In conclusion, the concordance in comparison with the results of RhCE test method using 24 oil-soluble mixtures and inter-laboratory reproducibility using 20 water- and oil-soluble mixtures blind-coded were good, indicating that the STE test method is a suitable alternative for predicting the eye irritation potential of both substances and mixtures. Copyright © 2018 Elsevier Ltd. All rights reserved.
Göbl, Rüdiger; Navab, Nassir; Hennersperger, Christoph
2018-06-01
Research in ultrasound imaging is limited in reproducibility by two factors: First, many existing ultrasound pipelines are protected by intellectual property, rendering exchange of code difficult. Second, most pipelines are implemented in special hardware, resulting in limited flexibility of implemented processing steps on such platforms. With SUPRA, we propose an open-source pipeline for fully software-defined ultrasound processing for real-time applications to alleviate these problems. Covering all steps from beamforming to output of B-mode images, SUPRA can help improve the reproducibility of results and make modifications to the image acquisition mode accessible to the research community. We evaluate the pipeline qualitatively, quantitatively, and regarding its run time. The pipeline shows image quality comparable to a clinical system and backed by point spread function measurements a comparable resolution. Including all processing stages of a usual ultrasound pipeline, the run-time analysis shows that it can be executed in 2D and 3D on consumer GPUs in real time. Our software ultrasound pipeline opens up the research in image acquisition. Given access to ultrasound data from early stages (raw channel data, radiofrequency data), it simplifies the development in imaging. Furthermore, it tackles the reproducibility of research results, as code can be shared easily and even be executed without dedicated ultrasound hardware.
Current reinforcement model reproduces center-in-center vein trajectory of Physarum polycephalum.
Akita, Dai; Schenz, Daniel; Kuroda, Shigeru; Sato, Katsuhiko; Ueda, Kei-Ichi; Nakagaki, Toshiyuki
2017-06-01
Vein networks span the whole body of the amoeboid organism in the plasmodial slime mould Physarum polycephalum, and the network topology is rearranged within an hour in response to spatio-temporal variations of the environment. It has been reported that this tube morphogenesis is capable of solving mazes, and a mathematical model, named the 'current reinforcement rule', was proposed based on the adaptability of the veins. Although it is known that this model works well for reproducing some key characters of the organism's maze-solving behaviour, one important issue is still open: In the real organism, the thick veins tend to trace the shortest possible route by cutting the corners at the turn of corridors, following a center-in-center trajectory, but it has not yet been examined whether this feature also appears in the mathematical model, using corridors of finite width. In this report, we confirm that the mathematical model reproduces the center-in-center trajectory of veins around corners observed in the maze-solving experiment. © 2017 Japanese Society of Developmental Biologists.
FormTracer. A mathematica tracing package using FORM
NASA Astrophysics Data System (ADS)
Cyrol, Anton K.; Mitter, Mario; Strodthoff, Nils
2017-10-01
We present FormTracer, a high-performance, general purpose, easy-to-use Mathematica tracing package which uses FORM. It supports arbitrary space and spinor dimensions as well as an arbitrary number of simple compact Lie groups. While keeping the usability of the Mathematica interface, it relies on the efficiency of FORM. An additional performance gain is achieved by a decomposition algorithm that avoids redundant traces in the product tensors spaces. FormTracer supports a wide range of syntaxes which endows it with a high flexibility. Mathematica notebooks that automatically install the package and guide the user through performing standard traces in space-time, spinor and gauge-group spaces are provided. Program Files doi:http://dx.doi.org/10.17632/7rd29h4p3m.1 Licensing provisions: GPLv3 Programming language: Mathematica and FORM Nature of problem: Efficiently compute traces of large expressions Solution method: The expression to be traced is decomposed into its subspaces by a recursive Mathematica expansion algorithm. The result is subsequently translated to a FORM script that takes the traces. After FORM is executed, the final result is either imported into Mathematica or exported as optimized C/C++/Fortran code. Unusual features: The outstanding features of FormTracer are the simple interface, the capability to efficiently handle an arbitrary number of Lie groups in addition to Dirac and Lorentz tensors, and a customizable input-syntax.
Nanofiber-net-binary structured membranes for highly sensitive detection of trace HCl gas
NASA Astrophysics Data System (ADS)
Wang, Xianfeng; Wang, Jialin; Si, Yang; Ding, Bin; Yu, Jianyong; Sun, Gang; Luo, Wenjing; Zheng, Gang
2012-11-01
This work describes the detection of trace hydrogen chloride (HCl) gas through analyses of the resonance frequency signal from quartz crystal microbalance (QCM) sensors coated with polyaniline (PANI) functionalized polyamide 6 (PA 6) (PANI-PA 6) nanofiber-net-binary (NNB) structured membranes. The PA 6 NNB substrate comprising nanofibers and spider-web-like nano-nets fabricated by a versatile electro-spinning/netting (ESN) process offered an ideal interface for the uniform PANI functionalization and enhanced sensing performance. Benefiting from the large specific surface area, high porosity, and strong adhesive force to the QCM electrode of the PANI-PA 6 NNB membranes, the developed HCl-selective sensors exhibited a rapid response, good reproducibility and stability, and low detection limit (7 ppb) at room temperature. Additionally, the PANI-PA 6 NNB sensing membranes presented visible color changes upon cycled exposure to HCl and ammonia, suggesting their potential application in the development of colorimetric sensors. The PANI-PA 6 NNB coated QCM sensors are considered to be a promising candidate for trace HCl gas detection in practical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Binotti, M.; Zhu, G.; Gray, A.
An analytical approach, as an extension of one newly developed method -- First-principle OPTical Intercept Calculation (FirstOPTIC) -- is proposed to treat the geometrical impact of three-dimensional (3-D) effects on parabolic trough optical performance. The mathematical steps of this analytical approach are presented and implemented numerically as part of the suite of FirstOPTIC code. In addition, the new code has been carefully validated against ray-tracing simulation results and available numerical solutions. This new analytical approach to treating 3-D effects will facilitate further understanding and analysis of the optical performance of trough collectors as a function of incidence angle.
Force field development with GOMC, a fast new Monte Carlo molecular simulation code
NASA Astrophysics Data System (ADS)
Mick, Jason Richard
In this work GOMC (GPU Optimized Monte Carlo) a new fast, flexible, and free molecular Monte Carlo code for the simulation atomistic chemical systems is presented. The results of a large Lennard-Jonesium simulation in the Gibbs ensemble is presented. Force fields developed using the code are also presented. To fit the models a quantitative fitting process is outlined using a scoring function and heat maps. The presented n-6 force fields include force fields for noble gases and branched alkanes. These force fields are shown to be the most accurate LJ or n-6 force fields to date for these compounds, capable of reproducing pure fluid behavior and binary mixture behavior to a high degree of accuracy.
NASA Astrophysics Data System (ADS)
Bogiatzis, P.; Altoé, I. L.; Karamitrou, A.; Ishii, M.; Ishii, H.
2015-12-01
DigitSeis is a new open-source, interactive digitization software written in MATLAB that converts digital, raster images of analog seismograms to readily usable, discretized time series using image processing algorithms. DigitSeis automatically identifies and corrects for various geometrical distortions of seismogram images that are acquired through the original recording, storage, and scanning procedures. With human supervision, the software further identifies and classifies important features such as time marks and notes, corrects time-mark offsets from the main trace, and digitizes the combined trace with an analysis to obtain as accurate timing as possible. Although a large effort has been made to minimize the human input, DigitSeis provides interactive tools for challenging situations such as trace crossings and stains in the paper. The effectiveness of the software is demonstrated with the digitization of seismograms that are over half a century old from the Harvard-Adam Dziewoński observatory that is still in operation as a part of the Global Seismographic Network (station code HRV and network code IU). The spectral analysis of the digitized time series shows no spurious features that may be related to the occurrence of minute and hour marks. They also display signals associated with significant earthquakes, and a comparison of the spectrograms with modern recordings reveals similarities in the background noise.
Label-Free Toxin Detection by Means of Time-Resolved Electrochemical Impedance Spectroscopy
Chai, Changhoon; Takhistov, Paul
2010-01-01
The real-time detection of trace concentrations of biological toxins requires significant improvement of the detection methods from those reported in the literature. To develop a highly sensitive and selective detection device it is necessary to determine the optimal measuring conditions for the electrochemical sensor in three domains: time, frequency and polarization potential. In this work we utilized a time-resolved electrochemical impedance spectroscopy for the detection of trace concentrations of Staphylococcus enterotoxin B (SEB). An anti-SEB antibody has been attached to the nano-porous aluminum surface using 3-aminopropyltriethoxysilane/glutaraldehyde coupling system. This immobilization method allows fabrication of a highly reproducible and stable sensing device. Using developed immobilization procedure and optimized detection regime, it is possible to determine the presence of SEB at the levels as low as 10 pg/mL in 15 minutes. PMID:22315560
Tracking and Establishing Provenance of Earth Science Datasets: A NASA-based Example
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram K.; Goldstein, Justin C.; Hua, Hook; Wolfe, Robert E.
2016-01-01
Information quality is of paramount importance to science. Accurate, scientifically vetted and statistically meaningful and, ideally, reproducible information engenders scientific trust and research opportunities. Therefore, so-called Highly Influential Scientific Assessments (HISA) such as the U.S. Third National Climate Assessment undergo a very rigorous process to ensure transparency and credibility. As an activity to support the transparency of such reports, the U.S. Global Change Research Program has developed the Global Change Information System (GCIS). Specifically related to the transparency of NCA3, a recent activity was carried out to trace the provenance as completely as possible for all figures in the NCA3 report that predominantly used NASA data. This paper discusses lessons learned from this activity that trace the provenance of NASA figures in a major HISA-class pdf report.
The computational nature of memory modification.
Gershman, Samuel J; Monfils, Marie-H; Norman, Kenneth A; Niv, Yael
2017-03-15
Retrieving a memory can modify its influence on subsequent behavior. We develop a computational theory of memory modification, according to which modification of a memory trace occurs through classical associative learning, but which memory trace is eligible for modification depends on a structure learning mechanism that discovers the units of association by segmenting the stream of experience into statistically distinct clusters (latent causes). New memories are formed when the structure learning mechanism infers that a new latent cause underlies current sensory observations. By the same token, old memories are modified when old and new sensory observations are inferred to have been generated by the same latent cause. We derive this framework from probabilistic principles, and present a computational implementation. Simulations demonstrate that our model can reproduce the major experimental findings from studies of memory modification in the Pavlovian conditioning literature.
Comparing TCV experimental VDE responses with DINA code simulations
NASA Astrophysics Data System (ADS)
Favez, J.-Y.; Khayrutdinov, R. R.; Lister, J. B.; Lukash, V. E.
2002-02-01
The DINA free-boundary equilibrium simulation code has been implemented for TCV, including the full TCV feedback and diagnostic systems. First results showed good agreement with control coil perturbations and correctly reproduced certain non-linear features in the experimental measurements. The latest DINA code simulations, presented in this paper, exploit discharges with different cross-sectional shapes and different vertical instability growth rates which were subjected to controlled vertical displacement events (VDEs), extending previous work with the DINA code on the DIII-D tokamak. The height of the TCV vessel allows observation of the non-linear evolution of the VDE growth rate as regions of different vertical field decay index are crossed. The vertical movement of the plasma is found to be well modelled. For most experiments, DINA reproduces the S-shape of the vertical displacement in TCV with excellent precision. This behaviour cannot be modelled using linear time-independent models because of the predominant exponential shape due to the unstable pole of any linear time-independent model. The other most common equilibrium parameters like the plasma current Ip, the elongation κ, the triangularity δ, the safety factor q, the ratio between the averaged plasma kinetic pressure and the pressure of the poloidal magnetic field at the edge of the plasma βp, and the internal self inductance li also show acceptable agreement. The evolution of the growth rate γ is estimated and compared with the evolution of the closed-loop growth rate calculated with the RZIP linear model, confirming the origin of the observed behaviour.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petitpas, Guillaume; Whitesides, Russel
UQHCCI_2 propagates the uncertainties of mass-average quantities (temperature, heat capacity ratio) and the output performances (IMEP, heat release, CA50 and RI) of a HCCI engine test bench using the pressure trace, and intake and exhaust molar fraction and IVC temperature distributions, as inputs (those inputs may be computed using another code UQHCCI_2, or entered independently).
Learning ESL Literacy among Indo-Canadian Women.
ERIC Educational Resources Information Center
Cumming, Alister; Gill, Jaswinder
1991-01-01
Reports findings from an action research project that set up an instructional program for Punjabi-speaking women immigrants and traced their English and literacy development in classroom and home settings. Data indicate that efforts to teach and acquire literacy focused on language code; self-control strategies and schematic representations for…
Nonimaging light concentration using total internal reflection films.
Ouellette, G; Waltham, C E; Drees, R M; Poon, A; Schubank, R; Whitehead, L A
1992-05-01
We present a method of fabricating nonimaging light concentrators from total internal reflection film. A prototype has been made and tested and found to operate in agreement with predictions of ray-tracing codes. The performance of the prototype is comparable with that of concentrators made from specular reflecting materials.
NASA Technical Reports Server (NTRS)
Wang, Yongli; Benson, Robert F.
2011-01-01
Two software applications have been produced specifically for the analysis of some million digital topside ionograms produced by a recent analog-to-digital conversion effort of selected analog telemetry tapes from the Alouette-2, ISIS-1 and ISIS-2 satellites. One, TOPIST (TOPside Ionogram Scalar with True-height algorithm) from the University of Massachusetts Lowell, is designed for the automatic identification of the topside-ionogram ionospheric-reflection traces and their inversion into vertical electron-density profiles Ne(h). TOPIST also has the capability of manual intervention. The other application, from the Goddard Space Flight Center based on the FORTRAN code of John E. Jackson from the 1960s, is designed as an IDL-based interactive program for the scaling of selected digital topside-sounder ionograms. The Jackson code has also been modified, with some effort, so as to run on modern computers. This modification was motivated by the need to scale selected ionograms from the millions of Alouette/ISIS topside-sounder ionograms that only exist on 35-mm film. During this modification, it became evident that it would be more efficient to design a new code, based on the capabilities of present-day computers, than to continue to modify the old code. Such a new code has been produced and here we will describe its capabilities and compare Ne(h) profiles produced from it with those produced by the Jackson code. The concept of the new code is to assume an initial Ne(h) and derive a final Ne(h) through an iteration process that makes the resulting apparent-height profile fir the scaled values within a certain error range. The new code can be used on the X-, O-, and Z-mode traces. It does not assume any predefined profile shape between two contiguous points, like the exponential rule used in Jackson s program. Instead, Monotone Piecewise Cubic Interpolation is applied in the global profile to keep the monotone nature of the profile, which also ensures better smoothness in the final profile than in Jackson s program. The new code uses the complete refractive index expression for a cold collisionless plasma and can accommodate the IGRF, T96, and other geomagnetic field models.
Reproducing Epidemiologic Research and Ensuring Transparency.
Coughlin, Steven S
2017-08-15
Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Validation of multi-temperature nozzle flow code NOZNT
NASA Technical Reports Server (NTRS)
Park, Chul; Lee, Seung-Ho
1993-01-01
A computer code NOZNT (Nozzle in n-Temperatures), which calculates one-dimensional flows of partially dissociated and ionized air in an expanding nozzle, is tested against five existing sets of experimental data. The code accounts for: a) the differences among various temperatures, i.e., translational-rotational temperature, vibrational temperatures of individual molecular species, and electron-electronic temperature, b) radiative cooling, and c) the effects of impurities. The experimental data considered are: 1) the sodium line reversal and 2) the electron temperature and density data, both obtained in a shock tunnel, and 3) the spectroscopic emission data, 4) electron beam data on vibrational temperature, and 5) mass-spectrometric species concentration data, all obtained in arc-jet wind tunnels. It is shown that the impurities are most likely responsible for the observed phenomena in shock tunnels. For the arc-jet flows, impurities are inconsequential and the NOZNT code is validated by numerically reproducing the experimental data.
NASA Technical Reports Server (NTRS)
Bobbitt, Percy J.
1992-01-01
A discussion is given of the many factors that affect sonic booms with particular emphasis on the application and development of improved computational fluid dynamics (CFD) codes. The benefits that accrue from interference (induced) lift, distributing lift using canard configurations, the use of wings with dihedral or anhedral and hybrid laminar flow control for drag reduction are detailed. The application of the most advanced codes to a wider variety of configurations along with improved ray-tracing codes to arrive at more accurate and, hopefully, lower sonic booms is advocated. Finally, it is speculated that when all of the latest technology is applied to the design of a supersonic transport it will be found environmentally acceptable.
Weindling, P
2001-01-01
The Nuremberg Code has generally been seen as arising from the Nuremberg Medical Trial. This paper examines developments prior to the Trial, involving the physiologist Andrew Conway Ivy and an inter-Allied Scientific Commission on Medical War Crimes. The paper traces the formulation of the concept of a medical war crime by the physiologist John West Thompson, as part of the background to Ivy's code on human experiments of 1 August 1946. It evaluates subsequent responses by the American Medical Association, and by other war crimes experts, notably Leo Alexander, who developed Ivy's conceptual framework. Ivy's interaction with the judges at Nuremberg alerted them to the importance of formulating ethical guidelines for clinical research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strait, R.S.; Pearson, P.K.; Sengupta, S.K.
A password system comprises a set of codewords spaced apart from one another by a Hamming distance (HD) that exceeds twice the variability that can be projected for a series of biometric measurements for a particular individual and that is less than the HD that can be encountered between two individuals. To enroll an individual, a biometric measurement is taken and exclusive-ORed with a random codeword to produce a reference value. To verify the individual later, a biometric measurement is taken and exclusive-ORed with the reference value to reproduce the original random codeword or its approximation. If the reproduced valuemore » is not a codeword, the nearest codeword to it is found, and the bits that were corrected to produce the codeword to it is found, and the bits that were corrected to produce the codeword are also toggled in the biometric measurement taken and the codeword generated during enrollment. The correction scheme can be implemented by any conventional error correction code such as Reed-Muller code R(m,n). In the implementation using a hand geometry device an R(2,5) code has been used in this invention. Such codeword and biometric measurement can then be used to see if the individual is an authorized user. Conventional Diffie-Hellman public key encryption schemes and hashing procedures can then be used to secure the communications lines carrying the biometric information and to secure the database of authorized users.« less
40 CFR 372.23 - SIC and NAICS codes to which this Part applies.
Code of Federal Regulations, 2010 CFR
2010-07-01
... facilities primarily engaged in reproducing text, drawings, plans, maps, or other copy, by blueprinting...)); 212324Kaolin and Ball Clay Mining Limited to facilities operating without a mine or quarry and that are...)); 212393Other Chemical and Fertilizer Mineral Mining Limited to facilities operating without a mine or quarry...
Constraining physical parameters of ultra-fast outflows in PDS 456 with Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Hagino, K.; Odaka, H.; Done, C.; Gandhi, P.; Takahashi, T.
2014-07-01
Deep absorption lines with extremely high velocity of ˜0.3c observed in PDS 456 spectra strongly indicate the existence of ultra-fast outflows (UFOs). However, the launching and acceleration mechanisms of UFOs are still uncertain. One possible way to solve this is to constrain physical parameters as a function of distance from the source. In order to study the spatial dependence of parameters, it is essential to adopt 3-dimensional Monte Carlo simulations that treat radiation transfer in arbitrary geometry. We have developed a new simulation code of X-ray radiation reprocessed in AGN outflow. Our code implements radiative transfer in 3-dimensional biconical disk wind geometry, based on Monte Carlo simulation framework called MONACO (Watanabe et al. 2006, Odaka et al. 2011). Our simulations reproduce FeXXV and FeXXVI absorption features seen in the spectra. Also, broad Fe emission lines, which reflects the geometry and viewing angle, is successfully reproduced. By comparing the simulated spectra with Suzaku data, we obtained constraints on physical parameters. We discuss launching and acceleration mechanisms of UFOs in PDS 456 based on our analysis.
Effect of doping in the Bi-Sr-Ca-Cu-O superconductor
NASA Technical Reports Server (NTRS)
Akbar, S. A.; Wong, M. S.; Botelho, M. J.; Sung, Y. M.; Alauddin, M.; Drummer, C. E.; Fair, M. J.
1991-01-01
The results of the effect of doping on the superconducting transition in the Bi-Sr-Ca-Cu-O system are reported. Samples were prepared under identical conditions with varying types (Pb, Sb, Sn, Nb) and amounts of dopants. All samples consisted of multiple phases, and showed stable and reproducible superconducting transitions. Stabilization of the well known 110 K phase depends on both the type and amount of dopant. No trace of superconducting phase of 150 K and above was observed.
Simulation of Fatigue Behavior of High Temperature Metal Matrix Composites
NASA Technical Reports Server (NTRS)
Tong, Mike T.; Singhal, Suren N.; Chamis, Christos C.; Murthy, Pappu L. N.
1996-01-01
A generalized relatively new approach is described for the computational simulation of fatigue behavior of high temperature metal matrix composites (HT-MMCs). This theory is embedded in a specialty-purpose computer code. The effectiveness of the computer code to predict the fatigue behavior of HT-MMCs is demonstrated by applying it to a silicon-fiber/titanium-matrix HT-MMC. Comparative results are shown for mechanical fatigue, thermal fatigue, thermomechanical (in-phase and out-of-phase) fatigue, as well as the effects of oxidizing environments on fatigue life. These results show that the new approach reproduces available experimental data remarkably well.
Development of battering ram vibrator system
NASA Astrophysics Data System (ADS)
Sun, F.; Chen, Z.; Lin, J.; Tong, X.
2012-12-01
This paper researched the battering ram vibrator system, by electric machinery we can control oil system of battering ram, we realized exact control of battering ram, after analyzed pseudorandom coding, code "0" and "1" correspond to rest and shake of battering ram, then we can get pseudorandom coding which is the same with battering ram vibrator. After testing , by the reference trace and single shot record, when we using pseudorandom coding mode, the ratio of seismic wavelet to correlation interfere is about 68 dB, while the general mode , the ratio of seismic wavelet to correlation interfere only is 27.9dB, by battering ram vibrator system, we can debase the correlation interfere which come from the single shaking frequency of battering ram, this system advanced the signal-to-noise ratio of seismic data, which can give direction of the application of battering ram vibrator in metal mine exploration and high resolving seismic exploration.
Simulations of Laboratory Astrophysics Experiments using the CRASH code
NASA Astrophysics Data System (ADS)
Trantham, Matthew; Kuranz, Carolyn; Manuel, Mario; Keiter, Paul; Drake, R. P.
2014-10-01
Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, imploding bubbles, and interacting jet experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via Grant DEFC52-08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0001840, and by the National Laser User Facility Program, Grant Number DE-NA0000850.
NASA Astrophysics Data System (ADS)
Rabie, M.; Franck, C. M.
2016-06-01
We present a freely available MATLAB code for the simulation of electron transport in arbitrary gas mixtures in the presence of uniform electric fields. For steady-state electron transport, the program provides the transport coefficients, reaction rates and the electron energy distribution function. The program uses established Monte Carlo techniques and is compatible with the electron scattering cross section files from the open-access Plasma Data Exchange Project LXCat. The code is written in object-oriented design, allowing the tracing and visualization of the spatiotemporal evolution of electron swarms and the temporal development of the mean energy and the electron number due to attachment and/or ionization processes. We benchmark our code with well-known model gases as well as the real gases argon, N2, O2, CF4, SF6 and mixtures of N2 and O2.
P1198: software for tracing decision behavior in lending to small businesses.
Andersson, P
2001-05-01
This paper describes a process-tracing software program specially designed to capture decision behavior in lending to small businesses. The source code was written in Lotus Notes. The software runs in a Web browser and consists of two interacting systems: a database and a user interface. The database includes three realistic loan applications. The user interface consists of different but interacting screens that enable the participant to operate the software. Log files register the decision behavior of the participant. An empirical example is presented in order to show the software's potential in providing insights into judgment and decision making. The implications of the software are discussed.
Stojanovska, Jadranka; Ibrahim, El-Sayed H.; Chughtai, Aamer R.; Jackson, Elizabeth A.; Gross, Barry H.; Jacobson, Jon A.; Tsodikov, Alexander; Daneshvar, Brian; Long, Benjamin D.; Chenevert, Thomas L.; Kazerooni, Ella A.
2017-01-01
Intrathoracic fat volume, more specifically, epicardial fat volume, is an emerging imaging biomarker of adverse cardiovascular events. The purpose of this work is to show the feasibility and reproducibility of intrathoracic fat volume measurement applied to contrast-enhanced multidetector computed tomography images. A retrospective cohort study of 62 subjects free of cardiovascular disease (55% females, age = 49 ± 11 years) conducted from 2008 to 2011 formed the study group. Intrathoracic fat volume was defined as all fat voxels measuring −50 to −250 Hounsfield Unit within the intrathoracic cavity from the level of the pulmonary artery bifurcation to the heart apex. The intrathoracic fat was separated into epicardial and extrapericardial fat by tracing the pericardium. The measurements were obtained by 2 readers and compared for interrater reproducibility. The fat volume measurements for the study group were 141 ± 72 cm3 for intrathoracic fat, 58 ± 27 cm3 for epicardial fat, and 84 ± 50 cm3 for extrapericardial fat. There was no statistically significant difference in intrathoracic fat volume measurements between the 2 readers, with correlation coefficients of 0.88 (P = .55) for intrathoracic fat volume and −0.12 (P = .33) for epicardial fat volume. Voxel-based measurement of intrathoracic fat, including the separation into epicardial and extrapericardial fat, is feasible and highly reproducible from multidetector computed tomography scans. PMID:28626797
NASA Technical Reports Server (NTRS)
Baumeister, Joseph F.
1990-01-01
Analysis of energy emitted from simple or complex cavity designs can lead to intricate solutions due to nonuniform radiosity and irradiation within a cavity. A numerical ray tracing technique was applied to simulate radiation propagating within and from various cavity designs. To obtain the energy balance relationships between isothermal and nonisothermal cavity surfaces and space, the computer code NEVADA was utilized for its statistical technique applied to numerical ray tracing. The analysis method was validated by comparing results with known theoretical and limiting solutions, and the electrical resistance network method. In general, for nonisothermal cavities the performance (apparent emissivity) is a function of cylinder length-to-diameter ratio, surface emissivity, and cylinder surface temperatures. The extent of nonisothermal conditions in a cylindrical cavity significantly affects the overall cavity performance. Results are presented over a wide range of parametric variables for use as a possible design reference.
Asymptotic Learning of Alphanumeric Coding in Autobiographical Memory
ERIC Educational Resources Information Center
Martin, Maryanne; Jones, Gregory V.
2007-01-01
Studies of autobiographical memory have shown that observed levels of incidental learning are often relatively low. Do low levels of retention result simply from a low learning rate, or is learning also asymptotic? To address this question, it is necessary to trace performance over a large number of learning opportunities, and this was carried out…
Using AORSA to simulate helicon waves in DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lau, C., E-mail: lauch@ornl.gov; Blazevski, D.; Green, D. L.
2015-12-10
Recent efforts have shown that helicon waves (fast waves at > 20ω{sub ci}) may be an attractive option for driving efficient off-axis current drive during non-inductive tokamak operation for DIII-D, ITER and DEMO. For DIII-D scenarios, the ray tracing code, GENRAY, has been extensively used to study helicon current drive efficiency and location as a function of many plasma parameters. The full wave code, AORSA, which is applicable to arbitrary Larmor radius and can resolve arbitrary ion cyclotron harmonic order, has been recently used to validate the ray tracing technique at these high cyclotron harmonics. If the SOL is ignored,more » it will be shown that the GENRAY and AORSA calculated current drive profiles are comparable for the envisioned high beta advanced scenarios for DIII-D, where there is high single pass absorption due to electron Landau damping and minimal ion damping. AORSA is also been used to estimate possible SOL effects on helicon current drive coupling and SOL absorption due to collisional and slow wave effects.« less
Combining analysis with optimization at Langley Research Center. An evolutionary process
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1982-01-01
The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.
A model of polarized-beam AGS in the ray-tracing code Zgoubi
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meot, F.; Ahrens, L.; Brown, K.
A model of the Alternating Gradient Synchrotron, based on the AGS snapramps, has been developed in the stepwise ray-tracing code Zgoubi. It has been used over the past 5 years in a number of accelerator studies aimed at enhancing RHIC proton beam polarization. It is also used to study and optimize proton and Helion beam polarization in view of future RHIC and eRHIC programs. The AGS model in Zgoubi is operational on-line via three different applications, ’ZgoubiFromSnaprampCmd’, ’AgsZgoubiModel’ and ’AgsModelViewer’, with the latter two essentially interfaces to the former which is the actual model ’engine’. All three commands are availablemore » from the controls system application launcher in the AGS ’StartUp’ menu, or from eponymous commands on shell terminals. Main aspects of the model and of its operation are presented in this technical note, brief excerpts from various studies performed so far are given for illustration, means and methods entering in ZgoubiFromSnaprampCmd are developed further in appendix.« less
Using AORSA to simulate helicon waves in DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lau, Cornwall H; Jaeger, E. F.; Bertelli, Nicola
2015-01-01
Recent efforts have shown that helicon waves (fast waves at >20 omega(ci)) may be an attractive option for driving efficient off-axis current drive during non-inductive tokamak operation for DIII-D, ITER and DEMO. For DIII-D scenarios, the ray tracing code, GENRAY, has been extensively used to study helicon current drive efficiency and location as a function of many plasma parameters. The full wave code, AORSA, which is applicable to arbitrary Larmor radius and can resolve arbitrary ion cyclotron harmonic order, has been recently used to validate the ray tracing technique at these high cyclotron harmonics. If the SOL is ignored, itmore » will be shown that the GENRAY and AORSA calculated current drive profiles are comparable for the envisioned high beta advanced scenarios for DIII-D, where there is high single pass absorption due to electron Landau damping and minimal ion damping. AORSA is also been used to estimate possible SOL effects on helicon current drive coupling and SOL absorption due to collisional and slow wave effects.« less
NASA Astrophysics Data System (ADS)
Kredler, L.; Häußler, W.; Martin, N.; Böni, P.
The flux is still a major limiting factor in neutron research. For instruments being supplied by cold neutrons using neutron guides, both at present steady-state and at new spallation neutron sources, it is therefore important to optimize the instrumental setup and the neutron guidance. Optimization of neutron guide geometry and of the instrument itself can be performed by numerical ray-tracing simulations using existing open-access codes. In this paper, we discuss how such Monte Carlo simulations have been employed in order to plan improvements of the Neutron Resonant Spin Echo spectrometer RESEDA (FRM II, Germany) as well as the neutron guides before and within the instrument. The essential components have been represented with the help of the McStas ray-tracing package. The expected intensity has been tested by means of several virtual detectors, implemented in the simulation code. Comparison between simulations and preliminary measurements results shows good agreement and demonstrates the reliability of the numerical approach. These results will be taken into account in the planning of new components installed in the guide system.
NASA Astrophysics Data System (ADS)
Zhang, Wending; Li, Cheng; Gao, Kun; Lu, Fanfan; Liu, Min; Li, Xin; Zhang, Lu; Mao, Dong; Gao, Feng; Huang, Ligang; Mei, Ting; Zhao, Jianlin
2018-05-01
Au-nanoparticle (Au-NP) substrates for surface-enhanced Raman spectroscopy (SERS) were fabricated by grid-like scanning a Au-film using a femtosecond pulse. The Au-NPs were directly deposited on the Au-film surface due to the scanning process. The experimentally obtained Au-NPs presented local surface plasmon resonance effect in the visible spectral range, as verified by finite difference time domain simulations and measured reflection spectrum. The SERS experiment using the Au-NP substrates exhibited high activity and excellent substrate reproducibility and stability, and a clearly present Raman spectra of target analytes, e.g. Rhodamine-6G, Rhodamine-B and Malachite green, with concentrations down to 10‑9 M. This work presents an effective approach to producing Au-NP SERS substrates with advantages in activity, reproducibility and stability, which could be used in a wide variety of practical applications for trace amount detection.
Zhang, Wending; Li, Cheng; Gao, Kun; Lu, Fanfan; Liu, Min; Li, Xin; Zhang, Lu; Mao, Dong; Gao, Feng; Huang, Ligang; Mei, Ting; Zhao, Jianlin
2018-05-18
Au-nanoparticle (Au-NP) substrates for surface-enhanced Raman spectroscopy (SERS) were fabricated by grid-like scanning a Au-film using a femtosecond pulse. The Au-NPs were directly deposited on the Au-film surface due to the scanning process. The experimentally obtained Au-NPs presented local surface plasmon resonance effect in the visible spectral range, as verified by finite difference time domain simulations and measured reflection spectrum. The SERS experiment using the Au-NP substrates exhibited high activity and excellent substrate reproducibility and stability, and a clearly present Raman spectra of target analytes, e.g. Rhodamine-6G, Rhodamine-B and Malachite green, with concentrations down to 10 -9 M. This work presents an effective approach to producing Au-NP SERS substrates with advantages in activity, reproducibility and stability, which could be used in a wide variety of practical applications for trace amount detection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merolle, L., E-mail: lucia.merolle@elettra.eu; Gianoncelli, A.; Malucelli, E., E-mail: emil.malucelli@unibo.it
2016-01-28
Elemental analysis of biological sample can give information about content and distribution of elements essential for human life or trace elements whose absence is the cause of abnormal biological function or development. However, biological systems contain an ensemble of cells with heterogeneous chemistry and elemental content; therefore, accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. Powerful methods in molecular biology are abundant, among them X-Ray microscopy based on synchrotron light source has gaining increasing attention thanks to its extremely sensitivity. However, reproducibility and repeatability of these measurements is one of the majormore » obstacles in achieving a statistical significance in single cells population analysis. In this study, we compared the elemental content of human colon adenocarcinoma cells obtained by three distinct accesses to synchrotron radiation light.« less
NASA Astrophysics Data System (ADS)
Monaco, P.
2007-12-01
We present some results of the new MORGANA model for the rise of galaxies and active nuclei, and show that the improved physical motivation of the description of star formation and feedback allows to get hints on the physical processes at play. We propose that the high level of turbulence in star-forming bulges is at the base of the observed downsizing of AGNs. In this framework it is also possible to reproduce the recently obtained evidence that most low-redshift accretion is powered by relatively massive, slowly accreting black holes. Besides, we notice that many galaxy formation models (including MORGANA) fail to reproduce a basic observable, namely the number density of 10^{11} M_⊙ galaxies at z˜1, as traced by the GOODS-MUSIC sample. This points to a possibly missing ingredient in the modeling of stellar feedback.
Solid-state modeling of the terahertz spectrum of the high explosive HMX.
Allis, Damian G; Prokhorova, Darya A; Korter, Timothy M
2006-02-09
The experimental solid-state terahertz (THz) spectrum (3-120 cm(-1)) of the beta-crystal form of the high explosive octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine (HMX) has been analyzed using solid-state density functional theory calculations. Various density functionals (both generalized gradient approximation and local density approximation) are compared in terms of their abilities to reproduce the experimentally observed solid-state structure and low-frequency vibrational motions. Good-to-excellent agreement between solid-state theory and experiment can be achieved in the THz region where isolated-molecule calculations fail to reproduce the observed spectral features, demonstrating a clear limitation of using isolated-molecule calculations for the assignment of THz frequency motions in molecular solids. The deficiency of isolated-molecule calculations is traced to modification of the molecular structure in the solid state through crystal packing effects and the formation of weak C-H...O hydrogen bonds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Platania, P., E-mail: platania@ifp.cnr.it; Figini, L.; Farina, D.
The purpose of this work is the optical modeling and physical performances evaluations of the JT-60SA ECRF launcher system. The beams have been simulated with the electromagnetic code GRASP® and used as input for ECCD calculations performed with the beam tracing code GRAY, capable of modeling propagation, absorption and current drive of an EC Gaussion beam with general astigmatism. Full details of the optical analysis has been taken into account to model the launched beams. Inductive and advanced reference scenarios has been analysed for physical evaluations in the full poloidal and toroidal steering ranges for two slightly different layouts ofmore » the launcher system.« less
Analysis Of The Boeing FEL Mirror Measurements
NASA Astrophysics Data System (ADS)
Knapp, Charles E.; Viswanathan, Vriddhachalam K.; Appert, Quentin D.
1989-07-01
The aberrations have been measured for the finished mirrors that are part of the Burst Mode ring resonator of the Free Electron Laser (FEL) being constructed at the Boeing Aerospace Company in Seattle, Washington. This paper presents analysis of these measurements using the GLAD code, a diffraction ray-tracing code. The diffraction losses within the resonator due to the aberrations are presented. The analysis was conducted in two different modes, a paraxial approximation and a full 3-D calculation, and good agreement between the two approaches is shown. Finally, a proposed solution to the problems caused by the aberrations is presented and analyzed.
Baba, H; Onizuka, Y; Nakao, M; Fukahori, M; Sato, T; Sakurai, Y; Tanaka, H; Endo, S
2011-02-01
In this study, microdosimetric energy distributions of secondary charged particles from the (10)B(n,α)(7)Li reaction in boron-neutron capture therapy (BNCT) field were calculated using the Particle and Heavy Ion Transport code System (PHITS). The PHITS simulation was performed to reproduce the geometrical set-up of an experiment that measured the microdosimetric energy distributions at the Kyoto University Reactor where two types of tissue-equivalent proportional counters were used, one with A-150 wall alone and another with a 50-ppm-boron-loaded A-150 wall. It was found that the PHITS code is a useful tool for the simulation of the energy deposited in tissue in BNCT based on the comparisons with experimental results.
NASA Astrophysics Data System (ADS)
Bagli, Enrico; Guidi, Vincenzo
2013-08-01
A toolkit for the simulation of coherent interactions between high-energy charged particles and complex crystal structures, called DYNECHARM++ has been developed. The code has been written in C++ language taking advantage of this object-oriented programing method. The code is capable to evaluating the electrical characteristics of complex atomic structures and to simulate and track the particle trajectory within them. Calculation method of electrical characteristics based on their expansion in Fourier series has been adopted. Two different approaches to simulate the interaction have been adopted, relying on the full integration of particle trajectories under the continuum potential approximation and on the definition of cross-sections of coherent processes. Finally, the code has proved to reproduce experimental results and to simulate interaction of charged particles with complex structures.
Reproducibility in Computational Neuroscience Models and Simulations
McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.
2016-01-01
Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845
Running an open experiment: transparency and reproducibility in soil and ecosystem science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bond-Lamberty, Benjamin; Smith, Ashly P.; Bailey, Vanessa L.
Researchers in soil and ecosystem science, and almost every other field, are being pushed--by funders, journals, governments, and their peers--to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent "open experiment", in which wemore » documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team's communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.« less
Running an open experiment: transparency and reproducibility in soil and ecosystem science
NASA Astrophysics Data System (ADS)
Bond-Lamberty, Ben; Peyton Smith, A.; Bailey, Vanessa
2016-08-01
Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent ‘open experiment’, in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team’s communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.
RELAP-7 Code Assessment Plan and Requirement Traceability Matrix
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.
2016-10-01
The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less
NASA Astrophysics Data System (ADS)
Bahramvash Shams, S.; Walden, V. P.; Turner, D. D.
2017-12-01
Measurements of trace gases at high temporal resolution are important for understanding variations and trends at high latitudes. Trace gases over Greenland can be influenced by both long-range transport from pollution sources as well as local chemical processes. Satellite retrievals are an important data source in the polar regions, but accurate ground-based measurements are needed for proper validation, especially in data sparse regions. A moderate-resolution (0.5 cm-1) Fourier transform infrared spectrometer (FTIR), the Polar Atmospheric Emitted Radiance Interferometer (P-AERI), has been operated at Summit Station, Greenland as part of the ICECAPS project since 2010. In this study, trace gas concentrations, including ozone, nitrous oxide, and methane are retrieved using different optimal estimation retrieval codes. We first present results of retrieved gases using synthetic spectra (from a radiative transfer model) that mimic P-AERI measurements to evaluate systematic errors in the inverse models. We also retrieve time series of trace gas concentrations during periods of clear skies over Summit. We investigate the amount of vertical information that can be obtained with moderate resolution spectra for each of the trace gases, and also the impact of the seasonal variation of atmospheric water vapor on the retrievals. Data from surface observations and ozonesondes obtained by the NOAA Global Monitoring Division are used to improve the retrievals and as validation.
Reproducibility of Ba/Ca variations recorded by northeast Pacific bamboo corals
NASA Astrophysics Data System (ADS)
Serrato Marks, G.; LaVigne, M.; Hill, T. M.; Sauthoff, W.; Guilderson, T. P.; Roark, E. B.; Dunbar, R. B.; Horner, T. J.
2017-09-01
Trace elemental ratios preserved in the calcitic skeleton of bamboo corals have been shown to serve as archives of past ocean conditions. The concentration of dissolved barium (BaSW), a bioactive nutrientlike element, is linked to biogeochemical processes such as the cycling and export of nutrients. Recent work has calibrated bamboo coral Ba/Ca, a new BaSW proxy, using corals spanning the oxygen minimum zone beneath the California Current System. However, it was previously unclear whether Ba/Cacoral records were internally reproducible. Here we investigate the accuracy of using laser ablation inductively coupled plasma mass spectrometry for Ba/Cacoral analyses and test the internal reproducibility of Ba/Ca among replicate radial transects in the calcite of nine bamboo corals collected from the Gulf of Alaska (643-720 m) and the California margin (870-2054 m). Data from replicate Ba/Ca transects were aligned using visible growth bands to account for nonconcentric growth; smoothed data were reproducible within 4% for eight corals (
Sharma, Kanishka; Caroli, Anna; Quach, Le Van; Petzold, Katja; Bozzetto, Michela; Serra, Andreas L.; Remuzzi, Giuseppe; Remuzzi, Andrea
2017-01-01
Background In autosomal dominant polycystic kidney disease (ADPKD), total kidney volume (TKV) is regarded as an important biomarker of disease progression and different methods are available to assess kidney volume. The purpose of this study was to identify the most efficient kidney volume computation method to be used in clinical studies evaluating the effectiveness of treatments on ADPKD progression. Methods and findings We measured single kidney volume (SKV) on two series of MR and CT images from clinical studies on ADPKD (experimental dataset) by two independent operators (expert and beginner), twice, using all of the available methods: polyline manual tracing (reference method), free-hand manual tracing, semi-automatic tracing, Stereology, Mid-slice and Ellipsoid method. Additionally, the expert operator also measured the kidney length. We compared different methods for reproducibility, accuracy, precision, and time required. In addition, we performed a validation study to evaluate the sensitivity of these methods to detect the between-treatment group difference in TKV change over one year, using MR images from a previous clinical study. Reproducibility was higher on CT than MR for all methods, being highest for manual and semiautomatic contouring methods (planimetry). On MR, planimetry showed highest accuracy and precision, while on CT accuracy and precision of both planimetry and Stereology methods were comparable. Mid-slice and Ellipsoid method, as well as kidney length were fast but provided only a rough estimate of kidney volume. The results of the validation study indicated that planimetry and Stereology allow using an importantly lower number of patients to detect changes in kidney volume induced by drug treatment as compared to other methods. Conclusions Planimetry should be preferred over fast and simplified methods for accurately monitoring ADPKD progression and assessing drug treatment effects. Expert operators, especially on MR images, are required for performing reliable estimation of kidney volume. The use of efficient TKV quantification methods considerably reduces the number of patients to enrol in clinical investigations, making them more feasible and significant. PMID:28558028
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makwana, K. D., E-mail: kirit.makwana@gmx.com; Cattaneo, F.; Zhdankin, V.
Simulations of decaying magnetohydrodynamic (MHD) turbulence are performed with a fluid and a kinetic code. The initial condition is an ensemble of long-wavelength, counter-propagating, shear-Alfvén waves, which interact and rapidly generate strong MHD turbulence. The total energy is conserved and the rate of turbulent energy decay is very similar in both codes, although the fluid code has numerical dissipation, whereas the kinetic code has kinetic dissipation. The inertial range power spectrum index is similar in both the codes. The fluid code shows a perpendicular wavenumber spectral slope of k{sub ⊥}{sup −1.3}. The kinetic code shows a spectral slope of k{submore » ⊥}{sup −1.5} for smaller simulation domain, and k{sub ⊥}{sup −1.3} for larger domain. We estimate that collisionless damping mechanisms in the kinetic code can account for the dissipation of the observed nonlinear energy cascade. Current sheets are geometrically characterized. Their lengths and widths are in good agreement between the two codes. The length scales linearly with the driving scale of the turbulence. In the fluid code, their thickness is determined by the grid resolution as there is no explicit diffusivity. In the kinetic code, their thickness is very close to the skin-depth, irrespective of the grid resolution. This work shows that kinetic codes can reproduce the MHD inertial range dynamics at large scales, while at the same time capturing important kinetic physics at small scales.« less
The computational nature of memory modification
Gershman, Samuel J; Monfils, Marie-H; Norman, Kenneth A; Niv, Yael
2017-01-01
Retrieving a memory can modify its influence on subsequent behavior. We develop a computational theory of memory modification, according to which modification of a memory trace occurs through classical associative learning, but which memory trace is eligible for modification depends on a structure learning mechanism that discovers the units of association by segmenting the stream of experience into statistically distinct clusters (latent causes). New memories are formed when the structure learning mechanism infers that a new latent cause underlies current sensory observations. By the same token, old memories are modified when old and new sensory observations are inferred to have been generated by the same latent cause. We derive this framework from probabilistic principles, and present a computational implementation. Simulations demonstrate that our model can reproduce the major experimental findings from studies of memory modification in the Pavlovian conditioning literature. DOI: http://dx.doi.org/10.7554/eLife.23763.001 PMID:28294944
High resolution fate map of the zebrafish diencephalon.
Russek-Blum, Niva; Nabel-Rosen, Helit; Levkowitz, Gil
2009-07-01
The diencephalon acts as an interactive site between the sensory, central, and endocrine systems and is one of the most elaborate structures in the vertebrate brain. To better understand the embryonic development and morphogenesis of the diencephalon, we developed an improved photoactivation (uncaging)-based lineage tracing strategy. To determine the exact position of a given diencephalic progenitor domain, we used a transgenic line driving green fluorescent protein (GFP) in cells expressing the proneural protein, Neurogenin1 (Neurog1), which was used as a visible neural plate landmark. This approach facilitated precise labeling of defined groups of cells in the prospective diencephalon of the zebrafish neural plate. In this manner, we labeled multiple overlapping areas of the diencephalon, thereby ensuring both accuracy and reproducibility of our lineage tracing regardless of the dynamic changes of the developing neural plate. We present a fate map of the zebrafish diencephalon at a higher spatial resolution than previously described. (c) 2009 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Zuo, Zewen; Zhu, Kai; Ning, Lixin; Cui, Guanglei; Qu, Jun; Cheng, Ying; Wang, Junzhuan; Shi, Yi; Xu, Dongsheng; Xin, Yu
2015-01-01
Wafer-scale three-dimensional (3D) surface enhancement Raman scattering (SERS) substrates were prepared using the plasma etching and ion sputtering methods that are completely compatible with well-established silicon device technologies. The substrates are highly sensitive with excellent uniformity and reproducibility, exhibiting an enhancement factor up to 1012 with a very low relative standard deviation (RSD) around 5%. These are attributed mainly to the uniform-distributed, multiple-type high-density hot spots originating from the structural characteristics of Ag nanoparticles (NPs) decorated Si nanocone (NC) arrays. We demonstrate that the trace dimethyl phthalate (DMP) at a concentration of 10-7 M can be well detected using this SERS substrate, showing that the AgNPs-decorated SiNC arrays can serve as efficient SERS substrates for phthalate acid esters (PAEs) detection with high sensitivity.
Bobrowski, A
1994-05-01
The catalytic adsorptive stripping voltammetric method with alpha-benzil dioxime and nitrite affords numerous advantages in cobalt determination. The detailed conditions of the determination of the cobalt traces in metallic zinc by catalytic adsorptive stripping voltammetry have been investigated. Both the linear sweep and the differential pulse stripping modes can be used with similar sensitivity. Possible interferences by Mn, Pb, Cu, Ni and Fe are evaluated. In the presence of 5 x 10(5) fold excess of Zn the linear dependence of the cobalt CASV peak current on concentration ranged from 0.05 mug/l to 3 mug/l. Optimal conditions include the accumulation potential of -0.65 V and the accumulation time of 10 sec. The results of the determination of 10(-5)% level of Co in the metallic zinc showed good reproducibility (relative standard deviation, RSD = 0.07) and reliability.
Moyo, Mambo; Okonkwo, Jonathan O; Agyei, Nana M
2014-03-05
A biosensor for trace metal ions based on horseradish peroxidase (HRP) immobilized on maize tassel-multiwalled carbon nanotube (MT-MWCNT) through electrostatic interactions is described herein. The biosensor was characterized using Fourier transform infrared (FTIR), UV-vis spectrometry, voltammetric and amperometric methods. The FTIR and UV-vis results inferred that HRP was not denatured during its immobilization on MT-MWCNT composite. The biosensing principle was based on the determination of the cathodic responses of the immobilized HRP to H₂O₂, before and after incubation in trace metal standard solutions. Under optimum conditions, the inhibition rates of trace metals were proportional to their concentrations in the range of 0.092-0.55 mg L⁻¹, 0.068-2 mg L⁻¹ for Pb²⁺ and Cu²⁺ respectively. The limits of detection were 2.5 μg L⁻¹ for Pb²⁺ and 4.2 μg L⁻¹ for Cu²⁺. Representative Dixon and Cornish-Bowden plots were used to deduce the mode of inhibition induced by the trace metal ions. The inhibition was reversible and mixed for both metal ions. Furthermore, the biosensor showed good stability, selectivity, repeatability and reproducibility. Copyright © 2013 Elsevier Inc. All rights reserved.
What to do with a Dead Research Code
NASA Astrophysics Data System (ADS)
Nemiroff, Robert J.
2016-01-01
The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.
A new software for deformation source optimization, the Bayesian Earthquake Analysis Tool (BEAT)
NASA Astrophysics Data System (ADS)
Vasyura-Bathke, H.; Dutta, R.; Jonsson, S.; Mai, P. M.
2017-12-01
Modern studies of crustal deformation and the related source estimation, including magmatic and tectonic sources, increasingly use non-linear optimization strategies to estimate geometric and/or kinematic source parameters and often consider both jointly, geodetic and seismic data. Bayesian inference is increasingly being used for estimating posterior distributions of deformation source model parameters, given measured/estimated/assumed data and model uncertainties. For instance, some studies consider uncertainties of a layered medium and propagate these into source parameter uncertainties, while others use informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed to efficiently explore the high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational burden of these methods is high and estimation codes are rarely made available along with the published results. Even if the codes are accessible, it is usually challenging to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in deformation source estimations, we undertook the effort of developing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package builds on the pyrocko seismological toolbox (www.pyrocko.org), and uses the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat), and we encourage and solicit contributions to the project. Here, we present our strategy for developing BEAT and show application examples; especially the effect of including the model prediction uncertainty of the velocity model in following source optimizations: full moment tensor, Mogi source, moderate strike-slip earth-quake.
NASA Astrophysics Data System (ADS)
Lizotte, Todd E.; Ohar, Orest P.
2009-05-01
At a border security conference in August 2008, Michael Sullivan, acting director of Bureau of Alcohol, Tobacco, Firearms and Explosives stated that, "Nearly all illegal firearms (90% to 95%) seized in Mexico come from the United States"[1]. When firearms are recovered at a crime scene, the firearms can be traced providing specific details on illegal firearm dealers or straw purchasers within the United States. Criminals or narco terrorist groups target US dealers to source firearms for drug cartels in Mexico and South America. Joint law enforcement programs between the US and Mexico law enforcement have been effective, however, in most cases the firearms that are seized are only a small fraction of the firearms trafficked across the United States border. A technology called Microstamping, when applied to newly manufactured firearms will provide further opportunities for tracing illegal firearms for law enforcement in the United States and across the globe. Microstamping is a patented technology and trace solution where intentional tooling marks are formed or micromachined onto firearms interior surfaces that come into contact or impact the surfaces of cartridge casings. The intentional tooling marks can take the form of alphanumeric codes or encoded geometric codes, such as a barcode. As the firearm is discharged the intentional tooling marks transfer a code to the cartridge casing before it is ejected out of the firearm. When recovered at the scene of an incident, the Microstamped cartridge can indentify a specific firearm, without the need to recover that firearm. Microstamping provides critical intelligence for use in border security operations and cross border violent drug related crime investigations. This paper will explain the key attributes of microstamping technology; including its potential benefits in border security operations and how data gathered from the technique can be used in geospatial information systems to identify illicit firearm sources, trafficking routes, as well as spatial and temporal mapping of narco-terrorist movements on either side of the border.
Wei, Tianfu; Chen, Zhengyi; Li, Gongke; Zhang, Zhuomin
2018-05-04
Aflatoxins are highly toxic mycotoxin contamination, which pose serious food safety incidents. It is very important to precisely and rapidly determine trace aflatoxins in food. In this study, we designed porous monolithic column based on covalent cross-linked polymer gels for online extraction and analysis of trace aflatoxins in food samples with complicated matrices coupled with high-performance liquid chromatography-ultraviolet detector (HPLC-UV). The prepared monolithic column showed excellent enrichment performance due to its good permeability, good reproducibility and long life span. The study of adsorption mechanism suggested that the excellent enrichment performance of this monolithic column was attributed to the multiple effect of π-π stacking interaction, hydrophobic effect and steric effect. When the online analytical method was applied for the determine of trace aflatoxins in real food samples, aflatoxins G 1 and aflatoxins B 1 could be actually found in one positive bean sauce sample and quantified to be 32.8 and 26.4 μg/kg, respectively. Aflatoxins G 1 in one bean sample could be also found and quantified to be 25.9 μg/kg. The low detection limits of the developed method were achieved in range of 0.08-0.2 μg/kg. And the recoveries for spiked samples were in range from 76.1 to 113% with RSDs of 1.1-9.6%. The developed method was proved to be a promising method for online enrichment and analysis of trace aflatoxins in complicated food samples. Copyright © 2018 Elsevier B.V. All rights reserved.
Epiphytic cryptogams as a source of bioaerosols and trace gases
NASA Astrophysics Data System (ADS)
Ruckteschler, Nina; Hrabe de Angelis, Isabella; Zartman, Charles E.; Araùjo, Alessandro; Pöschl, Ulrich; Manzi, Antonio O.; Andreae, Meinrat O.; Pöhlker, Christopher; Weber, Bettina
2016-04-01
Cryptogamic covers comprise (cyano-)bacteria, algae, lichens, bryophytes, fungi, and archaea in varying proportions. These organisms do not form flowers, but reproduce by spores or cell cleavage with these reproductive units being dispersed via the atmosphere. As so-called poikilohydric organisms they are unable to regulate their water content, and their physiological activity pattern mainly follows the external water conditions. We hypothesize, that both spore dispersal and the release of trace gases are governed by the moisture patterns of these organisms and thus they could have a greater impact on the atmosphere than previously thought. In order to test this hypothesis, we initiated experiments at the study site Amazonian Tall Tower Observatory (ATTO) in September 2014. We installed microclimate sensors in epiphytic cryptogams at four different heights of a tree to monitor the activity patterns of these organisms. Self-developed moisture probes are used to analyze the water status of the organisms accompanied by light and temperature sensors. The continuously logged data are linked to ongoing measurements of trace gases and particulate bioaerosols to analyze these for the relevance of cryptogams. Here, we are particularly interested in diurnal cycles of coarse mode particles and the atmospheric abundance of fine potassium-rich particles from a currently unknown biogenic source. Based upon the results of this field study we also investigate the bioaerosol and trace gas release patterns of cryptogamic covers under controlled conditions. With this combined approach of field and laboratory experiments we aim to disclose the role of cryptogamic covers in bioaerosol and trace gas release patterns in the Amazonian rainforest.
NASA Astrophysics Data System (ADS)
Alyami, Abeer; Saviello, Daniela; McAuliffe, Micheal A. P.; Cucciniello, Raffaele; Mirabile, Antonio; Proto, Antonio; Lewis, Liam; Iacopino, Daniela
2017-08-01
Au nanorods were used as an alternative to commonly used Ag nanoparticles as Surface Enhanced Raman Scattering (SERS) probes for identification of dye composition of blue BIC ballpoint pens. When used in combination with Thin Layer Chromatography (TLC), Au nanorod colloids allowed identification of the major dye components of the BIC pen ink, otherwise not identifiable by normal Raman spectroscopy. Thanks to their enhanced chemical stability compared to Ag colloids, Au nanorods provided stable and reproducible SERS signals and allowed easy identification of phthalocyanine and triarylene dyes in the pen ink mixture. These findings were supported by FTIR and MALDI analyses, also performed on the pen ink. Furthermore, the self-assembly of Au nanorods into large area ordered superstructures allowed identification of BIC pen traces. SERS spectra of good intensity and high reproducibility were obtained using Au nanorod vertical arrays, due to the high density of hot spots and morphological reproducibility of these superstructures. These results open the way to the employment of SERS for fast screening analysis and for quantitative analysis of pens and faded pens which are relevant for the fields of forensic and art conservation sciences.
Recommendations for open data science.
Gymrek, Melissa; Farjoun, Yossi
2016-01-01
Life science research increasingly relies on large-scale computational analyses. However, the code and data used for these analyses are often lacking in publications. To maximize scientific impact, reproducibility, and reuse, it is crucial that these resources are made publicly available and are fully transparent. We provide recommendations for improving the openness of data-driven studies in life sciences.
Shuai, Lan; Malins, Jeffrey G
2017-02-01
Despite its prevalence as one of the most highly influential models of spoken word recognition, the TRACE model has yet to be extended to consider tonal languages such as Mandarin Chinese. A key reason for this is that the model in its current state does not encode lexical tone. In this report, we present a modified version of the jTRACE model in which we borrowed on its existing architecture to code for Mandarin phonemes and tones. Units are coded in a way that is meant to capture the similarity in timing of access to vowel and tone information that has been observed in previous studies of Mandarin spoken word recognition. We validated the model by first simulating a recent experiment that had used the visual world paradigm to investigate how native Mandarin speakers process monosyllabic Mandarin words (Malins & Joanisse, 2010). We then subsequently simulated two psycholinguistic phenomena: (1) differences in the timing of resolution of tonal contrast pairs, and (2) the interaction between syllable frequency and tonal probability. In all cases, the model gave rise to results comparable to those of published data with human subjects, suggesting that it is a viable working model of spoken word recognition in Mandarin. It is our hope that this tool will be of use to practitioners studying the psycholinguistics of Mandarin Chinese and will help inspire similar models for other tonal languages, such as Cantonese and Thai.
NASA Astrophysics Data System (ADS)
Hut, R. W.; van de Giesen, N. C.; Drost, N.
2017-05-01
The suggestions by Hutton et al. might not be enough to guarantee reproducible computational hydrology. Archiving software code and research data alone will not be enough. We add to the suggestion of Hutton et al. that hydrologists not only document their (computer) work, but that hydrologists use the latest best practices in designing research software, most notably the use of containers and open interfaces. To make sure hydrologists know of these best practices, we urge close collaboration with Research Software Engineers (RSEs).
Neutron production cross sections for (d,n) reactions at 55 MeV
NASA Astrophysics Data System (ADS)
Wakasa, T.; Goto, S.; Matsuno, M.; Mitsumoto, S.; Okada, T.; Oshiro, H.; Sakaguchi, S.
2017-08-01
The cross sections for (d,n) reactions on {}^natC-{}^{197}Au have been measured at a bombarding energy of 55 MeV and a laboratory scattering angle of θ_lab = 9.5°. The angular distributions for the {}^natC(d,n) reaction have also been obtained at θ_lab = 0°-40°. The neutron energy spectra are dominated by deuteron breakup contributions and their peak positions can be reasonably reproduced by considering the Coulomb force effects. The data are compared with the TENDL-2015 nuclear data and Particle and Heavy Ion Transport code System (PHITS) calculations. Both calculations fail to reproduce the measured energy spectra and angular distributions.
Three-dimensional Monte Carlo calculation of atmospheric thermal heating rates
NASA Astrophysics Data System (ADS)
Klinger, Carolin; Mayer, Bernhard
2014-09-01
We present a fast Monte Carlo method for thermal heating and cooling rates in three-dimensional atmospheres. These heating/cooling rates are relevant particularly in broken cloud fields. We compare forward and backward photon tracing methods and present new variance reduction methods to speed up the calculations. For this application it turns out that backward tracing is in most cases superior to forward tracing. Since heating rates may be either calculated as the difference between emitted and absorbed power per volume or alternatively from the divergence of the net flux, both approaches have been tested. We found that the absorption/emission method is superior (with respect to computational time for a given uncertainty) if the optical thickness of the grid box under consideration is smaller than about 5 while the net flux divergence may be considerably faster for larger optical thickness. In particular, we describe the following three backward tracing methods: the first and most simple method (EMABS) is based on a random emission of photons in the grid box of interest and a simple backward tracing. Since only those photons which cross the grid box boundaries contribute to the heating rate, this approach behaves poorly for large optical thicknesses which are common in the thermal spectral range. For this reason, the second method (EMABS_OPT) uses a variance reduction technique to improve the distribution of the photons in a way that more photons are started close to the grid box edges and thus contribute to the result which reduces the uncertainty. The third method (DENET) uses the flux divergence approach where - in backward Monte Carlo - all photons contribute to the result, but in particular for small optical thickness the noise becomes large. The three methods have been implemented in MYSTIC (Monte Carlo code for the phYSically correct Tracing of photons In Cloudy atmospheres). All methods are shown to agree within the photon noise with each other and with a discrete ordinate code for a one-dimensional case. Finally a hybrid method is built using a combination of EMABS_OPT and DENET, and application examples are shown. It should be noted that for this application, only little improvement is gained by EMABS_OPT compared to EMABS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tominaga, Nozomu; Shibata, Sanshiro; Blinnikov, Sergei I., E-mail: tominaga@konan-u.ac.jp, E-mail: sshibata@post.kek.jp, E-mail: Sergei.Blinnikov@itep.ru
We develop a time-dependent, multi-group, multi-dimensional relativistic radiative transfer code, which is required to numerically investigate radiation from relativistic fluids that are involved in, e.g., gamma-ray bursts and active galactic nuclei. The code is based on the spherical harmonic discrete ordinate method (SHDOM) which evaluates a source function including anisotropic scattering in spherical harmonics and implicitly solves the static radiative transfer equation with ray tracing in discrete ordinates. We implement treatments of time dependence, multi-frequency bins, Lorentz transformation, and elastic Thomson and inelastic Compton scattering to the publicly available SHDOM code. Our code adopts a mixed-frame approach; the source functionmore » is evaluated in the comoving frame, whereas the radiative transfer equation is solved in the laboratory frame. This implementation is validated using various test problems and comparisons with the results from a relativistic Monte Carlo code. These validations confirm that the code correctly calculates the intensity and its evolution in the computational domain. The code enables us to obtain an Eddington tensor that relates the first and third moments of intensity (energy density and radiation pressure) and is frequently used as a closure relation in radiation hydrodynamics calculations.« less
Tracing the evolution of the Galactic bulge with chemodynamical modelling of alpha-elements
NASA Astrophysics Data System (ADS)
Friaça, A. C. S.; Barbuy, B.
2017-02-01
Context. Galactic bulge abundances can be best understood as indicators of bulge formation and nucleosynthesis processes by comparing them with chemo-dynamical evolution models. Aims: The aim of this work is to study the abundances of alpha-elements in the Galactic bulge, including a revision of the oxygen abundance in a sample of 56 bulge red giants. Methods: Literature abundances for O, Mg, Si, Ca and Ti in Galactic bulge stars are compared with chemical evolution models. For oxygen in particular, we reanalysed high-resolution spectra obtained using FLAMES+UVES on the Very Large Telescope, now taking each star's carbon abundances, derived from CI and C2 lines, into account simultaneously. Results: We present a chemical evolution model of alpha-element enrichment in a massive spheroid that represents a typical classical bulge evolution. The code includes multi-zone chemical evolution coupled with hydrodynamics of the gas. Comparisons between the model predictions and the abundance data suggest a typical bulge formation timescale of 1-2 Gyr. The main constraint on the bulge evolution is provided by the O data from analyses that have taken the C abundance and dissociative equilibrium into account. Mg, Si, Ca and Ti trends are well reproduced, whereas the level of overabundance critically depends on the adopted nucleosynthesis prescriptions. Observations collected both at the European Southern Observatory, Paranal, Chile (ESO programmes 71.B-0617A, 73.B0074A, and GTO 71.B-0196)
RSEIS and RFOC: Seismic Analysis in R
NASA Astrophysics Data System (ADS)
Lees, J. M.
2015-12-01
Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.
Temporal evolution of photon energy emitted from two-component advective flows: origin of time lag
NASA Astrophysics Data System (ADS)
Chatterjee, Arka; Chakrabarti, Sandip K.; Ghosh, Himadri
2017-12-01
X-ray time lag of black hole candidates contains important information regarding the emission geometry. Recently, study of time lags from observational data revealed very intriguing properties. To investigate the real cause of this lag behavior with energy and spectral states, we study photon paths inside a two-component advective flow (TCAF) which appears to be a satisfactory model to explain the spectral and timing properties. We employ the Monte Carlo simulation technique to carry out the Comptonization process. We use a relativistic thick disk in Schwarzschild geometry as the CENtrifugal pressure supported BOundary Layer (CENBOL) which is the Compton cloud. In TCAF, this is the post-shock region of the advective component. Keplerian disk on the equatorial plane which is truncated at the inner edge i.e. at the outer boundary of the CENBOL, acts as the soft photon source. Ray-tracing code is employed to track the photons to a distantly located observer. We compute the cumulative time taken by a photon during Comptonization, reflection and following the curved geometry on the way to the observer. Time lags between various hard and soft bands have been calculated. We study the variation of time lags with accretion rates, CENBOL size and inclination angle. Time lags for different energy channels are plotted for different inclination angles. The general trend of variation of time lag with QPO frequency and energy as observed in satellite data is reproduced.
Structural lineaments of Gaspe from ERTS imagery
NASA Technical Reports Server (NTRS)
Steffensen, R.
1973-01-01
A test study was conducted to assess the value of ERTS images for mapping geologic features of the Gaspe Peninsula, Quebec. The specific objectives of the study were: 1) to ascertain the best procedure to follow in order to obtain valuable geologic data as a result of interpretation; and 2) to indicate in which way these data could relate to mineral exploration. Of the four spectral bands of the Multispectral scanner, the band from 700 to 800 nanometers, which seems to possess the best informational content for geologic study, was selected for analysis. The original ERTS image at a scale of 1:3,700,000 was enlarged about 15 times and reproduced on film. Geologically meaningful lines, called structural lineaments, were outlined and classified according to five categories: morpho-lithologic boundaries, morpho-lithologic lineaments, fault traces, fracture zones and undefined lineaments. Comparison with the geologic map of Gaspe shows that morpho-lithologic boundaries correspond to contacts between regional stratigraphic units. Morpholithologic lineaments follow bedding trends, whereas fracture traces appear as sets of parallel lineaments, intersecting at high angles the previous category of lineaments. Fault traces mark more precisely the location of faults already mapped and spot the presence of presumable faults, not indicated on the geologic map.
A finite volume method for trace element diffusion and partitioning during crystal growth
NASA Astrophysics Data System (ADS)
Hesse, Marc A.
2012-09-01
A finite volume method on a uniform grid is presented to compute the polythermal diffusion and partitioning of a trace element during the growth of a porphyroblast crystal in a uniform matrix and in linear, cylindrical and spherical geometry. The motion of the crystal-matrix interface and the thermal evolution are prescribed functions of time. The motion of the interface is discretized and it advances from one cell boundary to next as the prescribed interface position passes the cell center. The appropriate conditions for the flux across the crystal-matrix interface are derived from discrete mass conservation. Numerical results are benchmarked against steady and transient analytic solutions for isothermal diffusion with partitioning and growth. Two applications illustrate the ability of the model to reproduce observed rare-earth element patterns in garnets (Skora et al., 2006) and water concentration profiles around spherulites in obsidian (Watkins et al., 2009). Simulations with diffusion inside the growing crystal show complex concentration evolutions for trace elements with high diffusion coefficients, such as argon or hydrogen, but demonstrate that rare-earth element concentrations in typical metamorphic garnets are not affected by intracrystalline diffusion.
Trubyanov, Maxim M; Mochalov, Georgy M; Suvorov, Sergey S; Puzanov, Egor S; Petukhov, Anton N; Vorotyntsev, Ilya V; Vorotyntsev, Vladimir M
2018-07-27
The current study focuses on the processes involved during the flow conversion of water into acetylene in a calcium carbide reaction cell for the trace moisture analysis of ammonia by reaction gas chromatography. The factors negatively affecting the reproducibility and the accuracy of the measurements are suggested and discussed. The intramolecular reaction of the HOCaCCH intermediate was found to be a side reaction producing background acetylene during the contact of wet ammonia gas with calcium carbide. The presence of the HOCaCCH intermediate among the reaction products is confirmed by an FTIR spectral study of calcium carbide powder exposed to wet gas. The side reaction kinetics is evaluated experimentally and its influence on the results of the gas chromatographic measurements is discussed in relation to the determination of the optimal operating parameters for ammonia analysis. The reaction gas chromatography method for the trace moisture measurements in an ammonia matrix was experimentally compared to an FTIR long-path length gas cell technique to evaluate the accuracy limitations and the resource intensity. Copyright © 2018 Elsevier B.V. All rights reserved.
DeepNeuron: an open deep learning toolbox for neuron tracing.
Zhou, Zhi; Kuo, Hsien-Chi; Peng, Hanchuan; Long, Fuhui
2018-06-06
Reconstructing three-dimensional (3D) morphology of neurons is essential for understanding brain structures and functions. Over the past decades, a number of neuron tracing tools including manual, semiautomatic, and fully automatic approaches have been developed to extract and analyze 3D neuronal structures. Nevertheless, most of them were developed based on coding certain rules to extract and connect structural components of a neuron, showing limited performance on complicated neuron morphology. Recently, deep learning outperforms many other machine learning methods in a wide range of image analysis and computer vision tasks. Here we developed a new Open Source toolbox, DeepNeuron, which uses deep learning networks to learn features and rules from data and trace neuron morphology in light microscopy images. DeepNeuron provides a family of modules to solve basic yet challenging problems in neuron tracing. These problems include but not limited to: (1) detecting neuron signal under different image conditions, (2) connecting neuronal signals into tree(s), (3) pruning and refining tree morphology, (4) quantifying the quality of morphology, and (5) classifying dendrites and axons in real time. We have tested DeepNeuron using light microscopy images including bright-field and confocal images of human and mouse brain, on which DeepNeuron demonstrates robustness and accuracy in neuron tracing.
Spin dynamics modeling in the AGS based on a stepwise ray-tracing method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dutheil, Yann
The AGS provides a polarized proton beam to RHIC. The beam is accelerated in the AGS from Gγ= 4.5 to Gγ = 45.5 and the polarization transmission is critical to the RHIC spin program. In the recent years, various systems were implemented to improve the AGS polarization transmission. These upgrades include the double partial snakes configuration and the tune jumps system. However, 100% polarization transmission through the AGS acceleration cycle is not yet reached. The current efficiency of the polarization transmission is estimated to be around 85% in typical running conditions. Understanding the sources of depolarization in the AGS ismore » critical to improve the AGS polarized proton performances. The complexity of beam and spin dynamics, which is in part due to the specialized Siberian snake magnets, drove a strong interest for original methods of simulations. For that, the Zgoubi code, capable of direct particle and spin tracking through field maps, was here used to model the AGS. A model of the AGS using the Zgoubi code was developed and interfaced with the current system through a simple command: the AgsFromSnapRampCmd. Interfacing with the machine control system allows for fast modelization using actual machine parameters. Those developments allowed the model to realistically reproduce the optics of the AGS along the acceleration ramp. Additional developments on the Zgoubi code, as well as on post-processing and pre-processing tools, granted long term multiturn beam tracking capabilities: the tracking of realistic beams along the complete AGS acceleration cycle. Beam multiturn tracking simulations in the AGS, using realistic beam and machine parameters, provided a unique insight into the mechanisms behind the evolution of the beam emittance and polarization during the acceleration cycle. Post-processing softwares were developed to allow the representation of the relevant quantities from the Zgoubi simulations data. The Zgoubi simulations proved particularly useful to better understand the polarization losses through horizontal intrinsic spin resonances The Zgoubi model as well as the tools developed were also used for some direct applications. For instance, some beam experiment simulations allowed an accurate estimation of the expected polarization gains from machine changes. In particular, the simulations that involved involved the tune jumps system provided an accurate estimation of polarization gains and the optimum settings that would improve the performance of the AGS.« less
NASA Astrophysics Data System (ADS)
Nilsson, E.; Decker, J.; Peysson, Y.; Artaud, J.-F.; Ekedahl, A.; Hillairet, J.; Aniel, T.; Basiuk, V.; Goniche, M.; Imbeaux, F.; Mazon, D.; Sharma, P.
2013-08-01
Fully non-inductive operation with lower hybrid current drive (LHCD) in the Tore Supra tokamak is achieved using either a fully active multijunction (FAM) launcher or a more recent ITER-relevant passive active multijunction (PAM) launcher, or both launchers simultaneously. While both antennas show comparable experimental efficiencies, the analysis of stability properties in long discharges suggest different current profiles. We present comparative modelling of LHCD with the two different launchers to characterize the effect of the respective antenna spectra on the driven current profile. The interpretative modelling of LHCD is carried out using a chain of codes calculating, respectively, the global discharge evolution (tokamak simulator METIS), the spectrum at the antenna mouth (LH coupling code ALOHA), the LH wave propagation (ray-tracing code C3PO), and the distribution function (3D Fokker-Planck code LUKE). Essential aspects of the fast electron dynamics in time, space and energy are obtained from hard x-ray measurements of fast electron bremsstrahlung emission using a dedicated tomographic system. LHCD simulations are validated by systematic comparisons between these experimental measurements and the reconstructed signal calculated by the code R5X2 from the LUKE electron distribution. An excellent agreement is obtained in the presence of strong Landau damping (found under low density and high-power conditions in Tore Supra) for which the ray-tracing model is valid for modelling the LH wave propagation. Two aspects of the antenna spectra are found to have a significant effect on LHCD. First, the driven current is found to be proportional to the directivity, which depends upon the respective weight of the main positive and main negative lobes and is particularly sensitive to the density in front of the antenna. Second, the position of the main negative lobe in the spectrum is different for the two launchers. As this lobe drives a counter-current, the resulting driven current profile is also different for the FAM and PAM launchers.
Using the Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, P. J.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Wallin, J. F.
2013-01-01
The Astrophysics Source Code Library (ASCL) is a free on-line registry of source codes that are of interest to astrophysicists; with over 500 codes, it is the largest collection of scientist-written astrophysics programs in existence. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. An advisory committee formed in 2011 provides input and guides the development and expansion of the ASCL, and since January 2012, all accepted ASCL entries are indexed by ADS. Though software is increasingly important for the advancement of science in astrophysics, these methods are still often hidden from view or difficult to find. The ASCL (ascl.net/) seeks to improve the transparency and reproducibility of research by making these vital methods discoverable, and to provide recognition and incentive to those who write and release programs useful for astrophysics research. This poster provides a description of the ASCL, an update on recent additions, and the changes in the astrophysics community we are starting to see because of the ASCL.
ERIC Educational Resources Information Center
de Freitas, Elizabeth
2005-01-01
This paper draws from theorists in critical pedagogy and cultural studies in order to name and then trace the re-inscription and circulation of normative whiteness in geographically isolated rural communities. The paper examines a particular rural Canadian maritime community where my role as teacher-educator and my commitment to developing…
Tracing Ideologies of Learning in Group Talk and Their Impediments to Collaboration
ERIC Educational Resources Information Center
Anderson, Kate T.; Weninger, Csilla
2012-01-01
In this paper we examine the complex relationship between dynamics of group talk and students' ideologies of learning. Through an interactional analysis and thematic coding of group talk, this study details barriers to collaboration in a digital storytelling workshop with primary-aged youth in Singapore. Drawing on 25 h of video-recorded data, we…
USDA-ARS?s Scientific Manuscript database
Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...
Learning Uncertainty Tolerant Plans through Approximation in Complex Domains
1989-01-01
Illinois BysI ~ ~Distribution/ ... AvailiblIity Codes i~Avai’l and/or I Dist I Special I U 3 ACKNOWLEDGEMENTS I 1 I would like to thank my advisor , Professor...controlling robots [Andreae84, Whitehall87]. Whitehall’s PLAND sys- tem observes a trace of robo + ztivity and develops macro-operators which include
Lithographically Encrypted Inverse Opals for Anti-Counterfeiting Applications.
Heo, Yongjoon; Kang, Hyelim; Lee, Joon-Seok; Oh, You-Kwan; Kim, Shin-Hyun
2016-07-01
Colloidal photonic crystals possess inimitable optical properties of iridescent structural colors and unique spectral shape, which render them useful for security materials. This work reports a novel method to encrypt graphical and spectral codes in polymeric inverse opals to provide advanced security. To accomplish this, this study prepares lithographically featured micropatterns on the top surface of hydrophobic inverse opals, which serve as shadow masks against the surface modification of air cavities to achieve hydrophilicity. The resultant inverse opals allow rapid infiltration of aqueous solution into the hydrophilic cavities while retaining air in the hydrophobic cavities. Therefore, the structural color of inverse opals is regioselectively red-shifted, disclosing the encrypted graphical codes. The decoded inverse opals also deliver unique reflectance spectral codes originated from two distinct regions. The combinatorial code composed of graphical and optical codes is revealed only when the aqueous solution agreed in advance is used for decoding. In addition, the encrypted inverse opals are chemically stable, providing invariant codes with high reproducibility. In addition, high mechanical stability enables the transfer of the films onto any surfaces. This novel encryption technology will provide a new opportunity in a wide range of security applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Physics behind the mechanical nucleosome positioning code
NASA Astrophysics Data System (ADS)
Zuiddam, Martijn; Everaers, Ralf; Schiessel, Helmut
2017-11-01
The positions along DNA molecules of nucleosomes, the most abundant DNA-protein complexes in cells, are influenced by the sequence-dependent DNA mechanics and geometry. This leads to the "nucleosome positioning code", a preference of nucleosomes for certain sequence motives. Here we introduce a simplified model of the nucleosome where a coarse-grained DNA molecule is frozen into an idealized superhelical shape. We calculate the exact sequence preferences of our nucleosome model and find it to reproduce qualitatively all the main features known to influence nucleosome positions. Moreover, using well-controlled approximations to this model allows us to come to a detailed understanding of the physics behind the sequence preferences of nucleosomes.
Production of Pions in pA-collisions
NASA Technical Reports Server (NTRS)
Moskalenko, I. V.; Mashnik, S. G.
2003-01-01
Accurate knowledge of pion production cross section in PA-collisions is of interest for astrophysics, CR physics, and space radiation studies. Meanwhile, pion production in pA-reactions is often accounted for by simple scaling of that for pp-collisions, which is not enough for many real applications. We evaluate the quality of existing parameterizations using the data and simulations with the Los Alamos version of the Quark-Gluon String Model code LAQGSM and the improved Cascade-Exciton Model code CEM2k. The LAQGSM and CEM2k models have been shown to reproduce well nuclear reactions and hadronic data in the range 0.01-800 GeV/nucleon.
Benchmark Testing of a New 56Fe Evaluation for Criticality Safety Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leal, Luiz C; Ivanov, E.
2015-01-01
The SAMMY code was used to evaluate resonance parameters of the 56Fe cross section in the resolved resonance energy range of 0–2 MeV using transmission data, capture, elastic, inelastic, and double differential elastic cross sections. The resonance analysis was performed with the code SAMMY that fits R-matrix resonance parameters using the generalized least-squares technique (Bayes’ theory). The evaluation yielded a set of resonance parameters that reproduced the experimental data very well, along with a resonance parameter covariance matrix for data uncertainty calculations. Benchmark tests were conducted to assess the evaluation performance in benchmark calculations.
Geometric calibration of Colour and Stereo Surface Imaging System of ESA's Trace Gas Orbiter
NASA Astrophysics Data System (ADS)
Tulyakov, Stepan; Ivanov, Anton; Thomas, Nicolas; Roloff, Victoria; Pommerol, Antoine; Cremonese, Gabriele; Weigel, Thomas; Fleuret, Francois
2018-01-01
There are many geometric calibration methods for "standard" cameras. These methods, however, cannot be used for the calibration of telescopes with large focal lengths and complex off-axis optics. Moreover, specialized calibration methods for the telescopes are scarce in literature. We describe the calibration method that we developed for the Colour and Stereo Surface Imaging System (CaSSIS) telescope, on board of the ExoMars Trace Gas Orbiter (TGO). Although our method is described in the context of CaSSIS, with camera-specific experiments, it is general and can be applied to other telescopes. We further encourage re-use of the proposed method by making our calibration code and data available on-line.
Strait, Robert S.; Pearson, Peter K.; Sengupta, Sailes K.
2000-01-01
A password system comprises a set of codewords spaced apart from one another by a Hamming distance (HD) that exceeds twice the variability that can be projected for a series of biometric measurements for a particular individual and that is less than the HD that can be encountered between two individuals. To enroll an individual, a biometric measurement is taken and exclusive-ORed with a random codeword to produce a "reference value." To verify the individual later, a biometric measurement is taken and exclusive-ORed with the reference value to reproduce the original random codeword or its approximation. If the reproduced value is not a codeword, the nearest codeword to it is found, and the bits that were corrected to produce the codeword to it is found, and the bits that were corrected to produce the codeword are also toggled in the biometric measurement taken and the codeword generated during enrollment. The correction scheme can be implemented by any conventional error correction code such as Reed-Muller code R(m,n). In the implementation using a hand geometry device an R(2,5) code has been used in this invention. Such codeword and biometric measurement can then be used to see if the individual is an authorized user. Conventional Diffie-Hellman public key encryption schemes and hashing procedures can then be used to secure the communications lines carrying the biometric information and to secure the database of authorized users.
The Impact of Modeling Assumptions in Galactic Chemical Evolution Models
NASA Astrophysics Data System (ADS)
Côté, Benoit; O'Shea, Brian W.; Ritter, Christian; Herwig, Falk; Venn, Kim A.
2017-02-01
We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. This provides a consistent framework for comparing the best-fit solutions generated by our different models. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. This result supports the similar conclusions originally claimed by Romano & Starkenburg for Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of SNe Ia and the strength of galactic outflows, are substantially different and in fact mutually exclusive from one model to another. For the purpose of understanding how a galaxy evolves, we conclude that only reproducing the evolution of a limited number of elements is insufficient and can lead to misleading conclusions. More elements or additional constraints such as the Galaxy’s star-formation efficiency and the gas fraction are needed in order to break the degeneracy between the different modeling assumptions. Our results show that the successes and failures of chemical evolution models are predominantly driven by the input stellar yields, rather than by the complexity of the Galaxy model itself. Simple models such as OMEGA are therefore sufficient to test and validate stellar yields. OMEGA is part of the NuGrid chemical evolution package and is publicly available online at http://nugrid.github.io/NuPyCEE.
Status of BOUT fluid turbulence code: improvements and verification
NASA Astrophysics Data System (ADS)
Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.
2006-10-01
BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.
Młynarski, Wiktor
2014-01-01
To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficient coding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform—Independent Component Analysis (ICA) trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment. PMID:24639644
Gyrokinetic simulations of DIII-D near-edge L-mode plasmas
NASA Astrophysics Data System (ADS)
Neiser, Tom; Jenko, Frank; Carter, Troy; Schmitz, Lothar; Merlo, Gabriele; Told, Daniel; Banon Navarro, Alejandro; McKee, George; Yan, Zheng
2017-10-01
In order to understand the L-H transition, a good understanding of the L-mode edge region is necessary. We perform nonlinear gyrokinetic simulations of a DIII-D L-mode discharge with the GENE code in the near-edge, which we define as ρtor >= 0.8 . At ρ = 0.9 , ion-scale simulations reproduce experimental heat fluxes within the uncertainty of the experiment. At ρ = 0 . 8 , electron-scale simulations reproduce the experimental electron heat flux while ion-scale simulations do not reproduce the respective ion heat flux due to a strong poloidal zonal flow. However, we reproduce both electron and ion heat fluxes by increasing the local ion temperature gradient by 80 % . Local fitting to the CER data in the domain 0.7 <= ρ <= 0.9 is compatible with such an increase in ion temperature gradient within the error bars. Ongoing multi-scale simulations are investigating whether radial electron streamers could dampen the poloidal zonal flows at ρ = 0.8 and increase the radial ion-scale flux. Supported by U.S. DOE under Contract Numbers DE-FG02-08ER54984, DE-FC02-04ER54698, and DE-AC02-05CH11231.
Wall-touching kink mode calculations with the M3D code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Breslau, J. A., E-mail: jbreslau@pppl.gov; Bhattacharjee, A.
This paper seeks to address a controversy regarding the applicability of the 3D nonlinear extended MHD code M3D [W. Park et al., Phys. Plasmas 6, 1796 (1999)] and similar codes to calculations of the electromagnetic interaction of a disrupting tokamak plasma with the surrounding vessel structures. M3D is applied to a simple test problem involving an external kink mode in an ideal cylindrical plasma, used also by the Disruption Simulation Code (DSC) as a model case for illustrating the nature of transient vessel currents during a major disruption. While comparison of the results with those of the DSC is complicatedmore » by effects arising from the higher dimensionality and complexity of M3D, we verify that M3D is capable of reproducing both the correct saturation behavior of the free boundary kink and the “Hiro” currents arising when the kink interacts with a conducting tile surface interior to the ideal wall.« less
Extension of the XGC code for global gyrokinetic simulations in stellarator geometry
NASA Astrophysics Data System (ADS)
Cole, Michael; Moritaka, Toseo; White, Roscoe; Hager, Robert; Ku, Seung-Hoe; Chang, Choong-Seock
2017-10-01
In this work, the total-f, gyrokinetic particle-in-cell code XGC is extended to treat stellarator geometries. Improvements to meshing tools and the code itself have enabled the first physics studies, including single particle tracing and flux surface mapping in the magnetic geometry of the heliotron LHD and quasi-isodynamic stellarator Wendelstein 7-X. These have provided the first successful test cases for our approach. XGC is uniquely placed to model the complex edge physics of stellarators. A roadmap to such a global confinement modeling capability will be presented. Single particle studies will include the physics of energetic particles' global stochastic motions and their effect on confinement. Good confinement of energetic particles is vital for a successful stellarator reactor design. These results can be compared in the core region with those of other codes, such as ORBIT3d. In subsequent work, neoclassical transport and turbulence can then be considered and compared to results from codes such as EUTERPE and GENE. After sufficient verification in the core region, XGC will move into the stellarator edge region including the material wall and neutral particle recycling.
Developing a Multi-Dimensional Hydrodynamics Code with Astrochemical Reactions
NASA Astrophysics Data System (ADS)
Kwak, Kyujin; Yang, Seungwon
2015-08-01
The Atacama Large Millimeter/submillimeter Array (ALMA) revealed high resolution molecular lines some of which are still unidentified yet. Because formation of these astrochemical molecules has been seldom studied in traditional chemistry, observations of new molecular lines drew a lot of attention from not only astronomers but also chemists both experimental and theoretical. Theoretical calculations for the formation of these astrochemical molecules have been carried out providing reaction rates for some important molecules, and some of theoretical predictions have been measured in laboratories. The reaction rates for the astronomically important molecules are now collected to form databases some of which are publically available. By utilizing these databases, we develop a multi-dimensional hydrodynamics code that includes the reaction rates of astrochemical molecules. Because this type of hydrodynamics code is able to trace the molecular formation in a non-equilibrium fashion, it is useful to study the formation history of these molecules that affects the spatial distribution of some specific molecules. We present the development procedure of this code and some test problems in order to verify and validate the developed code.
Makwana, K. D.; Zhdankin, V.; Li, H.; ...
2015-04-10
We performed simulations of decaying magnetohydrodynamic (MHD) turbulence with a fluid and a kinetic code. The initial condition is an ensemble of long-wavelength, counter-propagating, shear-Alfvén waves, which interact and rapidly generate strong MHD turbulence. The total energy is conserved and the rate of turbulent energy decay is very similar in both codes, although the fluid code has numerical dissipation, whereas the kinetic code has kinetic dissipation. The inertial range power spectrum index is similar in both the codes. The fluid code shows a perpendicular wavenumber spectral slope of k-1.3⊥k⊥-1.3. The kinetic code shows a spectral slope of k-1.5⊥k⊥-1.5 for smallermore » simulation domain, and k-1.3⊥k⊥-1.3 for larger domain. We then estimate that collisionless damping mechanisms in the kinetic code can account for the dissipation of the observed nonlinear energy cascade. Current sheets are geometrically characterized. Their lengths and widths are in good agreement between the two codes. The length scales linearly with the driving scale of the turbulence. In the fluid code, their thickness is determined by the grid resolution as there is no explicit diffusivity. In the kinetic code, their thickness is very close to the skin-depth, irrespective of the grid resolution. Finally, this work shows that kinetic codes can reproduce the MHD inertial range dynamics at large scales, while at the same time capturing important kinetic physics at small scales.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makwana, K. D.; Zhdankin, V.; Li, H.
We performed simulations of decaying magnetohydrodynamic (MHD) turbulence with a fluid and a kinetic code. The initial condition is an ensemble of long-wavelength, counter-propagating, shear-Alfvén waves, which interact and rapidly generate strong MHD turbulence. The total energy is conserved and the rate of turbulent energy decay is very similar in both codes, although the fluid code has numerical dissipation, whereas the kinetic code has kinetic dissipation. The inertial range power spectrum index is similar in both the codes. The fluid code shows a perpendicular wavenumber spectral slope of k-1.3⊥k⊥-1.3. The kinetic code shows a spectral slope of k-1.5⊥k⊥-1.5 for smallermore » simulation domain, and k-1.3⊥k⊥-1.3 for larger domain. We then estimate that collisionless damping mechanisms in the kinetic code can account for the dissipation of the observed nonlinear energy cascade. Current sheets are geometrically characterized. Their lengths and widths are in good agreement between the two codes. The length scales linearly with the driving scale of the turbulence. In the fluid code, their thickness is determined by the grid resolution as there is no explicit diffusivity. In the kinetic code, their thickness is very close to the skin-depth, irrespective of the grid resolution. Finally, this work shows that kinetic codes can reproduce the MHD inertial range dynamics at large scales, while at the same time capturing important kinetic physics at small scales.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Ling; Berry, R. A.; Martineau, R. C.
The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less
2007-07-20
science.msfc.nasa.gov/ss/pad/ sppb /workshoV7/&eoma&ne/Reo cgn/geo c&m.for. See also the GEOPACK library at http:/nssdcftp.gsfc.na-a.gov/models...calculations, fed into the inverse computation, should reproduce the original coordinate grid. This test of the consistency of the direct and inverse...algorithm. A test of this type was performed for a uniform grid, for line traces from 7200 km to the ground, and for 800 km to the ground. The maximum
Co-occurrence profiles of trace elements in potable water systems: a case study.
Andra, Syam S; Makris, Konstantinos C; Charisiadis, Pantelis; Costa, Costas N
2014-11-01
Potable water samples (N = 74) from 19 zip code locations in a region of Greece were profiled for 13 trace elements composition using inductively coupled plasma mass spectrometry. The primary objective was to monitor the drinking water quality, while the primary focus was to find novel associations in trace elements occurrence that may further shed light on common links in their occurrence and fate in the pipe scales and corrosion products observed in urban drinking water distribution systems. Except for arsenic at two locations and in six samples, rest of the analyzed elements was below maximum contaminant levels, for which regulatory values are available. Further, we attempted to hierarchically cluster trace elements based on their covariances resulting in two groups; one with arsenic, antimony, zinc, cadmium, and copper and the second with the rest of the elements. The grouping trends were partially explained by elements' similar chemical activities in water, underscoring their potential for co-accumulation and co-mobilization phenomena from pipe scales into finished water. Profiling patterns of trace elements in finished water could be indicative of their load on pipe scales and corrosion products, with a corresponding risk of episodic contaminant release. Speculation was made on the role of disinfectants and disinfection byproducts in mobilizing chemically similar trace elements of human health interest from pipe scales to tap water. It is warranted that further studies may eventually prove useful to water regulators from incorporating the acquired knowledge in the drinking water safety plans.
NASA Astrophysics Data System (ADS)
Singh, Udaybir; Kumar, Nitin; Kumar, Anil; Purohit, Laxmi Prasad; Sinha, Ashok Kumar
2011-07-01
This paper presents the design of two types of magnetron injection guns (MIG's) for 1 MW, 127.5 GHz gyrotron. TE24,8 mode has been chosen as the operating mode. In-house developed code MIGSYN has been used to estimate the initial gun parameters. The electron trajectory tracing program EGUN and in-house developed code MIGANS have been used to optimize the single-anode and the double-anode design for 80 kV, 40 A MIG. The parametric analysis of MIG has also been presented. The advantages and the disadvantages of each kind of configuration have been critically examined.
Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard
2018-06-01
Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .
Global simulation study for the time sequence of events leading to the substorm onset
NASA Astrophysics Data System (ADS)
Tanaka, T.; Ebihara, Y.; Watanabe, M.; Den, M.; Fujita, S.; Kikuchi, T.; Hashimoto, K. K.; Kataoka, R.
2017-06-01
We have developed a global simulation code which gives numerical solutions having an extremely high resolution. The substorm solution obtained from this simulation code reproduces the precise features of the substorm onset in the ionosphere. It can reproduce the onset that starts from the equatorward side of the quiet arc, two step development of the onset, and the westward traveling surge (WTS) that starts 2 min after the initial brightening. Then, we investigated the counter structures in the magnetosphere that correspond to each event in the ionosphere. The structure in the magnetosphere promoting the onset is the near-Earth dynamo in the inner magnetospheric region away from the equatorial plane. The near-Earth dynamo is driven by the field-aligned pressure increase due to the parallel flow associated with the squeezing, combined with equatorward field-perpendicular flow induced by the near-Earth neutral line (NENL). The dipolarization front is launched from the NENL associated with the convection transient from the growth phase to the expansion phase, but neither the launch nor the arrival of the dipolarization front coincides with the onset timing. The arrival of flow to the equatorial plane of the inner magnetosphere occurs 2 min after the onset, when the WTS starts to develop toward the west. The expansion phase is further developed by this flow. Looking at the present result that the onset sequence induced by the near-Earth dynamo reproduces the details of observation quite well, we cannot avoid to conclude that the current wedge is a misleading concept.
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.
2015-05-01
As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. In previous papers, we demonstrate new possibility of the passive THz camera using for a temperature difference observing on the human skin if this difference is caused by different temperatures inside the body. For proof of validity of our statement we make the similar physical experiment using the IR camera. We show a possibility of temperature trace on human body skin, caused by changing of temperature inside the human body due to water drinking. We use as a computer code that is available for treatment of images captured by commercially available IR camera, manufactured by Flir Corp., as well as our developed computer code for computer processing of these images. Using both codes we demonstrate clearly changing of human body skin temperature induced by water drinking. Shown phenomena are very important for the detection of forbidden samples and substances concealed inside the human body using non-destructive control without X-rays using. Early we have demonstrated such possibility using THz radiation. Carried out experiments can be used for counter-terrorism problem solving. We developed original filters for computer processing of images captured by IR cameras. Their applications for computer processing of images results in a temperature resolution enhancing of cameras.
Forde, Arnell S.; Flocks, James G.; Wiese, Dana S.; Fredericks, Jake J.
2016-03-29
The archived trace data are in standard SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in American Standard Code for Information Interchange (ASCII) format instead of Extended Binary Coded Decimal Interchange Code (EBCDIC) format. The SEG Y files are available on the DVD version of this report or online, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. The Web version of this archive does not contain the SEG Y trace files. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The printable profiles are provided as Graphics Interchange Format (GIF) images processed and gained using SU software and can be viewed from theProfiles page or by using the links located on the trackline maps; refer to the Software page for links to example SU processing scripts.
NASA Astrophysics Data System (ADS)
Bonfiglio, D.; Veranda, M.; Cappello, S.; Chacón, L.; Spizzo, G.
2010-11-01
The emergence of a self-organized reversed-field pinch (RFP) helical regime, first shown by 3D MHD numerical simulations, has been highlighted in the RFX-mod experiment at high current operation (IP above 1 MA). In fact, a quasi-stationary helical configuration spontaneously appears, characterized by strong internal electron transport barriers. In such regime electron temperature and density become, to a very good approximation, functions of the helical flux coordinate related to the dominant helical magnetic component. In addition, this regime is diagnosed to be associated with the topological transition to a single-helical-axis (SHAx) state, achieved after the expulsion of the separatrix of the dominant mode's magnetic island. The SHAx state is theoretically predicted to be resilient to the magnetic chaos induced by secondary modes. In this paper, we present initial results of the volume-preserving field line tracing code NEMATO [Finn J M and Chacón L 2005 Phys. Plasmas 12 054503] applied to study the magnetic topology resulting from 3D MHD simulations of the RFP. First, a successful 2D verification test of the code is shown, then, initial application to a systematic study of chaos healing in the helical RFP is discussed. The separatrix disappearance is confirmed to play an essential role for chaos healing. The triggering effect of a reversed magnetic shear for the formation of ordered surfaces within magnetic chaos is also diagnosed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonfiglio, Daniele; Veranda, M.; Cappello, Susanna
2010-01-01
The emergence of a self-organized reversed-field pinch (RFP) helical regime, first shown by 3D MHD numerical simulations, has been highlighted in the RFX-mod experiment at high current operation (IP above 1 MA). In fact, a quasi-stationary helical configuration spontaneously appears, characterized by strong internal electron transport barriers. In such regime electron temperature and density become, to a very good approximation, functions of the helical flux coordinate related to the dominant helical magnetic component. In addition, this regime is diagnosed to be associated with the topological transition to a single-helical-axis (SHAx) state, achieved after the expulsion of the separatrix of themore » dominant mode's magnetic island. The SHAx state is theoretically predicted to be resilient to the magnetic chaos induced by secondary modes. In this paper, we present initial results of the volume-preserving field line tracing code nemato [Finn J M and Chacon L 2005 Phys. Plasmas 12 054503] applied to study the magnetic topology resulting from 3D MHD simulations of the RFP. First, a successful 2D verification test of the code is shown, then, initial application to a systematic study of chaos healing in the helical RFP is discussed. The separatrix disappearance is confirmed to play an essential role for chaos healing. The triggering effect of a reversed magnetic shear for the formation of ordered surfaces within magnetic chaos is also diagnosed.« less
Yeung, Ka Yee
2016-01-01
Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface. PMID:27045593
Hung, Ling-Hong; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee
2016-01-01
Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.
Koch, Iris; Reimer, Kenneth J; Bakker, Martine I; Basta, Nicholas T; Cave, Mark R; Denys, Sébastien; Dodd, Matt; Hale, Beverly A; Irwin, Rob; Lowney, Yvette W; Moore, Margo M; Paquin, Viviane; Rasmussen, Pat E; Repaso-Subang, Theresa; Stephenson, Gladys L; Siciliano, Steven D; Wragg, Joanna; Zagury, Gerald J
2013-01-01
Bioaccessibility is a measurement of a substance's solubility in the human gastro-intestinal system, and is often used in the risk assessment of soils. The present study was designed to determine the variability among laboratories using different methods to measure the bioaccessibility of 24 inorganic contaminants in one standardized soil sample, the standard reference material NIST 2710. Fourteen laboratories used a total of 17 bioaccessibility extraction methods. The variability between methods was assessed by calculating the reproducibility relative standard deviations (RSDs), where reproducibility is the sum of within-laboratory and between-laboratory variability. Whereas within-laboratory repeatability was usually better than (<) 15% for most elements, reproducibility RSDs were much higher, indicating more variability, although for many elements they were comparable to typical uncertainties (e.g., 30% in commercial laboratories). For five trace elements of interest, reproducibility RSDs were: arsenic (As), 22-44%; cadmium (Cd), 11-41%; Cu, 15-30%; lead (Pb), 45-83%; and Zn, 18-56%. Only one method variable, pH, was found to correlate significantly with bioaccessibility for aluminum (Al), Cd, copper (Cu), manganese (Mn), Pb and zinc (Zn) but other method variables could not be examined systematically because of the study design. When bioaccessibility results were directly compared with bioavailability results for As (swine and mouse) and Pb (swine), four methods returned results within uncertainty ranges for both elements: two that were defined as simpler (gastric phase only, limited chemicals) and two were more complex (gastric + intestinal phases, with a mixture of chemicals).
WOLF: a computer code package for the calculation of ion beam trajectories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogel, D.L.
1985-10-01
The WOLF code solves POISSON'S equation within a user-defined problem boundary of arbitrary shape. The code is compatible with ANSI FORTRAN and uses a two-dimensional Cartesian coordinate geometry represented on a triangular lattice. The vacuum electric fields and equipotential lines are calculated for the input problem. The use may then introduce a series of emitters from which particles of different charge-to-mass ratios and initial energies can originate. These non-relativistic particles will then be traced by WOLF through the user-defined region. Effects of ion and electron space charge are included in the calculation. A subprogram PISA forms part of this codemore » and enables optimization of various aspects of the problem. The WOLF package also allows detailed graphics analysis of the computed results to be performed.« less
NASA Astrophysics Data System (ADS)
Lawrence, G.; Barnard, C.; Viswanathan, V.
1986-11-01
Historically, wave optics computer codes have been paraxial in nature. Folded systems could be modeled by "unfolding" the optical system. Calculation of optical aberrations is, in general, left for the analyst to do with off-line codes. While such paraxial codes were adequate for the simpler systems being studied 10 years ago, current problems such as phased arrays, ring resonators, coupled resonators, and grazing incidence optics require a major advance in analytical capability. This paper describes extension of the physical optics codes GLAD and GLAD V to include a global coordinate system and exact ray aberration calculations. The global coordinate system allows components to be positioned and rotated arbitrarily. Exact aberrations are calculated for components in aligned or misaligned configurations by using ray tracing to compute optical path differences and diffraction propagation. Optical path lengths between components and beam rotations in complex mirror systems are calculated accurately so that coherent interactions in phased arrays and coupled devices may be treated correctly.
Applications of automatic differentiation in computational fluid dynamics
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.
1994-01-01
Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.
Stability or stasis in the names of organisms: the evolving codes of nomenclature.
Knapp, Sandra; Lamas, Gerardo; Lughadha, Eimear Nic; Novarino, Gianfranco
2004-01-01
Nomenclature, far from being a dry dusty subject, is today more relevant than ever before. Researchers into genomics are discovering again the need for systems of nomenclature-names are what we use to communicate about organisms, and by extension the rest of their biology. Here, we briefly outline the history of the published international codes of nomenclature, tracing them from the time of Linnaeus in the eighteenth century to the present day. We then outline some of what we feel are the major challenges that face the codes in the twenty-first century; focusing primarily on publication, priority, typification and the role of science in the naming of organisms. We conclude that the codes are essential for taxonomists in the pursuance of their science, and that the democratic nature of decision-making in the regulation of the rules of nomenclature, though sometimes perceived as a potential weakness, is in fact one of its great strengths. PMID:15253348
Characteristic Evolution and Matching
NASA Astrophysics Data System (ADS)
Winicour, Jeffrey
2012-01-01
I review the development of numerical evolution codes for general relativity based upon the characteristic initial-value problem. Progress in characteristic evolution is traced from the early stage of 1D feasibility studies to 2D-axisymmetric codes that accurately simulate the oscillations and gravitational collapse of relativistic stars and to current 3D codes that provide pieces of a binary black-hole spacetime. Cauchy codes have now been successful at simulating all aspects of the binary black-hole problem inside an artificially constructed outer boundary. A prime application of characteristic evolution is to extend such simulations to null infinity where the waveform from the binary inspiral and merger can be unambiguously computed. This has now been accomplished by Cauchy-characteristic extraction, where data for the characteristic evolution is supplied by Cauchy data on an extraction worldtube inside the artificial outer boundary. The ultimate application of characteristic evolution is to eliminate the role of this outer boundary by constructing a global solution via Cauchy-characteristic matching. Progress in this direction is discussed.
Hill, Katherine E; Kelly, Andrew D; Kuijjer, Marieke L; Barry, William; Rattani, Ahmed; Garbutt, Cassandra C; Kissick, Haydn; Janeway, Katherine; Perez-Atayde, Antonio; Goldsmith, Jeffrey; Gebhardt, Mark C; Arredouani, Mohamed S; Cote, Greg; Hornicek, Francis; Choy, Edwin; Duan, Zhenfeng; Quackenbush, John; Haibe-Kains, Benjamin; Spentzos, Dimitrios
2017-05-15
A microRNA (miRNA) collection on the imprinted 14q32 MEG3 region has been associated with outcome in osteosarcoma. We assessed the clinical utility of this miRNA set and their association with methylation status. We integrated coding and non-coding RNA data from three independent annotated clinical osteosarcoma cohorts (n = 65, n = 27, and n = 25) and miRNA and methylation data from one in vitro (19 cell lines) and one clinical (NCI Therapeutically Applicable Research to Generate Effective Treatments (TARGET) osteosarcoma dataset, n = 80) dataset. We used time-dependent receiver operating characteristic (tdROC) analysis to evaluate the clinical value of candidate miRNA profiles and machine learning approaches to compare the coding and non-coding transcriptional programs of high- and low-risk osteosarcoma tumors and high- versus low-aggressiveness cell lines. In the cell line and TARGET datasets, we also studied the methylation patterns of the MEG3 imprinting control region on 14q32 and their association with miRNA expression and tumor aggressiveness. In the tdROC analysis, miRNA sets on 14q32 showed strong discriminatory power for recurrence and survival in the three clinical datasets. High- or low-risk tumor classification was robust to using different microRNA sets or classification methods. Machine learning approaches showed that genome-wide miRNA profiles and miRNA regulatory networks were quite different between the two outcome groups and mRNA profiles categorized the samples in a manner concordant with the miRNAs, suggesting potential molecular subtypes. Further, miRNA expression patterns were reproducible in comparing high-aggressiveness versus low-aggressiveness cell lines. Methylation patterns in the MEG3 differentially methylated region (DMR) also distinguished high-aggressiveness from low-aggressiveness cell lines and were associated with expression of several 14q32 miRNAs in both the cell lines and the large TARGET clinical dataset. Within the limits of available CpG array coverage, we observed a potential methylation-sensitive regulation of the non-coding RNA cluster by CTCF, a known enhancer-blocking factor. Loss of imprinting/methylation changes in the 14q32 non-coding region defines reproducible previously unrecognized osteosarcoma subtypes with distinct transcriptional programs and biologic and clinical behavior. Future studies will define the precise relationship between 14q32 imprinting, non-coding RNA expression, genomic enhancer binding, and tumor aggressiveness, with possible therapeutic implications for both early- and advanced-stage patients.
Influence of Stress Corrosion Crack Morphology on Ultrasonic Examination Performances
NASA Astrophysics Data System (ADS)
Dupond, O.; Duwig, V.; Fouquet, T.
2009-03-01
Stress Corrosion Cracking represents a potential damage for several components in PWR. For this reason, NDE of stress corrosion cracks corresponds to an important stake for Electricité de France (EDF) both for availability and for safety of plants. This paper is dedicated to the ultrasonic examination of SCC crack defects. The study mixes an experimental approach conducted on artificial flaws—meant to represent the characteristic morphologic features often encountered on SCC cracks—and a 2D finite element modelling with the code ATHENA 2D developed by EDF. Results indicate that ATHENA reproduces correctly the interaction of the beam on the complex defect. Indeed specific ultrasonic responses resulting from the defect morphology have been observed experimentally and reproduced with the modelling.
Assessment of early attrition using an ordinary flatbed scanner.
Van't Spijker, Arie; Kreulen, Cees M; Bronkhorst, Ewald M; Creugers, Nico H J
2012-07-01
The aim of this study was to assess a two-dimensional method to monitor occlusal tooth wear quantitatively using a commercially available ordinary flatbed scanner. A flatbed scanner, measuring software and gypsum casts were used. In Part I, two observers (A and B) independently traced scans of marked wear facets of ten sets of casts in two sessions (test and retest). In Part II, three other sets of casts were duplicated and two observers (C and D) marked wear facets and traced the scanned images independently. Intra- and inter-observer agreement was determined comparing measured values (mm(2)) in paired T-tests. Duplicate measurement errors (DME) were calculated. In Part I the test and retest values (10 casts, 218 teeth) of observer A and B did not differ significantly (A: p = 0.289; B: p = 0.666); correlation coefficients were 0.998 (A) and 0.999 (B). "Tracing wear facets" showed a DME of 0.30 mm(2) for observer A and 0.15 mm(2) for observer B. In Part II, assessment of 70 teeth resulted in correlation coefficients of 0.994 for observer C and 0.997 for observer D; no differences between test and retest values were found for C (p = 0.061), although D differed significantly (p = 0.000). The DME for "marking and tracing wear facets" was 0.39 mm(2) (C) and 0.27 mm(2) (D). DME for inter-observer agreement were 0.45 mm(2) (test) and 0.42 mm(2) (re-test). We conclude that marking and tracing of occlusal wear facets to assess occlusal tooth wear quantitatively can be done accurately and reproducibly. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Shiller, Alan M.
2003-01-01
It is well-established that sampling and sample processing can easily introduce contamination into dissolved trace element samples if precautions are not taken. However, work in remote locations sometimes precludes bringing bulky clean lab equipment into the field and likewise may make timely transport of samples to the lab for processing impossible. Straightforward syringe filtration methods are described here for collecting small quantities (15 mL) of 0.45- and 0.02-microm filtered river water in an uncontaminated manner. These filtration methods take advantage of recent advances in analytical capabilities that require only small amounts of waterfor analysis of a suite of dissolved trace elements. Filter clogging and solute rejection artifacts appear to be minimal, although some adsorption of metals and organics does affect the first approximately 10 mL of water passing through the filters. Overall the methods are clean, easy to use, and provide reproducible representations of the dissolved and colloidal fractions of trace elements in river waters. Furthermore, sample processing materials can be prepared well in advance in a clean lab and transported cleanly and compactly to the field. Application of these methods is illustrated with data from remote locations in the Rocky Mountains and along the Yukon River. Evidence from field flow fractionation suggests that the 0.02-microm filters may provide a practical cutoff to distinguish metals associated with small inorganic and organic complexes from those associated with silicate and oxide colloids.
NASA Astrophysics Data System (ADS)
Brenan, J. M.; Shaw, H. F.; Ryerson, F. J.; Phinney, D. L.
1995-10-01
In order to more fully establish a basis for quantifying the role of amphibole in trace-element fractionation processes, we have measured pargasite/silicate melt partitioning of a variety of trace elements (Rb, Ba, Nb, Ta, Hf, Zr, Ce, Nd, Sm, Yb), including the first published values for U, Th and Pb. Experiments conducted at 1000°C and 1.5 GPa yielded large crystals free of compositional zoning. Partition coefficients were found to be constant at total concentrations ranging from ˜ 1 to > 100 ppm, indicating Henry's Law is oparative over this interval. Comparison of partition coefficients measured in this study with previous determinations yields good agreement for similar compositions at comparable pressure and temperature. The compatibility of U, Th and Pb in amphibole decreases in the order Pb > Th > U. Partial melting or fractional crystallization of amphibole-bearing assemblages will therefore result in the generation of excesses in 238U activity relative to 230Th, similar in magnitude to that produced by clinopyroxene. The compatibility of Pb in amphibole relative to U or Th indicates that melt generation in the presence of residual amphibole will result in the long-term enrichment in Pb relative to U or Th in the residue. This process is therefore incapable of producing the depletion in Pb relative to U or Th inferred from the Pb isotopic composition of MORB and OIB. Comparison of partition coefficients measured in this study with previous values for clinopyroxene allows some distinction to be made between expected trace-element fractionations produced during dry (cpx present) and wet (cpx + amphibole present) melting. Rb, Ba, Nb and Ta are dramatically less compatible in clinopyroxene than in amphibole, whereas Th, U, Hf and Zr have similar compatibilities in both phases. Interelement fractionations, such as DNb/DBa are also different for clinopyroxene and amphibole. Changes in certain ratios, such as Ba/Nb, Ba/Th, and Nb/Th within comagmatic suites may therefore offer a means to discern the loss of amphibole from the melting assemblage. Elastic strain theory is applied to the partitioning data after the approaches of Beattie and Blundy and Wood and is used to predict amphibole/melt partition coefficients at conditions of P, T and composition other than those employed in this study. Given values of DCa, DTi and DK from previous partitioning studies, this approach yields amphibole/melt trace-element partition coefficients that reproduce measured values from the literature to within 40-45%. This degree of reproducibility is considered reasonable given that model parameters are derived from partitioning relations involving iron- and potassium-free amphibole.
1988-09-01
Autodrift, ARTIST Autoscaling , Electron Density 16. PRICE CODE Profiles 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 19. SECURITY...FIGURES Figure No. Page 2.1 ARTIST Scaled Parameters 4 2.2 ARTIST ASCII Ionogram 6 2.3 ARTISTSV Optifont lonogram 7 2.4 Autoscaling of Es Trace Before...diagnostic programs for testing communication ports. The aforementioned contract required a performance evaluation of ARTIST . Manual and autoscaled
ERIC Educational Resources Information Center
Ho, Kai Fai; Tan, Preston
2013-01-01
The term "professional vision" points to the many nuanced ways professionals see. This paper traces the development of a professional vision of a researcher and a teacher looking at classroom practices. The researcher's interest was to capture and study notable aspects of the teacher's practice. Through a coding scheme, disparate…
NASA Astrophysics Data System (ADS)
Bonfiglio, D.; Veranda, M.; Cappello, S.; Chacon, L.; Escande, D. F.; Piovesan, P.
2009-11-01
The existence of a Reversed Field Pinch (RFP) dynamo as a (laminar) helical self-organization was anticipated by MHD numerical studies [1]. High current operation in RFX-mod experiment shows such a helical self-organization: strong internal electron transport barriers (ITB) appear and magnetic chaos healing is diagnosed when Single Helical Axis (SHAx) regimes are achieved [2]. We present results of the field line tracing code NEMATO [3] applied to study the magnetic topology resulting from 3D MHD simulations, with the aim of clarifying the conditions for chaos healing in SHAx states. First tests confirm the basic picture: the magnetic chaos due to island overlap is significantly reduced after the expulsion of the dominant mode separatrix. The possible synergy with the presence of magnetic and/or flow shear at the SHAx ITB will also be discussed [4].[4pt] [1] S. Cappello, Plasma Phys. Control. Fusion (2004) & references therein [0pt] [2] R. Lorenzini et al., Nature Phys. (2009) [0pt] [3] J. M. Finn and L. Chacon, Phys. Plasmas (2005) [0pt] [4] M.E. Puiatti et al invited presentation EPS 2009 conference, submitted to Plasma Phys. Control. Fusion
MultiElec: A MATLAB Based Application for MEA Data Analysis.
Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R
2015-01-01
We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.
The identification of incident cancers in UK primary care databases: a systematic review.
Rañopa, Michael; Douglas, Ian; van Staa, Tjeerd; Smeeth, Liam; Klungel, Olaf; Reynolds, Robert; Bhaskaran, Krishnan
2015-01-01
UK primary care databases are frequently used in observational studies with cancer outcomes. We aimed to systematically review methods used by such studies to identify and validate incident cancers of the breast, colorectum, and prostate. Medline and Embase (1980-2013) were searched for UK primary care database studies with incident breast, colorectal, or prostate cancer outcomes. Data on the methods used for case ascertainment were extracted and summarised. Questionnaires were sent to corresponding authors to obtain details about case ascertainment. Eighty-four studies of breast (n = 51), colorectal (n = 54), and prostate cancer (n = 31) were identified; 30 examined >1 cancer type. Among the 84 studies, 57 defined cancers using only diagnosis codes, while 27 required further evidence such as chemotherapy. Few studies described methods used to create cancer code lists (n = 5); or made lists available directly (n = 5). Twenty-eight code lists were received on request from study authors. All included malignant neoplasm diagnosis codes, but there was considerable variation in the specific codes included which was not explained by coding dictionary changes. Code lists also varied in terms of other types of codes included, such as in-situ, cancer morphology, history of cancer, and secondary/suspected/borderline cancer codes. In UK primary care database studies, methods for identifying breast, colorectal, and prostate cancers were often unclear. Code lists were often unavailable, and where provided, we observed variation in the individual codes and types of codes included. Clearer reporting of methods and publication of code lists would improve transparency and reproducibility of studies. Copyright © 2014 John Wiley & Sons, Ltd.
Three dimensional ray tracing Jovian magnetosphere in the low frequency range
NASA Technical Reports Server (NTRS)
Menietti, J. D.
1982-01-01
Ray tracing of the Jovian magnetosphere in the low frequency range (1+40 MHz) has resulted in a new understanding of the source mechanism for Io dependent decametric radiation (DAM). Our three dimensional ray tracing computer code has provided model DAM arcs at 10 deg. intervals of Io longitude source positions for the full 360 deg of Jovian system III longitude. In addition, particularly interesting arcs were singled out for detailed study and modelling. Dependent decametric radiation arcs are categorized according to curvature--the higher curvature arcs are apparently due to wave stimulation at a nonconstant wave normal angle, psi. The psi(f) relationship has a signature that is common to most of the higher curvature arcs. The low curvature arcs, on the other hand, are adequately modelled with a constant wave normal angle of close to 90 deg. These results imply that for higher curvature arcs observed for from Jupiter (to diminish spacecraft motion effects) the electrons providing the gyroemission are relativistically beamed.
Partial Automation of Requirements Tracing
NASA Technical Reports Server (NTRS)
Hayes, Jane; Dekhtyar, Alex; Sundaram, Senthil; Vadlamudi, Sravanthi
2006-01-01
Requirements Tracing on Target (RETRO) is software for after-the-fact tracing of textual requirements to support independent verification and validation of software. RETRO applies one of three user-selectable information-retrieval techniques: (1) term frequency/inverse document frequency (TF/IDF) vector retrieval, (2) TF/IDF vector retrieval with simple thesaurus, or (3) keyword extraction. One component of RETRO is the graphical user interface (GUI) for use in initiating a requirements-tracing project (a pair of artifacts to be traced to each other, such as a requirements spec and a design spec). Once the artifacts have been specified and the IR technique chosen, another component constructs a representation of the artifact elements and stores it on disk. Next, the IR technique is used to produce a first list of candidate links (potential matches between the two artifact levels). This list, encoded in Extensible Markup Language (XML), is optionally processed by a filtering component designed to make the list somewhat smaller without sacrificing accuracy. Through the GUI, the user examines a number of links and returns decisions (yes, these are links; no, these are not links). Coded in XML, these decisions are provided to a "feedback processor" component that prepares the data for the next application of the IR technique. The feedback reduces the incidence of erroneous candidate links. Unlike related prior software, RETRO does not require the user to assign keywords, and automatically builds a document index.
IJS procedure for RELAP5 to TRACE input model conversion using SNAP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prosek, A.; Berar, O. A.
2012-07-01
The TRAC/RELAP Advanced Computational Engine (TRACE) advanced, best-estimate reactor systems code developed by the U.S. Nuclear Regulatory Commission comes with a graphical user interface called Symbolic Nuclear Analysis Package (SNAP). Much of efforts have been done in the past to develop the RELAP5 input decks. The purpose of this study is to demonstrate the Institut 'Josef Stefan' (IJS) conversion procedure from RELAP5 to TRACE input model of BETHSY facility. The IJS conversion procedure consists of eleven steps and is based on the use of SNAP. For calculations of the selected BETHSY 6.2TC test the RELAP5/MOD3.3 Patch 4 and TRACE V5.0more » Patch 1 were used. The selected BETHSY 6.2TC test was 15.24 cm equivalent diameter horizontal cold leg break in the reference pressurized water reactor without high pressure and low pressure safety injection. The application of the IJS procedure for conversion of BETHSY input model showed that it is important to perform the steps in proper sequence. The overall calculated results obtained with TRACE using the converted RELAP5 model were close to experimental data and comparable to RELAP5/MOD3.3 calculations. Therefore it can be concluded, that proposed IJS conversion procedure was successfully demonstrated on the BETHSY integral test facility input model. (authors)« less
Matong, Joseph M; Nyaba, Luthando; Nomngongo, Philiswa N
2016-07-01
The main objectives of this study were to determine the concentration of fourteen trace elements and to investigate their distribution as well as a contamination levels in selected agricultural soils. An ultrasonic assisted sequential extraction procedure derived from three-step BCR method was used for fractionation of trace elements. The total concentration of trace elements in soil samples was obtained by total digestion method in soil samples with aqua regia. The results of the extractable fractions revealed that most of the target trace elements can be transferred to the human being through the food chain, thus leading to serious human health. Enrichment factor (EF), geo-accumulation index (Igeo), contamination factor (CF), risk assessment code (RAC) and individual contamination factors (ICF) were used to assess the environmental impacts of trace metals in soil samples. The EF revealed that Cd was enriched by 3.1-7.2 (except in Soil 1). The Igeo results showed that the soils in the study area was moderately contaminated with Fe, and heavily to extremely polluted with Cd. The soil samples from the unplanted field was found to have highest contamination factor for Cd and lowest for Pb. Soil 3 showed a high risk for Tl and Cd with RAC values of greater than or equal to 50%. In addition, Fe, Ni, Cu, V, As, Mo (except Soil 2), Sb and Pb posed low environmental risk. The modified BCR sequential extraction method provided more information about mobility and environmental implication of studied trace elements in the study area. Copyright © 2016 Elsevier Ltd. All rights reserved.
Inductively Coupled Plasma Optical Emission Spectrometry for Rare Earth Elements Analysis
NASA Astrophysics Data System (ADS)
He, Man; Hu, Bin; Chen, Beibei; Jiang, Zucheng
2017-01-01
Inductively coupled plasma optical emission spectrometry (ICP-OES) merits multielements capability, high sensitivity, good reproducibility, low matrix effect and wide dynamic linear range for rare earth elements (REEs) analysis. But the spectral interference in trace REEs analysis by ICP-OES is a serious problem due to the complicated emission spectra of REEs, which demands some correction technology including interference factor method, derivative spectrum, Kalman filtering algorithm and partial least-squares (PLS) method. Matrix-matching calibration, internal standard, correction factor and sample dilution are usually employed to overcome or decrease the matrix effect. Coupled with various sample introduction techniques, the analytical performance of ICP-OES for REEs analysis would be improved. Compared with conventional pneumatic nebulization (PN), acid effect and matrix effect are decreased to some extent in flow injection ICP-OES, with higher tolerable matrix concentration and better reproducibility. By using electrothermal vaporization as sample introduction system, direct analysis of solid samples by ICP-OES is achieved and the vaporization behavior of refractory REEs with high boiling point, which can easily form involatile carbides in the graphite tube, could be improved by using chemical modifier, such as polytetrafluoroethylene and 1-phenyl-3-methyl-4-benzoyl-5-pyrazone. Laser ablation-ICP-OES is suitable for the analysis of both conductive and nonconductive solid samples, with the absolute detection limit of ng-pg level and extremely low sample consumption (0.2 % of that in conventional PN introduction). ICP-OES has been extensively employed for trace REEs analysis in high-purity materials, and environmental and biological samples.
You, Chao; Song, Lili; Xu, Baiqing; Gao, Shaopeng
2016-02-01
A method is developed for determination of levoglucosan at trace concentration levels in complex matrices of snow and ice samples. This method uses an injection mixture comprising acetonitrile and melt sample at a ratio of 50/50 (v/v). Samples are analyzed using ultra-performance liquid chromatography system combined with triple tandem quadrupole mass spectrometry (UPLC-MS/MS). Levoglucosan is analyzed on BEH Amide column (2.1 mm × 100 mm, 1.7 um), and a Z-spray electrospray ionization source is used for levoglucosan ionization. The polyether sulfone filter is selected for filtrating insoluble particles due to less impact on levoglucosan. The matrix effect is evaluated by using a standard addition method. During the method validation, limit of detection (LOD), linearity, recovery, repeatability and reproducibility were evaluated using standard addition method. The LOD of this method is 0.11 ng mL(-1). Recoveries vary from 91.2% at 0.82 ng mL(-1) to 99.3% at 4.14 ng mL(-1). Repeatability ranges from 17.9% at a concentration of 0.82 ng mL(-1) to 2.8% at 4.14 ng mL(-1). Reproducibility ranges from 15.1% at a concentration of 0.82 ng mL(-1) to 1.9% at 4.14 ng mL(-1). This method can be implemented using less than 0.50 mL sample volume in low and middle latitude regions like the Tibetan Plateau. Copyright © 2015 Elsevier B.V. All rights reserved.
Amadei, Gianluca; Ross, Brian M
2012-02-15
Basil (Ocimum basilicum) is an important flavourant plant which constitutes the major ingredient of the pasta sauce 'Pesto alla Genovese'. The characteristic smell of basil stems mainly from a handful of terpenoids (methyl cinnamate, eucalyptol, linalool and estragole), the concentration of which varies according to basil cultivars. The simple and rapid analysis of the terpenoid constituents of basil would be useful as a means to optimise harvesting times and to act as a quality control process for basil-containing foodstuffs. Classical analytical techniques such as gas chromatography/mass spectrometry (GC/MS) are, however, slow, technically demanding and therefore less suitable for routine analysis. A new chemical ionisation technique which allows real-time quantification of traces gases, Selected Ion Flow Tube Mass Spectrometry (SIFT-MS), was therefore utilised to determine its usefulness for the assay of terpenoid concentrations in basil and pesto sauce headspace. Trace gas analysis was performed using the NO(+) precursor ion which minimised interference from other compounds. Character-impacting compound concentration was measured in basil headspace with good reproducibility and statistically significant differences were observed between cultivars. Quantification of linalool in pesto sauce headspace proved more difficult due to the presence of interfering compounds. This was resolved by careful selection of reaction product ions which allowed us to detect differences between various commercial brands of pesto. We conclude that SIFT-MS may be a valid tool for the fast and reproducible analysis of flavourant terpenoids in basil and basil-derived foodstuffs. Copyright © 2011 John Wiley & Sons, Ltd.
Cosmic-ray propagation with DRAGON2: I. numerical solver and astrophysical ingredients
NASA Astrophysics Data System (ADS)
Evoli, Carmelo; Gaggero, Daniele; Vittino, Andrea; Di Bernardo, Giuseppe; Di Mauro, Mattia; Ligorini, Arianna; Ullio, Piero; Grasso, Dario
2017-02-01
We present version 2 of the DRAGON code designed for computing realistic predictions of the CR densities in the Galaxy. The code numerically solves the interstellar CR transport equation (including inhomogeneous and anisotropic diffusion, either in space and momentum, advective transport and energy losses), under realistic conditions. The new version includes an updated numerical solver and several models for the astrophysical ingredients involved in the transport equation. Improvements in the accuracy of the numerical solution are proved against analytical solutions and in reference diffusion scenarios. The novel features implemented in the code allow to simulate the diverse scenarios proposed to reproduce the most recent measurements of local and diffuse CR fluxes, going beyond the limitations of the homogeneous galactic transport paradigm. To this end, several applications using DRAGON2 are presented as well. This new version facilitates the users to include their own physical models by means of a modular C++ structure.
Parallelization of KENO-Va Monte Carlo code
NASA Astrophysics Data System (ADS)
Ramón, Javier; Peña, Jorge
1995-07-01
KENO-Va is a code integrated within the SCALE system developed by Oak Ridge that solves the transport equation through the Monte Carlo Method. It is being used at the Consejo de Seguridad Nuclear (CSN) to perform criticality calculations for fuel storage pools and shipping casks. Two parallel versions of the code: one for shared memory machines and other for distributed memory systems using the message-passing interface PVM have been generated. In both versions the neutrons of each generation are tracked in parallel. In order to preserve the reproducibility of the results in both versions, advanced seeds for random numbers were used. The CONVEX C3440 with four processors and shared memory at CSN was used to implement the shared memory version. A FDDI network of 6 HP9000/735 was employed to implement the message-passing version using proprietary PVM. The speedup obtained was 3.6 in both cases.
NASA Astrophysics Data System (ADS)
Robotham, A. S. G.; Howlett, Cullan
2018-06-01
In this short note we publish the analytic quantile function for the Navarro, Frenk & White (NFW) profile. All known published and coded methods for sampling from the 3D NFW PDF use either accept-reject, or numeric interpolation (sometimes via a lookup table) for projecting random Uniform samples through the quantile distribution function to produce samples of the radius. This is a common requirement in N-body initial condition (IC), halo occupation distribution (HOD), and semi-analytic modelling (SAM) work for correctly assigning particles or galaxies to positions given an assumed concentration for the NFW profile. Using this analytic description allows for much faster and cleaner code to solve a common numeric problem in modern astronomy. We release R and Python versions of simple code that achieves this sampling, which we note is trivial to reproduce in any modern programming language.
Evaluation of Production Cross Sections of Li, Be, B in CR
NASA Technical Reports Server (NTRS)
Moskalenko, I. V.; Mashnik, S. G.
2003-01-01
Accurate evaluation of the production cross section of light elements is important for models of cosmic ray (CR) propagation, galactic chemical evolution, and cosmological studies. However, the experimental spallation cross section data are scarce and often unavailable to CR community while semi-empirical systematics are frequently wrong by a significant factor. Running sophisticated nuclear codes is not an option of choice for everyone either. We use the Los Alamos versions of the Quark-Gluon String Model code LAQGSM and the improved Cascade-Exciton Model code CEM2k together with all available data from Los Alamos Nuclear Laboratory (LANL) nuclear database to produce evaluated production cross sections of isotopes of Li, Be, and B suitable for astrophysical applications. The LAQGSM and CEM2k models have been shown to reproduce well nuclear reactions and hadronic data in the range 0.01-800 GeV/nucleon.
Wang, G; Doyle, E J; Peebles, W A
2016-11-01
A monostatic antenna array arrangement has been designed for the microwave front-end of the ITER low-field-side reflectometer (LFSR) system. This paper presents details of the antenna coupling coefficient analyses performed using GENRAY, a 3-D ray tracing code, to evaluate the plasma height accommodation capability of such an antenna array design. Utilizing modeled data for the plasma equilibrium and profiles for the ITER baseline and half-field scenarios, a design study was performed for measurement locations varying from the plasma edge to inside the top of the pedestal. A front-end antenna configuration is recommended for the ITER LFSR system based on the results of this coupling analysis.
A Phase-Space Approach to Collisionless Stellar Systems Using a Particle Method
NASA Astrophysics Data System (ADS)
Hozumi, Shunsuke
1997-10-01
A particle method for reproducing the phase space of collisionless stellar systems is described. The key idea originates in Liouville's theorem, which states that the distribution function (DF) at time t can be derived from tracing necessary orbits back to t = 0. To make this procedure feasible, a self-consistent field (SCF) method for solving Poisson's equation is adopted to compute the orbits of arbitrary stars. As an example, for the violent relaxation of a uniform density sphere, the phase-space evolution generated by the current method is compared to that obtained with a phase-space method for integrating the collisionless Boltzmann equation, on the assumption of spherical symmetry. Excellent agreement is found between the two methods if an optimal basis set for the SCF technique is chosen. Since this reproduction method requires only the functional form of initial DFs and does not require any assumptions to be made about the symmetry of the system, success in reproducing the phase-space evolution implies that there would be no need of directly solving the collisionless Boltzmann equation in order to access phase space even for systems without any special symmetries. The effects of basis sets used in SCF simulations on the reproduced phase space are also discussed.
Shear wave elastography for breast masses is highly reproducible.
Cosgrove, David O; Berg, Wendie A; Doré, Caroline J; Skyba, Danny M; Henry, Jean-Pierre; Gay, Joel; Cohen-Bacrie, Claude
2012-05-01
To evaluate intra- and interobserver reproducibility of shear wave elastography (SWE) for breast masses. For intraobserver reproducibility, each observer obtained three consecutive SWE images of 758 masses that were visible on ultrasound. 144 (19%) were malignant. Weighted kappa was used to assess the agreement of qualitative elastographic features; the reliability of quantitative measurements was assessed by intraclass correlation coefficients (ICC). For the interobserver reproducibility, a blinded observer reviewed images and agreement on features was determined. Mean age was 50 years; mean mass size was 13 mm. Qualitatively, SWE images were at least reasonably similar for 666/758 (87.9%). Intraclass correlation for SWE diameter, area and perimeter was almost perfect (ICC ≥ 0.94). Intraobserver reliability for maximum and mean elasticity was almost perfect (ICC = 0.84 and 0.87) and was substantial for the ratio of mass-to-fat elasticity (ICC = 0.77). Interobserver agreement was moderate for SWE homogeneity (κ = 0.57), substantial for qualitative colour assessment of maximum elasticity (κ = 0.66), fair for SWE shape (κ = 0.40), fair for B-mode mass margins (κ = 0.38), and moderate for B-mode mass shape (κ = 0.58), orientation (κ = 0.53) and BI-RADS assessment (κ = 0.59). SWE is highly reproducible for assessing elastographic features of breast masses within and across observers. SWE interpretation is at least as consistent as that of BI-RADS ultrasound B-mode features. • Shear wave ultrasound elastography can measure the stiffness of breast tissue • It provides a qualitatively and quantitatively interpretable colour-coded map of tissue stiffness • Intraobserver reproducibility of SWE is almost perfect while intraobserver reproducibility of SWE proved to be moderate to substantial • The most reproducible SWE features between observers were SWE image homogeneity and maximum elasticity.
Third Party Interaction in the Medical Context: Code-switching and Control
Vickers, Caroline H.; Goble, Ryan; Deckert, Sharon K.
2015-01-01
The purpose of this paper is to examine the micro-interactional co-construction of power within Spanish language concordant medical consultations in California involving a third party family member. Findings indicate the third party instigates code-switching to English on the part of medical providers, a language that the patient does not understand, rendering the patient a non-participant in the medical consultation. In these consultations involving a third party family member, monolingual Spanish-speaking patients are stripped of control in ways that are similar to other powerless groups in medical consultations. Implications include the need to further examine how micro-level interactions reproduce societal ideologies and shape policy on the ground. PMID:27667896
Transport modeling of L- and H-mode discharges with LHCD on EAST
NASA Astrophysics Data System (ADS)
Li, M. H.; Ding, B. J.; Imbeaux, F.; Decker, J.; Zhang, X. J.; Kong, E. H.; Zhang, L.; Wei, W.; Shan, J. F.; Liu, F. K.; Wang, M.; Xu, H. D.; Yang, Y.; Peysson, Y.; Basiuk, V.; Artaud, J.-F.; Yuynh, P.; Wan, B. N.
2013-04-01
High-confinement (H-mode) discharges with lower hybrid current drive (LHCD) as the only heating source are obtained on EAST. In this paper, an empirical transport model of mixed Bohm/gyro-Bohm for electron and ion heat transport was first calibrated against a database of 3 L-mode shots on EAST. The electron and ion temperature profiles are well reproduced in the predictive modeling with the calibrated model coupled to the suite of codes CRONOS. CRONOS calculations with experimental profiles are also performed for electron power balance analysis. In addition, the time evolutions of LHCD are calculated by the C3PO/LUKE code involving current diffusion, and the results are compared with experimental observations.
Time-resolved x-ray spectra from laser-generated high-density plasmas
NASA Astrophysics Data System (ADS)
Andiel, U.; Eidmann, Klaus; Witte, Klaus-Juergen
2001-04-01
We focused frequency doubled ultra short laser pulses on solid C, F, Na and Al targets, K-shell emission was systematically investigated by time resolved spectroscopy using a sub-ps streak camera. A large number of laser shots can be accumulated when triggering the camera with an Auston switch system at very high temporal precision. The system provides an outstanding time resolution of 1.7ps accumulating thousands of laser shots. The time duration of the He-(alpha) K-shell resonance lines was observed in the range of (2-4)ps and shows a decrease with the atomic number. The experimental results are well reproduced by hydro code simulations post processed with an atomic kinetics code.
Superhydrophobic Ag nanostructures on polyaniline membranes with strong SERS enhancement.
Liu, Weiyu; Miao, Peng; Xiong, Lu; Du, Yunchen; Han, Xijiang; Xu, Ping
2014-11-07
We demonstrate here a facile fabrication of n-dodecyl mercaptan-modified superhydrophobic Ag nanostructures on polyaniline membranes for molecular detection based on SERS technique, which combines the superhydrophobic condensation effect and the high enhancement factor. It is calculated that the as-fabricated superhydrophobic substrate can exhibit a 21-fold stronger molecular condensation, and thus further amplifies the SERS signal to achieve more sensitive detection. The detection limit of the target molecule, methylene blue (MB), on this superhydrophobic substrate can be 1 order of magnitude higher than that on the hydrophilic substrate. With high reproducibility, the feasibility of using this SERS-active superhydrophobic substrate for quantitative molecular detection is explored. A partial least squares (PLS) model was established for the quantification of MB by SERS, with correlation coefficient R(2) = 95.1% and root-mean-squared error of prediction (RMSEP) = 0.226. We believe this superhydrophobic SERS substrate can be widely used in trace analysis due to its facile fabrication, high signal reproducibility and promising SERS performance.
Multiplex DNA detection of food allergens on a digital versatile disk.
Tortajada-Genaro, Luis A; Santiago-Felipe, Sara; Morais, Sergi; Gabaldón, José Antonio; Puchades, Rosa; Maquieira, Ángel
2012-01-11
The development of a DNA microarray method on a digital versatile disk (DVD) is described for the simultaneous detection of traces of hazelnut ( Corylus avellana L.), peanut ( Arachis hypogaea ), and soybean ( Glycine max ) in foods. After DNA extraction, multiplex PCR was set up using 5'-labeled specific primers for Cor a 1, Ar h 2, and Le genes, respectively. Digoxin-labeled PCR products were detected by hybridization with 5'-biotinylated probes immobilized on a streptavidin-modified DVD surface. The reaction product attenuates the signal intensity of the laser that reached the DVD drive used as detector, correlating well with the amount of amplified sequence. Analytical performances showed a detection limit of 1 μg/g and good assay reproducibility (RSD 8%), suitable for the simultaneous detection of the three targeted allergens. The developed methodology was tested with several commercially available foodstuffs, demonstrating its applicability. The results were in good agreement, in terms of sensitivity and reproducibility, with those obtained with ELISA, PCR-gel agarose electrophoresis, and RT-PCR.
NASA Astrophysics Data System (ADS)
Wessels, Philipp; Ewald, Johannes; Wieland, Marek; Nisius, Thomas; Vogel, Andreas; Viefhaus, Jens; Meier, Guido; Wilhein, Thomas; Drescher, Markus
2014-11-01
The destruction and formation of equilibrium multidomain patterns in permalloy (Ni80Fe20 ) microsquares has been captured using pump-probe x-ray magnetic circular dichroism (XMCD) spectromicroscopy at a new full-field magnetic transmission soft x-ray microscopy endstation with subnanosecond time resolution. The movie sequences show the dynamic magnetization response to intense Oersted field pulses of approximately 200-ps root mean square (rms) duration and the magnetization reorganization to the ground-state domain configuration. The measurements display how a vortex flux-closure magnetization distribution emerges out of a nonequilibrium uniform single-domain state. During the destruction of the initial vortex pattern, we have traced the motion of the central vortex core that is ejected out of the microsquare at high velocities exceeding 1 km/s. A reproducible recovery into a defined final vortex state with stable chirality and polarity could be achieved. Using an additional external bias field, the transient reversal of the square magnetization direction could be monitored and consistently reproduced by micromagnetic simulations.
Geochemical constraints on depth of origin of oceanic carbonatites: The Cape Verde case
NASA Astrophysics Data System (ADS)
Doucelance, Régis; Hammouda, Tahar; Moreira, Manuel; Martins, João C.
2010-12-01
We present new Sr-Nd isotope compositions together with major- and trace element concentrations measured for whole rocks and mineral separate phases (apatite, biotite and calcite) from fifteen Cape Verde oceanic carbonatites (Atlantic Ocean). Trace element patterns of calcio- and magnesio-carbonatites present a strong depletion in K, Hf, Zr and Ti and an overall enrichment in Sr and REE relative to Cape Verde basalts, arguing for distinct source components between carbonatites and basalts. Sr and Nd isotopic ratios show small, but significant variations defining a binary mixing between a depleted end-member with unradiogenic Sr and radiogenic Nd values and a ''enriched'' end-member compatible with old marine carbonates. We interpret the depleted end-member as the Cape Verde oceanic lithosphere by comparison with previous studies on Cape Verde basalts. We thus propose that oceanic carbonatites are resulting from the interaction of a deep rooted mantle plume carrying a lower 4He/ 3He signature from the lower mantle and a carbonated metasomatized lithosphere, which by low degree melting produced carbonatite magmas. Sr-Nd compositions and trace element patterns of carbonatites argue in favor of a metasomatic agent originating from partial melting of recycled, carbonated oceanic crust. We have successfully reproduced the main geochemical features of this model using a Monte-Carlo-type simulation.
Zheng, Ying; Chen, Zhuo; Zheng, Chengbin; Lee, Yong-Ill; Hou, Xiandeng; Wu, Li; Tian, Yunfei
2016-08-01
A facile method was developed for determination of trace volatile acetone by coupling a derivatization reaction to surface-enhanced Raman scattering (SERS). With iodide modified Ag nanoparticles (Ag IMNPs) as the SERS substrate, acetone without obvious Raman signal could be converted to SERS-sensitive species via a chemical derivatization reaction with 2,4-dinitrophenylhydrazine (2,4-DNPH). In addition, acetone can be effectively separated from liquid phase with a purge-sampling device and then any serious interference from sample matrices can be significantly reduced. The optimal conditions for the derivatization reaction and the SERS analysis were investigated in detail, and the selectivity and reproducibility of this method were also evaluated. Under the optimal conditions, the limit of detection (LOD) for acetone was 5mgL(-1) or 0.09mM (3σ). The relative standard deviation (RSD) for 80mgL(-1) acetone (n=9) was 1.7%. This method was successfully used for the determination of acetone in artificial urine and human urine samples with spiked recoveries ranging from 92% to 110%. The present method is convenient, sensitive, selective, reliable and suitable for analysis of trace acetone, and it could have a promising clinical application in early diabetes diagnosis. Copyright © 2016 Elsevier B.V. All rights reserved.
Zhang, Li; Li, Zhenhua; Hu, Zheng; Chang, Xijun
2011-09-01
The first study on the high efficiency of triocarbohydrazide modified attapulgite as solid-phase extractant for preconcentration of trace Au(III) prior to the measurement by inductively coupled plasma optical emission spectrometry (ICP-OES) has been reported. Experimental conditions for effective adsorption of trace levels of Au(III) were optimized with respect to different experimental parameters using batch and column procedures in detail. At pH 3, Au(III) could be quantitatively adsorbed on the new sorbent, and the adsorbed Au(III) could be completely eluted from the sorbent surface by 2.0mL 1.0molL(-1) of HCl+2% CS(NH(2))(2) solution. An enrichment factor of 150 was accomplished. Moreover, common interfering ions did not interfere in both separation and determination. The maximum adsorption capacity of the sorbent for Au(III) was found to be 66.7mgg(-1). The detection limit (3σ) of this method was 0.32μgL(-1) and the relative standard deviation (RSD) was 3.3% (n=8). The method, with high selectivity, sensitivity and reproducibility, was validated using certified reference materials, and had been applied for the determination of trace Au(III) with satisfactory results. Copyright © 2011 Elsevier B.V. All rights reserved.
Bartyzel, Jakub; Rozanski, Kazimierz
2016-01-01
A dedicated, GC-based analytical system is presented which allows detection of four anthropogenic trace gases (SF6, SF5CF3, CFC-12 and Halon-1301) in a single water sample, with detection limits and measurement uncertainties sufficiently low to employ them as quantitative indicators of groundwater age. The gases dissolved in water are extracted in the field using the method based on a dynamic head-space concept. In the laboratory, the investigated gases are cryogenically enriched, separated and measured using an electron capture detector. Reproducibility of the analyses is in the order of 2-5 %. The investigated tracers were measured in several production wells located in the recharge area of an intensively exploited aquifer in southern Poland. While the piston-flow ages of groundwater in the investigated wells revealed internal consistency, they appeared to be generally smaller than the ages derived from time series of tritium content in those wells, interpreted by lumped-parameter models. This difference stems mainly from significantly longer travel times of tritium through the unsaturated zone, when compared to the gaseous tracers being used. The results of this study highlight the benefits of using multiple tracing in quantifying timescales of groundwater flow in shallow aquifer systems.
Chelatable trace zinc causes low, irreproducible KDAC8 activity.
Toro, Tasha B; Edenfield, Samantha A; Hylton, Brandon J; Watt, Terry J
2018-01-01
Acetylation is an important regulatory mechanism in cells, and emphasis is being placed on identifying substrates and small molecule modulators of this post-translational modification. However, the reported in vitro activity of the lysine deacetylase KDAC8 is inconsistent across experimental setups, even with the same substrate, complicating progress in the field. We detected trace levels of zinc, a known inhibitor of KDAC8 when present in excess, even in high-quality buffer reagents, at concentrations that are sufficient to significantly inhibit the enzyme under common reaction conditions. We hypothesized that trace zinc in solution could account for the observed variability in KDAC8 activity. We demonstrate that addition of chelators, including BSA, EDTA, and citrate, and/or the use of a phosphate-based buffer instead of the more common tris-based buffer, eliminates the inhibition from low levels of zinc as well as the dependence of specific activity on enzyme concentration. This results in high KDAC8 activity that is consistent across buffer systems, even using low concentrations of enzyme. We report conditions that are suitable for several assays to increase both enzyme activity and reproducibility. Our results have significant implications for approaches used to identify substrates and small molecule modulators of KDAC8 and interpretation of existing data. Copyright © 2017 Elsevier Inc. All rights reserved.
Simulating cosmologies beyond ΛCDM with PINOCCHIO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rizzo, Luca A.; Villaescusa-Navarro, Francisco; Monaco, Pierluigi
2017-01-01
We present a method that extends the capabilities of the PINpointing Orbit-Crossing Collapsed HIerarchical Objects (PINOCCHIO) code, allowing it to generate accurate dark matter halo mock catalogues in cosmological models where the linear growth factor and the growth rate depend on scale. Such cosmologies comprise, among others, models with massive neutrinos and some classes of modified gravity theories. We validate the code by comparing the halo properties from PINOCCHIO against N-body simulations, focusing on cosmologies with massive neutrinos: νΛCDM. We analyse the halo mass function, halo two-point correlation function and halo power spectrum, showing that PINOCCHIO reproduces the results frommore » simulations with the same level of precision as the original code (∼ 5–10%). We demonstrate that the abundance of halos in cosmologies with massless and massive neutrinos from PINOCCHIO matches very well the outcome of simulations, and point out that PINOCCHIO can reproduce the Ω{sub ν}–σ{sub 8} degeneracy that affects the halo mass function. We finally show that the clustering properties of the halos from PINOCCHIO matches accurately those from simulations both in real and redshift-space, in the latter case up to k = 0.3 h Mpc{sup −1}. We emphasize that the computational time required by PINOCCHIO to generate mock halo catalogues is orders of magnitude lower than the one needed for N-body simulations. This makes this tool ideal for applications like covariance matrix studies within the standard ΛCDM model but also in cosmologies with massive neutrinos or some modified gravity theories.« less
Hybrid information privacy system: integration of chaotic neural network and RSA coding
NASA Astrophysics Data System (ADS)
Hsu, Ming-Kai; Willey, Jeff; Lee, Ting N.; Szu, Harold H.
2005-03-01
Electronic mails are adopted worldwide; most are easily hacked by hackers. In this paper, we purposed a free, fast and convenient hybrid privacy system to protect email communication. The privacy system is implemented by combining private security RSA algorithm with specific chaos neural network encryption process. The receiver can decrypt received email as long as it can reproduce the specified chaos neural network series, so called spatial-temporal keys. The chaotic typing and initial seed value of chaos neural network series, encrypted by the RSA algorithm, can reproduce spatial-temporal keys. The encrypted chaotic typing and initial seed value are hidden in watermark mixed nonlinearly with message media, wrapped with convolution error correction codes for wireless 3rd generation cellular phones. The message media can be an arbitrary image. The pattern noise has to be considered during transmission and it could affect/change the spatial-temporal keys. Since any change/modification on chaotic typing or initial seed value of chaos neural network series is not acceptable, the RSA codec system must be robust and fault-tolerant via wireless channel. The robust and fault-tolerant properties of chaos neural networks (CNN) were proved by a field theory of Associative Memory by Szu in 1997. The 1-D chaos generating nodes from the logistic map having arbitrarily negative slope a = p/q generating the N-shaped sigmoid was given first by Szu in 1992. In this paper, we simulated the robust and fault-tolerance properties of CNN under additive noise and pattern noise. We also implement a private version of RSA coding and chaos encryption process on messages.
NASA Astrophysics Data System (ADS)
Jiaming, Liu; Guohui, Zhu; Tianlong, Yang; Aihong, Wu; Yan, Fu; Longdi, Li
2003-07-01
The effects of different surfactants on solid substrate-room temperature phosphorescence (SS-RTP) properties of Sn4+-morin systems were investigated. It was found that the SS-RTP intensity of luminescence system was increased greatly in presence of sodium dodecyl sulfate (SDS). A new highly sensitive method for the determination of trace tin has been proposed based on sensitization of SDS on SS-RTP intensity of morin-tin system on the filter paper substrate. The linear dynamic range of this method is 8.0-112 ag per spot (with the volume of 0.4 μl per spot) with a detection limit of 4.0 ag per spot, and the regression equation is ΔIp=199.7+3.456mSn(IV) (ag per spot), with the correlation coefficient r=0.9998 (n=7). This simple, rapid and reproducible method has been applied to determine the amount of tin in real samples with satisfactory results.
Kim, Kimin; Ahn, J. -W.; Scotti, F.; ...
2015-09-03
Ideal plasma shielding and amplification of resonant magnetic perturbations in non-axisymmetric tokamak is presented by field line tracing simulation with full ideal plasma response, compared to measurements of divertor lobe structures. Magnetic field line tracing simulations in NSTX with toroidal non-axisymmetry indicate the ideal plasma response can significantly shield/amplify and phase shift the vacuum resonant magnetic perturbations. Ideal plasma shielding for n = 3 mode is found to prevent magnetic islands from opening as consistently shown in the field line connection length profile and magnetic footprints on the divertor target. It is also found that the ideal plasma shielding modifiesmore » the degree of stochasticity but does not change the overall helical lobe structures of the vacuum field for n = 3. Furthermore, amplification of vacuum fields by the ideal plasma response is predicted for low toroidal mode n = 1, better reproducing measurements of strong striation of the field lines on the divertor plate in NSTX.« less
Low-cost and large-scale flexible SERS-cotton fabric as a wipe substrate for surface trace analysis
NASA Astrophysics Data System (ADS)
Chen, Yanmin; Ge, Fengyan; Guang, Shanyi; Cai, Zaisheng
2018-04-01
The large-scale surface enhanced Raman scattering (SERS) cotton fabrics were fabricated based on traditional woven ones using a dyeing-like method of vat dyes, where silver nanoparticles (Ag NPs) were in-situ synthesized by 'dipping-reducing-drying' process. By controlling the concentration of AgNO3 solution, the optimal SERS cotton fabric was obtained, which had a homogeneous close packing of Ag NPs. The SERS cotton fabric was employed to detect p-Aminothiophenol (PATP). It was found that the new fabric possessed excellent reproducibility (about 20%), long-term stability (about 57 days) and high SERS sensitivity with a detected concentration as low as 10-12 M. Furthermore, owing to the excellent mechanical flexibility and good absorption ability, the SERS cotton fabric was employed to detect carbaryl on the surface of an apple by simply swabbing, which showed great potential in fast trace analysis. More importantly, this study may realize large-scale production with low cost by a traditional cotton fabric.
Murphy, K E; Beary, E S; Rearick, M S; Vocke, R D
2000-10-01
Lead (Pb) and cadmium (Cd) have been determined in six new environmental standard reference materials (SRMs) using isotope dilution inductively coupled plasma mass spectrometry (ID ICP-MS). The SRMs are the following: SRM 1944, New York-New Jersey Waterway Sediment, SRMs 2583 and 2584, Trace Elements in Indoor Dust, Nominal 90 mg/kg and 10,000 mg/kg Lead, respectively, SRMs 2586 and 2587, Trace Elements in Soil Containing Lead from Paint, Nominal 500 mg/kg and 3,000 mg/kg Lead, respectively, and SRM 2782, Industrial Sludge. The capabilities of ID ICP-MS for the certification of Pb and Cd in these materials are assessed. Sample preparation and ratio measurement uncertainties have been evaluated. Reproducibility and accuracy of the established procedures are demonstrated by determination of gravimetrically prepared primary standard solutions and by comparison with isotope dilution thermal ionization mass spectrometry (ID TIMS). Material heterogeneity was readily demonstrated to be the dominant source of uncertainty in the certified values.
Zhou, Shaofeng; Han, Xiaojuan; Fan, Honglei; Liu, Yaqing
2016-06-22
Au nanoparticles decorated mesoporous MnFe₂O₄ nanocrystal clusters (MnFe₂O₄/Au hybrid nanospheres) were used for the electrochemical sensing of As(III) by square wave anodic stripping voltammetry (SWASV). Modified on a cheap glass carbon electrode, these MnFe₂O₄/Au hybrid nanospheres show favorable sensitivity (0.315 μA/ppb) and limit of detection (LOD) (3.37 ppb) toward As(III) under the optimized conditions in 0.1 M NaAc-HAc (pH 5.0) by depositing for 150 s at the deposition potential of -0.9 V. No obvious interference from Cd(II) and Hg(II) was recognized during the detection of As(III). Additionally, the developed electrode displayed good reproducibility, stability, and repeatability, and offered potential practical applicability for electrochemical detection of As(III) in real water samples. The present work provides a potential method for the design of new and cheap sensors in the application of electrochemical determination toward trace As(III) and other toxic metal ions.
Trace detection of oxygen--ionic liquids in gas sensor design.
Baltes, N; Beyle, F; Freiner, S; Geier, F; Joos, M; Pinkwart, K; Rabenecker, P
2013-11-15
This paper presents a novel electrochemical membrane sensor on basis of ionic liquids for trace analysis of oxygen in gaseous atmospheres. The faradaic response currents for the reduction of oxygen which were obtained by multiple-potential-step-chronoamperometry could be used for real time detection of oxygen down to concentrations of 30 ppm. The theoretical limit of detection was 5 ppm. The simple, non-expensive sensors varied in electrolyte composition and demonstrated a high sensitivity, a rapid response time and an excellent reproducibility at room temperature. Some of them were continuously used for at least one week and first results promise good long term stability. Voltammetric, impedance and oxygen detection studies at temperatures up to 200 °C (in the presence and absence of humidity and CO2) revealed also the limitations of certain ionic liquids for some electrochemical high temperature applications. Application areas of the developed sensors are control and analysis processes of non oxidative and oxygen free atmospheres. Copyright © 2013 Elsevier B.V. All rights reserved.
Decoding DNA labels by melting curve analysis using real-time PCR.
Balog, József A; Fehér, Liliána Z; Puskás, László G
2017-12-01
Synthetic DNA has been used as an authentication code for a diverse number of applications. However, existing decoding approaches are based on either DNA sequencing or the determination of DNA length variations. Here, we present a simple alternative protocol for labeling different objects using a small number of short DNA sequences that differ in their melting points. Code amplification and decoding can be done in two steps using quantitative PCR (qPCR). To obtain a DNA barcode with high complexity, we defined 8 template groups, each having 4 different DNA templates, yielding 158 (>2.5 billion) combinations of different individual melting temperature (Tm) values and corresponding ID codes. The reproducibility and specificity of the decoding was confirmed by using the most complex template mixture, which had 32 different products in 8 groups with different Tm values. The industrial applicability of our protocol was also demonstrated by labeling a drone with an oil-based paint containing a predefined DNA code, which was then successfully decoded. The method presented here consists of a simple code system based on a small number of synthetic DNA sequences and a cost-effective, rapid decoding protocol using a few qPCR reactions, enabling a wide range of authentication applications.
User Manual for the NASA Glenn Ice Accretion Code LEWICE: Version 2.0
NASA Technical Reports Server (NTRS)
Wright, William B.
1999-01-01
A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.0 of this code, which is called LEWICE. This version differs from previous releases due to its robustness and its ability to reproduce results accurately for different spacing and time step criteria across computing platform. It also differs in the extensive effort undertaken to compare the results against the database of ice shapes which have been generated in the NASA Glenn Icing Research Tunnel (IRT) 1. This report will only describe the features of the code related to the use of the program. The report will not describe the inner working of the code or the physical models used. This information is available in the form of several unpublished documents which will be collectively referred to as a Programmers Manual for LEWICE 2 in this report. These reports are intended as an update/replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.
A new Bayesian Earthquake Analysis Tool (BEAT)
NASA Astrophysics Data System (ADS)
Vasyura-Bathke, Hannes; Dutta, Rishabh; Jónsson, Sigurjón; Mai, Martin
2017-04-01
Modern earthquake source estimation studies increasingly use non-linear optimization strategies to estimate kinematic rupture parameters, often considering geodetic and seismic data jointly. However, the optimization process is complex and consists of several steps that need to be followed in the earthquake parameter estimation procedure. These include pre-describing or modeling the fault geometry, calculating the Green's Functions (often assuming a layered elastic half-space), and estimating the distributed final slip and possibly other kinematic source parameters. Recently, Bayesian inference has become popular for estimating posterior distributions of earthquake source model parameters given measured/estimated/assumed data and model uncertainties. For instance, some research groups consider uncertainties of the layered medium and propagate these to the source parameter uncertainties. Other groups make use of informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed that efficiently explore the often high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational demands of these methods are high and estimation codes are rarely distributed along with the published results. Even if codes are made available, it is often difficult to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in earthquake source estimations, we undertook the effort of producing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package is build on top of the pyrocko seismological toolbox (www.pyrocko.org) and makes use of the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat) and we encourage and solicit contributions to the project. In this contribution, we present our strategy for developing BEAT, show application examples, and discuss future developments.
IonRayTrace: An HF Propagation Model for Communications and Radar Applications
2014-12-01
for modeling the impact of ionosphere variability on detection algorithms. Modification of IonRayTrace’s source code to include flexible gridding and...color denotes plasma frequency in MHz .................................................................. 6 4. Ionospheric absorption (dB) versus... Ionosphere for its environmental background [3]. IonRayTrace’s operation is summarized briefly in Section 3. However, the scope of this document is primarily
ERIC Educational Resources Information Center
Erduran, Sibel
Eight physical science textbooks were analyzed for coverage on acids, bases, and neutralization. At the level of the text, clarity and coherence of statements were investigated. The conceptual framework for this topic was represented in a concept map which was used as a coding tool for tracing concepts and links present in textbooks. Cognitive…
Methodology to Improve Aviation Security With Terrorist Using Aircraft as a Weapon
2013-09-01
STATEMENT Approval for public release;distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words ) The aviation industry... Electronic Baggage Screening Program EDS Explosive Detection System EMMI Energy, Matter, Material wealth, and Information ETD Explosives Trace...12 All checked baggage in the United States has been subjected to 100% screening since December 2003 under TSA’s Electronic Baggage Screening
ERIC Educational Resources Information Center
Levine, Gavrielle
This investigation traced changes in anxiety for teaching mathematics (ATM) among pre-service elementary school teachers (n=36) enrolled in a mathematics methods course by analyzing their weekly journal entries. Journal entries were coded for high level of ATM (ATM-high) or absence of ATM (ATM-absent) during the first class session, as well as…
NASA Astrophysics Data System (ADS)
da Silva, Roberto; Vainstein, Mendeli H.; Lamb, Luis C.; Prado, Sandra D.
2013-03-01
We propose a novel probabilistic model that outputs the final standings of a soccer league, based on a simple dynamics that mimics a soccer tournament. In our model, a team is created with a defined potential (ability) which is updated during the tournament according to the results of previous games. The updated potential modifies a team future winning/losing probabilities. We show that this evolutionary game is able to reproduce the statistical properties of final standings of actual editions of the Brazilian tournament (Brasileirão) if the starting potential is the same for all teams. Other leagues such as the Italian (Calcio) and the Spanish (La Liga) tournaments have notoriously non-Gaussian traces and cannot be straightforwardly reproduced by this evolutionary non-Markovian model with simple initial conditions. However, we show that by setting the initial abilities based on data from previous tournaments, our model is able to capture the stylized statistical features of double round robin system (DRRS) tournaments in general. A complete understanding of these phenomena deserves much more attention, but we suggest a simple explanation based on data collected in Brazil: here several teams have been crowned champion in previous editions corroborating that the champion typically emerges from random fluctuations that partly preserve the Gaussian traces during the tournament. On the other hand, in the Italian and Spanish cases, only a few teams in recent history have won their league tournaments. These leagues are based on more robust and hierarchical structures established even before the beginning of the tournament. For the sake of completeness, we also elaborate a totally Gaussian model (which equalizes the winning, drawing, and losing probabilities) and we show that the scores of the Brazilian tournament “Brasileirão” cannot be reproduced. This shows that the evolutionary aspects are not superfluous and play an important role which must be considered in other alternative models. Finally, we analyze the distortions of our model in situations where a large number of teams is considered, showing the existence of a transition from a single to a double peaked histogram of the final classification scores. An interesting scaling is presented for different sized tournaments.
Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs.
Yu, Yu-Yan; Chen, Yuan-Yuan; Gao, Xuan; Liu, Yuan-Yuan; Zhang, Hong-Yan; Wang, Tong-Ying
2018-04-01
A novel and sensitive assay for aflatoxin B1 (AFB1) detection has been developed by using bio-bar code assay (BCA). The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP) and monoclonal antibodies modified magnetic microparticle (MMP), and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR) detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10 -8 ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Copyright © 2017. Published by Elsevier B.V.