Tank 241-AP-105, cores 208, 209 and 210, analytical results for the final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nuzum, J.L.
1997-10-24
This document is the final laboratory report for Tank 241-AP-105. Push mode core segments were removed from Risers 24 and 28 between July 2, 1997, and July 14, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-AP-105 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997) and Tank Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT), differential scanning calorimetry (DSC) analysis, or total organic carbon (TOC) analysis exceeded the notification limits as stated in TSAP and DQO. The statisticalmore » results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report. Appearance and Sample Handling Two cores, each consisting of four segments, were expected from Tank 241-AP-105. Three cores were sampled, and complete cores were not obtained. TSAP states core samples should be transported to the laboratory within three calendar days from the time each segment is removed from the tank. This requirement was not met for all cores. Attachment 1 illustrates subsamples generated in the laboratory for analysis and identifies their sources. This reference also relates tank farm identification numbers to their corresponding 222-S Laboratory sample numbers.« less
An improved method for field extraction and laboratory analysis of large, intact soil cores
Tindall, J.A.; Hemmen, K.; Dowd, J.F.
1992-01-01
Various methods have been proposed for the extraction of large, undisturbed soil cores and for subsequent analysis of fluid movement within the cores. The major problems associated with these methods are expense, cumbersome field extraction, and inadequate simulation of unsaturated flow conditions. A field and laboratory procedure is presented that is economical, convenient, and simulates unsaturated and saturated flow without interface flow problems and can be used on a variety of soil types. In the field, a stainless steel core barrel is hydraulically pressed into the soil (30-cm diam. and 38 cm high), the barrel and core are extracted from the soil, and after the barrel is removed from the core, the core is then wrapped securely with flexible sheet metal and a stainless mesh screen is attached to the bottom of the core for support. In the laboratory the soil core is set atop a porous ceramic plate over which a soil-diatomaceous earth slurry has been poured to assure good contact between plate and core. A cardboard cylinder (mold) is fastened around the core and the empty space filled with paraffin wax. Soil cores were tested under saturated and unsaturated conditions using a hanging water column for potentials ???0. Breakthrough curves indicated that no interface flow occurred along the edge of the core. This procedure proved to be reliable for field extraction of large, intact soil cores and for laboratory analysis of solute transport.
Magnetic resonance imaging in laboratory petrophysical core analysis
NASA Astrophysics Data System (ADS)
Mitchell, J.; Chandrasekera, T. C.; Holland, D. J.; Gladden, L. F.; Fordham, E. J.
2013-05-01
Magnetic resonance imaging (MRI) is a well-known technique in medical diagnosis and materials science. In the more specialized arena of laboratory-scale petrophysical rock core analysis, the role of MRI has undergone a substantial change in focus over the last three decades. Initially, alongside the continual drive to exploit higher magnetic field strengths in MRI applications for medicine and chemistry, the same trend was followed in core analysis. However, the spatial resolution achievable in heterogeneous porous media is inherently limited due to the magnetic susceptibility contrast between solid and fluid. As a result, imaging resolution at the length-scale of typical pore diameters is not practical and so MRI of core-plugs has often been viewed as an inappropriate use of expensive magnetic resonance facilities. Recently, there has been a paradigm shift in the use of MRI in laboratory-scale core analysis. The focus is now on acquiring data in the laboratory that are directly comparable to data obtained from magnetic resonance well-logging tools (i.e., a common physics of measurement). To maintain consistency with well-logging instrumentation, it is desirable to measure distributions of transverse (T2) relaxation time-the industry-standard metric in well-logging-at the laboratory-scale. These T2 distributions can be spatially resolved over the length of a core-plug. The use of low-field magnets in the laboratory environment is optimal for core analysis not only because the magnetic field strength is closer to that of well-logging tools, but also because the magnetic susceptibility contrast is minimized, allowing the acquisition of quantitative image voxel (or pixel) intensities that are directly scalable to liquid volume. Beyond simple determination of macroscopic rock heterogeneity, it is possible to utilize the spatial resolution for monitoring forced displacement of oil by water or chemical agents, determining capillary pressure curves, and estimating wettability. The history of MRI in petrophysics is reviewed and future directions considered, including advanced data processing techniques such as compressed sensing reconstruction and Bayesian inference analysis of under-sampled data. Although this review focuses on rock core analysis, the techniques described are applicable in a wider context to porous media in general, such as cements, soils, ceramics, and catalytic materials.
Chakrabarti, Anjan K.; Grau-Sepulveda, Maria V.; O’Brien, Sean; Abueg, Cassandra; Ponirakis, Angelo; Delong, Elizabeth; Peterson, Eric; Klein, Lloyd W.; Garratt, Kirk N.; Weintraub, William S.; Gibson, C. Michael
2017-01-01
Background The goal of this study was to compare angiographic interpretation of coronary arteriograms by sites in community practice versus those made by a centralized angiographic core laboratory. Methods and Results The study population consisted of 2013 American College of Cardiology–National Cardiovascular Data Registry (ACC–NCDR) records with 2- and 3- vessel coronary disease from 54 sites in 2004 to 2007. The primary analysis compared Registry (NCDR)-defined 2- and 3-vessel disease versus those from an angiographic core laboratory analysis. Vessel-level kappa coefficients suggested moderate agreement between NCDR and core laboratory analysis, ranging from kappa=0.39 (95% confidence intervals, 0.32–0.45) for the left anterior descending artery to kappa=0.59 (95% confidence intervals, 0.55–0.64) for the right coronary artery. Overall, 6.3% (n=127 out of 2013) of those patients identified with multivessel disease at NCDR sites had had 0- or 1-vessel disease by core laboratory reading. There was no directional bias with regard to overcall, that is, 12.3% of cases read as 3-vessel disease by the sites were read as <3-vessel disease by the core laboratory, and 13.9% of core laboratory 3-vessel cases were read as <3-vessel by the sites. For a subset of patients with left main coronary disease, registry overcall was not linked to increased rates of mortality or myocardial infarction. Conclusions There was only modest agreement between angiographic readings in clinical practice and those from an independent core laboratory. Further study will be needed because the implications for patient management are uncertain. PMID:24496239
Chakrabarti, Anjan K; Grau-Sepulveda, Maria V; O'Brien, Sean; Abueg, Cassandra; Ponirakis, Angelo; Delong, Elizabeth; Peterson, Eric; Klein, Lloyd W; Garratt, Kirk N; Weintraub, William S; Gibson, C Michael
2014-02-01
The goal of this study was to compare angiographic interpretation of coronary arteriograms by sites in community practice versus those made by a centralized angiographic core laboratory. The study population consisted of 2013 American College of Cardiology-National Cardiovascular Data Registry (ACC-NCDR) records with 2- and 3- vessel coronary disease from 54 sites in 2004 to 2007. The primary analysis compared Registry (NCDR)-defined 2- and 3-vessel disease versus those from an angiographic core laboratory analysis. Vessel-level kappa coefficients suggested moderate agreement between NCDR and core laboratory analysis, ranging from kappa=0.39 (95% confidence intervals, 0.32-0.45) for the left anterior descending artery to kappa=0.59 (95% confidence intervals, 0.55-0.64) for the right coronary artery. Overall, 6.3% (n=127 out of 2013) of those patients identified with multivessel disease at NCDR sites had had 0- or 1-vessel disease by core laboratory reading. There was no directional bias with regard to overcall, that is, 12.3% of cases read as 3-vessel disease by the sites were read as <3-vessel disease by the core laboratory, and 13.9% of core laboratory 3-vessel cases were read as <3-vessel by the sites. For a subset of patients with left main coronary disease, registry overcall was not linked to increased rates of mortality or myocardial infarction. There was only modest agreement between angiographic readings in clinical practice and those from an independent core laboratory. Further study will be needed because the implications for patient management are uncertain.
Publications - GMC 313 | Alaska Division of Geological & Geophysical
'-13,075.0') from the Pan American Redoubt Shoal State (29690) #1 well Authors: Saltmarsh, Art, and Core . Bibliographic Reference Saltmarsh, Art, and Core Laboratories, 2004, Porosity and permeability core analysis
Laboratory ultrasonic pulse velocity logging for determination of elastic properties from rock core
NASA Astrophysics Data System (ADS)
Blacklock, Natalie Erin
During the development of deep underground excavations spalling and rockbursting have been recognized as significant mechanisms of violent brittle failure. In order to predict whether violent brittle failure will occur, it is important to identify the location of stiffness transitions that are associated with geologic structure. One approach to identify the effect of geologic structures is to apply borehole geophysical tools ahead of the tunnel advance. Stiffness transitions can be identified using mechanical property analysis surveys that combine acoustic velocity and density data to calculate acoustic estimates of elastic moduli. However, logistical concerns arise since the approach must be conducted at the advancing tunnel face. As a result, borehole mechanical property analyses are rarely used. Within this context, laboratory ultrasonic pulse velocity testing has been proposed as a potential alternative to borehole mechanical property analysis since moving the analysis to the laboratory would remove logistical constraints and improve safety for the evaluators. In addition to the traditional method of conducting velocity testing along the core axis, two new methodologies for point-focused testing were developed across the core diameter, and indirectly along intact lengths of drill core. The indirect test procedure was implemented in a continuous ultrasonic velocity test program along 573m of drill core to identify key geologic structures that generated transitions in ultrasonic elastic moduli. The test program was successful at identifying the location of geologic contacts, igneous intrusions, faults and shear structures. Ultrasonic values of Young's modulus and bulk modulus were determined at locations of significant velocity transitions to examine the potential for energy storage and energy release. Comparison of results from different ultrasonic velocity test configurations determined that the indirect test configuration provided underestimates for values of Young's modulus. This indicated that the test procedure will require modifications to improve coupling of the transducers to the core surface. In order to assess whether laboratory testing can be an alternative to borehole surveys, laboratory velocity testing must be directly assessed with results from acoustic borehole logging. There is also potential for the laboratory velocity program to be used to assess small scale stiffness changes, differences in mineral composition and the degree of fracturing of drill core.
Introductory Archaeology: The Inexpensive Laboratory.
ERIC Educational Resources Information Center
Rice, Patricia C.
1990-01-01
Describes a number of student-focused laboratory exercises that are inexpensive, yet show the scientific character of archaeology. Describes the environmental laboratory exercise which includes the following analysis topics: (1) pollen; (2) earth core; (3) microfaunal; and (4) microwear. Describes the ceramic laboratory which involves…
NASA Astrophysics Data System (ADS)
Ja'fari, Ahmad; Hamidzadeh Moghadam, Rasoul
2012-10-01
Routine core analysis provides useful information for petrophysical study of the hydrocarbon reservoirs. Effective porosity and fluid conductivity (permeability) could be obtained from core analysis in laboratory. Coring hydrocarbon bearing intervals and analysis of obtained cores in laboratory is expensive and time consuming. In this study an improved method to make a quantitative correlation between porosity and permeability obtained from core and conventional well log data by integration of different artificial intelligent systems is proposed. The proposed method combines the results of adaptive neuro-fuzzy inference system (ANFIS) and neural network (NN) algorithms for overall estimation of core data from conventional well log data. These methods multiply the output of each algorithm with a weight factor. Simple averaging and weighted averaging were used for determining the weight factors. In the weighted averaging method the genetic algorithm (GA) is used to determine the weight factors. The overall algorithm was applied in one of SW Iran’s oil fields with two cored wells. One-third of all data were used as the test dataset and the rest of them were used for training the networks. Results show that the output of the GA averaging method provided the best mean square error and also the best correlation coefficient with real core data.
Hydrogen Safety Project: Chemical analysis support task. Window ``E`` analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T E; Campbell, J A; Hoppe, E W
1992-09-01
Core samples taken from tank 101-SY at Hanford during ``window E`` were analyzed for organic and radiochemical constituents by staff of the Analytical Chemistry Laboratory at Pacific Northwest Laboratory. Westinghouse Hanford company submitted these samples to the laboratory.
Blau, Ashley; Brown, Alison; Mahanta, Lisa; Amr, Sami S.
2016-01-01
The Translational Genomics Core (TGC) at Partners Personalized Medicine (PPM) serves as a fee-for-service core laboratory for Partners Healthcare researchers, providing access to technology platforms and analysis pipelines for genomic, transcriptomic, and epigenomic research projects. The interaction of the TGC with various components of PPM provides it with a unique infrastructure that allows for greater IT and bioinformatics opportunities, such as sample tracking and data analysis. The following article describes some of the unique opportunities available to an academic research core operating within PPM, such the ability to develop analysis pipelines with a dedicated bioinformatics team and maintain a flexible Laboratory Information Management System (LIMS) with the support of an internal IT team, as well as the operational challenges encountered to respond to emerging technologies, diverse investigator needs, and high staff turnover. In addition, the implementation and operational role of the TGC in the Partners Biobank genotyping project of over 25,000 samples is presented as an example of core activities working with other components of PPM. PMID:26927185
Blau, Ashley; Brown, Alison; Mahanta, Lisa; Amr, Sami S
2016-02-26
The Translational Genomics Core (TGC) at Partners Personalized Medicine (PPM) serves as a fee-for-service core laboratory for Partners Healthcare researchers, providing access to technology platforms and analysis pipelines for genomic, transcriptomic, and epigenomic research projects. The interaction of the TGC with various components of PPM provides it with a unique infrastructure that allows for greater IT and bioinformatics opportunities, such as sample tracking and data analysis. The following article describes some of the unique opportunities available to an academic research core operating within PPM, such the ability to develop analysis pipelines with a dedicated bioinformatics team and maintain a flexible Laboratory Information Management System (LIMS) with the support of an internal IT team, as well as the operational challenges encountered to respond to emerging technologies, diverse investigator needs, and high staff turnover. In addition, the implementation and operational role of the TGC in the Partners Biobank genotyping project of over 25,000 samples is presented as an example of core activities working with other components of PPM.
Fitzgibbons, Patrick L; Murphy, Douglas A; Dorfman, David M; Roche, Patrick C; Tubbs, Raymond R
2006-10-01
Correct assessment of human epidermal growth factor receptor 2 (HER2) status is essential in managing patients with invasive breast carcinoma, but few data are available on the accuracy of laboratories performing HER2 testing by immunohistochemistry (IHC). To review the results of the 2004 and 2005 College of American Pathologists HER2 Immunohistochemistry Tissue Microarray Survey. The HER2 survey is designed for laboratories performing immunohistochemical staining and interpretation for HER2. The survey uses tissue microarrays, each consisting of ten 3-mm tissue cores obtained from different invasive breast carcinomas. All cases are also analyzed by fluorescence in situ hybridization. Participants receive 8 tissue microarrays (80 cases) with instructions to perform immunostaining for HER2 using the laboratory's standard procedures. The laboratory interprets the stained slides and returns results to the College of American Pathologists for analysis. In 2004 and 2005, a core was considered "graded" when at least 90% of laboratories agreed on the result--negative (0, 1+) versus positive (2+, 3+). This interlaboratory comparison survey included 102 laboratories in 2004 and 141 laboratories in 2005. Of the 160 cases in both surveys, 111 (69%) achieved 90% consensus (graded). All 43 graded cores scored as IHC-positive were fluorescence in situ hybridization-positive, whereas all but 3 of the 68 IHC-negative graded cores were fluorescence in situ hybridization-negative. Ninety-seven (95%) of 102 laboratories in 2004 and 129 (91%) of 141 laboratories in 2005 correctly scored at least 90% of the graded cores. Performance among laboratories performing HER2 IHC in this tissue microarray-based survey was excellent. Cores found to be IHC-positive or IHC-negative by participant consensus can be used as validated benchmarks for interlaboratory comparison, allowing laboratories to assess their performance and determine if improvements are needed.
Tank 241-T-204, core 188 analytical results for the final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nuzum, J.L.
TANK 241-T-204, CORE 188, ANALYTICAL RESULTS FOR THE FINAL REPORT. This document is the final laboratory report for Tank 241 -T-204. Push mode core segments were removed from Riser 3 between March 27, 1997, and April 11, 1997. Segments were received and extruded at 222-8 Laboratory. Analyses were performed in accordance with Tank 241-T-204 Push Mode Core Sampling and analysis Plan (TRAP) (Winkleman, 1997), Letter of instruction for Core Sample Analysis of Tanks 241-T-201, 241- T-202, 241-T-203, and 241-T-204 (LAY) (Bell, 1997), and Safety Screening Data Qual@ Objective (DO) ODukelow, et al., 1995). None of the subsamples submitted for totalmore » alpha activity (AT) or differential scanning calorimetry (DC) analyses exceeded the notification limits stated in DO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group and are not considered in this report.« less
Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogden, John Edward; Unal, Cetin
A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.
Video networking of cardiac catheterization laboratories.
Tobis, J; Aharonian, V; Mansukhani, P; Kasaoka, S; Jhandyala, R; Son, R; Browning, R; Youngblood, L; Thompson, M
1999-02-01
The purpose of this study was to assess the feasibility and accuracy of a video telecommunication network to transmit coronary images to provide on-line interaction between personnel in a cardiac catheterization laboratory and a remote core laboratory. A telecommunication system was installed in the cardiac catheterization laboratory at Kaiser Hospital, Los Angeles, and the core laboratory at the University of California, Irvine, approximately 40 miles away. Cineangiograms, live fluoroscopy, intravascular ultrasound studies and images of the catheterization laboratory were transmitted in real time over a dedicated T1 line at 768 kilobytes/second at 15 frames/second. These cases were performed during a clinical study of angiographic guidance versus intravascular ultrasound (IVUS) guidance of stent deployment. During the cases the core laboratory performed quantitative analysis of the angiograms and ultrasound images. Selected images were then annotated and transmitted back to the catheterization laboratory to facilitate discussion during the procedure. A successful communication hookup was obtained in 39 (98%) of 40 cases. Measurements of angiographic parameters were very close between the original cinefilm and the transmitted images. Quantitative analysis of the ultrasound images showed no significant difference in any of the diameter or cross-sectional area measurements between the original ultrasound tape and the transmitted images. The telecommunication link during the interventional procedures had a significant impact in 23 (58%) of 40 cases affecting the area to be treated, the size of the inflation balloon, recognition of stent underdeployment, or the existence of disease in other areas that was not noted on the original studies. Current video telecommunication systems provide high-quality images on-line with accurate representation of cineangiograms and intravascular ultrasound images. This system had a significant impact on 58% of the cases in this small clinical trial. Telecommunication networks between hospitals and a central core laboratory may facilitate physician training and improve technical skills and judgement during interventional procedures. This project has implications for how multicenter clinical trials could be operated through telecommunication networks to ensure conformity with the protocol.
Posttest analysis of a laboratory-cast monolith of salt-saturated concrete. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wakeley, L.D.; Poole, T.S.
A salt-saturated concrete was formulated for laboratory testing of cementitious mixtures with potential for use in disposal of radioactive wastes in a geologic repository in halite rock. Cores were taken from a laboratory-cast concrete monolith on completion of tests of permeability, strain, and stress. The cores were analyzed for physical and chemical evidence of brine migration through the concrete, and other features with potential impact on installation of crete plugs at the Waste Isolation Pilot Plant (WIPP) in New Mexico. The posttest analyses of the cores provided evidence of brine movement along the interface between concrete and pipe, and littlemore » indication of permeability through the monolith itself. There may also have been diffusion of chloride into the monolith without actual brine flow.« less
Optical Methods for Identifying Hard Clay Core Samples During Petrophysical Studies
NASA Astrophysics Data System (ADS)
Morev, A. V.; Solovyeva, A. V.; Morev, V. A.
2018-01-01
X-ray phase analysis of the general mineralogical composition of core samples from one of the West Siberian fields was performed. Electronic absorption spectra of the clay core samples with an added indicator were studied. The speed and availability of applying the two methods in petrophysical laboratories during sample preparation for standard and special studies were estimated.
Suba, Eric J; Pfeifer, John D; Raab, Stephen S
2007-10-01
Patient identification errors in surgical pathology often involve switches of prostate or breast needle core biopsy specimens among patients. We assessed strategies for decreasing the occurrence of these uncommon and yet potentially catastrophic events. Root cause analyses were performed following 3 cases of patient identification error involving prostate needle core biopsy specimens. Patient identification errors in surgical pathology result from slips and lapses of automatic human action that may occur at numerous steps during pre-laboratory, laboratory and post-laboratory work flow processes. Patient identification errors among prostate needle biopsies may be difficult to entirely prevent through the optimization of work flow processes. A DNA time-out, whereby DNA polymorphic microsatellite analysis is used to confirm patient identification before radiation therapy or radical surgery, may eliminate patient identification errors among needle biopsies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, J.A.; Clauss, S.A.; Grant, K.E.
The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, J. M.
The Concrete and Materials Branch (CMB) of the Geotechnical and Structures Laboratory was requested to perform an analysis on concrete cores collected from the north and south walls of the H-Canyon Section 3 Personnel Tunel, Savannah River Site, Aiken, South Carolina to determine the cause of the lower than expected compressive strength. This study examined five cores provided to the ERDC by the Department of Energy. The cores were logged in as CMB No. 170051-1 to 170051-5 and subjected to petrographic examination, air void analysis, chemical sprays, scanning electron microscopy, and x-ray diffraction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, T.A.
This is the final sample analysis report for tank 241-BX-104 (BX-104), cores 126 and 127. Two segments from each core yielded a total of 11 samples which were analyzed. The data quality objectives (DQOs) applicable to this sampling event were the Safety Screening DQO (Dukelow et al. 1995) and the Organic Safety DQO (Turner et al. 1995). The samples were received, extruded and analyzed at PNNL 325 Analytical Chemistry Laboratory (ACL). The analyses were performed in accordance with the Sample Analysis Plan (Gretsinger 1996) and indicated that the tank is safe with respect to the criteria in the Safety Screeningmore » and Organic DQO. Detailed analytical results were described in the analytical laboratory 45-day Report (Attachment 1, WHC-SD-WM-DP-171, REV. 0) and final report (Attachment 2, PNL-BX-104 REV.1) prepared by PNNL, 325 Laboratory. Corrections and/or exceptions to the PNNL final report are provided.« less
Ramírez, Juan C; Parrado, Rudy; Sulleiro, Elena; de la Barra, Anabelle; Rodríguez, Marcelo; Villarroel, Sandro; Irazu, Lucía; Alonso-Vega, Cristina; Alves, Fabiana; Curto, María A; García, Lineth; Ortiz, Lourdes; Torrico, Faustino; Gascón, Joaquim; Flevaud, Laurence; Molina, Israel; Ribeiro, Isabela; Schijman, Alejandro G
2017-01-01
Real-Time PCR (qPCR) testing is recommended as both a diagnostic and outcome measurement of etiological treatment in clinical practice and clinical trials of Chagas disease (CD), but no external quality assurance (EQA) program provides performance assessment of the assays in use. We implemented an EQA system to evaluate the performance of molecular biology laboratories involved in qPCR based follow-up in clinical trials of CD. An EQA program was devised for three clinical trials of CD: the E1224 (NCT01489228), a pro-drug of ravuconazole; the Sampling Study (NCT01678599), that used benznidazole, both conducted in Bolivia; and the CHAGASAZOL (NCT01162967), that tested posaconazole, conducted in Spain. Four proficiency testing panels containing negative controls and seronegative blood samples spiked with 1, 10 and 100 parasite equivalents (par. eq.)/mL of four Trypanosoma cruzi stocks, were sent from the Core Lab in Argentina to the participating laboratories located in Bolivia and Spain. Panels were analyzed simultaneously, blinded to sample allocation, at 4-month intervals. In addition, 302 random blood samples from both trials carried out in Bolivia were sent to Core Lab for retesting analysis. The analysis of proficiency testing panels gave 100% of accordance (within laboratory agreement) and concordance (between laboratory agreement) for all T. cruzi stocks at 100 par. eq./mL; whereas their values ranged from 71 to 100% and from 62 to 100% at 1 and 10 par. eq./mL, respectively, depending on the T. cruzi stock. The results obtained after twelve months of preparation confirmed the stability of blood samples in guanidine-EDTA buffer. No significant differences were found between qPCR results from Bolivian laboratory and Core Lab for retested clinical samples. This EQA program for qPCR analysis of CD patient samples may significantly contribute to ensuring the quality of laboratory data generated in clinical trials and molecular diagnostics laboratories of CD.
Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas
2014-01-01
Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools.
2014-01-01
Background Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Methods Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Results Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Conclusions Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools. PMID:25565007
Challenges for proteomics core facilities.
Lilley, Kathryn S; Deery, Michael J; Gatto, Laurent
2011-03-01
Many analytical techniques have been executed by core facilities established within academic, pharmaceutical and other industrial institutions. The centralization of such facilities ensures a level of expertise and hardware which often cannot be supported by individual laboratories. The establishment of a core facility thus makes the technology available for multiple researchers in the same institution. Often, the services within the core facility are also opened out to researchers from other institutions, frequently with a fee being levied for the service provided. In the 1990s, with the onset of the age of genomics, there was an abundance of DNA analysis facilities, many of which have since disappeared from institutions and are now available through commercial sources. Ten years on, as proteomics was beginning to be utilized by many researchers, this technology found itself an ideal candidate for being placed within a core facility. We discuss what in our view are the daily challenges of proteomics core facilities. We also examine the potential unmet needs of the proteomics core facility that may also be applicable to proteomics laboratories which do not function as core facilities. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Core vs. Bulk Samples in Soil-Moisture Tension Analyses
Walter M. Broadfoot
1954-01-01
The usual laboratory procedure in determining soil-moisture tension values is to use "undisturbed" soil cores for tensions up to 60 cm. of water and bulk soil samples for higher tensions. Low tensions are usually obtained with a tension table and the higher tensions by use of pressure plate apparatus. In tension analysis at the Vicksburg Infiltration Project...
ERIC Educational Resources Information Center
DeSantis, Kara A.; Reinking, Jeffrey L.
2011-01-01
This laboratory exercise is an inquiry-based investigation developed around the core experiment where students, working alone or in groups, each purify and analyze their own prescreened colored proteins using immobilized metal affinity chromatography (IMAC). Here, we present reagents and protocols that allow 12 different proteins to be purified in…
Pharmacology Portal: An Open Database for Clinical Pharmacologic Laboratory Services.
Karlsen Bjånes, Tormod; Mjåset Hjertø, Espen; Lønne, Lars; Aronsen, Lena; Andsnes Berg, Jon; Bergan, Stein; Otto Berg-Hansen, Grim; Bernard, Jean-Paul; Larsen Burns, Margrete; Toralf Fosen, Jan; Frost, Joachim; Hilberg, Thor; Krabseth, Hege-Merete; Kvan, Elena; Narum, Sigrid; Austgulen Westin, Andreas
2016-01-01
More than 50 Norwegian public and private laboratories provide one or more analyses for therapeutic drug monitoring or testing for drugs of abuse. Practices differ among laboratories, and analytical repertoires can change rapidly as new substances become available for analysis. The Pharmacology Portal was developed to provide an overview of these activities and to standardize the practices and terminology among laboratories. The Pharmacology Portal is a modern dynamic web database comprising all available analyses within therapeutic drug monitoring and testing for drugs of abuse in Norway. Content can be retrieved by using the search engine or by scrolling through substance lists. The core content is a substance registry updated by a national editorial board of experts within the field of clinical pharmacology. This ensures quality and consistency regarding substance terminologies and classification. All laboratories publish their own repertoires in a user-friendly workflow, adding laboratory-specific details to the core information in the substance registry. The user management system ensures that laboratories are restricted from editing content in the database core or in repertoires within other laboratory subpages. The portal is for nonprofit use, and has been fully funded by the Norwegian Medical Association, the Norwegian Society of Clinical Pharmacology, and the 8 largest pharmacologic institutions in Norway. The database server runs an open-source content management system that ensures flexibility with respect to further development projects, including the potential expansion of the Pharmacology Portal to other countries. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Osetkovsky, I. V.; Kozyrev, N. A.; Kryukov, R. E.; Usoltsev, A. A.; Gusev, A. I.
2017-09-01
The effect of introduction of cobalt in the charge of the flux cored wire of Fe-C-Si-Mn-Cr-Ni-Mo-V system operating under abrasive and abrasive-shock loads is studied. In the laboratory conditions samples of flux cored wires were made, deposition was performed, the effect of cobalt on the hardness and the degree of wear was evaluated, metallographic studies were carried out. The influence of cobalt introduced into the charge of the flux cored wire of Fe-C-Si-Mn-Cr-Ni-Mo-V system on the structure, nature of nonmetallic inclusions, hardness and wear resistance of the weld metal was studied. In the laboratory conditions samples flux cored wire were made using appropriate powdered materials. As a carbon-fluorine-containing material dust from gas cleaning units of aluminum production was used. In the course of the study the chemical composition of the weld metal was determined, metallographic analysis was performed, mechanical properties were determined. As a result of the metallographic analysis the size of the former austenite grain, martensite dispersion in the structure of the weld metal, the level of contamination with its nonmetallic inclusions were established.
ERIC Educational Resources Information Center
Han, Duanduan; Ugaz, Victor
2017-01-01
Three self-contained mini-labs were integrated into a core undergraduate fluid mechanics course, with the goal of delivering hands-on content in a manner scalable to large class sizes. These mini-labs supported learning objectives involving friction loss in pipes, flow measurement, and centrifugal pump analysis. The hands-on experiments were…
The role of total laboratory automation in a consolidated laboratory network.
Seaberg, R S; Stallone, R O; Statland, B E
2000-05-01
In an effort to reduce overall laboratory costs and improve overall laboratory efficiencies at all of its network hospitals, the North Shore-Long Island Health System recently established a Consolidated Laboratory Network with a Core Laboratory at its center. We established and implemented a centralized Core Laboratory designed around the Roche/Hitachi CLAS Total Laboratory Automation system to perform the general and esoteric laboratory testing throughout the system in a timely and cost-effective fashion. All remaining STAT testing will be performed within the Rapid Response Laboratories (RRLs) at each of the system's hospitals. Results for this laboratory consolidation and implementation effort demonstrated a decrease in labor costs and improved turnaround time (TAT) at the core laboratory. Anticipated system savings are approximately $2.7 million. TATs averaged 1.3 h within the Core Laboratory and less than 30 min in the RRLs. When properly implemented, automation systems can reduce overall laboratory expenses, enhance patient services, and address the overall concerns facing the laboratory today: job satisfaction, decreased length of stay, and safety. The financial savings realized are primarily a result of labor reductions.
TREAT Transient Analysis Benchmarking for the HEU Core
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontogeorgakos, D. C.; Connaway, H. M.; Wright, A. E.
2014-05-01
This work was performed to support the feasibility study on the potential conversion of the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory from the use of high enriched uranium (HEU) fuel to the use of low enriched uranium (LEU) fuel. The analyses were performed by the GTRI Reactor Conversion staff at the Argonne National Laboratory (ANL). The objective of this study was to benchmark the transient calculations against temperature-limited transients performed in the final operating HEU TREAT core configuration. The MCNP code was used to evaluate steady-state neutronics behavior, and the point kinetics code TREKIN was used tomore » determine core power and energy during transients. The first part of the benchmarking process was to calculate with MCNP all the neutronic parameters required by TREKIN to simulate the transients: the transient rod-bank worth, the prompt neutron generation lifetime, the temperature reactivity feedback as a function of total core energy, and the core-average temperature and peak temperature as a functions of total core energy. The results of these calculations were compared against measurements or against reported values as documented in the available TREAT reports. The heating of the fuel was simulated as an adiabatic process. The reported values were extracted from ANL reports, intra-laboratory memos and experiment logsheets and in some cases it was not clear if the values were based on measurements, on calculations or a combination of both. Therefore, it was decided to use the term “reported” values when referring to such data. The methods and results from the HEU core transient analyses will be used for the potential LEU core configurations to predict the converted (LEU) core’s performance.« less
CHAP-2 heat-transfer analysis of the Fort St. Vrain reactor core
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kotas, J.F.; Stroh, K.R.
1983-01-01
The Los Alamos National Laboratory is developing the Composite High-Temperature Gas-Cooled Reactor Analysis Program (CHAP) to provide advanced best-estimate predictions of postulated accidents in gas-cooled reactor plants. The CHAP-2 reactor-core model uses the finite-element method to initialize a two-dimensional temperature map of the Fort St. Vrain (FSV) core and its top and bottom reflectors. The code generates a finite-element mesh, initializes noding and boundary conditions, and solves the nonlinear Laplace heat equation using temperature-dependent thermal conductivities, variable coolant-channel-convection heat-transfer coefficients, and specified internal fuel and moderator heat-generation rates. This paper discusses this method and analyzes an FSV reactor-core accident thatmore » simulates a control-rod withdrawal at full power.« less
Implementation of Quality Management in Core Service Laboratories
Creavalle, T.; Haque, K.; Raley, C.; Subleski, M.; Smith, M.W.; Hicks, B.
2010-01-01
CF-28 The Genetics and Genomics group of the Advanced Technology Program of SAIC-Frederick exists to bring innovative genomic expertise, tools and analysis to NCI and the scientific community. The Sequencing Facility (SF) provides next generation short read (Illumina) sequencing capacity to investigators using a streamlined production approach. The Laboratory of Molecular Technology (LMT) offers a wide range of genomics core services including microarray expression analysis, miRNA analysis, array comparative genome hybridization, long read (Roche) next generation sequencing, quantitative real time PCR, transgenic genotyping, Sanger sequencing, and clinical mutation detection services to investigators from across the NIH. As the technology supporting this genomic research becomes more complex, the need for basic quality processes within all aspects of the core service groups becomes critical. The Quality Management group works alongside members of these labs to establish or improve processes supporting operations control (equipment, reagent and materials management), process improvement (reengineering/optimization, automation, acceptance criteria for new technologies and tech transfer), and quality assurance and customer support (controlled documentation/SOPs, training, service deficiencies and continual improvement efforts). Implementation and expansion of quality programs within unregulated environments demonstrates SAIC-Frederick's dedication to providing the highest quality products and services to the NIH community.
Flow Cytometry Technician | Center for Cancer Research
PROGRAM DESCRIPTION The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). KEY ROLES/RESPONSIBILITIES The Flow Cytometry Core (Flow Core) of the Cancer and Inflammation Program (CIP) is a service core which supports the research efforts of the CCR by providing expertise in the field of flow cytometry (using analyzers and sorters) with the goal of gaining a more thorough understanding of the biology of cancer and cancer cells. The Flow Core provides service to 12-15 CIP laboratories and more than 22 non-CIP laboratories. Flow core staff provide technical advice on the experimental design of applications, which include immunological phenotyping, cell function assays, and cell cycle analysis. Work is performed per customer requirements, and no independent research is involved. The Flow Cytometry Technician will be responsible for: Monitor performance of and maintain high dimensional flow cytometer analyzers and cell sorters Operate high dimensional flow cytometer analyzers and cell sorters Monitoring lab supply levels and order lab supplies, perform various record keeping responsibilities Assist in the training of scientific end users on the use of flow cytometry in their research, as well as how to operate and troubleshoot the bench-top analyzer instruments Experience with sterile technique and tissue culture
Booth, James S.
1979-01-01
The purpose of this report is to present the results of geotechnical, textural, and chemical tests performed on samples from the upper Continental Slope, northern Gulf of Mexico.The samples were collected by a piston corer up to 12 m (40 ft.} in length with a head weight of 908 kg (one ton}. The inside diameter of the C. A. B. liner was 89 mm (3.5 inches}. Upon retrieval, the cores were cut in 1.5 m sections, examined for evidence of disturbance, then, if in acceptable condition, were sealed and placed in their in situ vertical position in a refrigerated van. Once ashore, the sections were opened, sealed with wax, recapped and stored as before.The cores were split lengthwise for analysis. One half of the core was X-rayed and the radiograph was carefully examined as a further check for disturbance. This half was then archived. The other half of the core was used for the laboratory work.
Jagust, William J.; Landau, Susan M.; Koeppe, Robert A.; Reiman, Eric M.; Chen, Kewei; Mathis, Chester A.; Price, Julie C.; Foster, Norman L.; Wang, Angela Y.
2015-01-01
INTRODUCTION This paper reviews the work done in the ADNI PET core over the past 5 years, largely concerning techniques, methods, and results related to amyloid imaging in ADNI. METHODS The PET Core has utilized [18F]florbetapir routinely on ADNI participants, with over 1600 scans available for download. Four different laboratories are involved in data analysis, and have examined factors such as longitudinal florbetapir analysis, use of FDG-PET in clinical trials, and relationships between different biomarkers and cognition. RESULTS Converging evidence from the PET Core has indicated that cross-sectional and longitudinal florbetapir analyses require different reference regions. Studies have also examined the relationship between florbetapir data obtained immediately after injection, which reflects perfusion, and FDG-PET results. Finally, standardization has included the translation of florbetapir PET data to a centiloid scale. CONCLUSION The PET Core has demonstrated a variety of methods for standardization of biomarkers such as florbetapir PET in a multicenter setting. PMID:26194311
The statistical analysis of circadian phase and amplitude in constant-routine core-temperature data
NASA Technical Reports Server (NTRS)
Brown, E. N.; Czeisler, C. A.
1992-01-01
Accurate estimation of the phases and amplitude of the endogenous circadian pacemaker from constant-routine core-temperature series is crucial for making inferences about the properties of the human biological clock from data collected under this protocol. This paper presents a set of statistical methods based on a harmonic-regression-plus-correlated-noise model for estimating the phases and the amplitude of the endogenous circadian pacemaker from constant-routine core-temperature data. The methods include a Bayesian Monte Carlo procedure for computing the uncertainty in these circadian functions. We illustrate the techniques with a detailed study of a single subject's core-temperature series and describe their relationship to other statistical methods for circadian data analysis. In our laboratory, these methods have been successfully used to analyze more than 300 constant routines and provide a highly reliable means of extracting phase and amplitude information from core-temperature data.
Lack of Association between Hepatitis C Virus core Gene Variation 70/91aa and Insulin Resistance.
Scalioni, Letícia de Paula; da Silva, Allan Peres; Miguel, Juliana Custódio; Espírito Santo, Márcia Paschoal do; Marques, Vanessa Alves; Brandão-Mello, Carlos Eduardo; Villela-Nogueira, Cristiane Alves; Lewis-Ximenez, Lia Laura; Lampe, Elisabeth; Villar, Livia Melo
2017-07-21
The role of hepatitis C virus (HCV) in insulin resistance (IR) is not fully understood. The aim of this study was to determine the impact of amino acid (aa) substitutions in the core region of HCV according to IR and to identify clinical and laboratory associations. Ninety-two treatment-naive HCV patients were recruited to determine laboratory data and blood cell count. IR was determined using Homeostasis Model Assessment (HOMA) index where IR was defined as HOMA ≥2. HCV RNA load and genotype were determined by Abbott Real time HCV. HCV core region was determined by direct nucleotide sequencing. Bivariate analysis was conducted using HOMA IR ≥2 as a dependent factor. IR prevalence was 43.5% ( n = 40), vitamin D sufficiency was found in 76.1% ( n = 70) and 72.8% ( n = 67) had advanced liver fibrosis. In the bivariate analyses, elevated values of γGT ( p = 0.024) and fibrosis staging ( p = 0.004) were associated with IR, but IR was not related to core mutations. The presence of glutamine in position 70 was associated with low vitamin D concentration ( p = 0.005). In the multivariate analysis, no variable was independently associated with HOMA-IR. In conclusion, lack of association between IR and HCV core mutations in positions 70 and 91 suggests that genetic variability of this region has little impact on IR.
Streitberg, George S; Angel, Lyndall; Sikaris, Kenneth A; Bwititi, Phillip T
2012-10-01
Pathology has developed substantially since the 1990s with the introduction of total laboratory automation (TLA), in response to workloads and the need to improve quality. TLA has enhanced core laboratories, which evolved from discipline-based laboratories. Work practices have changed, with central reception now loading samples onto the Inlet module of the TLA. It is important to continually appraise technology. This study looked at the impact of technology using a self-administered survey to seniors in clinical biochemistry in NATA GX/GY-classified laboratories in Australia. The responses were yes, no, or not applicable and are expressed as percentages of responses. Some of the questions sourced for descriptive answers. Eighty-one laboratories responded, and the locations were 63%, 33%, and 4% in capital cities, regional cities, and country towns, respectively. Forty-two percent were public and 58% private. Clinical biochemistry was in all core laboratories of various sizes, and most performed up to 20 tests per sample. Thirty percent of the 121 surveyed laboratories had plans to install an automated line. Fifty-eight percent had hematology and biochemistry instrumentations in their peripheral laboratory, and 16% had a STAT laboratory on the same site as the core laboratory. There were varied instruments in specialist laboratories, and analyzers with embedded computers were in all laboratories. Medium and large laboratories had workstations with integrated instruments, and some large laboratories had TLA. Technology evolution and rising demand for pathology services make it imperative for laboratories to embrace such changes and reorganize the laboratories to take into account point-of-care testing and the efficiencies of core laboratories and TLA.
Trbusek, J
2009-11-01
Detection of HCV core antigen as direct marker of hepatitis C infection clearly improves diagnosis of this disease (especially reduction of window period) and brings broad clinical utilization. The company Abbott Laboratories offers fully automated laboratory test for measurement of HCV core antigen on ARCHITECT analyzers.
ERIC Educational Resources Information Center
Barreto, Jose C.; Dubetz, Terry A.; Schmidt, Diane L.; Isern, Sharon; Beatty, Thomas; Brown, David W.; Gillman, Edward; Alberte, Randall S.; Egiebor, Nosa O.
2007-01-01
Core concepts can be integrated throughout lower-division science and engineering courses by using a series of related, cross-referenced laboratory experiments. Starting with butane combustion in chemistry, the authors expanded the underlying core concepts of energy transfer into laboratories designed for biology, physics, and engineering. This…
NASA Astrophysics Data System (ADS)
Robbins, William L.; Conklin, James J.
1995-10-01
Medical images (angiography, CT, MRI, nuclear medicine, ultrasound, x ray) play an increasingly important role in the clinical development and regulatory review process for pharmaceuticals and medical devices. Since medical images are increasingly acquired and archived digitally, or are readily digitized from film, they can be visualized, processed and analyzed in a variety of ways using digital image processing and display technology. Moreover, with image-based data management and data visualization tools, medical images can be electronically organized and submitted to the U.S. Food and Drug Administration (FDA) for review. The collection, processing, analysis, archival, and submission of medical images in a digital format versus an analog (film-based) format presents both challenges and opportunities for the clinical and regulatory information management specialist. The medical imaging 'core laboratory' is an important resource for clinical trials and regulatory submissions involving medical imaging data. Use of digital imaging technology within a core laboratory can increase efficiency and decrease overall costs in the image data management and regulatory review process.
Publications - GMC 367 | Alaska Division of Geological & Geophysical
. Minerals Management Service, and Core Laboratories Publication Date: Aug 2009 Publisher: Alaska Division of Bibliographic Reference U.S. Minerals Management Service, and Core Laboratories, 2009, Sidewall core analyses
Lu, Michael T; Meyersohn, Nandini M; Mayrhofer, Thomas; Bittner, Daniel O; Emami, Hamed; Puchner, Stefan B; Foldyna, Borek; Mueller, Martin E; Hearne, Steven; Yang, Clifford; Achenbach, Stephan; Truong, Quynh A; Ghoshhajra, Brian B; Patel, Manesh R; Ferencik, Maros; Douglas, Pamela S; Hoffmann, Udo
2018-04-01
Purpose To assess concordance and relative prognostic utility between central core laboratory and local site interpretation for significant coronary artery disease (CAD) and cardiovascular events. Materials and Methods In the Prospective Multicenter Imaging Study for Evaluation of Chest Pain (PROMISE) trial, readers at 193 North American sites interpreted coronary computed tomographic (CT) angiography as part of the clinical evaluation of stable chest pain. Readers at a central core laboratory also interpreted CT angiography blinded to clinical data, site interpretation, and outcomes. Significant CAD was defined as stenosis greater than or equal to 50%; cardiovascular events were defined as a composite of cardiovascular death or myocardial infarction. Results In 4347 patients (51.8% women; mean age ± standard deviation, 60.4 years ± 8.2), core laboratory and site interpretations were discordant in 16% (683 of 4347), most commonly because of a finding of significant CAD by site but not by core laboratory interpretation (80%, 544 of 683). Overall, core laboratory interpretation resulted in 41% fewer patients being reported as having significant CAD (14%, 595 of 4347 vs 23%, 1000 of 4347; P < .001). Over a median follow-up period of 25 months, 1.3% (57 of 4347) sustained myocardial infarction or cardiovascular death. The C statistic for future myocardial infarction or cardiovascular death was 0.61 (95% confidence interval [CI]: 0.54, 0.68) for the core laboratory and 0.63 (95% CI: 0.56, 0.70) for the sites. Conclusion Compared with interpretation by readers at 193 North American sites, standardized core laboratory interpretation classified 41% fewer patients as having significant CAD. © RSNA, 2017 Online supplemental material is available for this article. Clinical trial registration no. NCT01174550.
Yang, Lei; Zhou, Weihua; Xue, Kaihua; Wei, Rupeng; Ling, Zheng
2018-05-01
The enormous potential as an alternative energy resource has made natural gas hydrates a material of intense research interest. Their exploration and sample characterization require a quick and effective analysis of the hydrate-bearing cores recovered under in situ pressures. Here a novel Pressure Core Ultrasonic Test System (PCUTS) for on-board analysis of sediment cores containing gas hydrates at in situ pressures is presented. The PCUTS is designed to be compatible with an on-board pressure core transfer device and a long gravity-piston pressure-retained corer. It provides several advantages over laboratory core analysis including quick and non-destructive detection, in situ and successive acoustic property acquisition, and remission of sample storage and transportation. The design of the unique assembly units to ensure the in situ detection is demonstrated, involving the U-type protecting jackets, transducer precession device, and pressure stabilization system. The in situ P-wave velocity measurements make the detection of gas hydrate existence in the sediments possible on-board. Performance tests have verified the feasibility and sensitivity of the ultrasonic test unit, showing the dependence of P-wave velocity on gas hydrate saturation. The PCUTS has been successfully applied for analysis of natural samples containing gas hydrates recovered from the South China Sea. It is indicated that on-board P-wave measurements could provide a quick and effective understanding of the hydrate occurrence in natural samples, which can assist further resource exploration, assessment, and subsequent detailed core analysis.
NASA Astrophysics Data System (ADS)
Yang, Lei; Zhou, Weihua; Xue, Kaihua; Wei, Rupeng; Ling, Zheng
2018-05-01
The enormous potential as an alternative energy resource has made natural gas hydrates a material of intense research interest. Their exploration and sample characterization require a quick and effective analysis of the hydrate-bearing cores recovered under in situ pressures. Here a novel Pressure Core Ultrasonic Test System (PCUTS) for on-board analysis of sediment cores containing gas hydrates at in situ pressures is presented. The PCUTS is designed to be compatible with an on-board pressure core transfer device and a long gravity-piston pressure-retained corer. It provides several advantages over laboratory core analysis including quick and non-destructive detection, in situ and successive acoustic property acquisition, and remission of sample storage and transportation. The design of the unique assembly units to ensure the in situ detection is demonstrated, involving the U-type protecting jackets, transducer precession device, and pressure stabilization system. The in situ P-wave velocity measurements make the detection of gas hydrate existence in the sediments possible on-board. Performance tests have verified the feasibility and sensitivity of the ultrasonic test unit, showing the dependence of P-wave velocity on gas hydrate saturation. The PCUTS has been successfully applied for analysis of natural samples containing gas hydrates recovered from the South China Sea. It is indicated that on-board P-wave measurements could provide a quick and effective understanding of the hydrate occurrence in natural samples, which can assist further resource exploration, assessment, and subsequent detailed core analysis.
Tracing footprints of environmental events in tree ring chemistry using neutron activation analysis
NASA Astrophysics Data System (ADS)
Sahin, Dagistan
The aim of this study is to identify environmental effects on tree-ring chemistry. It is known that industrial pollution, volcanic eruptions, dust storms, acid rain and similar events can cause substantial changes in soil chemistry. Establishing whether a particular group of trees is sensitive to these changes in soil environment and registers them in the elemental chemistry of contemporary growth rings is the over-riding goal of any Dendrochemistry research. In this study, elemental concentrations were measured in tree-ring samples of absolutely dated eleven modern forest trees, grown in the Mediterranean region, Turkey, collected and dated by the Malcolm and Carolyn Wiener Laboratory for Aegean and Near Eastern Dendrochronology laboratory at Cornell University. Correlations between measured elemental concentrations in the tree-ring samples were analyzed using statistical tests to answer two questions. Does the current concentration of a particular element depend on any other element within the tree? And, are there any elements showing correlated abnormal concentration changes across the majority of the trees? Based on the detailed analysis results, the low mobility of sodium and bromine, positive correlations between calcium, zinc and manganese, positive correlations between trace elements lanthanum, samarium, antimony, and gold within tree-rings were recognized. Moreover, zinc, lanthanum, samarium and bromine showed strong, positive correlations among the trees and were identified as possible environmental signature elements. New Dendrochemistry information found in this study would be also useful in explaining tree physiology and elemental chemistry in Pinus nigra species grown in Turkey. Elemental concentrations in tree-ring samples were measured using Neutron Activation Analysis (NAA) at the Pennsylvania State University Radiation Science and Engineering Center (RSEC). Through this study, advanced methodologies for methodological, computational and experimental NAA were developed to ensure an acceptable accuracy and certainty in the elemental concentration measurements in tree-ring samples. Two independent analysis methods of NAA were used; the well known k-zero method and a novel method developed in this study, called the Multi-isotope Iterative Westcott (MIW) method. The MIW method uses reaction rate probabilities for a group of isotopes, which can be calculated by a neutronic simulation or measured by experimentation, and determines the representative values for the neutron flux and neutron flux characterization parameters based on Westcott convention. Elemental concentration calculations for standard reference material and tree-ring samples were then performed using the MIW and k-zero analysis methods of the NAA and the results were cross verified. In the computational part of this study, a detailed burnup coupled neutronic simulation was developed to analyze real-time neutronic changes in a TRIGA Mark III reactor core, in this study, the Penn State Breazeale Reactor (PSBR) core. To the best of the author`s knowledge, this is the first burnup coupled neutronic simulation with realistic time steps and full fuel temperature profile for a TRIGA reactor using Monte Carlo Utility for Reactor Evolutions (MURE) code and Monte Carlo Neutral-Particle Code (MCNP) coupling. High fidelity and flexibility in the simulation was aimed to replicate the real core operation through the day. This approach resulted in an enhanced accuracy in neutronic representation of the PSBR core with respect to previous neutronic simulation models for the PSBR core. An important contribution was made in the NAA experimentation practices employed in Dendrochemistry studies at the RSEC. Automated laboratory control and analysis software for NAA measurements in the RSEC Radionuclide Applications Laboratory was developed. Detailed laboratory procedures were written in this study comprising preparation, handling and measurements of tree-ring samples in the Radionuclide Applications Laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
VISWANATH, R.S.
This data package presents sampling data and analytical results from the September 22 and 27, 1999, headspace vapor sampling of Hanford Site Tank 241-2-361 during sludge core removal. The Lockheed Martin Hanford Corporation (LMHC) sampling team collected the samples and Waste Management Laboratory (WML) analyzed the samples in accordance with the requirements specified in the 241-2361 Sludge Characterization Sampling and Analysis Plan, (SAP), HNF-4371, Rev. 1, (Babcock and Wilcox Hanford Corporation, 1999). Six SUMMA{trademark} canister samples were collected on each day (1 ambient field blank and 5 tank vapor samples collected when each core segment was removed). The samples weremore » radiologically released on September 28 and October 4, 1999, and received at the laboratory on September 29 and October 6, 1999. Target analytes were not detected at concentrations greater than their notification limits as specified in the SAP. Analytical results for the target analytes and tentatively identified compounds (TICs) are presented in Section 2.2.2 starting on page 2B-7. Three compounds identified for analysis in the SAP were analyzed as TICs. The discussion of this modification is presented in Section 2.2.1.2.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro; Trujillo, Susie
During calendar year 2017, Sandia National Laboratories (SNL) made strides towards developing an open portable design platform rich in highperformance computing (HPC) enabled modeling, analysis and synthesis tools. The main focus was to lay the foundations of the core interfaces that will enable plug-n-play insertion of synthesis optimization technologies in the areas of modeling, analysis and synthesis.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., identification of lithologic and fossil content, core analysis, laboratory analyses of physical and chemical... form of schematic cross sections, 3-dimensional representations, and maps, developed by determining the... means geophysical knowledge, often in the form of schematic cross sections, 3-dimensional...
Code of Federal Regulations, 2014 CFR
2014-07-01
..., identification of lithologic and fossil content, core analysis, laboratory analyses of physical and chemical... form of schematic cross sections, 3-dimensional representations, and maps, developed by determining the... means geophysical knowledge, often in the form of schematic cross sections, 3-dimensional...
Code of Federal Regulations, 2012 CFR
2012-07-01
... include, but is not limited to, identification of lithologic and fossil content, core analysis, laboratory... means geological knowledge, often in the form of schematic cross sections, 3-dimensional representations... information. Interpreted geophysical information means geophysical knowledge, often in the form of schematic...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., identification of lithologic and fossil content, core analysis, laboratory analyses of physical and chemical... form of schematic cross sections, 3-dimensional representations, and maps, developed by determining the... means geophysical knowledge, often in the form of schematic cross sections, 3-dimensional...
Code of Federal Regulations, 2012 CFR
2012-07-01
..., identification of lithologic and fossil content, core analysis, laboratory analyses of physical and chemical... form of schematic cross sections, 3-dimensional representations, and maps, developed by determining the... means geophysical knowledge, often in the form of schematic cross sections, 3-dimensional...
Code of Federal Regulations, 2011 CFR
2011-07-01
... include, but is not limited to, identification of lithologic and fossil content, core analysis, laboratory... form of schematic cross sections, 3-dimensional representations, and maps, developed by determining the... means geophysical knowledge, often in the form of schematic cross sections, 3-dimensional...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fountain, Matthew S.; Fiskum, Sandra K.; Baldwin, David L.
This data package contains the K Basin sludge characterization results obtained by Pacific Northwest National Laboratory during processing and analysis of four sludge core samples collected from Engineered Container SCS-CON-210 in 2010 as requested by CH2M Hill Plateau Remediation Company. Sample processing requirements, analytes of interest, detection limits, and quality control sample requirements are defined in the KBC-33786, Rev. 2. The core processing scope included reconstitution of a sludge core sample distributed among four to six 4-L polypropylene bottles into a single container. The reconstituted core sample was then mixed and subsampled to support a variety of characterization activities. Additionalmore » core sludge subsamples were combined to prepare a container composite. The container composite was fractionated by wet sieving through a 2,000 micron mesh and a 500-micron mesh sieve. Each sieve fraction was sampled to support a suite of analyses. The core composite analysis scope included density determination, radioisotope analysis, and metals analysis, including the Waste Isolation Pilot Plant Hazardous Waste Facility Permit metals (with the exception of mercury). The container composite analysis included most of the core composite analysis scope plus particle size distribution, particle density, rheology, and crystalline phase identification. A summary of the received samples, core sample reconstitution and subsampling activities, container composite preparation and subsampling activities, physical properties, and analytical results are presented. Supporting data and documentation are provided in the appendices. There were no cases of sample or data loss and all of the available samples and data are reported as required by the Quality Assurance Project Plan/Sampling and Analysis Plan.« less
Code of Federal Regulations, 2013 CFR
2013-07-01
... include, but is not limited to, identification of lithologic and fossil content, core analysis, laboratory.... Interpreted geological information means geological knowledge, often in the form of schematic cross sections... knowledge, often in the form of schematic cross sections, 3-dimensional representations, and maps, developed...
Code of Federal Regulations, 2014 CFR
2014-07-01
... include, but is not limited to, identification of lithologic and fossil content, core analysis, laboratory.... Interpreted geological information means geological knowledge, often in the form of schematic cross sections... knowledge, often in the form of schematic cross sections, 3-dimensional representations, and maps, developed...
Integration of Biosafety into Core Facility Management
Fontes, Benjamin
2013-01-01
This presentation will discuss the implementation of biosafety policies for small, medium and large core laboratories with primary shared objectives of ensuring the control of biohazards to protect core facility operators and assure conformity with applicable state and federal policies, standards and guidelines. Of paramount importance is the educational process to inform core laboratories of biosafety principles and policies and to illustrate the technology and process pathways of the core laboratory for biosafety professionals. Elevating awareness of biohazards and the biosafety regulatory landscape among core facility operators is essential for the establishment of a framework for both project and material risk assessment. The goal of the biohazard risk assessment process is to identify the biohazard risk management parameters to conduct the procedure safely and in compliance with applicable regulations. An evaluation of the containment, protective equipment and work practices for the procedure for the level of risk identified is facilitated by the establishment of a core facility registration form for work with biohazards and other biological materials with potential risk. The final step in the biocontainment process is the assumption of Principal Investigator role with full responsibility for the structure of the site-specific biosafety program plan by core facility leadership. The presentation will provide example biohazard protocol reviews and accompanying containment measures for core laboratories at Yale University.
Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
TEMPLETON, A.M.
2000-01-12
This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase I : Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activitymore » Waste and High-Level Waste Feed Data Quality Objectives (L and H DQO) (Patello et al. 1999), and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less
Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
TEMPLETON, A.M.
2000-05-19
This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy ''Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO)' (Nguyen 1999a), ''Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Butch X (LAW DQO) (Nguyen 1999b)'', ''Low Activity Wastemore » and High-Level Waste Feed Data Quality Objectives (L&H DQO)'' (Patello et al. 1999), and ''Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO)'' (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide sub-samples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less
Diffraction data of core-shell nanoparticles from an X-ray free electron laser
Li, Xuanxuan; Chiu, Chun -Ya; Wang, Hsiang -Ju; ...
2017-04-11
X-ray free-electron lasers provide novel opportunities to conduct single particle analysis on nanoscale particles. Coherent diffractive imaging experiments were performed at the Linac Coherent Light Source (LCLS), SLAC National Laboratory, exposing single inorganic core-shell nanoparticles to femtosecond hard-X-ray pulses. Each facetted nanoparticle consisted of a crystalline gold core and a differently shaped palladium shell. Scattered intensities were observed up to about 7 nm resolution. Analysis of the scattering patterns revealed the size distribution of the samples, which is consistent with that obtained from direct real-space imaging by electron microscopy. Furthermore, scattering patterns resulting from single particles were selected and compiledmore » into a dataset which can be valuable for algorithm developments in single particle scattering research.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budnitz, R.J.; Davis, P.R.; Ravindra, M.K.
1994-08-01
In 1989 the US Nuclear Regulatory Commission (NRC) initiated an extensive program to examine carefully the potential risks during low-power and shutdown operations. The program included two parallel projects, one at Brookhaven National Laboratory studying a pressurized water reactor (Surry Unit 1) and the other at Sandia National Laboratories studying a boiling water reactor (Grand Gulf). Both the Brookhaven and Sandia projects have examined only accidents initiated by internal plant faults--so-called ``internal initiators.`` This project, which has explored the likelihood of seismic-initiated core damage accidents during refueling shutdown conditions, is complementary to the internal-initiator analyses at Brookhaven and Sandia. Thismore » report covers the seismic analysis at Surry Unit 1. All of the many systems modeling assumptions, component non-seismic failure rates, and human error rates that were used in the internal-initiator study at Surry have been adopted here, so that the results of the two studies can be as comparable as possible. Both the Brookhaven study and this study examine only two shutdown plant operating states (POSs) during refueling outages at Surry, called POS 6 and POS 10, which represent mid-loop operation before and after refueling, respectively. This analysis has been limited to work analogous to a level-1 seismic PRA, in which estimates have been developed for the core-damage frequency from seismic events during POSs 6 and 10. The results of the analysis are that the core-damage frequency of earthquake-initiated accidents during refueling outages in POS 6 and POS 10 is found to be low in absolute terms, less than 10{sup {minus}6}/year.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budnitz, R.J.; Davis, P.R.; Ravindra, M.K.
In 1989 the US Nuclear Regulatory Commission (NRC) initiated an extensive program to examine carefully the potential risks during low-power and shutdown operations. The program included two parallel projects, one at Sandia National Laboratories studying a boiling water reactor (Grand Gulf), and the other at Brookhaven National Laboratory studying a pressurized water reactor (Surry Unit 1). Both the Sandia and Brookhaven projects have examined only accidents initiated by internal plant faults---so-called ``internal initiators.`` This project, which has explored the likelihood of seismic-initiated core damage accidents during refueling outage conditions, is complementary to the internal-initiator analyses at Brookhaven and Sandia. Thismore » report covers the seismic analysis at Grand Gulf. All of the many systems modeling assumptions, component non-seismic failure rates, and human effort rates that were used in the internal-initiator study at Grand Gulf have been adopted here, so that the results of the study can be as comparable as possible. Both the Sandia study and this study examine only one shutdown plant operating state (POS) at Grand Gulf, namely POS 5 representing cold shutdown during a refueling outage. This analysis has been limited to work analogous to a level-1 seismic PRA, in which estimates have been developed for the core-damage frequency from seismic events during POS 5. The results of the analysis are that the core-damage frequency for earthquake-initiated accidents during refueling outages in POS 5 is found to be quite low in absolute terms, less than 10{sup {minus}7}/year.« less
Wassenaar, L I; Terzer-Wassmuth, S; Douence, C; Araguas-Araguas, L; Aggarwal, P K; Coplen, T B
2018-03-15
Water stable isotope ratios (δ 2 H and δ 18 O values) are widely used tracers in environmental studies; hence, accurate and precise assays are required for providing sound scientific information. We tested the analytical performance of 235 international laboratories conducting water isotope analyses using dual-inlet and continuous-flow isotope ratio mass spectrometers and laser spectrometers through a water isotope inter-comparison test. Eight test water samples were distributed by the IAEA to international stable isotope laboratories. These consisted of a core set of five samples spanning the common δ-range of natural waters, and three optional samples (highly depleted, enriched, and saline). The fifth core sample contained unrevealed trace methanol to assess analyst vigilance to the impact of organic contamination on water isotopic measurements made by all instrument technologies. For the core and optional samples ~73 % of laboratories gave acceptable results within 0.2 ‰ and 1.5 ‰ of the reference values for δ 18 O and δ 2 H, respectively; ~27 % produced unacceptable results. Top performance for δ 18 O values was dominated by dual-inlet IRMS laboratories; top performance for δ 2 H values was led by laser spectrometer laboratories. Continuous-flow instruments yielded comparatively intermediate results. Trace methanol contamination of water resulted in extreme outlier δ-values for laser instruments, but also affected reactor-based continuous-flow IRMS systems; however, dual-inlet IRMS δ-values were unaffected. Analysis of the laboratory results and their metadata suggested inaccurate or imprecise performance stemmed mainly from skill- and knowledge-based errors including: calculation mistakes, inappropriate or compromised laboratory calibration standards, poorly performing instrumentation, lack of vigilance to contamination, or inattention to unreasonable isotopic outcomes. To counteract common errors, we recommend that laboratories include 1-2 'known' control standards in all autoruns; laser laboratories should screen each autorun for spectral contamination; and all laboratories should evaluate whether derived d-excess values are realistic when both isotope ratios are measured. Combined, these data evaluation strategies should immediately inform the laboratory about fundamental mistakes or compromised samples. Copyright © 2018 John Wiley & Sons, Ltd.
Wassenaar, L. I.; Terzer-Wassmuth, S.; Douence, C.; Araguas-Araguas, L.; Aggarwal, P. K.; Coplen, Tyler B.
2018-01-01
RationaleWater stable isotope ratios (δ2H and δ18O values) are widely used tracers in environmental studies; hence, accurate and precise assays are required for providing sound scientific information. We tested the analytical performance of 235 international laboratories conducting water isotope analyses using dual-inlet and continuous-flow isotope ratio mass spectrometers and laser spectrometers through a water isotope inter-comparison test.MethodsEight test water samples were distributed by the IAEA to international stable isotope laboratories. These consisted of a core set of five samples spanning the common δ-range of natural waters, and three optional samples (highly depleted, enriched, and saline). The fifth core sample contained unrevealed trace methanol to assess analyst vigilance to the impact of organic contamination on water isotopic measurements made by all instrument technologies.ResultsFor the core and optional samples ~73 % of laboratories gave acceptable results within 0.2 ‰ and 1.5 ‰ of the reference values for δ18O and δ2H, respectively; ~27 % produced unacceptable results. Top performance for δ18O values was dominated by dual-inlet IRMS laboratories; top performance for δ2H values was led by laser spectrometer laboratories. Continuous-flow instruments yielded comparatively intermediate results. Trace methanol contamination of water resulted in extreme outlier δ-values for laser instruments, but also affected reactor-based continuous-flow IRMS systems; however, dual-inlet IRMS δ-values were unaffected.ConclusionsAnalysis of the laboratory results and their metadata suggested inaccurate or imprecise performance stemmed mainly from skill- and knowledge-based errors including: calculation mistakes, inappropriate or compromised laboratory calibration standards, poorly performing instrumentation, lack of vigilance to contamination, or inattention to unreasonable isotopic outcomes. To counteract common errors, we recommend that laboratories include 1–2 'known' control standards in all autoruns; laser laboratories should screen each autorun for spectral contamination; and all laboratories should evaluate whether derived d-excess values are realistic when both isotope ratios are measured. Combined, these data evaluation strategies should immediately inform the laboratory about fundamental mistakes or compromised samples.
Method for tracking core-contributed publications.
Loomis, Cynthia A; Curchoe, Carol Lynn
2012-12-01
Accurately tracking core-contributed publications is an important and often difficult task. Many core laboratories are supported by programmatic grants (such as Cancer Center Support Grant and Clinical Translational Science Awards) or generate data with instruments funded through S10, Major Research Instrumentation, or other granting mechanisms. Core laboratories provide their research communities with state-of-the-art instrumentation and expertise, elevating research. It is crucial to demonstrate the specific projects that have benefited from core services and expertise. We discuss here the method we developed for tracking core contributed publications.
Green, Cynthia L.; Kligfield, Paul; George, Samuel; Gussak, Ihor; Vajdic, Branislav; Sager, Philip; Krucoff, Mitchell W.
2013-01-01
Background The Cardiac Safety Research Consortium (CSRC) provides both “learning” and blinded “testing” digital ECG datasets from thorough QT (TQT) studies annotated for submission to the US Food and Drug Administration (FDA) to developers of ECG analysis technologies. This manuscript reports the first results from a blinded “testing” dataset that examines Developer re-analysis of original Sponsor-reported core laboratory data. Methods 11,925 anonymized ECGs including both moxifloxacin and placebo arms of a parallel-group TQT in 191 subjects were blindly analyzed using a novel ECG analysis algorithm applying intelligent automation. Developer measured ECG intervals were submitted to CSRC for unblinding, temporal reconstruction of the TQT exposures, and statistical comparison to core laboratory findings previously submitted to FDA by the pharmaceutical sponsor. Primary comparisons included baseline-adjusted interval measurements, baseline- and placebo-adjusted moxifloxacin QTcF changes (ddQTcF), and associated variability measures. Results Developer and Sponsor-reported baseline-adjusted data were similar with average differences less than 1 millisecond (ms) for all intervals. Both Developer and Sponsor-reported data demonstrated assay sensitivity with similar ddQTcF changes. Average within-subject standard deviation for triplicate QTcF measurements was significantly lower for Developer than Sponsor-reported data (5.4 ms and 7.2 ms, respectively; p<0.001). Conclusion The virtually automated ECG algorithm used for this analysis produced similar yet less variable TQT results compared to the Sponsor-reported study, without the use of a manual core laboratory. These findings indicate CSRC ECG datasets can be useful for evaluating novel methods and algorithms for determining QT/QTc prolongation by drugs. While the results should not constitute endorsement of specific algorithms by either CSRC or FDA, the value of a public domain digital ECG warehouse to provide prospective, blinded comparisons of ECG technologies applied for QT/QTc measurement is illustrated. PMID:22424006
NASA Astrophysics Data System (ADS)
Ingraham, M. D.; Dewers, T. A.; Heath, J. E.
2016-12-01
Utilizing the localization conditions laid out in Rudnicki 2002, the failure of a series of tests performed on Mancos shale has been analyzed. Shale specimens were tested under constant mean stress conditions in an axisymmetric stress state, with specimens cored both parallel and perpendicular to bedding. Failure data indicates that for the range of pressures tested the failure surface is well represented by a Mohr- Coulomb failure surface with a friction angle of 34.4 for specimens cored parallel to bedding, and 26.5 for specimens cored perpendicular to bedding. There is no evidence of a yield cap up to 200 MPa mean stress. Comparison with the theory shows that the best agreement in terms of band angles comes from assuming normality of the plastic strain increment. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Pacific Northwest National Laboratory institutional plan: FY 1996--2001
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-01-01
This report contains the operation and direction plan for the Pacific Northwest National Laboratory of the US Department of Energy. The topics of the plan include the laboratory mission and core competencies, the laboratory strategic plan; the laboratory initiatives in molecular sciences, microbial biotechnology, global environmental change, complex modeling of physical systems, advanced processing technology, energy technology development, and medical technologies and systems; core business areas, critical success factors, and resource projections.
Flow Cytometry Scientist | Center for Cancer Research
PROGRAM DESCRIPTION The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). KEY ROLES/RESPONSIBILITIES The Flow Cytometry Core (Flow Core) in the Cancer and Inflammation Program (CIP) is a service core which supports the research efforts of the CCR by providing expertise in the field of flow cytometry (using analyzers and sorters) with the goal of gaining a more thorough understanding of the biology of the immune system, cancer, and inflammation processes. The Flow Core provides service to 12-15 CIP laboratories and more than 22 non-CIP laboratories. Flow core staff provide technical advice on the experimental design of applications, which include immunological phenotyping, cell function assays, and cell cycle analysis. Work is performed per customer requirements, and no independent research is involved. The Flow Cytometry Scientist will be responsible for: Daily management of the Flow Cytometry Core, to include the supervision and guidance of technical staff members Monitor performance of and maintain high dimensional flow cytometer analyzers and cell sorters Operate high dimensional flow cytometer analyzers and cell sorters Provide scientific expertise to the user community and facilitate the development of cutting edge technologies Interact with Flow Core users and customers, and provide technical and scientific advice, and guidance regarding their experiments, including possible collaborations Train staff and scientific end users on the use of flow cytometry in their research, as well as teach them how to operate and troubleshoot the bench-top analyzer instruments Prepare and deliver lectures, as well as one-on-one training sessions, with customers/users Ensure that protocols are up-to-date, and appropriately adhered to Experience with sterile technique and tissue culture
Experimenting with Mathematical Biology
ERIC Educational Resources Information Center
Sanft, Rebecca; Walter, Anne
2016-01-01
St. Olaf College recently added a Mathematical Biology concentration to its curriculum. The core course, Mathematics of Biology, was redesigned to include a wet laboratory. The lab classes required students to collect data and implement the essential modeling techniques of formulation, implementation, validation, and analysis. The four labs…
Results and analysis of saltstone cores taken from saltstone disposal unit cell 2A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reigel, M. M.; Hill, K. A.
2016-03-01
As part of an ongoing Performance Assessment (PA) Maintenance Plan, Savannah River Remediation (SRR) has developed a sampling and analyses strategy to facilitate the comparison of field-emplaced samples (i.e., saltstone placed and cured in a Saltstone Disposal Unit (SDU)) with samples prepared and cured in the laboratory. The primary objectives of the Sampling and Analyses Plan (SAP) are; (1) to demonstrate a correlation between the measured properties of laboratory-prepared, simulant samples (termed Sample Set 3), and the field-emplaced saltstone samples (termed Sample Set 9), and (2) to validate property values assumed for the Saltstone Disposal Facility (SDF) PA modeling. Themore » analysis and property data for Sample Set 9 (i.e. six core samples extracted from SDU Cell 2A (SDU2A)) are documented in this report, and where applicable, the results are compared to the results for Sample Set 3. Relevant properties to demonstrate the aforementioned objectives include bulk density, porosity, saturated hydraulic conductivity (SHC), and radionuclide leaching behavior.« less
Real-time oil-saturation monitoring in rock cores with low-field NMR.
Mitchell, J; Howe, A M; Clarke, A
2015-07-01
Nuclear magnetic resonance (NMR) provides a powerful suite of tools for studying oil in reservoir core plugs at the laboratory scale. Low-field magnets are preferred for well-log calibration and to minimize magnetic-susceptibility-induced internal gradients in the porous medium. We demonstrate that careful data processing, combined with prior knowledge of the sample properties, enables real-time acquisition and interpretation of saturation state (relative amount of oil and water in the pores of a rock). Robust discrimination of oil and brine is achieved with diffusion weighting. We use this real-time analysis to monitor the forced displacement of oil from porous materials (sintered glass beads and sandstones) and to generate capillary desaturation curves. The real-time output enables in situ modification of the flood protocol and accurate control of the saturation state prior to the acquisition of standard NMR core analysis data, such as diffusion-relaxation correlations. Although applications to oil recovery and core analysis are demonstrated, the implementation highlights the general practicality of low-field NMR as an inline sensor for real-time industrial process control. Copyright © 2015 Elsevier Inc. All rights reserved.
An optical method for characterizing carbon content in ceramic pot filters.
Goodwin, J Y; Elmore, A C; Salvinelli, C; Reidmeyer, Mary R
2017-08-01
Ceramic pot filter (CPF) technology is a relatively common means of household water treatment in developing areas, and performance characteristics of CPFs have been characterized using production CPFs, experimental CPFs fabricated in research laboratories, and ceramic disks intended to be CPF surrogates. There is evidence that CPF manufacturers do not always fire their products according to best practices and the result is incomplete combustion of the pore forming material and the creation of a carbon core in the final CPFs. Researchers seldom acknowledge the existence of potential existence of carbon cores, and at least one CPF producer has postulated that the carbon may be beneficial in terms of final water quality because of the presence of activated carbon in consumer filters marketed in the Western world. An initial step in characterizing the presence and impact of carbon cores is the characterization of those cores. An optical method which may be more viable to producers relative to off-site laboratory analysis of carbon content has been developed and verified. The use of the optical method is demonstrated via preliminary disinfection and flowrate studies, and the results of these studies indicate that the method may be of use in studying production kiln operation.
Hunter, Adam; Dayalan, Saravanan; De Souza, David; Power, Brad; Lorrimar, Rodney; Szabo, Tamas; Nguyen, Thu; O'Callaghan, Sean; Hack, Jeremy; Pyke, James; Nahid, Amsha; Barrero, Roberto; Roessner, Ute; Likic, Vladimir; Tull, Dedreia; Bacic, Antony; McConville, Malcolm; Bellgard, Matthew
2017-01-01
An increasing number of research laboratories and core analytical facilities around the world are developing high throughput metabolomic analytical and data processing pipelines that are capable of handling hundreds to thousands of individual samples per year, often over multiple projects, collaborations and sample types. At present, there are no Laboratory Information Management Systems (LIMS) that are specifically tailored for metabolomics laboratories that are capable of tracking samples and associated metadata from the beginning to the end of an experiment, including data processing and archiving, and which are also suitable for use in large institutional core facilities or multi-laboratory consortia as well as single laboratory environments. Here we present MASTR-MS, a downloadable and installable LIMS solution that can be deployed either within a single laboratory or used to link workflows across a multisite network. It comprises a Node Management System that can be used to link and manage projects across one or multiple collaborating laboratories; a User Management System which defines different user groups and privileges of users; a Quote Management System where client quotes are managed; a Project Management System in which metadata is stored and all aspects of project management, including experimental setup, sample tracking and instrument analysis, are defined, and a Data Management System that allows the automatic capture and storage of raw and processed data from the analytical instruments to the LIMS. MASTR-MS is a comprehensive LIMS solution specifically designed for metabolomics. It captures the entire lifecycle of a sample starting from project and experiment design to sample analysis, data capture and storage. It acts as an electronic notebook, facilitating project management within a single laboratory or a multi-node collaborative environment. This software is being developed in close consultation with members of the metabolomics research community. It is freely available under the GNU GPL v3 licence and can be accessed from, https://muccg.github.io/mastr-ms/.
Altering Reservoir Wettability to Improve Production from Single Wells
DOE Office of Scientific and Technical Information (OSTI.GOV)
W. W. Weiss
2006-09-30
Many carbonate reservoirs are naturally fractured and typically produce less than 10% original oil in place during primary recovery. Spontaneous imbibition has proven an important mechanism for oil recovery from fractured reservoirs, which are usually weak waterflood candidates. In some situations, chemical stimulation can promote imbibition of water to alter the reservoir wettability toward water-wetness such that oil is produced at an economic rate from the rock matrix into fractures. In this project, cores and fluids from five reservoirs were used in laboratory tests: the San Andres formation (Fuhrman Masho and Eagle Creek fields) in the Permian Basin of Texasmore » and New Mexico; and the Interlake, Stony Mountain, and Red River formations from the Cedar Creek Anticline in Montana and South Dakota. Solutions of nonionic, anionic, and amphoteric surfactants with formation water were used to promote waterwetness. Some Fuhrman Masho cores soaked in surfactant solution had improved oil recovery up to 38%. Most Eagle Creek cores did not respond to any of the tested surfactants. Some Cedar Creek anticline cores had good response to two anionic surfactants (CD 128 and A246L). The results indicate that cores with higher permeability responded better to the surfactants. The increased recovery is mainly ascribed to increased water-wetness. It is suspected that rock mineralogy is also an important factor. The laboratory work generated three field tests of the surfactant soak process in the West Fuhrman Masho San Andres Unit. The flawlessly designed tests included mechanical well clean out, installation of new pumps, and daily well tests before and after the treatments. Treatments were designed using artificial intelligence (AI) correlations developed from 23 previous surfactant soak treatments. The treatments were conducted during the last quarter of 2006. One of the wells produced a marginal volume of incremental oil through October. It is interesting to note that the field tests were conducted in an area of the field that has not met production expectations. The dataset on the 23 Phosphoria well surfactant soaks was updated. An analysis of the oil decline curves indicted that 4.5 lb of chemical produced a barrel of incremental oil. The AI analysis supports the adage 'good wells are the best candidates.' The generally better performance of surfactant in the high permeability core laboratory tests supports this observation. AI correlations were developed to predict the response to water-frac stimulations in a tight San Andres reservoir. The correlations maybe useful in the design of Cedar Creek Anticline surfactant soak treatments planned for next year. Nuclear Magnetic Resonance scans of dolomite cores to measure porosity and saturation during the high temperature laboratory work were acquired. The scans could not be correlated with physical measurement using either conventional or AI methods.« less
The Alzheimer's Disease Neuroimaging Initiative 2 PET Core: 2015.
Jagust, William J; Landau, Susan M; Koeppe, Robert A; Reiman, Eric M; Chen, Kewei; Mathis, Chester A; Price, Julie C; Foster, Norman L; Wang, Angela Y
2015-07-01
This article reviews the work done in the Alzheimer's Disease Neuroimaging Initiative positron emission tomography (ADNI PET) core over the past 5 years, largely concerning techniques, methods, and results related to amyloid imaging in ADNI. The PET Core has used [(18)F]florbetapir routinely on ADNI participants, with over 1600 scans available for download. Four different laboratories are involved in data analysis, and have examined factors such as longitudinal florbetapir analysis, use of [(18)F]fluorodeoxyglucose (FDG)-PET in clinical trials, and relationships between different biomarkers and cognition. Converging evidence from the PET Core has indicated that cross-sectional and longitudinal florbetapir analyses require different reference regions. Studies have also examined the relationship between florbetapir data obtained immediately after injection, which reflects perfusion, and FDG-PET results. Finally, standardization has included the translation of florbetapir PET data to a centiloid scale. The PET Core has demonstrated a variety of methods for the standardization of biomarkers such as florbetapir PET in a multicenter setting. Copyright © 2015 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Integration of Video-Based Demonstrations to Prepare Students for the Organic Chemistry Laboratory
NASA Astrophysics Data System (ADS)
Nadelson, Louis S.; Scaggs, Jonathan; Sheffield, Colin; McDougal, Owen M.
2015-08-01
Consistent, high-quality introductions to organic chemistry laboratory techniques effectively and efficiently support student learning in the organic chemistry laboratory. In this work, we developed and deployed a series of instructional videos to communicate core laboratory techniques and concepts. Using a quasi-experimental design, we tested the videos in five traditional laboratory experiments by integrating them with the standard pre-laboratory student preparation presentations and instructor demonstrations. We assessed the influence of the videos on student laboratory knowledge and performance, using sections of students who did not view the videos as the control. Our analysis of pre-quizzes revealed the control group had equivalent scores to the treatment group, while the post-quiz results show consistently greater learning gains for the treatment group. Additionally, the students who watched the videos as part of their pre-laboratory instruction completed their experiments in less time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khan, E.U.; George, T.L.; Rector, D.R.
The natural circulation tests of the Fast Flux Test Facility (FFTF) demonstrated a safe and stable transition from forced convection to natural convection and showed that natural convection may adequately remove decay heat from the reactor core. The COBRA-WC computer code was developed by the Pacific Northwest laboratory (PNL) to account for buoyancy-induced coolant flow redistribution and interassembly heat transfer, effects that become important in mitigating temperature gradients and reducing reactor core temperatures when coolant flow rate in the core is low. This report presents work sponsored by the US Department of Energy (DOE) with the objective of checking themore » validity of COBRA-WC during the first 220 seconds (sec) of the FFTF natural-circulation (plant-startup) tests using recorded data from two instrumented Fuel Open Test Assemblies (FOTAs). Comparison of COBRA-WC predictions of the FOTA data is a part of the final confirmation of the COBRA-WC methodology for core natural-convection analysis.« less
Green, Cynthia L; Kligfield, Paul; George, Samuel; Gussak, Ihor; Vajdic, Branislav; Sager, Philip; Krucoff, Mitchell W
2012-03-01
The Cardiac Safety Research Consortium (CSRC) provides both "learning" and blinded "testing" digital electrocardiographic (ECG) data sets from thorough QT (TQT) studies annotated for submission to the US Food and Drug Administration (FDA) to developers of ECG analysis technologies. This article reports the first results from a blinded testing data set that examines developer reanalysis of original sponsor-reported core laboratory data. A total of 11,925 anonymized ECGs including both moxifloxacin and placebo arms of a parallel-group TQT in 181 subjects were blindly analyzed using a novel ECG analysis algorithm applying intelligent automation. Developer-measured ECG intervals were submitted to CSRC for unblinding, temporal reconstruction of the TQT exposures, and statistical comparison to core laboratory findings previously submitted to FDA by the pharmaceutical sponsor. Primary comparisons included baseline-adjusted interval measurements, baseline- and placebo-adjusted moxifloxacin QTcF changes (ddQTcF), and associated variability measures. Developer and sponsor-reported baseline-adjusted data were similar with average differences <1 ms for all intervals. Both developer- and sponsor-reported data demonstrated assay sensitivity with similar ddQTcF changes. Average within-subject SD for triplicate QTcF measurements was significantly lower for developer- than sponsor-reported data (5.4 and 7.2 ms, respectively; P < .001). The virtually automated ECG algorithm used for this analysis produced similar yet less variable TQT results compared with the sponsor-reported study, without the use of a manual core laboratory. These findings indicate that CSRC ECG data sets can be useful for evaluating novel methods and algorithms for determining drug-induced QT/QTc prolongation. Although the results should not constitute endorsement of specific algorithms by either CSRC or FDA, the value of a public domain digital ECG warehouse to provide prospective, blinded comparisons of ECG technologies applied for QT/QTc measurement is illustrated. Copyright © 2012 Mosby, Inc. All rights reserved.
Group B Strep Infection in Newborns
... Active Bacterial Core surveillance (ABCs) CDC Streptococcus Laboratory Sepsis Group B Strep Disease in Newborns Language: English ( ... Active Bacterial Core surveillance (ABCs) CDC Streptococcus Laboratory Sepsis Language: English (US) Español (Spanish) File Formats Help: ...
Perkins, Kimberlie; Johnson, Brittany D.; Mirus, Benjamin B.
2014-01-01
During 2013–14, the USGS, in cooperation with the U.S. Department of Energy, focused on further characterization of the sedimentary interbeds below the future site of the proposed Remote Handled Low-Level Waste (RHLLW) facility, which is intended for the long-term storage of low-level radioactive waste. Twelve core samples from the sedimentary interbeds from a borehole near the proposed facility were collected for laboratory analysis of hydraulic properties, which also allowed further testing of the property-transfer modeling approach. For each core sample, the steady-state centrifuge method was used to measure relations between matric potential, saturation, and conductivity. These laboratory measurements were compared to water-retention and unsaturated hydraulic conductivity parameters estimated using the established property-transfer models. For each core sample obtained, the agreement between measured and estimated hydraulic parameters was evaluated quantitatively using the Pearson correlation coefficient (r). The highest correlation is for saturated hydraulic conductivity (Ksat) with an r value of 0.922. The saturated water content (qsat) also exhibits a strong linear correlation with an r value of 0.892. The curve shape parameter (λ) has a value of 0.731, whereas the curve scaling parameter (yo) has the lowest r value of 0.528. The r values demonstrate that model predictions correspond well to the laboratory measured properties for most parameters, which supports the value of extending this approach for quantifying unsaturated hydraulic properties at various sites throughout INL.
Oak Ridge National Laboratory Core Competencies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberto, J.B.; Anderson, T.D.; Berven, B.A.
1994-12-01
A core competency is a distinguishing integration of capabilities which enables an organization to deliver mission results. Core competencies represent the collective learning of an organization and provide the capacity to perform present and future missions. Core competencies are distinguishing characteristics which offer comparative advantage and are difficult to reproduce. They exhibit customer focus, mission relevance, and vertical integration from research through applications. They are demonstrable by metrics such as level of investment, uniqueness of facilities and expertise, and national impact. The Oak Ridge National Laboratory (ORNL) has identified four core competencies which satisfy the above criteria. Each core competencymore » represents an annual investment of at least $100M and is characterized by an integration of Laboratory technical foundations in physical, chemical, and materials sciences; biological, environmental, and social sciences; engineering sciences; and computational sciences and informatics. The ability to integrate broad technical foundations to develop and sustain core competencies in support of national R&D goals is a distinguishing strength of the national laboratories. The ORNL core competencies are: 9 Energy Production and End-Use Technologies o Biological and Environmental Sciences and Technology o Advanced Materials Synthesis, Processing, and Characterization & Neutron-Based Science and Technology. The distinguishing characteristics of each ORNL core competency are described. In addition, written material is provided for two emerging competencies: Manufacturing Technologies and Computational Science and Advanced Computing. Distinguishing institutional competencies in the Development and Operation of National Research Facilities, R&D Integration and Partnerships, Technology Transfer, and Science Education are also described. Finally, financial data for the ORNL core competencies are summarized in the appendices.« less
NASA Astrophysics Data System (ADS)
Judge, S. A.; Wilson, T. J.
2005-12-01
The International Polar Year (IPY) provides an excellent opportunity for highlighting polar research in education. The ultimate goal of our outreach and education program is to develop a series of modules that are focused on societally-relevant topics being investigated in Antarctic earth science, while teaching basic geologic concepts that are standard elements of school curricula. For example, we envision a university-level, undergraduate, introductory earth science class with the entire semester/quarter laboratory program focused on polar earth science research during the period of the International Polar Year. To attain this goal, a series of modules will be developed, including inquiry-based exercises founded on imagery (video, digital photos, digital core scans), GIS data layers, maps, and data sets available from OSU research groups. Modules that highlight polar research are also suitable for the K-12 audience. Scaleable/grade appropriate modules that use some of the same data sets as the undergraduate modules can be outlined for elementary through high school earth science classes. An initial module is being developed that focuses on paleoclimate data. The module provides a hands-on investigation of the climate history archived in both ice cores and sedimentary rock cores in order to understand time scales, drivers, and processes of global climate change. The paleoclimate module also demonstrates the types of polar research that are ongoing at OSU, allowing students to observe what research the faculty are undertaking in their respective fields. This will link faculty research with student education in the classroom, enhancing learning outcomes. Finally, this module will provide a direct link to U.S. Antarctic Program research related to the International Polar Year, when new ice and sedimentary rock cores will be obtained and analyzed. As a result of this laboratory exercise, the students will be able to: (1) Define an ice core and a sedimentary rock core. (Knowledge) (2) Identify climate indicators in each type of core by using digital core images. These include layers of particulate material (such as volcanic tephra) in ice cores and layers of larger grains (such as ice-rafted debris) in sedimentary rock cores. (Knowledge) (3) Describe how cores are taken in extreme environments, such as Antarctica. (Comprehension) (4) Use actual data from proxies in the ice and sedimentary records to graph changes through time in the cores. (Application) (5) Recognize variances in data sets that might illustrate periods of climate change. (Analysis) (6) Integrate data results from several proxies in order to construct a climate record for both ice cores and sedimentary rock cores. (Synthesis) (7) Interpret both the ice core and sedimentary rock core records to ascertain the effectiveness of both of these tools in archiving climate records. (Evaluation)
This testing is available only after mandatory prior consultation with Dr. Jeff Lifson (lifsonj@mail.nih.gov) of the Quantitative Molecular Diagnostics Core (QMDC). This assay is performed by the QMDC for measuring levels of cell- or t
Field testing a mobile inelastic neutron scattering system to measure soil carbon
USDA-ARS?s Scientific Manuscript database
Cropping history in conjunction with soil management practices can have a major impact on the amount of organic carbon (C) stored in soil. Current methods of assessing soil C based on soil coring and subsequent processing procedures prior to laboratory analysis are labor intensive and time consuming...
The Tissue Analysis Core (TAC) within the AIDS and Cancer Virus Program will process, embed, and perform microtomy on fixed tissue samples presented in ethanol. Collagen I, Collagen III, or Fibronectin immunohistochemistry will be performed, in order
Technology Development and Deployment | Energy Analysis | NREL
nexus. Example Projects Making Biofuel from Microalgae The Energy-Water-Food Nexus through the Lens of Algal Systems Planning for Algal Systems: An Energy-Water-Food Nexus Perspective (a strategic framework ) Core Capabilities Field Test Laboratory Building Sample Publications "Energy-Water-Food Nexus
Role of CT scanning in formation evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergosh, J.L.; Dibona, B.G.
1988-01-01
The use of the computerized tomographic (CT) scanner in formation evaluation of difficult to analyze core samples has moved from the research and development phase to daily, routine use in the core-analysis laboratory. The role of the CT scanner has become increasingly important as geologists try to obtain more representative core material for accurate formation evaluation. The most common problem facing the core analyst when preparing to measure petrophysical properties is the selection of representative and unaltered core samples for routine and special core testing. Recent data have shown that heterogeneous reservoir rock can be very difficult, if not impossible,more » to assess correctly when using standard core examination procedures, because many features, such as fractures, are not visible on the core surface. Another problem is the invasion of drilling mud into the core sample. Flushing formation oil and water from the core can greatly alter the saturation and distribution of fluids and lead to serious formation evaluation problems. Because the quality and usefulness of the core date are directly tied to proper sample selection, it has become imperative that the CT scanner be used whenever possible.« less
Steigen, Terje K; Claudio, Cheryl; Abbott, David; Schulzer, Michael; Burton, Jeff; Tymchak, Wayne; Buller, Christopher E; John Mancini, G B
2008-06-01
To assess reproducibility of core laboratory performance and impact on sample size calculations. Little information exists about overall reproducibility of core laboratories in contradistinction to performance of individual technicians. Also, qualitative parameters are being adjudicated increasingly as either primary or secondary end-points. The comparative impact of using diverse indexes on sample sizes has not been previously reported. We compared initial and repeat assessments of five quantitative parameters [e.g., minimum lumen diameter (MLD), ejection fraction (EF), etc.] and six qualitative parameters [e.g., TIMI myocardial perfusion grade (TMPG) or thrombus grade (TTG), etc.], as performed by differing technicians and separated by a year or more. Sample sizes were calculated from these results. TMPG and TTG were also adjudicated by a second core laboratory. MLD and EF were the most reproducible, yielding the smallest sample size calculations, whereas percent diameter stenosis and centerline wall motion require substantially larger trials. Of the qualitative parameters, all except TIMI flow grade gave reproducibility characteristics yielding sample sizes of many 100's of patients. Reproducibility of TMPG and TTG was only moderately good both within and between core laboratories, underscoring an intrinsic difficulty in assessing these. Core laboratories can be shown to provide reproducibility performance that is comparable to performance commonly ascribed to individual technicians. The differences in reproducibility yield huge differences in sample size when comparing quantitative and qualitative parameters. TMPG and TTG are intrinsically difficult to assess and conclusions based on these parameters should arise only from very large trials.
Trace-element analyses of core samples from the 1967-1988 drillings of Kilauea Iki lava lake, Hawaii
Helz, Rosalind Tuthill
2012-01-01
This report presents previously unpublished analyses of trace elements in drill core samples from Kilauea Iki lava lake and from the 1959 eruption that fed the lava lake. The two types of data presented were obtained by instrumental neutron-activation analysis (INAA) and energy-dispersive X-ray fluorescence analysis (EDXRF). The analyses were performed in U.S. Geological Survey (USGS) laboratories from 1989 to 1994. This report contains 93 INAA analyses on 84 samples and 68 EDXRF analyses on 68 samples. The purpose of the study was to document trace-element variation during chemical differentiation, especially during the closed-system differentiation of Kilauea Iki lava lake.
NASA Astrophysics Data System (ADS)
Yoneda, J.; Masui, A.; Konno, Y.; Jin, Y.; Kida, M.; Suzuki, K.; Nakatsuka, Y.; Tenma, N.; Nagao, J.
2014-12-01
Natural gas hydrate-bearing pressure core sediments have been sheared in compression using a newly developed Transparent Acrylic Cell Triaxial Testing (TACTT) system to investigate the geophysical and geomechanical behavior of sediments recovered from the deep seabed in the Eastern Nankai Trough, the first Japanese offshore production test region. The sediments were recovered by hybrid pressure core system (hybrid PCS) and pressure cores were cut by pressure core analysis tools (PCATs) on board. These pressure cores were transferred to the AIST Hokkaido centre and trimmed by pressure core non-destructive analysis tools (PNATs) for TACTT system which maintained the pressure and temperature conditions within the hydrate stability boundary, through the entire process of core handling from drilling to the end of laboratory testing. An image processing technique was used to capture the motion of sediment in a transparent acrylic cell, and digital photographs were obtained at every 0.1% of vertical strain during the test. Analysis of the optical images showed that sediments with 63% hydrate saturation exhibited brittle failure, although nonhydrate-bearing sediments exhibited ductile failure. In addition, the increase in shear strength with hydrate saturation increase of natural gas hydrate is in agreement with previous data from synthetic gas hydrate. This research was financially supported by the Research Consortium for Methane Hydrate Resources in Japan (MH21 Research Consortium) that carries out Japan's Methane Hydrate R&D Program by the Ministry of Economy, Trade and Industry (METI).
Neutronics Analyses of the Minimum Original HEU TREAT Core
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontogeorgakos, D.; Connaway, H.; Yesilyurt, G.
2014-04-01
This work was performed to support the feasibility study on the potential conversion of the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory from the use of high-enriched uranium (HEU) fuel to the use of low-enriched uranium (LEU) fuel. The analyses were performed by the GTRI Reactor Conversion staff at the Argonne National Laboratory (ANL). The objective of this study was to validate the MCNP model of the TREAT reactor with the well-documented measurements which were taken during the start-up and early operation of TREAT. Furthermore, the effect of carbon graphitization was also addressed. The graphitization level was assumedmore » to be 100% (ANL/GTRI/TM-13/4). For this purpose, a set of experiments was chosen to validate the TREAT MCNP model, involving the approach to criticality procedure, in-core neutron flux measurements with foils, and isothermal temperature coefficient and temperature distribution measurements. The results of this study extended the knowledge base for the TREAT MCNP calculations and established the credibility of the MCNP model to be used in the core conversion feasibility analysis.« less
Metallurgical failure analysis of MH-1A reactor core hold-down bolts. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hawthorne, J.R.; Watson, H.E.
1976-11-01
The Naval Research Laboratory has performed a failure analysis on two MH-1A reactor core hold-down bolts that broke in service. Adherence to fabrication specifications, post-service properties and possible causes of bolt failure were investigated. The bolt material was verified as 17-4PH precipitation hardening stainless steel. Measured bolt dimensions also were in accordance with fabrication drawing specifications. Bolt failure occurred in the region of a locking pin hole which reduced the bolt net section by 47 percent. The failure analysis indicates that the probable cause of failure was net section overloading resulting from a lateral bending force on the bolt. Themore » analysis indicates that net section overloading could also have resulted from combined tensile stresses (bolt preloading plus differential thermal expansion). Recommendations are made for improved bolting.« less
Data for ground-water test hole near Zamora, Central Valley Aquifer Project, California
French, J.J.; Page, R.W.; Bertoldi, G.L.
1982-01-01
Preliminary data are presented for the first of seven test holes drilled as a part of the Central Valley Aquifer Project which is part of the National Regional Aquifer Systems Analysis Program. The test hole was drilled in the SW 1/4 SE 1/4 sec. 34, T. 12 N. , R. 1 E., Yolo County, California, about 3 miles northeast of the town of Zamora. Drilled to a depth of 2,500 feet below land surface, the hole is cased to a depth of 190 feet and equipped with three piezometer tubes to depths of 947, 1,401, and 2,125 feet. A 5-foot well screen is at the bottom of each piezometer. Eighteen cores and 68 sidewall cores were recovered. Laboratory tests were made for mineralogy, hydraulic conductivity, porosity , consolidation, grain-size distribution, Atterberg limits, X-ray diffraction, diatom identification, thermal conductivity, and chemical analysis of water. Geophysical and thermal gradient logs were made. The hole is sampled periodically for chemical analysis and measured for water level in the three tapped zones. This report presents methods used to obtain field samples, laboratory procedures, and the data obtained. (USGS)
NASA Astrophysics Data System (ADS)
Hadi Mosleh, M.; Turner, M.; Sedighi, M.; Vardon, P. J.
2017-01-01
This paper presents the design, development, and application of a laboratory setup for the experimental investigations of gas flow and reactions in a fractured rock. The laboratory facility comprises (i) a high pressure manometric sorption apparatus, where equilibrium and kinetic phenomena of adsorption and desorption can be examined, (ii) a high pressure triaxial core flooding system where the chemical reactive transport properties or processes can be explored, and (iii) an ancillary system including pure and mixed gas supply and analysis units. Underground conditions, in terms of pore pressure, confining pressure, and temperature, can be replicated using the triaxial core flooding system developed for depths up to 2 km. Core flooding experiments can be conducted under a range of gas injection pressures up to 20 MPa and temperatures up to 338 K. Details of the design considerations and the specification for the critical measuring instruments are described. The newly developed laboratory facility has been applied to study the adsorption of N2, CH4, and CO2 relevant to applications in carbon sequestration in coal and enhanced coalbed methane recovery. Under a wide range of pressures, the flow of helium in a core sample was studied and the evolution of absolute permeability at different effective stress conditions has been investigated. A comprehensive set of high resolution data has been produced on anthracite coal samples from the South Wales coalfield, using the developed apparatus. The results of the applications provide improved insight into the high pressure flow and reaction of various gas species in the coal samples from the South Wales coalfield.
Contributed Review: Nuclear magnetic resonance core analysis at 0.3 T
NASA Astrophysics Data System (ADS)
Mitchell, Jonathan; Fordham, Edmund J.
2014-11-01
Nuclear magnetic resonance (NMR) provides a powerful toolbox for petrophysical characterization of reservoir core plugs and fluids in the laboratory. Previously, there has been considerable focus on low field magnet technology for well log calibration. Now there is renewed interest in the study of reservoir samples using stronger magnets to complement these standard NMR measurements. Here, the capabilities of an imaging magnet with a field strength of 0.3 T (corresponding to 12.9 MHz for proton) are reviewed in the context of reservoir core analysis. Quantitative estimates of porosity (saturation) and pore size distributions are obtained under favorable conditions (e.g., in carbonates), with the added advantage of multidimensional imaging, detection of lower gyromagnetic ratio nuclei, and short probe recovery times that make the system suitable for shale studies. Intermediate field instruments provide quantitative porosity maps of rock plugs that cannot be obtained using high field medical scanners due to the field-dependent susceptibility contrast in the porous medium. Example data are presented that highlight the potential applications of an intermediate field imaging instrument as a complement to low field instruments in core analysis and for materials science studies in general.
The Tissue Analysis Core (TAC) within the AIDS and Cancer Virus Program will process, embed, and perform microtomy on fixed tissue samples presented in ethanol. CD4 (DAB) and CD68/CD163 (FastRed) double immunohistochemistry will be performed, in whic
Two-Dimensional Neutronic and Fuel Cycle Analysis of the Transatomic Power Molten Salt Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Betzler, Benjamin R.; Powers, Jeffrey J.; Worrall, Andrew
2017-01-15
This status report presents the results from the first phase of the collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear, Nuclear Energy Voucher program. The TAP design is a molten salt reactor using movable moderator rods to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches andmore » time-dependent parameters necessary to simulate the continuously changing physics in this complex system. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this design. Additional analyses of time step sizes, mass feed rates and enrichments, and isotopic removals provide additional information to make informed design decisions. This work further demonstrates capabilities of ORNL modeling and simulation tools for analysis of molten salt reactor designs and strongly positions this effort for the upcoming three-dimensional core analysis.« less
Faster, better, cheaper: lean labs are the key to future survival.
Bryant, Patsy M; Gulling, Richard D
2006-03-28
Process improvement techniques have been used in manufacturing for many years to rein in costs and improve quality. Health care is now grappling with similar challenges. The Department of Laboratory Services at Good Samaritan Hospital, a 560-bed facility in Dayton, OH, used the Lean process improvement method in a 12-week project to streamline its core laboratory processes. By analyzing the flow of samples through the system and identifying value-added and non-value-added steps, both in the laboratory and during the collection process, Good Samaritan's project team redesigned systems and reconfigured the core laboratory layout to trim collection-to-results time from 65 minutes to 40 minutes. As a result, virtually all morning results are available to physicians by 7 a.m., critical values are called to nursing units within 30 minutes, and core laboratory services are optimally staffed for maximum cost-effectiveness.
Pacific Northwest Laboratory Institutional Plan FY 1995-2000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-12-01
This report serves as a document to describe the role PNL is positioned to take in the Department of Energy`s plans for its national centers in the period 1995-2000. It highlights the strengths of the facilities and personnel present at the laboratory, touches on the accomplishments and projects they have contributed to, and the direction being taken to prepare for the demands to be placed on DOE facilities in the near and far term. It consists of sections titled: director`s statement; laboratory mission and core competencies; laboratory strategic plan; laboratory initiatives; core business areas; critical success factors.
Core courses in public health laboratory science and practice: findings from 2006 and 2011 surveys.
DeBoy, John M; Beck, Angela J; Boulton, Matthew L; Kim, Deborah H; Wichman, Michael D; Luedtke, Patrick F
2013-01-01
We identified academic training courses or topics most important to the careers of U.S. public health, environmental, and agricultural laboratory (PHEAL) scientist-managers and directors, and determined what portions of the national PHEAL workforce completed these courses. We conducted electronic national surveys in 2006 and 2011, and analyzed data using numerical ranking, Chi-square tests comparing rates, and Spearman's formula measuring rank correlation. In 2006, 40 of 50 PHEAL directors identified 56 course topics as either important, useful, or not needed for someone in their position. These course topics were then ranked to provide a list of 31 core courses. In 2011, 1,659 of approximately 5,555 PHEAL scientific and technical staff, using a subset of 25 core courses, evidenced higher core course completion rates associated with higher-level job classification, advanced academic degree, and age. The 2011 survey showed that 287 PHEAL scientist-managers and directors, on average, completed 37.7% (n=5/13) of leadership/managerial core courses and 51.7% (n=6/12) of scientific core courses. For 1,659 laboratorians in all scientific and technical classifications, core-subject completion rates were higher in local laboratories (42.8%, n=11/25) than in state (36.0%, n=9/25), federal (34.4%, n=9/25), and university (31.2%, n=8/25) laboratories. There is a definable range of scientific, leadership, and managerial core courses needed by PHEAL scientist-managers and directors to function effectively in their positions. Potential PHEAL scientist-managers and directors need greater and continuing access to these courses, and academic and practice entities supporting development of this workforce should adopt curricula and core competencies aligned with these course topics.
Core Courses in Public Health Laboratory Science and Practice: Findings from 2006 and 2011 Surveys
Beck, Angela J.; Boulton, Matthew L.; Kim, Deborah H.; Wichman, Michael D.; Luedtke, Patrick F.
2013-01-01
Objectives We identified academic training courses or topics most important to the careers of U.S. public health, environmental, and agricultural laboratory (PHEAL) scientist-managers and directors, and determined what portions of the national PHEAL workforce completed these courses. Methods We conducted electronic national surveys in 2006 and 2011, and analyzed data using numerical ranking, Chi-square tests comparing rates, and Spearman's formula measuring rank correlation. Results In 2006, 40 of 50 PHEAL directors identified 56 course topics as either important, useful, or not needed for someone in their position. These course topics were then ranked to provide a list of 31 core courses. In 2011, 1,659 of approximately 5,555 PHEAL scientific and technical staff, using a subset of 25 core courses, evidenced higher core course completion rates associated with higher-level job classification, advanced academic degree, and age. The 2011 survey showed that 287 PHEAL scientist-managers and directors, on average, completed 37.7% (n=5/13) of leadership/managerial core courses and 51.7% (n=6/12) of scientific core courses. For 1,659 laboratorians in all scientific and technical classifications, core-subject completion rates were higher in local laboratories (42.8%, n=11/25) than in state (36.0%, n=9/25), federal (34.4%, n=9/25), and university (31.2%, n=8/25) laboratories. Conclusions There is a definable range of scientific, leadership, and managerial core courses needed by PHEAL scientist-managers and directors to function effectively in their positions. Potential PHEAL scientist-managers and directors need greater and continuing access to these courses, and academic and practice entities supporting development of this workforce should adopt curricula and core competencies aligned with these course topics. PMID:23997310
Prostate needle biopsy processing: a survey of laboratory practice across Europe.
Varma, Murali; Berney, Daniel M; Algaba, Ferran; Camparo, Philippe; Compérat, Eva; Griffiths, David F R; Kristiansen, Glen; Lopez-Beltran, Antonio; Montironi, Rodolfo; Egevad, Lars
2013-02-01
To determine the degree of variation in the handling of prostate needle biopsies (PBNx) in laboratories across Europe. A web based survey was emailed to members of the European Network of Uropathology and the British Association of Urological Pathologists. Responses were received from 241 laboratories in 15 countries. PNBx were generally taken by urologists (93.8%) or radiologists (23.7%) but in 8.7% were also taken by non-medical personnel such as radiographers, nurses or biomedical assistants. Of the responding laboratories, 40.8% received cores in separate containers, 42.3% processed one core/block, 54.2% examined three levels/block, 49.4% examined one H&E section/level and 56.1% retained spare sections for potential immunohistochemistry. Of the laboratories, 40.9% retained unstained spares for over a year while 36.2% discarded spares within 1 month of reporting. Only two (0.8%) respondents routinely performed immunohistochemistry on all PNBx. There were differences in laboratory practice between the UK and the rest of Europe (RE). Procurement of PNBx by non-medical personnel was more common in the UK. RE laboratories more commonly received each core in a separate container, processed one core/block, examined fewer levels/block and examined more H&E sections/level. RE laboratories also retained spares for potential immunohistochemistry less often and for shorter periods. Use of p63 as the sole basal cell marker was more common in RE. There are marked differences in procurement, handling and processing of PNBx in laboratories across Europe. This data can help the development of best practice guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
The Point-of-Care Laboratory in Clinical Microbiology
Michel-Lepage, Audrey; Boyer, Sylvie; Raoult, Didier
2016-01-01
SUMMARY Point-of-care (POC) laboratories that deliver rapid diagnoses of infectious diseases were invented to balance the centralization of core laboratories. POC laboratories operate 24 h a day and 7 days a week to provide diagnoses within 2 h, largely based on immunochromatography and real-time PCR tests. In our experience, these tests are conveniently combined into syndrome-based kits that facilitate sampling, including self-sampling and test operations, as POC laboratories can be operated by trained operators who are not necessarily biologists. POC laboratories are a way of easily providing clinical microbiology testing for populations distant from laboratories in developing and developed countries and on ships. Modern Internet connections enable support from core laboratories. The cost-effectiveness of POC laboratories has been established for the rapid diagnosis of tuberculosis and sexually transmitted infections in both developed and developing countries. PMID:27029593
Ranking of sabotage/tampering avoidance technology alternatives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, W.B.; Tabatabai, A.S.; Powers, T.B.
1986-01-01
Pacific Northwest Laboratory conducted a study to evaluate alternatives to the design and operation of nuclear power plants, emphasizing a reduction of their vulnerability to sabotage. Estimates of core melt accident frequency during normal operations and from sabotage/tampering events were used to rank the alternatives. Core melt frequency for normal operations was estimated using sensitivity analysis of results of probabilistic risk assessments. Core melt frequency for sabotage/tampering was estimated by developing a model based on probabilistic risk analyses, historic data, engineering judgment, and safeguards analyses of plant locations where core melt events could be initiated. Results indicate the most effectivemore » alternatives focus on large areas of the plant, increase safety system redundancy, and reduce reliance on single locations for mitigation of transients. Less effective options focus on specific areas of the plant, reduce reliance on some plant areas for safe shutdown, and focus on less vulnerable targets.« less
NASA Astrophysics Data System (ADS)
Riegel, H. B.; Zambrano, M.; Jablonska, D.; Emanuele, T.; Agosta, F.; Mattioni, L.; Rustichelli, A.
2017-12-01
The hydraulic properties of fault zones depend upon the individual contributions of the damage zone and the fault core. In the case of the damage zone, it is generally characterized by means of fracture analysis and modelling implementing multiple approaches, for instance the discrete fracture network model, the continuum model, and the channel network model. Conversely, the fault core is more difficult to characterize because it is normally composed of fine grain material generated by friction and wear. If the dimensions of the fault core allows it, the porosity and permeability are normally studied by means of laboratory analysis or in the other case by two dimensional microporosity analysis and in situ measurements of permeability (e.g. micro-permeameter). In this study, a combined approach consisting of fracture modeling, three-dimensional microporosity analysis, and computational fluid dynamics was applied to characterize the hydraulic properties of fault zones. The studied fault zones crosscut a well-cemented heterolithic succession (sandstone and mudstones) and may vary in terms of fault core thickness and composition, fracture properties, kinematics (normal or strike-slip), and displacement. These characteristics produce various splay and fault core behavior. The alternation of sandstone and mudstone layers is responsible for the concurrent occurrence of brittle (fractures) and ductile (clay smearing) deformation. When these alternating layers are faulted, they produce corresponding fault cores which act as conduits or barriers for fluid migration. When analyzing damage zones, accurate field and data acquisition and stochastic modeling was used to determine the hydraulic properties of the rock volume, in relation to the surrounding, undamaged host rock. In the fault cores, the three-dimensional pore network quantitative analysis based on X-ray microtomography images includes porosity, pore connectivity, and specific surface area. In addition, images were used to perform computational fluid simulation (Lattice-Boltzmann multi relaxation time method) and estimate the permeability. These results will be useful for understanding the deformation process and hydraulic properties across meter-scale damage zones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.
1981-11-01
This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712more » feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.« less
Shallow Carbon Sequestration Demonstration Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pendergrass, Gary; Fraley, David; Alter, William
The potential for carbon sequestration at relatively shallow depths was investigated at four power plant sites in Missouri. Exploratory boreholes were cored through the Davis Shale confining layer into the St. Francois aquifer (Lamotte Sandstone and Bonneterre Formation). Precambrian basement contact ranged from 654.4 meters at the John Twitty Energy Center in Southwest Missouri to over 1100 meters near the Sioux Power Plant in St. Charles County. Investigations at the John Twitty Energy Center included 3D seismic reflection surveys, downhole geophysical logging and pressure testing, and laboratory analysis of rock core and water samples. Plans to perform injectivity tests atmore » the John Twitty Energy Center, using food grade CO{sub 2}, had to be abandoned when the isolated aquifer was found to have very low dissolved solids content. Investigations at the Sioux Plant and Thomas Hill Energy Center in Randolph County found suitably saline conditions in the St. Francois. A fourth borehole in Platte County was discontinued before reaching the aquifer. Laboratory analyses of rock core and water samples indicate that the St. Charles and Randolph County sites could have storage potentials worthy of further study. The report suggests additional Missouri areas for further investigation as well.« less
Carroll, R.D.
1969-01-01
A statistical analysis was made of the relationship of various acoustic parameters of volcanic rocks to compressional wave velocities for data obtained in a volcanic region in Nevada. Some additional samples, chiefly granitic rocks, were also included in the study to extend the range of parameters and the variety of siliceous rock types sampled. Laboratory acoustic measurements obtained on 62 dry core samples were grouped with similar measurements obtained from geophysical logging devices at several depth intervals in a hole from which 15 of the core samples had been obtained. The effects of lithostatic and hydrostatic load on changing the rock acoustic parameters measured in the hole were noticeable when compared with the laboratory measurements on the same core. The results of the analyses determined by grouping all of the data, however, indicate that dynamic Young's, shear and bulk modulus, shear velocity, shear and compressional characteristic impedance, as well as amplitude and energy reflection coefficients may be reliably estimated on the basis of the compressional wave velocities of the rocks investigated. Less precise estimates can be made of density based on the rock compressional velocity. The possible extension of these relationships to include many siliceous rocks is suggested. ?? 1969.
NASA Astrophysics Data System (ADS)
Cortinovis, Silvia; Balsamo, Fabrizio; Storti, Fabrizio
2017-04-01
The study of the microstructural and petrophysical evolution of cataclasites and gouges has a fundamental impact on both hydraulic and frictional properties of fault zones. In the last decades, growing attention has been payed to the characterization of carbonate fault core rocks due to the nucleation and propagation of coseismic ruptures in carbonate successions (e.g., Umbria-Marche 1997, L'Aquila 2009, Amatrice 2016 earthquakes in Central Apennines, Italy). Among several physical parameters, grain size and shape in fault core rocks are expected to control the way of sliding along the slip surfaces in active fault zones, thus influencing the propagation of coseismic ruptures during earthquakes. Nevertheless, the role of grain size and shape distribution evolution in controlling the weakening or strengthening behavior in seismogenic fault zones is still not fully understood also because a comprehensive database from natural fault cores is still missing. In this contribution, we present a preliminary study of seismogenic extensional fault zones in Central Apennines by combining detailed filed mapping with grain size and microstructural analysis of fault core rocks. Field mapping was aimed to describe the structural architecture of fault systems and the along-strike fault rock distribution and fracturing variations. In the laboratory we used a Malvern Mastersizer 3000 granulometer to obtain a precise grain size characterization of loose fault rocks combined with sieving for coarser size classes. In addition, we employed image analysis on thin sections to quantify the grain shape and size in cemented fault core rocks. The studied fault zones consist of an up to 5-10 m-thick fault core where most of slip is accommodated, surrounded by a tens-of-meters wide fractured damage zone. Fault core rocks consist of (1) loose to partially cemented breccias characterized by different grain size (from several cm up to mm) and variable grain shape (from very angular to sub-rounded), and (2) very fine-grained gouges (< 1 mm) localized along major and minor mirror-like slip surfaces. Damage zones mostly consist of fractured rocks and, locally, pulverized rocks. Collectively, field observations and laboratory analyses indicate that within the fault cores of the studied fault zones, grain size progressively decreases approaching the master slip surfaces. Furthermore, grain shape changes from very angular to sub-rounded clasts moving toward the master slip surfaces. These features suggest that the progressive evolution of grain size and shape distributions within fault cores may have determined the development of strain localization by the softening and cushioning effects of smaller particles in loose fault rocks.
28. MODIFIED CHAIN SAW FOR CUTTING ROCK CORES; BRUNTON COMPASS ...
28. MODIFIED CHAIN SAW FOR CUTTING ROCK CORES; BRUNTON COMPASS STAND FOR DETERMINING CORE'S FIELD ORIENTATION; INSECTICIDE DISPENSER MODIFIED TO LUBRICATE CORE DRILLING PROCESS. - U.S. Geological Survey, Rock Magnetics Laboratory, 345 Middlefield Road, Menlo Park, San Mateo County, CA
Ejector subassembly for dual wall air drilling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolle, J.J.
1996-09-01
The dry drilling system developed for the Yucca Mountain Site Characterization Project incorporates a surface vacuum system to prevent drilling air and cuttings from contaminating the borehole wall during coring operations. As the drilling depth increases, however there is a potential for borehole contamination because of the limited volume of air which can be removed by the vacuum system. A feasibility analysis has shown that an ejector subassembly mounted in the drill string above the core barrel could significantly enhance the depth capacity of the dry drilling system. The ejector subassembly would use a portion of the air supplied tomore » the core bit to maintain a vacuum on the hole bottom. The results of a design study including performance testing of laboratory scale ejector simulator are presented here.« less
Finning, Kirstin; Bhandari, Radhika; Sellers, Fiona; Revelli, Nicoletta; Villa, Maria Antonietta; Muñiz-Díaz, Eduardo; Nogués, Núria
2016-03-01
High-throughput genotyping platforms enable simultaneous analysis of multiple polymorphisms for blood group typing. BLOODchip® ID is a genotyping platform based on Luminex® xMAP technology for simultaneous determination of 37 red blood cell (RBC) antigens (ID CORE XT) and 18 human platelet antigens (HPA) (ID HPA XT) using the BIDS XT software. In this international multicentre study, the performance of ID CORE XT and ID HPA XT, using the centres' current genotyping methods as the reference for comparison, and the usability and practicality of these systems, were evaluated under working laboratory conditions. DNA was extracted from whole blood in EDTA with Qiagen methodologies. Ninety-six previously phenotyped/genotyped samples were processed per assay: 87 testing samples plus five positive controls and four negative controls. Results were available for 519 samples: 258 with ID CORE XT and 261 with ID HPA XT. There were three "no calls" that were either caused by human error or resolved after repeating the test. Agreement between the tests and reference methods was 99.94% for ID CORE XT (9,540/9,546 antigens determined) and 100% for ID HPA XT (all 4,698 alleles determined). There were six discrepancies in antigen results in five RBC samples, four of which (in VS, N, S and Do(a)) could not be investigated due to lack of sufficient sample to perform additional tests and two of which (in S and C) were resolved in favour of ID CORE XT (100% accuracy). The total hands-on time was 28-41 minutes for a batch of 16 samples. Compared with the reference platforms, ID CORE XT and ID HPA XT were considered simpler to use and had shorter processing times. ID CORE XT and ID HPA XT genotyping platforms for RBC and platelet systems were accurate and user-friendly in working laboratory settings.
Advanced core-analyses for subsurface characterization
NASA Astrophysics Data System (ADS)
Pini, R.
2017-12-01
The heterogeneity of geological formations varies over a wide range of length scales and represents a major challenge for predicting the movement of fluids in the subsurface. Although they are inherently limited in the accessible length-scale, laboratory measurements on reservoir core samples still represent the only way to make direct observations on key transport properties. Yet, properties derived on these samples are of limited use and should be regarded as sample-specific (or `pseudos'), if the presence of sub-core scale heterogeneities is not accounted for in data processing and interpretation. The advent of imaging technology has significantly reshaped the landscape of so-called Special Core Analysis (SCAL) by providing unprecedented insight on rock structure and processes down to the scale of a single pore throat (i.e. the scale at which all reservoir processes operate). Accordingly, improved laboratory workflows are needed that make use of such wealth of information by e.g., referring to the internal structure of the sample and in-situ observations, to obtain accurate parameterisation of both rock- and flow-properties that can be used to populate numerical models. We report here on the development of such workflow for the study of solute mixing and dispersion during single- and multi-phase flows in heterogeneous porous systems through a unique combination of two complementary imaging techniques, namely X-ray Computed Tomography (CT) and Positron Emission Tomography (PET). The experimental protocol is applied to both synthetic and natural porous media, and it integrates (i) macroscopic observations (tracer effluent curves), (ii) sub-core scale parameterisation of rock heterogeneities (e.g., porosity, permeability and capillary pressure), and direct 3D observation of (iii) fluid saturation distribution and (iv) the dynamic spreading of the solute plumes. Suitable mathematical models are applied to reproduce experimental observations, including both 1D and 3D numerical schemes populated with the parameterisation above. While it validates the core-flooding experiments themselves, the calibrated mathematical model represents a key element for extending them to conditions prevalent in the subsurface, which would be otherwise not attainable in the laboratory.
U.S. Army Research Laboratory (ARL) XPairIt Simulator for Peptide Docking and Analysis
2014-07-01
results from a case study, docking a short peptide to a small protein. For this test we choose the 1RXZ system from the Protein Data Bank, which...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data ...core of XPairIt, which additionally contains many data management and organization options, analysis tools, and custom simulation methodology. Two
The Tissue Analysis Core within the AIDS and Cancer Virus Program will process, embed and perform microtomy on fixed tissue samples presented in ethanol. HIV/SIVin situhybridization for detection of vRNA and vDNA will be performed using the next-gene
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Sean; Dewan, Leslie; Massie, Mark
This report presents results from a collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear (GAIN) Nuclear Energy Voucher program. The TAP concept is a molten salt reactor using configurable zirconium hydride moderator rod assemblies to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches and time-dependent parametersmore » necessary to simulate the continuously changing physics in this complex system. The implementation of continuous-energy Monte Carlo transport and depletion tools in ChemTriton provide for full-core three-dimensional modeling and simulation. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this concept. Additional analyses of mass feed rates and enrichments, isotopic removals, tritium generation, core power distribution, core vessel helium generation, moderator rod heat deposition, and reactivity coeffcients provide additional information to make informed design decisions. This work demonstrates capabilities of ORNL modeling and simulation tools for neutronic and fuel cycle analysis of molten salt reactor concepts.« less
Nunez, A.; Strahan, G.; Soroka, D.S.; Damert, W.; Needleman, D.
2011-01-01
The Core Technologies (CT) unit, located at the Eastern Regional Research Center (ERRC), is a centralized resource of specialized instrumentation and technologies. Its objective is to provide supplementary research data processing, interpretation, analysis and consultation for a broad range of research programs approved by the Agricultural Research Service (ARS), the in-house research arm of the United States Department of Agriculture. The CT unit is comprised of four research related components: genetic analysis, proteomicsbiopolymers mass spectrometry, electron microscopy, and magnetic resonance spectroscopy (NMR). In addition, the Research Data Systems, the information pipeline of the CT, provides the means to facilitate data distribution to researchers, stakeholders, and the general public. The availability of integrated resource laboratories assures professional and dependable support to the goals of the ARS community.
USDA-ARS?s Scientific Manuscript database
The Genetics Analysis Core Facility at the Eastern Regional Research Center (ERRC) is a centralized laboratory working in support of the United States Department of Agriculture scientists within the ERRC. The application of molecular diagnostics within the Center has resulted in the morphing of the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denman, Matthew R.; Brooks, Dusty Marie
Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on keymore » figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardoni, Jeffrey N.; Kalinich, Donald A.
2014-02-01
Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on keymore » figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindenmeier, Clark W.; Serne, R. Jeffrey; Bjornstad, Bruce N.
2003-09-11
This report summarizes data collected from samples in borehole 299-E33-338 (C3391). Borehole 299-E33-338 was drilled for two purposes. One purpose was for installation of a RCRA ground-water monitoring well and the other was to characterize the in situ soils and background porewater chemistry near WMA B-BX-BY that have been largely uncontaminated by tank farm and crib and trench discharge operations. This borehole was drilled just outside the southeast fence line of the B tank farm. The borehole was drilled between July 23 and August 8, 2001 to a total depth of 80.05 m (275.75 ft) bgs using the cable-tool methodmore » (Horton 2002). The water table was contacted at 77.5 m (254.2 ft) bgs and the top of basalt at 82.6 m (271 ft) bgs. Samples to the top of basalt were collected via a drive barrel/splitspoon, before switching to a hard tool to drill 5 feet into the basalt. Nearly continuous core was obtained down to a depth of ~78.6 m (258 ft) bgs. Two hundred and two 2-ft long by 4-in diameter cores were retrieved, which accounts for ~75% the total length of the borehole. Each 2-ft splitspoon contained two 1-ft lexan-lined core segments. The lithology of this borehole was summarized onto a field geologist's log by a CH2M HILL Hanford Group, Inc. geologist (L. D. Walker); subsequently visual inspection of the cores was performed in the laboratory by K. A. Lindsey (Kennedy/Jenks), K. D. Reynolds (Duratek), and B. N. Bjornstad (Pacific Northwest National Laboratory), who also collected 24 samples for paleomagnetic analysis. Subsamples were taken from all 102 cores for moisture content (Table B.1). In addition, 21 core subsamples were collected from a depth of geological interest for mineralogical and geochemical analysis. Data from these samples allow for comparison of uncontaminated versus contaminated soils to better understand the contributions of tank wastes and other wastewaters on the vadose zone in and around WMA B-BX-BY.« less
NASA Astrophysics Data System (ADS)
Jackson, S. J.; Krevor, S. C.; Agada, S.
2017-12-01
A number of studies have demonstrated the prevalent impact that small-scale rock heterogeneity can have on larger scale flow in multiphase flow systems including petroleum production and CO2sequestration. Larger scale modeling has shown that this has a significant impact on fluid flow and is possibly a significant source of inaccuracy in reservoir simulation. Yet no core analysis protocol has been developed that faithfully represents the impact of these heterogeneities on flow functions used in modeling. Relative permeability is derived from core floods performed at conditions with high flow potential in which the impact of capillary heterogeneity is voided. A more accurate representation would be obtained if measurements were made at flow conditions where the impact of capillary heterogeneity on flow is scaled to be representative of the reservoir system. This, however, is generally impractical due to laboratory constraints and the role of the orientation of the rock heterogeneity. We demonstrate a workflow of combined observations and simulations, in which the impact of capillary heterogeneity may be faithfully represented in the derivation of upscaled flow properties. Laboratory measurements that are a variation of conventional protocols are used for the parameterization of an accurate digital rock model for simulation. The relative permeability at the range of capillary numbers relevant to flow in the reservoir is derived primarily from numerical simulations of core floods that include capillary pressure heterogeneity. This allows flexibility in the orientation of the heterogeneity and in the range of flow rates considered. We demonstrate the approach in which digital rock models have been developed alongside core flood observations for three applications: (1) A Bentheimer sandstone with a simple axial heterogeneity to demonstrate the validity and limitations of the approach, (2) a set of reservoir rocks from the Captain sandstone in the UK North Sea targeted for CO2 storage, and for which the use of capillary pressure hysteresis is necessary, and (3) a secondary CO2-EOR production of residual oil from a Berea sandstone with layered heterogeneities. In all cases the incorporation of heterogeneity is shown to be key to the ultimate derivation of flow properties representative of the reservoir system.
NASA Astrophysics Data System (ADS)
Soom, F.; Ulrich, C.; Dafflon, B.; Wu, Y.; Kneafsey, T. J.; López, R. D.; Peterson, J.; Hubbard, S. S.
2016-12-01
The Arctic tundra with its permafrost dominated soils is one of the regions most affected by global climate change, and in turn, can also influence the changing climate through biogeochemical processes, including greenhouse gas release or storage. Characterization of shallow permafrost distribution and characteristics are required for predicting ecosystem feedbacks to a changing climate over decadal to century timescales, because they can drive active layer deepening and land surface deformation, which in turn can significantly affect hydrological and biogeochemical responses, including greenhouse gas dynamics. In this study, part of the Next-Generation Ecosystem Experiment (NGEE-Arctic), we use X-ray computed tomography (CT) to estimate wet bulk density of cores extracted from a field site near Barrow AK, which extend 2-3m through the active layer into the permafrost. We use multi-dimensional relationships inferred from destructive core sample analysis to infer organic matter density, dry bulk density and ice content, along with some geochemical properties from nondestructive CT-scans along the entire length of the cores, which was not obtained by the spatially limited destructive laboratory analysis. Multi-parameter cross-correlations showed good agreement between soil properties estimated from CT scans versus properties obtained through destructive sampling. Soil properties estimated from cores located in different types of polygons provide valuable information about the vertical distribution of soil and permafrost properties as a function of geomorphology.
NASA Astrophysics Data System (ADS)
Syafrina, R.; Rohman, I.; Yuliani, G.
2018-05-01
This study aims to analyze the concept characteristics of solubility and solubility products that will serve as the basis for the development of virtual laboratory and students' science process skills. Characteristics of the analyzed concepts include concept definitions, concept attributes, and types of concepts. The concept analysis method uses concept analysis according to Herron. The results of the concept analysis show that there are twelve chemical concepts that become the prerequisite concept before studying the solubility and solubility and five core concepts that students must understand in the solubility and Solubility product. As many as 58.3% of the definitions of the concepts contained in high school textbooks support students' science process skills, the rest of the definition of the concept is memorized. Concept attributes that meet three levels of chemical representation and can be poured into a virtual laboratory have a percentage of 66.6%. Type of concept, 83.3% is a concept based on principle; and 16.6% concepts that state the process. Meanwhile, the science process skills that can be developed based on concept analysis are the ability to observe, calculate, measure, predict, interpret, hypothesize, apply, classify, and inference.
Reed, Michael F.; Bartholomay, Roy C.
1994-01-01
The U.S. Geological Survey (USGS) Project Office at the Idaho National Engineering Laboratory (INEL), in cooperation with the U.S. Department of Energy and Idaho State University, analyzed 66 samples from sedimentary interbed cores during a 38-month period beginning in October 1990 to determine bulk and clay mineralogy. These cores had been collected from 19 sites in the Big Lost River Basin, 2 sites in the Birch Creek Basin, and 1 site in the Mud Lake Basin, and were archived at the USGS lithologic core library at the INEL. Mineralogy data indicate that the core samples from the Big Lost River Basin have larger mean and median percentages of quartz, total feldspar, and total clay minerals, but smaller mean and median percentages of calcite than the core samples from the Birch Creek Basin. Core samples from the Mud Lake Basin have abundant quartz, total feldspar, calcite, and total clay minerals.
Accuracy of finite-difference modeling of seismic waves : Simulation versus laboratory measurements
NASA Astrophysics Data System (ADS)
Arntsen, B.
2017-12-01
The finite-difference technique for numerical modeling of seismic waves is still important and for some areas extensively used.For exploration purposes is finite-difference simulation at the core of both traditional imaging techniques such as reverse-time migration and more elaborate Full-Waveform Inversion techniques.The accuracy and fidelity of finite-difference simulation of seismic waves are hard to quantify and meaningfully error analysis is really onlyeasily available for simplistic media. A possible alternative to theoretical error analysis is provided by comparing finite-difference simulated data with laboratory data created using a scale model. The advantage of this approach is the accurate knowledge of the model, within measurement precision, and the location of sources and receivers.We use a model made of PVC immersed in water and containing horizontal and tilted interfaces together with several spherical objects to generateultrasonic pressure reflection measurements. The physical dimensions of the model is of the order of a meter, which after scaling represents a model with dimensions of the order of 10 kilometer and frequencies in the range of one to thirty hertz.We find that for plane horizontal interfaces the laboratory data can be reproduced by the finite-difference scheme with relatively small error, but for steeply tilted interfaces the error increases. For spherical interfaces the discrepancy between laboratory data and simulated data is sometimes much more severe, to the extent that it is not possible to simulate reflections from parts of highly curved bodies. The results are important in view of the fact that finite-difference modeling is often at the core of imaging and inversion algorithms tackling complicatedgeological areas with highly curved interfaces.
NASA Astrophysics Data System (ADS)
Raef, Abdelmoneam; Gad, Sabreen; Tucker-Kulesza, Stacey
2015-10-01
Seismic site characteristics, as pertaining to earthquake hazard reduction, are a function of the subsurface elastic moduli and the geologic structures. This study explores how multiscale (surface, downhole, and laboratory) datasets can be utilized to improve "constrained" average Vs30 (shear-wave velocity to a 30-meter depth). We integrate borehole, surface and laboratory measurements for a seismic site classification based on the standards of the National Earthquake Hazard Reduction Program (NEHRP). The seismic shear-wave velocity (Vs30) was derived from a geophysical inversion workflow that utilized multichannel analysis of surface-waves (MASW) and downhole acoustic televiewer imaging (DATI). P-wave and S-wave velocities, based on laboratory measurements of arrival times of ultrasonic-frequency signals, supported the workflow by enabling us to calculate Poisson's ratio, which was incorporated in building an initial model for the geophysical inversion of MASW. Extraction of core samples from two boreholes provided lithology and thickness calibration of the amplitudes of the acoustic televiewer imaging for each layer. The MASW inversion, for calculating Vs sections, was constrained with both ultrasonic laboratory measurements (from first arrivals of Vs and Vp waveforms at simulated in situ overburden stress conditions) and the downhole acoustic televiewer (DATV) amplitude logs. The Vs30 calculations enabled categorizing the studied site as NEHRP-class "C" - very dense soil and soft rock. Unlike shallow fractured carbonates in the studied area, S-wave and P-wave velocities at ultrasonic frequency for the deeper intact shale core-samples from two boreholes were in better agreement with the corresponding velocities from both a zero-offset vertical seismic profiling (VSP) and inversion of Rayleigh-wave velocity dispersion curves.
Oak Ridge National Laboratory Institutional Plan, FY 1995--FY 2000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-11-01
This report discusses the institutional plan for Oak Ridge National Laboratory for the next five years (1995-2000). Included in this report are the: laboratory director`s statement; laboratory mission, vision, and core competencies; laboratory plan; major laboratory initiatives; scientific and technical programs; critical success factors; summaries of other plans; and resource projections.
Interrelating the breakage and composition of mined and drill core coal
NASA Astrophysics Data System (ADS)
Wilson, Terril Edward
Particle size distribution of coal is important if the coal is to be beneficiated, or if a coal sales contract includes particle size specifications. An exploration bore core sample of coal ought to be reduced from its original cylindrical form to a particle size distribution and particle composition that reflects, insofar as possible, a process stream of raw coal it represents. Often, coal cores are reduced with a laboratory crushing machine, the product of which does not match the raw coal size distribution. This study proceeds from work in coal bore core reduction by Australian investigators. In this study, as differentiated from the Australian work, drop-shatter impact breakage followed by dry batch tumbling in steel cylinder rotated about its transverse axis are employed to characterize the core material in terms of first-order and zeroth-order breakage rate constants, which are indices of the propensity of the coal to degrade during excavation and handling. Initial drop-shatter and dry tumbling calibrations were done with synthetic cores composed of controlled low-strength concrete incorporating fly ash (as a partial substitute for Portland cement) in order to reduce material variables and conserve difficult-to-obtain coal cores. Cores of three different coalbeds--Illinois No. 6, Upper Freeport, and Pocahontas No. 5 were subjected to drop-shatter and dry batch tumbling tests to determine breakage response. First-order breakage, characterized by a first-order breakage index for each coal, occurred in the drop-shatter tests. First- and zeroth-order breakage occurred in dry batch tumbling; disappearance of coarse particles and creation of fine particles occurred in a systematic way that could be represented mathematically. Certain of the coal cores available for testing were dry and friable. Comparison of coal preparation plant feed with a crushed bore core and a bore core prepared by drop-shatter and tumbling (all from the same Illinois No.6 coal mining property) indicated that the size distribution and size fraction composition of the drop-shattered/tumbled core more closely resembled the plant feed than the crushed core. An attempt to determine breakage parameters (to allow use of selection and breakage functions and population balance models in the description of bore core size reduction) was initiated. Rank determination of the three coal types was done, indicating that higher rank associates with higher breakage propensity. The two step procedure of drop-shatter and dry batch tumbling simulates the first-order (volume breakage) and zeroth-order (abrasion of particle surfaces) that occur in excavation and handling operations, and is appropriate for drill core reduction prior to laboratory analysis.
The Tissue Analysis Core (TAC) within the AIDS and Cancer Virus Program will process, embed, and perform microtomy on fixed tissue samples presented in ethanol. CD4 (DAB) and CD68/CD163 (FastRed) double immunohistochemistry will be performed, allowin
Evaluation of bonded boron/epoxy doublers for commercial aircraft aluminum structures
NASA Technical Reports Server (NTRS)
Belason, Bruce; Rutherford, Paul; Miller, Matthew; Raj, Shreeram
1994-01-01
An 18 month laboratory test and stress analysis program was conducted to evaluate bonded boron/epoxy doublers for repairing cracks on aluminum aircraft structures. The objective was to obtain a core body of substantiating data which will support approval for use on commercial transports of a technology that is being widely used by the military. The data showed that the doublers had excellent performance.
NASA Astrophysics Data System (ADS)
Bogaard, T. A.
2003-04-01
This paper’s objectives are twofold: to test the potential of cation exchange capacity (CEC) analysis for refinement of the knowledge of the hydrological system in landslide areas; and to examine two laboratory CEC analysis techniques on their applicability to partly weathered marls. The NH4Ac and NaCl laboratory techniques are tested. The geochemical results are compared with the core descriptions and interpreted with respect to their usefulness. Both analysis techniques give identical results for CEC, and are plausible on the basis of the available clay content information. The determination of the exchangeable cations was more difficult, since part of the marls dissolved. With the ammonium-acetate method more of the marls are dissolved than with the sodium-chloride method. This negatively affects the results of the exchangeable cations. Therefore, the NaCl method is to be preferred for the determination of the cation fractions at the complex, be it that this method has the disadvantage that the sodium fraction cannot be determined. To overcome this problem it is recommended to try and use another salt e.g. SrCl2 as displacement fluid. Both Alvera and Boulc-Mondorès examples show transitions in cation composition with depth. It was shown that the exchangeable cation fractions can be useful in locating boundaries between water types, especially the boundary between the superficial, rain fed hydrological system and the lower, regional ground water system. This information may be important for landslide interventions since the hydrological system and the origin of the water need to be known in detail. It is also plausible that long-term predictions of slope stability may be improved by knowledge of the hydrogeochemical evolution of clayey landslides. In the Boulc-Mondorès example the subsurface information that can be extracted from CEC analyses was presented. In the Boulc-Mondorès cores deviant intervals of CEC could be identified. These are interpreted as weathered layers that may develop or have already developed into slip surfaces. The CEC analyses of the cores revealed ‘differences in chemical composition’ that can have an influence on slope stability. It is known that the chemical composition of a soil may have a large effect on the strength parameters of the material. The technique described here can also be used before core sampling for laboratory strength tests. The major problem of the CEC analyses turned out to be the explanation of the origin of the differences found in the core samples. From the above it is concluded that geochemistry is a potentially valuable technique for e.g. landslide research, but it is recognised that still a lot of work has to be done before the technique can be applied in engineering practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Poelgeest, F.; Niko, H.; Modwid, A.R.
1991-03-01
Shell Expro and Koninklijke/Shell E and P Laboratorium (KSEPL) have been engaged in a multidisciplinary effort to determine the water flood residual oil saturation (ROS) in two principal reservoirs of the Cormorant oil field in the U.K. sector of the North Sea. Data acquisition included special coring and testing. The study, which involved new reservoir-engineering and petrophysical techniques, was aimed at establishing consistent ROS values. This paper reports that reservoir-engineering work centered on reservoir-condition corefloods in the relative-permeability-at-reservoir-conditions (REPARC) apparatus, in which restoration of representative wettability condition was attempted with the aging technique. Aging results in a consistent reduction ofmore » water-wetness of all core samples. The study indicated that ROS values obtained on aged cores at water throughputs of at least 5 PV represented reservoir conditions. The petrophysical part of the study involved ROS estimation from sponge-core analysis and log evaluation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrieling, P. Douglas
2016-01-01
The Livermore Valley Open Campus (LVOC), a joint initiative of the National Nuclear Security Administration (NNSA), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL), enhances the national security missions of NNSA by promoting greater collaboration between world-class scientists at the national security laboratories, and their partners in industry and academia. Strengthening the science, technology, and engineering (ST&E) base of our nation is one of the NNSA’s top goals. By conducting coordinated and collaborative programs, LVOC enhances both the NNSA and the broader national science and technology base, and helps to ensure the health of core capabilities at LLNLmore » and SNL. These capabilities must remain strong to enable the laboratories to execute their primary mission for NNSA.« less
The impact of the EUSCLE Core Set Questionnaire for the assessment of cutaneous lupus erythematosus.
Kuhn, A; Patsinakidis, N; Bonsmann, G
2010-08-01
Epidemiological data and standard European guidelines for the diagnosis and treatment of cutaneous lupus erythematosus (CLE) are lacking in the current literature. In order to provide a standardized tool for an extensive consistent data collection, a study group of the European Society of Cutaneous Lupus Erythematosus (EUSCLE) recently developed a Core Set Questionnaire for the assessment of patients with different subtypes of CLE. The EUSCLE Core Set Questionnaire includes six sections on patient data, diagnosis, skin involvement, activity and damage of disease, laboratory analysis, and treatment. An instrument like the EUSCLE Core Set Questionnaire is essential to gain a broad and comparable data collection of patients with CLE from different European centres and to achieve consensus concerning clinical standards for the disease. The data will also be important for further characterization of the different CLE subtypes and the evaluation of therapeutic strategies; moreover, the EUSCLE Core Set Questionnaire might also be useful for the comparison of data in clinical trials. In this review, the impact of the EUSCLE Core Set Questionnaire is discussed in detail with regard to clinical and serological features as well as therapeutic modalities in CLE.
Data for ground-water test hole near Nicolaus, Central Valley aquifer project, California
French, James J.; Page, R.W.; Bertoldi, Gilbert L.
1983-01-01
Preliminary data are provided for the third of seven test holes drilled as a part of the Central Valley Aquifer Project which is part of the National Regional Aquifer Systems Analysis Program. The test hole was drilled in the SW 1/4 NE 1/4 sec. 2, T.12N., R.3E., Sutter County, California, about 1 1/2 miles northwest of the town of Nicolaus. Drilled to a depth of 1,150 feet below land surface, the hole is cased to a depth of 100 feet and equipped with three piezometer tubes to depths of 311, 711, and 1,071 feet. A 5-foot well screen is set in sand at the bottom of each piezometer. Each screened interval has a cement plug above and below it to isolate it from other parts of the aquifer, and the well bore is filled between the plugs with sediment. Thirty-one cores and 34 sidewall cores were recovered. Laboratory tests were made for minerology, consolidation, grain-size distribution, Atterberg limits, X-ray diffraction, thermal conductivity, and chemical analysis of water. Geophysical and thermal gradient logs were made. The hole is sampled periodically for chemical analysis of the three tapped zones and measured for water level. This report presents methods used to obtain field samples, laboratory procedures, and the data obtained. (USGS)
Hodges, Mary K.V.; Davis, Linda C.; Bartholomay, Roy C.
2018-01-30
In 1990, the U.S. Geological Survey, in cooperation with the U.S. Department of Energy Idaho Operations Office, established the Lithologic Core Storage Library at the Idaho National Laboratory (INL). The facility was established to consolidate, catalog, and permanently store nonradioactive drill cores and cuttings from subsurface investigations conducted at the INL, and to provide a location for researchers to examine, sample, and test these materials.The facility is open by appointment to researchers for examination, sampling, and testing of cores and cuttings. This report describes the facility and cores and cuttings stored at the facility. Descriptions of cores and cuttings include the corehole names, corehole locations, and depth intervals available.Most cores and cuttings stored at the facility were drilled at or near the INL, on the eastern Snake River Plain; however, two cores drilled on the western Snake River Plain are stored for comparative studies. Basalt, rhyolite, sedimentary interbeds, and surficial sediments compose most cores and cuttings, most of which are continuous from land surface to their total depth. The deepest continuously drilled core stored at the facility was drilled to 5,000 feet below land surface. This report describes procedures and researchers' responsibilities for access to the facility and for examination, sampling, and return of materials.
Kneafsey, T.J.; Lu, H.; Winters, W.; Boswell, R.; Hunter, R.; Collett, T.S.
2011-01-01
Collecting and preserving undamaged core samples containing gas hydrates from depth is difficult because of the pressure and temperature changes encountered upon retrieval. Hydrate-bearing core samples were collected at the BPXA-DOE-USGS Mount Elbert Gas Hydrate Stratigraphic Test Well in February 2007. Coring was performed while using a custom oil-based drilling mud, and the cores were retrieved by a wireline. The samples were characterized and subsampled at the surface under ambient winter arctic conditions. Samples thought to be hydrate bearing were preserved either by immersion in liquid nitrogen (LN), or by storage under methane pressure at ambient arctic conditions, and later depressurized and immersed in LN. Eleven core samples from hydrate-bearing zones were scanned using x-ray computed tomography to examine core structure and homogeneity. Features observed include radial fractures, spalling-type fractures, and reduced density near the periphery. These features were induced during sample collection, handling, and preservation. Isotopic analysis of the methane from hydrate in an initially LN-preserved core and a pressure-preserved core indicate that secondary hydrate formation occurred throughout the pressurized core, whereas none occurred in the LN-preserved core, however no hydrate was found near the periphery of the LN-preserved core. To replicate some aspects of the preservation methods, natural and laboratory-made saturated porous media samples were frozen in a variety of ways, with radial fractures observed in some LN-frozen sands, and needle-like ice crystals forming in slowly frozen clay-rich sediments. Suggestions for hydrate-bearing core preservation are presented.
Computed Tomography Scanning and Geophysical Measurements of Core from the Coldstream 1MH Well
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crandall, Dustin M.; Brown, Sarah; Moore, Johnathan E.
The computed tomography (CT) facilities and the Multi-Sensor Core Logger (MSCL) at the National Energy Technology Laboratory (NETL) Morgantown, West Virginia site were used to characterize core of the Marcellus Shale from a vertical well, the Coldstream 1MH Well in Clearfield County, PA. The core is comprised primarily of the Marcellus Shale from a depth of 7,002 to 7,176 ft. The primary impetus of this work is a collaboration between West Virginia University (WVU) and NETL to characterize core from multiple wells to better understand the structure and variation of the Marcellus and Utica shale formations. As part of thismore » effort, bulk scans of core were obtained from the Coldstream 1MH well, provided by the Energy Corporation of America (now Greylock Energy). This report, and the associated scans, provide detailed datasets not typically available from unconventional shales for analysis. The resultant datasets are presented in this report, and can be accessed from NETL's Energy Data eXchange (EDX) online system using the following link: https://edx.netl.doe.gov/dataset/coldstream-1mh-well. All equipment and techniques used were non-destructive, enabling future examinations to be performed on these cores. None of the equipment used was suitable for direct visualization of the shale pore space, although fractures and discontinuities were detectable with the methods tested. Low resolution CT imagery with the NETL medical CT scanner was performed on the entire core. Qualitative analysis of the medical CT images, coupled with x-ray fluorescence (XRF), P-wave, and magnetic susceptibility measurements from the MSCL were useful in identifying zones of interest for more detailed analysis as well as fractured zones. En echelon fractures were observed at 7,100 ft and were CT scanned using NETL’s industrial CT scanner at higher resolution. The ability to quickly identify key areas for more detailed study with higher resolution will save time and resources in future studies. The combination of methods used provided a multi-scale analysis of this core and provides both a macro and micro description of the core that is relevant for many subsurface energy-related examinations that have traditionally been performed at NETL.« less
Dorfman, David M; Bui, Marilyn M; Tubbs, Raymond R; Hsi, Eric D; Fitzgibbons, Patrick L; Linden, Michael D; Rickert, Robert R; Roche, Patrick C
2006-06-01
We have developed tissue microarray-based surveys to allow laboratories to compare their performance in staining predictive immunohistochemical markers, including proto-oncogene CD117 (c-kit), which is characteristically expressed in gastrointestinal stromal tumors (GISTs). GISTs exhibit activating mutations in the c-kit proto-oncogene, which render them amenable to treatment with imatinib mesylate. Consequently, correct identification of c-Kit expression is important for the diagnosis and treatment of GISTs. To analyze CD117 immunohistochemical staining performance by a large number of clinical laboratories. A mechanical device was used to construct tissue microarrays consisting of 3 x 1-mm cores of 10 tumor samples, which can be used to generate hundreds of tissue sections from the arrayed cases, suitable for large-scale interlaboratory comparison of immunohistochemical staining. An initial survey of 63 laboratories and a second survey of 90 laboratories, performed in 2004 and 2005, exhibited >81% concordance for 7 of 10 cores, including all 4 GIST cases, which were immunoreactive for CD117 with >95% staining concordance. Three of the cores achieved less than 81% concordance of results, possibly due to the presence of foci of necrosis in one core and CD117-positive mast cells in 2 cores of CD117-negative neoplasms. There was good performance among a large number of laboratories performing CD117 immunohistochemical staining, with consistently higher concordance of results for CD117-positive GIST cases than for nonimmunoreactive cases. Tissue microarrays for CD117 and other predictive markers should be useful for interlaboratory comparisons, quality assurance, and education of participants regarding staining nuances such as the expression of CKIT by nonneoplastic mast cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauntt, Randall O.; Mattie, Patrick D.
Sandia National Laboratories (SNL) has conducted an uncertainty analysis (UA) on the Fukushima Daiichi unit (1F1) accident progression with the MELCOR code. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). That study focused on reconstructing the accident progressions, as postulated by the limited plant data. This work was focused evaluation of uncertainty in core damage progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, reactor damage state, fraction of intact fuel, vessel lower head failure). The primary intent of this studymore » was to characterize the range of predicted damage states in the 1F1 reactor considering state of knowledge uncertainties associated with MELCOR modeling of core damage progression and to generate information that may be useful in informing the decommissioning activities that will be employed to defuel the damaged reactors at the Fukushima Daiichi Nuclear Power Plant. Additionally, core damage progression variability inherent in MELCOR modeling numerics is investigated.« less
Breault, Ronald W.; Monazam, Esmail R.
2015-04-01
In this study, chemical looping combustion is a promising technology for the capture of CO 2 involving redox materials as oxygen carriers. The effects of reduction conditions, namely, temperature and fuel partial pressure on the conversion products are investigated. The experiments were conducted in a laboratory fixed-bed reactor that was operated cyclically with alternating reduction and oxidation periods. Reactions are assumed to occur in the shell surrounding the particle grains with diffusion of oxygen to the surface from the grain core. Activation energies for the shell and core reactions range from 9 to 209 kJ/mol depending on the reaction step.
Publications - GMC 416 | Alaska Division of Geological & Geophysical
DGGS GMC 416 Publication Details Title: Total organic carbon and rock-eval pyrolysis of core and core Resolution Inc. Analytical Laboratories, 2013, Total organic carbon and rock-eval pyrolysis of core and core Table(s) gmc416.xls (44.0 K) Keywords Organic Chemistry Top of Page Department of Natural Resources
Generation and Characterization of States of Matter at Solar Core Conditions
NASA Astrophysics Data System (ADS)
Bachmann, Benjamin
2016-10-01
The equation-of-state (EOS) of matter at solar core conditions is important to stellar evolution models and understanding the origin of high Z elements. Temperatures, densities and pressures of stellar cores are, however, orders of magnitude greater than those obtained in state-of-the-art laboratory EOS experiments and therefore such conditions have been limited to observational astronomy and theoretical models. Here we present a method to generate and diagnose these conditions in the laboratory, which is the first step towards characterizing the EOS of such extreme states of matter. By launching a converging shock wave into a deuterated plastic sphere (CD2) we produce solar core conditions (R /RSun < 0.2) which are initiated when the shock reaches the center of the CD2 sphere and extends during transit of the reflected wave until the temperature drops to a level where the neutron production and x-ray self emission drop below threshold levels of the detectors. These conditions are diagnosed by both, the neutron spectral data from D-D nuclear reactions, and temporal, spatial, and spectral x-ray emission data. We will discuss how these observables can be measured and used to help our understanding of dense plasma states that reach well into the thermonuclear regime of stellar cores. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was supported by Laboratory Directed Research and Development Grant No. 13-ERD-073.
Digital Core Modelling for Clastic Oil and Gas Reservoir
NASA Astrophysics Data System (ADS)
Belozerov, I.; Berezovsky, V.; Gubaydullin, M.; Yur’ev, A.
2018-05-01
"Digital core" is a multi-purpose tool for solving a variety of tasks in the field of geological exploration and production of hydrocarbons at various stages, designed to improve the accuracy of geological study of subsurface resources, the efficiency of reproduction and use of mineral resources, as well as applying the results obtained in production practice. The actuality of the development of the "Digital core" software is that even a partial replacement of natural laboratory experiments with mathematical modelling can be used in the operative calculation of reserves in exploratory drilling, as well as in the absence of core material from wells. Or impossibility of its research by existing laboratory methods (weakly cemented, loose, etc. rocks). 3D-reconstruction of the core microstructure can be considered as a cheap and least time-consuming method for obtaining petrophysical information about the main filtration-capacitive properties and fluid motion in reservoir rocks.
Hodges, Mary K.V.; Orr, Stephanie M.; Potter, Katherine E.; LeMaitre, Tynan
2012-01-01
This report, prepared in cooperation with the U.S. Department of Energy, summarizes construction, geophysical, and lithologic data collected from about 4,509 feet of core from seven boreholes deepened or drilled by the U.S. Geological Survey (USGS), Idaho National Laboratory (INL) Project Office, from 2006 to 2009 at the INL. USGS 103, 105, 108, and 131 were deepened and cored from 759 to 1,307 feet, 800 to 1,409 feet, 760 to 1,218 feet, and 808 to 1,239 feet, respectively. Boreholes USGS 135, NRF-15, and NRF-16 were drilled and continuously cored from land surface to 1,198, 759, and 425 feet, respectively. Cores were photographed and digitally logged by using commercially available software. Borehole descriptions summarize location, completion date, and amount and type of core recovered.
Twining, Brian V.; Hodges, Mary K.V.; Orr, Stephanie
2008-01-01
This report summarizes construction, geophysical, and lithologic data collected from ten U.S. Geological Survey (USGS) boreholes completed between 1999 nd 2006 at the Idaho National Laboratory (INL): USGS 126a, 126b, 127, 128, 129, 130, 131, 132, 133, and 134. Nine boreholes were continuously cored; USGS 126b had 5 ft of core. Completion depths range from 472 to 1,238 ft. Geophysical data were collected for each borehole, and those data are summarized in this report. Cores were photographed and digitally logged using commercially available software. Digital core logs are in appendixes A through J. Borehole descriptions summarize location, completion date, and amount and type of core recovered. This report was prepared by the USGS in cooperation with the U.S. Department of Energy (DOE).
The Viral Evolution Core within the AIDS and Cancer Virus Program will extract viral RNA/DNA from cell-free or cell-associated samples. Complementary (cDNA) will be generated as needed, and cDNA or DNA will be diluted to a single copy prior to nested
Protein Changes in Macrophages Induced by Plasma from Rats Exposed to 35-GHz Millimeter Waves
2010-12-01
HumanEffectiveness Directorate, Air Force Research Laboratory, Brooks City-Base,Texas A macrophage assay and proteomic screening were used to...mW/cm2 until core temperature reached 41.0 8C. Two-dimensional polyacrylamide gel electrophoresis, image analysis, and Western blotting were used to...stimulation. Proteins of interest were identified using peptide mass fingerprinting. Compared to plasma from sham- exposed rats, plasma from
Sodium Based Heat Pipe Modules for Space Reactor Concepts: Stainless Steel SAFE-100 Core
NASA Technical Reports Server (NTRS)
Martin, James J.; Reid, Robert S.
2004-01-01
A heat pipe cooled reactor is one of several candidate reactor cores being considered for advanced space power and propulsion systems to support future space exploration applications. Long life heat pipe modules, with designs verified through a combination of theoretical analysis and experimental lifetime evaluations, would be necessary to establish the viability of any of these candidates, including the heat pipe reactor option. A hardware-based program was initiated to establish the infrastructure necessary to build heat pipe modules. This effort, initiated by Los Alamos National Laboratory and referred to as the Safe Affordable Fission Engine (SAFE) project, set out to fabricate and perform non-nuclear testing on a modular heat pipe reactor prototype that can provide 100 kilowatt from the core to an energy conversion system at 700 C. Prototypic heat pipe hardware was designed, fabricated, filled, closed-out and acceptance tested.
Szymanski, Jacek; Wilson, David L; Zhang, Guo-Qiang
2009-10-01
The rapid expansion of biomedical research has brought substantial scientific and administrative data management challenges to modern core facilities. Scientifically, a core facility must be able to manage experimental workflow and the corresponding set of large and complex scientific data. It must also disseminate experimental data to relevant researchers in a secure and expedient manner that facilitates collaboration and provides support for data interpretation and analysis. Administratively, a core facility must be able to manage the scheduling of its equipment and to maintain a flexible and effective billing system to track material, resource, and personnel costs and charge for services to sustain its operation. It must also have the ability to regularly monitor the usage and performance of its equipment and to provide summary statistics on resources spent on different categories of research. To address these informatics challenges, we introduce a comprehensive system called MIMI (multimodality, multiresource, information integration environment) that integrates the administrative and scientific support of a core facility into a single web-based environment. We report the design, development, and deployment experience of a baseline MIMI system at an imaging core facility and discuss the general applicability of such a system in other types of core facilities. These initial results suggest that MIMI will be a unique, cost-effective approach to addressing the informatics infrastructure needs of core facilities and similar research laboratories.
NASA Astrophysics Data System (ADS)
Bonaccorsi, Rosalba; Stoker, Carol R.
2008-10-01
Science results from a field-simulated lander payload and post-mission laboratory investigations provided "ground truth" to interpret remote science observations made as part of the 2005 Mars Astrobiology Research and Technology Experiment (MARTE) drilling mission simulation. The experiment was successful in detecting evidence for life, habitability, and preservation potential of organics in a relevant astrobiological analogue of Mars. Science results. Borehole 7 was drilled near the Río Tinto headwaters at Peña de Hierro (Spain) in the upper oxidized remnant of an acid rock drainage system. Analysis of 29 cores (215 cm of core was recovered from 606 cm penetrated depth) revealed a matrix of goethite- (42-94%) and hematite-rich (47-87%) rocks with pockets of phyllosilicates (47-74%) and fine- to coarse-grained loose material. Post-mission X-ray diffraction (XRD) analysis confirmed the range of hematite:goethite mixtures that were visually recognizable (˜1:1, ˜1:2, and ˜1:3 mixtures displayed a yellowish-red color whereas 3:1 mixtures displayed a dark reddish-brown color). Organic carbon was poorly preserved in hematite/goethite-rich materials (Corg <0.05 wt %) beneath the biologically active organic-rich soil horizon (Corg ˜3-11 wt %) in contrast to the phyllosilicate-rich zones (Corg ˜0.23 wt %). Ground truth vs. remote science analysis. Laboratory-based analytical results were compared to the analyses obtained by a Remote Science Team (RST) using a blind protocol. Ferric iron phases, lithostratigraphy, and inferred geologic history were correctly identified by the RST with the exception of phyllosilicate-rich materials that were misinterpreted as weathered igneous rock. Adenosine 5‧-triphosphate (ATP) luminometry, a tool available to the RST, revealed ATP amounts above background noise, i.e., 278-876 Relative Luminosity Units (RLUs) in only 6 cores, whereas organic carbon was detected in all cores. Our manned vs. remote observations based on automated acquisitions during the project provide insights for the preparation of future astrobiology-driven Mars missions.
Bonaccorsi, Rosalba; Stoker, Carol R
2008-10-01
Science results from a field-simulated lander payload and post-mission laboratory investigations provided "ground truth" to interpret remote science observations made as part of the 2005 Mars Astrobiology Research and Technology Experiment (MARTE) drilling mission simulation. The experiment was successful in detecting evidence for life, habitability, and preservation potential of organics in a relevant astrobiological analogue of Mars. SCIENCE RESULTS: Borehole 7 was drilled near the Río Tinto headwaters at Peña de Hierro (Spain) in the upper oxidized remnant of an acid rock drainage system. Analysis of 29 cores (215 cm of core was recovered from 606 cm penetrated depth) revealed a matrix of goethite- (42-94%) and hematite-rich (47-87%) rocks with pockets of phyllosilicates (47-74%) and fine- to coarse-grained loose material. Post-mission X-ray diffraction (XRD) analysis confirmed the range of hematite:goethite mixtures that were visually recognizable (approximately 1:1, approximately 1:2, and approximately 1:3 mixtures displayed a yellowish-red color whereas 3:1 mixtures displayed a dark reddish-brown color). Organic carbon was poorly preserved in hematite/goethite-rich materials (C(org) <0.05 wt %) beneath the biologically active organic-rich soil horizon (C(org) approximately 3-11 wt %) in contrast to the phyllosilicate-rich zones (C(org) approximately 0.23 wt %). GROUND TRUTH VS. REMOTE SCIENCE ANALYSIS: Laboratory-based analytical results were compared to the analyses obtained by a Remote Science Team (RST) using a blind protocol. Ferric iron phases, lithostratigraphy, and inferred geologic history were correctly identified by the RST with the exception of phyllosilicate-rich materials that were misinterpreted as weathered igneous rock. Adenosine 5'-triphosphate (ATP) luminometry, a tool available to the RST, revealed ATP amounts above background noise, i.e., 278-876 Relative Luminosity Units (RLUs) in only 6 cores, whereas organic carbon was detected in all cores. Our manned vs. remote observations based on automated acquisitions during the project provide insights for the preparation of future astrobiology-driven Mars missions.
Mitsuhata, Yuji; Nishiwaki, Junko; Kawabe, Yoshishige; Utsuzawa, Shin; Jinguuji, Motoharu
2010-01-01
Non-destructive measurements of contaminated soil core samples are desirable prior to destructive measurements because they allow obtaining gross information from the core samples without touching harmful chemical species. Medical X-ray computed tomography (CT) and time-domain low-field nuclear magnetic resonance (NMR) relaxometry were applied to non-destructive measurements of sandy soil core samples from a real site contaminated with heavy oil. The medical CT visualized the spatial distribution of the bulk density averaged over the voxel of 0.31 × 0.31 × 2 mm3. The obtained CT images clearly showed an increase in the bulk density with increasing depth. Coupled analysis with in situ time-domain reflectometry logging suggests that this increase is derived from an increase in the water volume fraction of soils with depth (i.e., unsaturated to saturated transition). This was confirmed by supplementary analysis using high-resolution micro-focus X-ray CT at a resolution of ∼10 μm, which directly imaged the increase in pore water with depth. NMR transverse relaxation waveforms of protons were acquired non-destructively at 2.7 MHz by the Carr–Purcell–Meiboom–Gill (CPMG) pulse sequence. The nature of viscous petroleum molecules having short transverse relaxation times (T2) compared to water molecules enabled us to distinguish the water-saturated portion from the oil-contaminated portion in the core sample using an M0–T2 plot, where M0 is the initial amplitude of the CPMG signal. The present study demonstrates that non-destructive core measurements by medical X-ray CT and low-field NMR provide information on the groundwater saturation level and oil-contaminated intervals, which is useful for constructing an adequate plan for subsequent destructive laboratory measurements of cores. PMID:21258437
Biospecimen Core Resource - TCGA
The Cancer Genome Atlas (TCGA) Biospecimen Core Resource centralized laboratory reviews and processes blood and tissue samples and their associated data using optimized standard operating procedures for the entire TCGA Research Network.
Reproducibility in Data-Scarce Environments
NASA Astrophysics Data System (ADS)
Darch, P. T.
2016-12-01
Among the usual requirements for reproducibility are large volumes of data and computationally intensive methods. Many fields within earth sciences, however, do not meet these requirements. Data are scarce and data-intensive methods are not well established. How can science be reproducible under these conditions? What changes, both infrastructural and cultural, are needed to advance reproducibility? This paper presents findings from a long-term social scientific case study of an emergent and data scarce field, the deep subseafloor biosphere. This field studies interactions between microbial communities living in the seafloor and the physical environments they inhabit. Factors such as these make reproducibility seem a distant goal for this community: - The relative newness of the field. Serious study began in the late 1990s; - The highly multidisciplinary nature of the field. Researchers come from a range of physical and life science backgrounds; - Data scarcity. Domain researchers produce much of these data in their own onshore laboratories by analyzing cores from international ocean drilling expeditions. Allocation of cores is negotiated between researchers from many fields. These factors interact in multiple ways to inhibit reproducibility: - Incentive structures emphasize producing new data and new knowledge rather than reanalysing extant data; - Only a few steps of laboratory analyses can be reproduced - such as analysis of DNA sequences, but not extraction of DNA from cores -, due to scarcity of cores; - Methodological heterogeneity is a consequence of multidisciplinarity, as researchers bring different techniques from diverse fields. - Few standards for data collection or analysis are available at this early stage of the field; - While datasets from multiple biological and physical phenomena can be integrated into a single workflow, curation tends to be divergent. Each type of dataset may be subject to different disparate policies and contributed to different databases. Our study demonstrates that data scarcity can be particularly acute in emerging scientific fields, and often results from resource scarcity more generally. Reproducibility tends to be a low priority among the many other scientific challenges they face.
Massonnet, Catherine; Vile, Denis; Fabre, Juliette; Hannah, Matthew A.; Caldana, Camila; Lisec, Jan; Beemster, Gerrit T.S.; Meyer, Rhonda C.; Messerli, Gaëlle; Gronlund, Jesper T.; Perkovic, Josip; Wigmore, Emma; May, Sean; Bevan, Michael W.; Meyer, Christian; Rubio-Díaz, Silvia; Weigel, Detlef; Micol, José Luis; Buchanan-Wollaston, Vicky; Fiorani, Fabio; Walsh, Sean; Rinn, Bernd; Gruissem, Wilhelm; Hilson, Pierre; Hennig, Lars; Willmitzer, Lothar; Granier, Christine
2010-01-01
A major goal of the life sciences is to understand how molecular processes control phenotypes. Because understanding biological systems relies on the work of multiple laboratories, biologists implicitly assume that organisms with the same genotype will display similar phenotypes when grown in comparable conditions. We investigated to what extent this holds true for leaf growth variables and metabolite and transcriptome profiles of three Arabidopsis (Arabidopsis thaliana) genotypes grown in 10 laboratories using a standardized and detailed protocol. A core group of four laboratories generated similar leaf growth phenotypes, demonstrating that standardization is possible. But some laboratories presented significant differences in some leaf growth variables, sometimes changing the genotype ranking. Metabolite profiles derived from the same leaf displayed a strong genotype × environment (laboratory) component. Genotypes could be separated on the basis of their metabolic signature, but only when the analysis was limited to samples derived from one laboratory. Transcriptome data revealed considerable plant-to-plant variation, but the standardization ensured that interlaboratory variation was not considerably larger than intralaboratory variation. The different impacts of the standardization on phenotypes and molecular profiles could result from differences of temporal scale between processes involved at these organizational levels. Our findings underscore the challenge of describing, monitoring, and precisely controlling environmental conditions but also demonstrate that dedicated efforts can result in reproducible data across multiple laboratories. Finally, our comparative analysis revealed that small variations in growing conditions (light quality principally) and handling of plants can account for significant differences in phenotypes and molecular profiles obtained in independent laboratories. PMID:20200072
Using data to make decisions and drive results: a LEAN implementation strategy.
Panning, Rick
2005-03-28
During the process of facility planning, Fairview Laboratory Services utilized LEAN manufacturing to maximize efficiency, simplify processes, and improve laboratory support of patient care services. By incorporating the LEAN program's concepts in our pilot program, we were able to reduce turnaround time by 50%, improve productivity by greater than 40%, reduce costs by 31%, save more than 440 square feet of space, standardize work practices, reduce errors and error potential, continuously measure performance, eliminate excess unused inventory and visual noise, and cross-train 100% of staff in the core laboratory. In addition, we trained a core team of people that is available to coordinate future LEAN projects in the laboratory and other areas of the organization.
Bedell, T Aaron; Hone, Graham A B; Valette, Damien; Yu, Jin-Quan; Davies, Huw M L; Sorensen, Erik J
2016-07-11
Methods for functionalizing carbon-hydrogen bonds are featured in a new synthesis of the tricyclic core architecture that characterizes the indoxamycin family of secondary metabolites. A unique collaboration between three laboratories has engendered a design for synthesis featuring two sequential C-H functionalization reactions, namely a diastereoselective dirhodium carbene insertion followed by an ester-directed oxidative Heck cyclization, to rapidly assemble the congested tricyclic core of the indoxamycins. This project exemplifies how multi-laboratory collaborations can foster conceptually novel approaches to challenging problems in chemical synthesis. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Zahasky, Christopher; Benson, Sally M.
2018-05-01
Accurate descriptions of heterogeneity in porous media are important for understanding and modeling single phase (e.g. contaminant transport, saltwater intrusion) and multiphase (e.g. geologic carbon storage, enhanced oil recovery) transport problems. Application of medical imaging to experimentally quantify these processes has led to significant progress in material characterization and understanding fluid transport behavior at laboratory scales. While widely utilized in cancer diagnosis and management, cardiology, and neurology, positron emission tomography (PET) has had relatively limited applications in earth science. This study utilizes a small-bore micro-PET scanner to image and quantify the transport behavior of pulses of a conservative aqueous radiotracer injected during single and multiphase flow experiments in two heterogeneous Berea sandstone cores. The cores are discretized into axial-parallel streamtubes, and using the reconstructed micro-PET data, expressions are derived from spatial moment analysis for calculating sub-core tracer flux and pore water velocity. Using the flux and velocity measurements, it is possible to calculate porosity and saturation from volumetric flux balance, and calculate permeability and water relative permeability from Darcy's law. Second spatial moment analysis enables measurement of sub-core solute dispersion during both single phase and multiphase experiments. A numerical simulation model is developed to verify the assumptions of the streamtube dimension reduction technique. A variation of the reactor ratio is presented as a diagnostic metric to efficiently determine the validity of the streamtube approximation in core and column-scale experiments. This study introduces a new method to quantify sub-core permeability, relative permeability, and dispersion. These experimental and analytical methods provide a foundation for future work on experimental measurements of differences in transport behavior across scales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Department of Energy Order DOE 5000.4A establishes DOE`s policy and guidelines regarding Laboratory Directed Research and Development (LDRD) at its multiprogram laboratories. As described in 5000.4A, LDRD is ``research and development of a creative and innovative nature which is selected by the Laboratory Director or his or her designee, for the purpose of maintaining the scientific and technological vitality of the Laboratory and to respond to scientific and technological opportunities in conformance with the guidelines in this Order. LDRD includes activities previously defined as ER&D, as well as other discretionary research and development activities not provided for in amore » DOE program.`` Consistent with the Mission Statement and Strategic Plan provided in PNL`s Institutional Plan, the LDRD investments are focused on developing new and innovative approaches in research related to our ``core competencies.`` Currently, PNL`s core competencies have been identified as integrated environmental research; process technology; energy systems research. In this report, the individual summaries of Laboratory-level LDRD projects are organized according to these core competencies. The largest proportion of Laboratory-level LDRD funds is allocated to the core competency of integrated environmental research. A significant proportion of PNL`s LDRD funds are also allocated to projects within the various research centers that are proposed by individual researchers or small research teams. The projects are described in Section 2.0. The projects described in this report represent PNL`s investment in its future and are vital to maintaining the ability to develop creative solutions for the scientific and technical challenges faced by DOE and the nation. In accordance with DOE guidelines, the report provides an overview of PNL`s LDRD program and the management process used for the program and project summaries for each LDRD project.« less
PDS Archive Release of Apollo 11, Apollo 12, and Apollo 17 Lunar Rock Sample Images
NASA Technical Reports Server (NTRS)
Garcia, P. A.; Stefanov, W. L.; Lofgren, G. E.; Todd, N. S.; Gaddis, L. R.
2013-01-01
Scientists at the Johnson Space Center (JSC) Lunar Sample Laboratory, Information Resources Directorate, and Image Science & Analysis Laboratory have been working to digitize (scan) the original film negatives of Apollo Lunar Rock Sample photographs [1, 2]. The rock samples, and associated regolith and lunar core samples, were obtained during the Apollo 11, 12, 14, 15, 16 and 17 missions. The images allow scientists to view the individual rock samples in their original or subdivided state prior to requesting physical samples for their research. In cases where access to the actual physical samples is not practical, the images provide an alternate mechanism for study of the subject samples. As the negatives are being scanned, they have been formatted and documented for permanent archive in the NASA Planetary Data System (PDS). The Astromaterials Research and Exploration Science Directorate (which includes the Lunar Sample Laboratory and Image Science & Analysis Laboratory) at JSC is working collaboratively with the Imaging Node of the PDS on the archiving of these valuable data. The PDS Imaging Node is now pleased to announce the release of the image archives for Apollo missions 11, 12, and 17.
ERIC Educational Resources Information Center
Sampson, Victor; Enderle, Patrick; Grooms, Jonathon; Witte, Shelbie
2013-01-01
This study examined how students' science-specific argumentative writing skills and understanding of core ideas changed over the course of a school year as they participated in a series of science laboratories designed using the Argument-Driven Inquiry (ADI) instructional model. The ADI model is a student-centered and writing-intensive approach to…
NASA Astrophysics Data System (ADS)
Sou, In Mei; Calantoni, Joseph; Reed, Allen; Furukawa, Yoko
2012-11-01
A synchronized dual stereo particle image velocimetry (PIV) measurement technique is used to examine the erosion process of a cohesive sediment core in the Small Oscillatory Flow Tunnel (S-OFT) in the Sediment Dynamics Laboratory at the Naval Research Laboratory, Stennis Space Center, MS. The dual stereo PIV windows were positioned on either side of a sediment core inserted along the centerline of the S-OFT allowing for a total measurement window of about 20 cm long by 10 cm high with sub-millimeter spacing on resolved velocity vectors. The period of oscillation ranged from 2.86 to 6.12 seconds with constant semi-excursion amplitude in the test section of 9 cm. During the erosion process, Kelvin-Helmholtz instabilities were observed as the flow accelerated in each direction and eventually were broken down when the flow reversed. The relative concentration of suspended sediments under different flow conditions was estimated using the intensity of light scattered from the sediment particles in suspension. By subtracting the initial light scattered from the core, the residual light intensity was assumed to be scattered from suspended sediments eroded from the core. Results from two different sediment core samples of mud and sand mixtures will be presented.
Application of the Toyota Production System improves core laboratory operations.
Rutledge, Joe; Xu, Min; Simpson, Joanne
2010-01-01
To meet the increased clinical demands of our hospital expansion, improve quality, and reduce costs, our tertiary care, pediatric core laboratory used the Toyota Production System lean processing to reorganize our 24-hour, 7 d/wk core laboratory. A 4-month, consultant-driven process removed waste, led to a physical reset of the space to match the work flow, and developed a work cell for our random access analyzers. In addition, visual controls, single piece flow, standard work, and "5S" were instituted. The new design met our goals as reflected by achieving and maintaining improved turnaround time (TAT; mean for creatinine reduced from 54 to 23 minutes) with increased testing volume (20%), monetary savings (4 full-time equivalents), decreased variability in TAT, and better space utilization (25% gain). The project had the unanticipated consequence of eliminating STAT testing because our in-laboratory TAT for routine testing was less than our prior STAT turnaround goal. The viability of this approach is demonstrated by sustained gains and further PDCA (Plan, Do, Check, Act) improvements during the 4 years after completion of the project.
Ballen, Cissy J; Thompson, Seth K; Blum, Jessamina E; Newstrom, Nicholas P; Cotner, Sehoya
2018-01-01
Course-based undergraduate research experiences (CUREs) are a type of laboratory learning environment associated with a science course, in which undergraduates participate in novel research. According to Auchincloss et al. (CBE Life Sci Educ 2104; 13:29-40), CUREs are distinct from other laboratory learning environments because they possess five core design components, and while national calls to improve STEM education have led to an increase in CURE programs nationally, less work has specifically focused on which core components are critical to achieving desired student outcomes. Here we use a backward elimination experimental design to test the importance of two CURE components for a population of non-biology majors: the experience of discovery and the production of data broadly relevant to the scientific or local community. We found nonsignificant impacts of either laboratory component on students' academic performance, science self-efficacy, sense of project ownership, and perceived value of the laboratory experience. Our results challenge the assumption that all core components of CUREs are essential to achieve positive student outcomes when applied at scale.
Tank 241-AZ-102 Privatization Push Mode Core Sampling and Analysis Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
RASMUSSEN, J.H.
1999-08-02
This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AZ-102. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AZ-102 required to satisfy the Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank TIS An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase 1: Confirm Tank TIS An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activity Waste andmore » High Level Waste Feed Data Quality Objectives (L&H DQO) (Patello et al. 1999) and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). The Tank Characterization Technical Sampling Basis document (Brown et al. 1998) indicates that these issues, except the Equipment DQO apply to tank 241-AZ-102 for this sampling event. The Equipment DQO is applied for shear strength measurements of the solids segments only. Poppiti (1999) requires additional americium-241 analyses of the sludge segments. Brown et al. (1998) also identify safety screening, regulatory issues and provision of samples to the Privatization Contractor(s) as applicable issues for this tank. However, these issues will not be addressed via this sampling event. Reynolds et al. (1999) concluded that information from previous sampling events was sufficient to satisfy the safety screening requirements for tank 241 -AZ-102. Push mode core samples will be obtained from risers 15C and 24A to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples, composite the liquids and solids, perform chemical analyses, and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AZ-102 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plan.« less
Laboratory directed research and development program, FY 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-02-01
The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab) Laboratory Directed Research and Development Program FY 1996 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of the Laboratory Directed Research and Development (LDRD) program planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The Berkeley Lab LDRD program is a critical tool for directing the Laboratory`s forefront scientific research capabilities toward vital, excellent, and emerging scientific challenges. The program provides themore » resources for Berkeley Lab scientists to make rapid and significant contributions to critical national science and technology problems. The LDRD program also advances the Laboratory`s core competencies, foundations, and scientific capability, and permits exploration of exciting new opportunities. Areas eligible for support include: (1) Work in forefront areas of science and technology that enrich Laboratory research and development capability; (2) Advanced study of new hypotheses, new experiments, and innovative approaches to develop new concepts or knowledge; (3) Experiments directed toward proof of principle for initial hypothesis testing or verification; and (4) Conception and preliminary technical analysis to explore possible instrumentation, experimental facilities, or new devices.« less
Biotechniques Laboratory: An Enabling Course in the Biological Sciences
ERIC Educational Resources Information Center
Di Trapani, Giovanna; Clarke, Frank
2012-01-01
Practical skills and competencies are critical to student engagement and effective learning in laboratory courses. This article describes the design of a yearlong, stand-alone laboratory course--the Biotechniques Laboratory--a common core course in the second year of all our degree programs in the biological sciences. It is an enabling,…
Integration of Video-Based Demonstrations to Prepare Students for the Organic Chemistry Laboratory
ERIC Educational Resources Information Center
Nadelson, Louis S.; Scaggs, Jonathan; Sheffield, Colin; McDougal, Owen M.
2015-01-01
Consistent, high-quality introductions to organic chemistry laboratory techniques effectively and efficiently support student learning in the organic chemistry laboratory. In this work, we developed and deployed a series of instructional videos to communicate core laboratory techniques and concepts. Using a quasi-experimental design, we tested the…
NASA Astrophysics Data System (ADS)
Place, P., Jr.; Petrenko, V. V.; Vimont, I.
2017-12-01
Carbon Monoxide (CO) is an important atmospheric trace gas that affects the oxidative capacity of the atmosphere and contributes indirectly to anthropogenic radiative forcing. Carbon monoxide stable isotopes can also serve as a tracer for variations in biomass burning, particularly in the preindustrial atmosphere. A good understanding of the past variations in CO mole fractions and isotopic composition can help improve the skill of chemical transport models and constrain biomass burning changes. Ice cores may preserve a record of past atmospheric CO for analysis and interpretation. To this end, a new extraction system has been developed for analysis of stable isotopes (δ13CO and δC18O) of atmospheric carbon monoxide from ice core and atmospheric air samples. This system has been designed to measure relatively small sample sizes (80 cc STP of air) to accommodate the limited availability of ice core samples. Trapped air is extracted from ice core samples via melting in a glass vacuum chamber. This air is expanded into a glass expansion loop and then compressed into the sample loop of a Reducing Gas Detector (Peak Laboratories, Peak Performer 1 RCP) for the CO mole fraction measurement. The remaining sample gas will be expelled from the melt vessel into a larger expansion loop via headspace compression for isotopic analysis. The headspace compression will be accomplished by introduction of clean degassed water into the bottom of the melt vessel. Isotopic analysis of the sample gas is done utilizing the Schütze Reagent to convert the carbon monoxide to carbon dioxide (CO2) which is then measured using continuous-flow isotope ratio mass spectrometry (Elementar Americas, IsoPrime 100). A series of cryogenic traps are used to purify the sample air, capture the converted sample CO2, and cryofocus the sample CO2 prior to injection.
Viral Evolution Core | FNLCR Staging
Brandon F. Keele, Ph.D. PI/Senior Principal Investigator, Retroviral Evolution Section Head, Viral Evolution Core Leidos Biomedical Research, Inc. Frederick National Laboratory for Cancer Research Frederick, MD 21702-1201 Tel: 301-846-173
Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components
NASA Technical Reports Server (NTRS)
Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.
2000-01-01
The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. L. Chichester; S. J. Thompson
2013-09-01
This report serves as a literature review of prior work performed at Idaho National Laboratory, and its predecessor organizations Idaho National Engineering Laboratory (INEL) and Idaho National Engineering and Environmental Laboratory (INEEL), studying radionuclide partitioning within the melted fuel debris of the reactor of the Three Mile Island 2 (TMI-2) nuclear power plant. The purpose of this review is to document prior published work that provides supporting evidence of the utility of using 144Ce as a surrogate for plutonium within melted fuel debris. When the TMI-2 accident occurred no quantitative nondestructive analysis (NDA) techniques existed that could assay plutonium inmore » the unconventional wastes from the reactor. However, unpublished work performed at INL by D. W. Akers in the late 1980s through the 1990s demonstrated that passive gamma-ray spectrometry of 144Ce could potentially be used to develop a semi-quantitative correlation for estimating plutonium content in these materials. The fate and transport of radioisotopes in fuel from different regions of the core, including uranium, fission products, and actinides, appear to be well characterized based on the maximum temperature reached by fuel in different parts of the core and the melting point, boiling point, and volatility of those radioisotopes. Also, the chemical interactions between fuel, fuel cladding, control elements, and core structural components appears to have played a large role in determining when and how fuel relocation occurred in the core; perhaps the most important of these reaction appears to be related to the formation of mixed-material alloys, eutectics, in the fuel cladding. Because of its high melting point, low volatility, and similar chemical behavior to plutonium, the element cerium appears to have behaved similarly to plutonium during the evolution of the TMI-2 accident. Anecdotal evidence extrapolated from open-source literature strengthens this logical feasibility for using cerium, which is rather easy to analyze using passive nondestructive analysis gamma-ray spectrometry, as a surrogate for plutonium in the final analysis of TMI-2 melted fuel debris. The generation of this report is motivated by the need to perform nuclear material accountancy measurements on the melted fuel debris that will be excavated from the damaged nuclear reactors at the Fukushima Daiichi nuclear power plant in Japan, which were destroyed by the Tohoku earthquake and tsunami on March 11, 2011. Lessons may be taken from prior U.S. work related to the study of the TMI-2 core debris to support the development of new assay methods for use at Fukushima Daiichi. While significant differences exist between the two reactor systems (pressurized water reactor (TMI-2) versus boiling water reactor (FD), fresh water post-accident cooing (TMI-2) versus salt water (FD), maintained containment (TMI-2) versus loss of containment (FD)) there remain sufficient similarities to motivate these comparisons.« less
Interpretation of well logs in a carbonate aquifer
MacCary, L.M.
1978-01-01
This report describes the log analysis of the Randolph and Sabial core holes in the Edwards aquifer in Texas, with particular attention to the principles that can be applied generally to any carbonate system. The geologic and hydrologic data were obtained during the drilling of the two holes, from extensive laboratory analysis of the cores, and from numerous geophysical logs run in the two holes. Some logging methods are inherently superiors to others for the analysis of limestone and dolomite aquifers. Three such systems are the dentistry, neutron, and acoustic-velocity (sonic) logs. Most of the log analysis described here is based on the interpretation of suites of logs from these three systems. In certain instances, deeply focused resistivity logs can be used to good advantage in carbonate rock studies; this technique is used to computer the water resistivity in the Randolph core hole. The rocks penetrated by the Randolph core hole are typical of those carbonates that have undergone very little solution by recent ground-water circulation. There are few large solutional openings; the water is saline; and the rocks are dark, dolomitic, have pore space that is interparticle or intercrystalline, and contain unoxidized organic material. The total porosity of rocks in the saline zone is higher than that of rocks in the fresh-water aquifer; however, the intrinsic permeability is much less in the saline zone because there are fewer large solutional openings. The Sabinal core hole penetrates a carbonate environment that has experienced much solution by ground water during recent geologic time. The rocks have high secondary porosities controlled by sedimentary structures within the rock; the water is fresh; and the dominant rock composition is limestone. The relative percentages of limestone and dolomite, the average matrix (grain) densities of the rock mixtures , and the porosity of the rock mass can be calculated from density, neutron, and acoustic logs. With supporting data from resistivity logs, the formation water quality can be estimated, as well as the relative cementation or tortuosity of the rock. Many of these properties calculated from logs can be verified by analysis of the core available from test holes drilled in the saline and fresh water zones.
Environmental Fate and Biological Consequences of Chemicals Related to Air Force Activities
1982-09-01
milliliters of a 10% solution over a 78.5-square centimeter surface. Sample treated cores were tested under laboratory conditions and in field studies and using...natural weather conditions for comparative tests. No Sig- nificant differences were noted between field and laboratory conditions . Biological...JP-4- and JP-5-dosed cores initially show a stressed condition as indicated by an increased rate in CO2 produc- tion followed by a rate of CO2
JPRS Report Science and Technology, Japan: Atomic Energy Society 1989 Annual Meeting.
1989-10-13
Control Rod Hole in VHTRC-1 Core [F, Akino, T, Yamane, et al.] ,,, 5 Measurement of MEU [Medium Enriched Uranium ] Fuel Element Characteristics in...K. Yoshida, K. Kobayashi, I. Kimura , C. Yamanaka, and S. Nakai, Laser Laboratory,, Osaka University. Nuclear Reactor Laboratory, Kyoto University...1 core loaded with 278 fuel rods (4 percent enriched uranium ). The PNS target was placed at the back center of the 1/2 assembly on the fixed side
Method and apparatus for recovering unstable cores
McGuire, Patrick L.; Barraclough, Bruce L.
1983-01-01
A method and apparatus suitable for stabilizing hydrocarbon cores are given. Such stabilized cores have not previously been obtainable for laboratory study, and such study is believed to be required before the hydrate reserves can become a utilizable resource. The apparatus can be built using commercially available parts and is very simple and safe to operate.
Environmental Response Laboratory Network Membership and Benefits
Member laboratories must meet core requirements including quality systems, policies and procedures, sample and data management, and analytical capabilities. Benefits include training and exercise opportunities, information sharing and technical support.
ERIC Educational Resources Information Center
Chen, Baiyun; DeMara, Ronald F.; Salehi, Soheil; Hartshorne, Richard
2018-01-01
A laboratory pedagogy interweaving weekly student portfolios with onsite formative electronic laboratory assessments (ELAs) is developed and assessed within the laboratory component of a required core course of the electrical and computer engineering (ECE) undergraduate curriculum. The approach acts to promote student outcomes, and neutralize…
Implementation of a platform dedicated to the biomedical analysis terminologies management
Cormont, Sylvie; Vandenbussche, Pierre-Yves; Buemi, Antoine; Delahousse, Jean; Lepage, Eric; Charlet, Jean
2011-01-01
Background and objectives. Assistance Publique - Hôpitaux de Paris (AP-HP) is implementing a new laboratory management system (LMS) common to the 12 hospital groups. First step to this process was to acquire a biological analysis dictionary. This dictionary is interfaced with the international nomenclature LOINC, and has been developed in collaboration with experts from all biological disciplines. In this paper we describe in three steps (modeling, data migration and integration/verification) the implementation of a platform for publishing and maintaining the AP-HP laboratory data dictionary (AnaBio). Material and Methods. Due to data complexity and volume, setting up a platform dedicated to the terminology management was a key requirement. This is an enhancement tackling identified weaknesses of previous spreadsheet tool. Our core model allows interoperability regarding data exchange standards and dictionary evolution. Results. We completed our goals within one year. In addition, structuring data representation has lead to a significant data quality improvement (impacting more than 10% of data). The platform is active in the 21 hospitals of the institution spread into 165 laboratories. PMID:22195205
ENFIN--A European network for integrative systems biology.
Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan
2009-11-01
Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.
BOREAS HYD-1 Soil Hydraulic Properties
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Knapp, David E. (Editor); Kelly, Shaun F.; Stangel, David E.; Smith, David E. (Technical Monitor)
2000-01-01
The Boreal Ecosystem-Atmosphere Study (BOREAS) Hydrology (HYD)-1 team coordinated a program of data collection to measure and monitor soil properties in collaboration with other science team measurement needs. This data set contains soil hydraulic properties determined at the Northern Study Area (NSA) and Southern Study Area (SSA) flux tower sites based on analysis of in situ tension infiltrometer tests and laboratory-determined water retention from soil cores collected during the 1994-95 field campaigns. Results from this analysis are saturated hydraulic conductivity, and fitting parameters for the van Genuchten-Mualem soil hydraulic conductivity and water retention function at flux tower sites. The data are contained in tabular ASCII files. The HYD-01 soil hydraulic properties data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).
Deepwater Gulf of Mexico turbidites -- Compaction effects on porosity and permeability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostermeier, R.M.
1995-06-01
The deepwater Gulf of Mexico is now a major area of activity for the US oil industry. Compaction causes particular concern because most prospective deepwater reservoirs are highly geo-pressured and many have limited aquifer support; water injection may also be problematic. To address some of the issues associated with compaction, the authors initiated a special core-analysis program to study compaction effects on turbidite sand porosity and permeability specifically. This program also addressed a number of subsidiary but no less important issues, such as sample characterization and quality, sample preparation, and test procedures. These issues are particularly pertinent, because Gulf ofmore » Mexico turbidites are generally unconsolidated, loose sands, and are thus susceptible to a whole array of potentially serious core-disturbing processes. One key result of the special core analysis program is that turbidite compressibilities exhibit large variations in both magnitude and stress dependence. These variations correlate with creep response in the laboratory measurements. The effects of compaction on permeability are significant. To eliminate complicating effects caused by fines movement, the authors made oil flow measurements at initial water saturation. The measurements indicate compaction reduces permeability four to five times more than porosity on a relative basis.« less
A data grid for imaging-based clinical trials
NASA Astrophysics Data System (ADS)
Zhou, Zheng; Chao, Sander S.; Lee, Jasper; Liu, Brent; Documet, Jorge; Huang, H. K.
2007-03-01
Clinical trials play a crucial role in testing new drugs or devices in modern medicine. Medical imaging has also become an important tool in clinical trials because images provide a unique and fast diagnosis with visual observation and quantitative assessment. A typical imaging-based clinical trial consists of: 1) A well-defined rigorous clinical trial protocol, 2) a radiology core that has a quality control mechanism, a biostatistics component, and a server for storing and distributing data and analysis results; and 3) many field sites that generate and send image studies to the radiology core. As the number of clinical trials increases, it becomes a challenge for a radiology core servicing multiple trials to have a server robust enough to administrate and quickly distribute information to participating radiologists/clinicians worldwide. The Data Grid can satisfy the aforementioned requirements of imaging based clinical trials. In this paper, we present a Data Grid architecture for imaging-based clinical trials. A Data Grid prototype has been implemented in the Image Processing and Informatics (IPI) Laboratory at the University of Southern California to test and evaluate performance in storing trial images and analysis results for a clinical trial. The implementation methodology and evaluation protocol of the Data Grid are presented.
Bartlett, John M S; Campbell, Fiona M; Ibrahim, Merdol; Thomas, Jeremy; Wencyk, Pete; Ellis, Ian; Kay, Elaine; Connolly, Yvonne; O'Grady, Anthony; Barnett, Sarah; Starczynski, Jane; Cunningham, Paul; Miller, Keith
2010-02-01
To assess a new HER2 fluorescence in situ hybridization (FISH) test and report on multicentre intrasite and intersite variation. HER2 results were scored from 45 breast cancers in eight laboratories using the Kreatech Poseidon HER2 FISH probe (Kreatech Diagnostics, Amsterdam, the Netherlands). Overall, 80.9% of cores were successfully analysed. Mean intrasite variation for HER2 ratio assessment was low (4.74%). Intersite variation in ratio was in line with previous reports (11.9+/-0.8%) for both reference and non-reference laboratories; only one laboratory displayed significantly higher intersite variation (P=0.009) than the remaining seven laboratories. The overall incidence of misclassification of cores was <1.3%, demonstrating an excellent level of concordance (>98.7%) across all eight laboratories, irrespective of whether they were 'reference' or 'routine diagnostic' laboratories. The Kreatech Poseidon HER2 FISH test is robust and reproducible. Highly quantitatively reproducible FISH results were obtained from eight 'diagnostic' and 'reference' laboratories; however, continued quality assessments are essential to good performance.
Physiology undergraduate degree requirements in the U.S.
VanRyn, Valerie S; Poteracki, James M; Wehrwein, Erica A
2017-12-01
Course-level learning objectives and core concepts for undergraduate physiology teaching exist. The next step is to consider how these resources fit into generalizable program-level guidelines for Bachelor of Science (BS) degrees in Physiology. In the absence of program-level guidelines for Physiology degree programs, we compiled a selective internal report to review degree requirements from 18 peer BS programs entitled "Physiology" in the United States (U.S.). There was a range of zero to three required semesters of math, physics, physics laboratory, general biology, biology laboratory, general chemistry, chemistry laboratory, organic chemistry, organic chemistry laboratory, biochemistry, biochemistry laboratory, anatomy, anatomy laboratory, core systems physiology, and physiology laboratory. Required upper division credits ranged from 11 to 31 and included system-specific, exercise and environmental, clinically relevant, pathology/disease-related, and basic science options. We hope that this information will be useful for all programs that consider themselves to be physiology, regardless of name. Reports such as this can serve as a starting point for collaboration among BS programs to improve physiology undergraduate education and best serve our students. Copyright © 2017 the American Physiological Society.
Status of Undergraduate Pharmacology Laboratories in Colleges of Pharmacy in the United States
ERIC Educational Resources Information Center
Katz, Norman L.; And Others
1978-01-01
U.S. colleges of pharmacy were surveyed in 1976 to determine whether a trend exists in continuing, discontinuing, or restructuring laboratory time in pharmaceutical education. Data regarding core undergraduate pharmacology courses, undergraduate pharmacology laboratory status, and pharmacology faculty are presented. (LBH)
NASA Astrophysics Data System (ADS)
Baldauf, J.; Denton, J.
2003-12-01
In order to replenish the national supply of science and mathematics educators, the National Science Foundation has supported the formation of the Center for Applications of Information Technology in the Teaching and Learning of Science (ITS) at Texas A&M University. The center staff and affiliated faculty work to change in fundamental ways the culture and relationships among scientists, educational researchers, and teachers. ITS is a partnership among the colleges of education, science, geosciences, agriculture and life science at Texas A&M University. Participants (teachers and graduate students) investigate how science is done and how science is taught and learned; how that learning is assessed, and how scholarly networks among all engaged in this work can be encouraged. While the center can offer graduate degrees most students apply as non-degree seekers. ITS participants are schooled on classroom technology applications, experience working on project teams, and access very current research work being conducted by scientists. ITS offers a certificate program consisting of two summer sessions over two years that results in 12 hours of graduate credit that can be applied to a degree. Interdisciplinary project teams spend three intense weeks connecting current research to classroom practices. During the past summer with the beginning of the two-year sequence, a course was implemented that introduced secondary teachers to Ocean Drilling Program (ODP) contributions to major earth science themes, using core and logging data, engineering (technology) tools and processes. Information Technology classroom applications were enhanced through hands-on laboratory exercises, web resources and online databases. The course was structured around the following objectives. 1. Distinguish the purpose and goals of the Ocean Drilling Program from the Integrated Ocean Drilling Program and describe the comparable science themes (ocean circulation, marine sedimentation, climate history, sea level change and geological time). This objective will be achieved by correctly answering 8 of 10 multiple choice items on course posttest on science themes of ODP/IODP. 2. Describe the technical tools and processes for determining sea level history by preparing and presenting a multimedia presentation on coring. 3. Describe the processes for describing a drill core and apply those processes to core samples from Leg 194 by developing a laboratory analysis report on core samples based on protocol for analyzing cores. 4. Explain the distinguishing features of scientific from industrial coring processes by developing a paper that contrasts scientific from industrial coring processes. 5. Describe the substructure of the ocean basin and the scientific tools (equipment and processes) used to explore this substructure by preparing and presenting a multimedia presentation on bore hole data interpretation. 6. Analyze and interpret data sets from a bore hole by developing a laboratory analysis report on bore-hole data. Student performance data for objectives indicate a 16% average positive change on the science themes addressed in instruction related to objective one occurred. Similarly, a 12% average positive change occurred on science education topics related to earth science among the students in this class. Ongoing contact between faculty members during the academic year is planned as these summer participants engage in implementing IT interventions and professional development experiences based on ocean science data experienced in the summer experience.
Multiscale Multiphysics Caprock Seal Analysis: A Case Study of the Farnsworth Unit, Texas, USA
NASA Astrophysics Data System (ADS)
Heath, J. E.; Dewers, T. A.; Mozley, P.
2015-12-01
Caprock sealing behavior depends on coupled processes that operate over a variety of length and time scales. Capillary sealing behavior depends on nanoscale pore throats and interfacial fluid properties. Larger-scale sedimentary architecture, fractures, and faults may govern properties of potential "seal-bypass" systems. We present the multiscale multiphysics investigation of sealing integrity of the caprock system that overlies the Morrow Sandstone reservoir, Farnsworth Unit, Texas. The Morrow Sandstone is the target injection unit for an on-going combined enhanced oil recovery-CO2 storage project by the Southwest Regional Partnership on Carbon Sequestration (SWP). Methods include small-to-large scale measurement techniques, including: focused ion beam-scanning electron microscopy; laser scanning confocal microscopy; electron and optical petrography; core examinations of sedimentary architecture and fractures; geomechanical testing; and a noble gas profile through sealing lithologies into the reservoir, as preserved from fresh core. The combined data set is used as part of a performance assessment methodology. The authors gratefully acknowledge the U.S. Department of Energy's (DOE) National Energy Technology Laboratory for sponsoring this project through the SWP under Award No. DE-FC26-05NT42591. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Data-intensive computing on numerically-insensitive supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, James P; Fasel, Patricia K; Habib, Salman
2010-12-03
With the advent of the era of petascale supercomputing, via the delivery of the Roadrunner supercomputing platform at Los Alamos National Laboratory, there is a pressing need to address the problem of visualizing massive petascale-sized results. In this presentation, I discuss progress on a number of approaches including in-situ analysis, multi-resolution out-of-core streaming and interactive rendering on the supercomputing platform. These approaches are placed in context by the emerging area of data-intensive supercomputing.
Case, David H.; Ijiri, Akira; Morono, Yuki; Tavormina, Patricia; Orphan, Victoria J.; Inagaki, Fumio
2017-01-01
High-pressure (HP) environments represent the largest volumetric majority of habitable space for microorganisms on the planet, including the deep-sea and subsurface biosphere. However, the importance of pressure as an environmental variable affecting deep microbial life and their biogeochemical functions in carbon cycling still remains poorly understood. Here, we designed a new high-volume HP-sediment core sampler that is deployable on the payload of a remotely operated vehicle and can maintain in situ HP conditions throughout multi-month enrichment incubations including daily amendments with liquid media and gases and daily effluent sampling for geochemical or microbiological analysis. Using the HP core device, we incubated sediment and overlying water associated with methane hydrate-exposed on the seafloor of the Joetsu Knoll, Japan, at 10 MPa and 4°C for 45 days in the laboratory. Diversity analyses based on 16S rRNA and methane-related functional genes, as well as carbon isotopic analysis of methane and bicarbonate, indicated the stimulation of both aerobic and anaerobic methanotrophy driven by members of the Methylococcales, and ANME, respectively: i.e., aerobic methanotrophy was observed upon addition of oxygen whereas anaerobic processes subsequently occurred after oxygen consumption. These laboratory-measured rates at 10 MPa were generally in agreement with previously reported rates of methane oxidation in other oceanographic locations. PMID:29312247
Data for ground-water test hole near Butte City, Central Valley aquifer project, California
French, James J.; Page, R.W.; Bertoldi, G.L.
1983-01-01
This report provides preliminary data for the third of seven test holes drilled as part of the Central Valley Aquifer Project which is part of the National Regional Aquifer Systems Analysis Program. The test hole was drilled in the SW 1/4 NE 1/4 sec. 32, T. 19 N., R. 1 W., Glenn County, California, about one-half mile south of the town of Butte City. Drilled to a depth of 1,432 feet below land surface, the hole is cased to a depth of 82 feet and equipped with three piezometer tubes to depths of 592 feet, 968 feet, and 1,330 feet. A 5-foot well screen is at the bottom of each piezometer. Each screened interval has a cement plug above and below it to isolate it from other parts of the aquifer , and the well bore is filled between the plugs with sediment. Nine cores and 49 sidewall cores were recovered. Laboratory tests were made for mineralogy, hydraulic conductivity, porosity , consolidation, grain-size distribution, Atterberg limits, X-ray diffraction, and chemical quality of water. Geophysical and thermal gradient logs were made. The hole is sampled periodically for chemical analysis and measured for water level in the three tapped zones. This report presents methods used to obtain field samples, laboratory procedures, and the data obtained. (USGS)
Physiological and psychological responses to outdoor vs. laboratory cycling.
Mieras, Molly E; Heesch, Matthew W S; Slivka, Dustin R
2014-08-01
The purpose of this study was to determine the physiological and psychological responses to laboratory vs. outdoor cycling. Twelve recreationally trained male cyclists participated in an initial descriptive testing session and 2 experimental trials consisting of 1 laboratory and 1 outdoor session, in a randomized order. Participants were given a standardized statement instructing them to give the same perceived effort for both the laboratory and outdoor 40-km trials. Variables measured include power output, heart rate (HR), core temperature, skin temperature, body weight, urine specific gravity (USG), Rating of Perceived Exertion (RPE), attentional focus, and environmental conditions. Wind speed was higher in the outdoor trial than in the laboratory trial (2.5 ± 0.6 vs. 0.0 ± 0.0 m·s-1, p = 0.02) whereas all other environmental conditions were similar. Power output (208.1 ± 10.2 vs. 163.4 ± 11.8 W, respectively, p < 0.001) and HR (152 ± 4 and 143 ± 6 b·min-1, respectively, p = 0.04) were higher in the outdoor trial than in the laboratory trial. Core temperature was similar, whereas skin temperature was cooler during the outdoor trial than during the laboratory trial (31.4 ± 0.3 vs. 33.0 ± 0.2° C, respectively, p < 0.001), thus creating a larger thermal gradient between the core and skin outdoors. No significant differences in body weight, USG, RPE, or attentional focus were observed between trials. These data indicate that outdoor cycling allows cyclists to exercise at a higher intensity than in laboratory cycling, despite similar environmental conditions and perceived exertion. In light of this, cyclists may want to ride at a higher perceived exertion in indoor settings to acquire the same benefit as they would from an outdoor ride.
Core Technical Capability Laboratory Management System
NASA Technical Reports Server (NTRS)
Shaykhian, Linda; Dugger, Curtis; Griffin, Laurie
2008-01-01
The Core Technical Capability Lab - oratory Management System (CTCLMS) consists of dynamically generated Web pages used to access a database containing detailed CTC lab data with the software hosted on a server that allows users to have remote access.
Revising Laboratory Work: Sociological Perspectives on the Science Classroom
ERIC Educational Resources Information Center
Jobér, Anna
2017-01-01
This study uses sociological perspectives to analyse one of the core practices in science education: school children's and students' laboratory work. Applying an ethnographic approach to the laboratory work done by pupils at a Swedish compulsory school, data were generated through observations, field notes, interviews, and a questionnaire. The…
Evaluation of hydraulic conductivities calculated from multi-port permeameter measurements
Wolf, Steven H.; Celia, Michael A.; Hess, Kathryn M.
1991-01-01
A multiport permeameter was developed for use in estimating hydraulic conductivity over intact sections of aquifer core using the core liner as the permeameter body. Six cores obtained from one borehole through the upper 9 m of a stratified glacial-outwash aquifer were used to evaluate the reliability of the permeameter. Radiographs of the cores were used to assess core integrity and to locate 5- to 10-cm sections of similar grain size for estimation of hydraulic conductivity. After extensive testing of the permeameter, hydraulic conductivities were determined for 83 sections of the six cores. Other measurement techniques included permeameter measurements on repacked sections of core, estimates based on grain-size analyses, and estimates based on borehole flowmeter measurements. Permeameter measurements of 33 sections of core that had been extruded, homogenized, and repacked did not differ significantly from the original measurements. Hydraulic conductivities estimated from grain-size distributions were slightly higher than those calculated from permeameter measurements; the significance of the difference depended on the estimating equation used. Hydraulic conductivities calculated from field measurements, using a borehole flowmeter in the borehole from which the cores were extracted, were significantly higher than those calculated from laboratory measurements and more closely agreed with independent estimates of hydraulic conductivity based on tracer movement near the borehole. This indicates that hydraulic conductivities based on laboratory measurements of core samples may underestimate actual field hydraulic conductivities in this type of stratified glacial-outwash aquifer.
The SAS4A/SASSYS-1 Safety Analysis Code System, Version 5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fanning, T. H.; Brunett, A. J.; Sumner, T.
The SAS4A/SASSYS-1 computer code is developed by Argonne National Laboratory for thermal, hydraulic, and neutronic analysis of power and flow transients in liquidmetal- cooled nuclear reactors (LMRs). SAS4A was developed to analyze severe core disruption accidents with coolant boiling and fuel melting and relocation, initiated by a very low probability coincidence of an accident precursor and failure of one or more safety systems. SASSYS-1, originally developed to address loss-of-decay-heat-removal accidents, has evolved into a tool for margin assessment in design basis accident (DBA) analysis and for consequence assessment in beyond-design-basis accident (BDBA) analysis. SAS4A contains detailed, mechanistic models of transientmore » thermal, hydraulic, neutronic, and mechanical phenomena to describe the response of the reactor core, its coolant, fuel elements, and structural members to accident conditions. The core channel models in SAS4A provide the capability to analyze the initial phase of core disruptive accidents, through coolant heat-up and boiling, fuel element failure, and fuel melting and relocation. Originally developed to analyze oxide fuel clad with stainless steel, the models in SAS4A have been extended and specialized to metallic fuel with advanced alloy cladding. SASSYS-1 provides the capability to perform a detailed thermal/hydraulic simulation of the primary and secondary sodium coolant circuits and the balance-ofplant steam/water circuit. These sodium and steam circuit models include component models for heat exchangers, pumps, valves, turbines, and condensers, and thermal/hydraulic models of pipes and plena. SASSYS-1 also contains a plant protection and control system modeling capability, which provides digital representations of reactor, pump, and valve controllers and their response to input signal changes.« less
Moura, Josemar de Almeida; Costa, Bruna Carvalho; de Faria, Rosa Malena Delbone; Soares, Taciana Figueiredo; Moura, Eliane Perlatto; Chiappelli, Francesco
2013-01-01
Requests for laboratory tests are among the most relevant additional tools used by physicians as part of patient's health problemsolving. However, the overestimation of complementary investigation may be linked to less reflective medical practice as a consequence of a poor physician-patient communication, and may impair patient-centered care. This scenario is likely to result from reduced consultation time, and a clinical model focused on the disease. We propose a new medical intervention program that specifically targets improving the patient-centered communication of laboratory tests results, the core of bioinformation in health care. Expectations are that medical students training in communication skills significantly improve physicians-patient relationship, reduce inappropriate use of laboratorial tests, and raise stakeholder engagement.
NASA Astrophysics Data System (ADS)
Baumgarten, Kristyne A.
This study investigated the possible relationship between collaborative learning strategies and the learning of core concepts. This study examined the differences between two groups of nursing students enrolled in an introductory microbiology laboratory course. The control group consisted of students enrolled in sections taught in the traditional method. The experimental group consisted of those students enrolled in the sections using collaborative learning strategies. The groups were assessed on their degrees of learning core concepts using a pre-test/post-test method. Scores from the groups' laboratory reports were also analyzed. There was no difference in the two group's pre-test scores. The post-test scores of the experimental group averaged 11 points higher than the scores of the control group. The lab report scores of the experimental group averaged 15 points higher than those scores of the control group. The data generated from this study demonstrated that collaborative learning strategies can be used to increase students learning of core concepts in microbiology labs.
Total laboratory automation: Do stat tests still matter?
Dolci, Alberto; Giavarina, Davide; Pasqualetti, Sara; Szőke, Dominika; Panteghini, Mauro
2017-07-01
During the past decades the healthcare systems have rapidly changed and today hospital care is primarily advocated for critical patients and acute treatments, for which laboratory test results are crucial and need to be always reported in predictably short turnaround time (TAT). Laboratories in the hospital setting can face this challenge by changing their organization from a compartmentalized laboratory department toward a decision making-based laboratory department. This requires the implementation of a core laboratory, that exploits total laboratory automation (TLA) using technological innovation in analytical platforms, track systems and information technology, including middleware, and a number of satellite specialized laboratory sections cooperating with care teams for specific medical conditions. In this laboratory department model, the short TAT for all first-line tests performed by TLA in the core laboratory represents the key paradigm, where no more stat testing is required because all samples are handled in real-time and (auto)validated results dispatched in a time that fulfills clinical needs. To optimally reach this goal, laboratories should be actively involved in managing all the steps covering the total examination process, speeding up also extra-laboratory phases, such sample delivery. Furthermore, to warrant effectiveness and not only efficiency, all the processes, e.g. specimen integrity check, should be managed by middleware through a predefined set of rules defined in light of the clinical governance. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Boaga, J.; Sauermilch, I.; Mateo, Z. R. P.
2017-12-01
Time-depth relationships (TDR) are crucial in correlating drillhole and core information to seismic reflection profiles, for accurate resource estimation, scientific interpretation and to guide drilling operations. Conventional seismic time-depth domain conversion utilizes downhole sonic logs (DSI), calibrated using available checkshot data, which are local travel times from the surface to a particular depth. Scientific drilling programs (ODP and IODP) also measure P-wave velocity (PWL or C) on recovered core samples. Only three percent of all ODP and IODP sites record all three velocity measurements, however this information can be instructive as sometimes these data input show dissimilar TDR. These representative sites provide us with an opportunity to perform a comparative analysis highlighting the differences and similarities of TDRs derived from checkshot, downhole, and laboratory measurements. We then discuss the impact of lithology, stratigraphy, water column and other petrophysical properties in the predictive accuracy of TDR calculations, in an effort to provide guidance for future drilling and coring expeditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Jun Hyuck; Park, Soo Jeong; Rho, Seong-Hwan
2005-11-01
The GluR0 ligand-binding core from N. punctiforme was expressed, purified and crystallized in the presence of l-glutamate. A diffraction data set was collected to a resolution of 2.1 Å. GluR0 from Nostoc punctiforme (NpGluR0) is a bacterial homologue of the ionotropic glutamate receptor. The ligand-binding core of NpGluR0 was crystallized at 294 K using the hanging-drop vapour-diffusion method. The l-glutamate-complexed crystal belongs to space group C222{sub 1}, with unit-cell parameters a = 78.0, b = 145.1, c = 132.1 Å. The crystals contain three subunits in the asymmetric unit, with a V{sub M} value of 2.49 Å{sup 3} Da{sup −1}.more » The diffraction limit of the l-glutamate complex data set was 2.1 Å using synchrotron X-ray radiation at beamline BL-4A of the Pohang Accelerator Laboratory (Pohang, Korea)« less
NASA Astrophysics Data System (ADS)
Moses, A. J.
1994-03-01
Flux rotating in the plane of laminations of amorphous materials or electrical steels can cause additional losses in electrical machines. To make full use of laboratory rotational magnetization studies, a better understanding of the nature of rotational flux in machine cores is needed. This paper highlights the need for careful laboratory simulation of the conditions which occur in actual machines. Single specimen tests must produce uniform flux over a given measuring region and output from field and flux sensors need careful analysis. Differences between thermal and flux sensing methods are shown as well as anomalies caused when the magnetisation direction is reversed in an anistropic specimen. Methods of overcoming these problems are proposed.
Analysis of the TREAT LEU Conceptual Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Kontogeorgakos, D. C.; Papadias, D. D.
2016-03-01
Analyses were performed to evaluate the performance of the low enriched uranium (LEU) conceptual design fuel for the conversion of the Transient Reactor Test Facility (TREAT) from its current highly enriched uranium (HEU) fuel. TREAT is an experimental nuclear reactor designed to produce high neutron flux transients for the testing of reactor fuels and other materials. TREAT is currently in non-operational standby, but is being restarted under the U.S. Department of Energy’s Resumption of Transient Testing Program. The conversion of TREAT is being pursued in keeping with the mission of the Department of Energy National Nuclear Security Administration’s Material Managementmore » and Minimization (M3) Reactor Conversion Program. The focus of this study was to demonstrate that the converted LEU core is capable of maintaining the performance of the existing HEU core, while continuing to operate safely. Neutronic and thermal hydraulic simulations have been performed to evaluate the performance of the LEU conceptual-design core under both steady-state and transient conditions, for both normal operation and reactivity insertion accident scenarios. In addition, ancillary safety analyses which were performed for previous LEU design concepts have been reviewed and updated as-needed, in order to evaluate if the converted LEU core will function safely with all existing facility systems. Simulations were also performed to evaluate the detailed behavior of the UO 2-graphite fuel, to support future fuel manufacturing decisions regarding particle size specifications. The results of these analyses will be used in conjunction with work being performed at Idaho National Laboratory and Los Alamos National Laboratory, in order to develop the Conceptual Design Report project deliverable.« less
Experimental Simulations of Methane Gas Migration through Water-Saturated Sediment Cores
NASA Astrophysics Data System (ADS)
Choi, J.; Seol, Y.; Rosenbaum, E. J.
2010-12-01
Previous numerical simulations (Jaines and Juanes, 2009) showed that modes of gas migration would mainly be determined by grain size; capillary invasion preferably occurring in coarse-grained sediments vs. fracturing dominantly in fine-grained sediments. This study was intended to experimentally simulate preferential modes of gas migration in various water-saturated sediment cores. The cores compacted in the laboratory include a silica sand core (mean size of 180 μm), a silica silt core (1.7 μm), and a kaolin clay core (1.0 μm). Methane gas was injected into the core placed within an x-ray-transparent pressure vessel, which was under continuous x-ray computed tomography (CT) scanning with controlled radial (σr), axial (σa), and pore pressures (P). The CT image analysis reveals that, under the radial effective stress (σr') of 0.69 MPa and the axial effective stress (σa') of 1.31 MPa, fracturings by methane gas injection occur in both silt and clay cores. Fracturing initiates at the capillary pressure (Pc) of ~ 0.41 MPa and ~ 2.41 MPa for silt and clay cores, respectively. Fracturing appears as irregular fracture-networks consisting of nearly invisibly-fine multiple fractures, longitudinally-oriented round tube-shape conduits, or fine fractures branching off from the large conduits. However, for the sand core, only capillary invasion was observed at or above 0.034 MPa of capillary pressure under the confining pressure condition of σr' = 1.38 MPa and σa' = 2.62 MPa. Compared to the numerical predictions under similar confining pressure conditions, fracturing occurs with relatively larger grain sizes, which may result from lower grain-contact compression and friction caused by loose compaction and flexible lateral boundary employed in the experiment.
Stevenson, D J
1981-11-06
Combined inferences from seismology, high-pressure experiment and theory, geomagnetism, fluid dynamics, and current views of terrestrial planetary evolution lead to models of the earth's core with the following properties. Core formation was contemporaneous with earth accretion; the core is not in chemical equilibrium with the mantle; the outer core is a fluid iron alloy containing significant quantities of lighter elements and is probably almost adiabatic and compositionally uniform; the more iron-rich inner solid core is a consequence of partial freezing of the outer core, and the energy release from this process sustains the earth's magnetic field; and the thermodynamic properties of the core are well constrained by the application of liquid-state theory to seismic and laboratory data.
Laboratory study of the characteristics of fault breccias in Busan area in Korea
NASA Astrophysics Data System (ADS)
Woo, I.; Um, J.
2012-12-01
The physical and mechanical characteristics of fault breccias from near the Mt. Kumjung were estimated from laboratory tests on fractured fault breccias. Mt. Kumjung is surrounded by Yangsan Fault and Dongrae Fault which are major faults traversing the southeast part of Korea in the direction of NE-SW. The undisturbed samples were obtained from boreholes drilled in this region. The microscopic analysis on the thin sections of fault breccias showed the microstructure and the porosity of breccias. The fault breccias are composed of mainly fine quartz grains, and of angular quartz grains and weathered microcline grains. This microstructure of fault breccias might be formed by the catalasis during brittle deformation processes of the fault. 20 to 40% porosity of fault breccias could play an important role in the passage of groundwater and then in the development of fault gouge in the core part of fault. The mechanical characteristics were estimated by means of uniaxial compressive strength tests on the undisturbed breccias samples. Since fault breccias are not cohesive enough to use it directly as a test specimen, the epoxy resin was utilized to fix the outer surface of core samples. The thin plastic wrap had been enveloped before the epoxy resin was applied in order that the epoxy resin could not penetrate into the core specimens. The thickness of epoxy resin was less than 1mm not to disturb the results of uniaxial compressive strength of core samples. The measured uniaxial compressive strengths are 10 to 15MPa for the only physically fractured breccias and 8 to 10 MPa for the core specimens with hydrothermally altered surface. These results can be compared with the Hoek and Brown failure criteria : 7 to 10MPa for GSI value 40 to 50 for fault breccias with fresh surface. The overall measured strength of fault breccias is less than the strength obtained empirically by Hoek and Brown failure criteria.; ;
Coupled field-structural analysis of HGTR fuel brick using ABAQUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohanty, S.; Jain, R.; Majumdar, S.
2012-07-01
High-temperature, gas-cooled reactors (HTGRs) are usually helium-gas cooled, with a graphite core that can operate at reactor outlet temperatures much higher than can conventional light water reactors. In HTGRs, graphite components moderate and reflect neutrons. During reactor operation, high temperature and high irradiation cause damage to the graphite crystal and grains and create other defects. This cumulative structural damage during the reactor lifetime leads to changes in graphite properties, which can alter the ability to support the designed loads. The aim of the present research is to develop a finite-element code using commercially available ABAQUS software for the structural integritymore » analysis of graphite core components under extreme temperature and irradiation conditions. In addition, the Reactor Geometry Generator tool-kit, developed at Argonne National Laboratory, is used to generate finite-element mesh for complex geometries such as fuel bricks with multiple pin holes and coolant flow channels. This paper presents the proposed concept and discusses results of stress analysis simulations of a fuel block with H-451 grade material properties. (authors)« less
Farmer, M. T.; Gerardi, C.; Bremer, N.; ...
2016-10-31
The reactor accidents at Fukushima-Dai-ichi have rekindled interest in late phase severe accident behavior involving reactor pressure vessel breach and discharge of molten core melt into the containment. Two technical issues of interest in this area include core-concrete interaction and the extent to which the core debris may be quenched and rendered coolable by top flooding. The OECD-sponsored Melt Coolability and Concrete Interaction (MCCI) programs at Argonne National Laboratory included the conduct of large scale reactor material experiments and associated analysis with the objectives of resolving the ex-vessel debris coolability issue, and to address remaining uncertainties related to long-term two-dimensionalmore » molten core-concrete interactions under both wet and dry cavity conditions. These tests provided a broad database to support accident management planning, as well as the development and validation of models and codes that can be used to extrapolate the experiment results to plant conditions. This paper provides a high level overview of the key experiment results obtained during the program. Finally, a discussion is also provided that describes technical gaps that remain in this area, several of which have arisen based on the sequence of events and operator actions during Fukushima.« less
Hydraulic Conductivity Measurements Barrow 2014
Katie McKnight; Tim Kneafsey; Craig Ulrich; Jil Geller
2015-02-22
Six individual ice cores were collected from Barrow Environmental Observatory in Barrow, Alaska, in May of 2013 as part of the Next Generation Ecosystem Experiment (NGEE). Each core was drilled from a different location at varying depths. A few days after drilling, the cores were stored in coolers packed with dry ice and flown to Lawrence Berkeley National Laboratory (LBNL) in Berkeley, CA. 3-dimensional images of the cores were constructed using a medical X-ray computed tomography (CT) scanner at 120kV. Hydraulic conductivity samples were extracted from these cores at LBNL Richmond Field Station in Richmond, CA, in February 2014 by cutting 5 to 8 inch segments using a chop saw. Samples were packed individually and stored at freezing temperatures to minimize any changes in structure or loss of ice content prior to analysis. Hydraulic conductivity was determined through falling head tests using a permeameter [ELE International, Model #: K-770B]. After approximately 12 hours of thaw, initial falling head tests were performed. Two to four measurements were collected on each sample and collection stopped when the applied head load exceeded 25% change from the original load. Analyses were performed between 2 to 3 times for each sample. The final hydraulic conductivity calculations were computed using methodology of Das et al., 1985.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, M. T.; Gerardi, C.; Bremer, N.
The reactor accidents at Fukushima-Dai-ichi have rekindled interest in late phase severe accident behavior involving reactor pressure vessel breach and discharge of molten core melt into the containment. Two technical issues of interest in this area include core-concrete interaction and the extent to which the core debris may be quenched and rendered coolable by top flooding. The OECD-sponsored Melt Coolability and Concrete Interaction (MCCI) programs at Argonne National Laboratory included the conduct of large scale reactor material experiments and associated analysis with the objectives of resolving the ex-vessel debris coolability issue, and to address remaining uncertainties related to long-term two-dimensionalmore » molten core-concrete interactions under both wet and dry cavity conditions. These tests provided a broad database to support accident management planning, as well as the development and validation of models and codes that can be used to extrapolate the experiment results to plant conditions. This paper provides a high level overview of the key experiment results obtained during the program. Finally, a discussion is also provided that describes technical gaps that remain in this area, several of which have arisen based on the sequence of events and operator actions during Fukushima.« less
A highly efficient multi-core algorithm for clustering extremely large datasets
2010-01-01
Background In recent years, the demand for computational power in computational biology has increased due to rapidly growing data sets from microarray and other high-throughput technologies. This demand is likely to increase. Standard algorithms for analyzing data, such as cluster algorithms, need to be parallelized for fast processing. Unfortunately, most approaches for parallelizing algorithms largely rely on network communication protocols connecting and requiring multiple computers. One answer to this problem is to utilize the intrinsic capabilities in current multi-core hardware to distribute the tasks among the different cores of one computer. Results We introduce a multi-core parallelization of the k-means and k-modes cluster algorithms based on the design principles of transactional memory for clustering gene expression microarray type data and categorial SNP data. Our new shared memory parallel algorithms show to be highly efficient. We demonstrate their computational power and show their utility in cluster stability and sensitivity analysis employing repeated runs with slightly changed parameters. Computation speed of our Java based algorithm was increased by a factor of 10 for large data sets while preserving computational accuracy compared to single-core implementations and a recently published network based parallelization. Conclusions Most desktop computers and even notebooks provide at least dual-core processors. Our multi-core algorithms show that using modern algorithmic concepts, parallelization makes it possible to perform even such laborious tasks as cluster sensitivity and cluster number estimation on the laboratory computer. PMID:20370922
Impact Vaporization of Planetesimal Cores
NASA Astrophysics Data System (ADS)
Kraus, R. G.; Root, S.; Lemke, R. W.; Stewart, S. T.; Jacobsen, S. B.; Mattsson, T. R.
2013-12-01
The degree of mixing and chemical equilibration between the iron cores of planetesimals and the mantle of the growing Earth has important consequences for understanding the end stages of Earth's formation and planet formation in general. At the Sandia Z machine, we developed a new shock-and-release technique to determine the density on the liquid-vapor dome of iron, the entropy on the iron shock Hugoniot, and the criteria for shock-induced vaporization of iron. We find that the critical shock pressure to vaporize iron is 507(+65,-85) GPa and show that decompression from a 15 km/s impact will initiate vaporization of iron cores, which is a velocity that is readily achieved at the end stages of planet formation. Vaporization of the iron cores increases dispersal of planetesimal cores, enables more complete chemical equilibration of the planetesimal cores with Earth's mantle, and reduces the highly siderophile element abundance on the Moon relative to Earth due to the expanding iron vapor exceeding the Moon's escape velocity. Sandia National Laboratories is a multiprogram laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Securities Administration under Contract No. DE-AC04-94AL85000.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony Hendrickson; Brian Mennecke; Kevin Scheibe
2005-10-01
Modern, forensics laboratories need Laboratory Information Management Systems (LIMS) implementations that allow the lab to track evidentiary items through their examination lifecycle and also serve all pertinent laboratory personnel. The research presented here presents LIMS core requirements as viewed by respondents serving in different forensic laboratory capacities as well as different forensic laboratory environments. A product-development methodology was employed to evaluate the relative value of the key features that constitute a LIMS, in order to develop a set of relative values for these features and the specifics of their implementation. In addition to the results of the product development analysis,more » this paper also provides an extensive review of LIMS and provides an overview of the preparation and planning process for the successful upgrade or implementation of a LIMS. Analysis of the data indicate that the relative value of LIMS components are viewed differently depending upon respondents' job roles (i.e., evidence technicians, scientists, and lab management), as well as by laboratory size. Specifically, the data show that: (1) Evidence technicians place the most value on chain of evidence capabilities and on chain of custody tracking; (2) Scientists generally place greatest value on report writing and generation, and on tracking daughter evidence that develops during their analyses; (3) Lab. Managers place the greatest value on chain of custody, daughter evidence, and not surprisingly, management reporting capabilities; and (4) Lab size affects LIMS preference in that, while all labs place daughter evidence tracking, chain of custody, and management and analyst report generation as their top three priorities, the order of this prioritization is size dependent.« less
Field Validation of Supercritical CO 2 Reactivity with Basalts
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGrail, B. Peter; Schaef, Herbert T.; Spane, Frank A.
2017-01-10
Continued global use of fossil fuels places a premium on developing technology solutions to minimize increases in atmospheric CO 2 levels. CO 2 storage in reactive basalts might be one of these solutions by permanently converting injected gaseous CO 2 into solid carbonates. Herein we report results from a field demonstration where ~1000 MT of CO 2 was injected into a natural basalt formation in Eastern Washington State. Following two years of post-injection monitoring, cores were obtained from within the injection zone and subjected to detailed physical and chemical analysis. Nodules found in vesicles throughout the cores were identified asmore » the carbonate mineral, ankerite Ca[Fe, Mg, Mn](CO 3) 2. Carbon isotope analysis showed the nodules are chemically distinct as compared with natural carbonates present in the basalt and clear correlation with the isotopic signature of the injected CO 2. These findings provide field validation of rapid mineralization rates observed from years of laboratory testing with basalts.« less
Using the Laboratory to Engage All Students in Science Practices
ERIC Educational Resources Information Center
Walker, J. P.; Sampson, V.; Southerland, S.; Enderle, P. J.
2016-01-01
This study examines the extent to which the type of instruction used during a general chemistry laboratory course affects students' ability to use core ideas to engage in science practices. We use Ford's (2008) description of the nature of scientific practices to categorize what students do in the laboratory as either empirical or…
Developing Laboratory Skills by Incorporating Peer-Review and Digital Badges
ERIC Educational Resources Information Center
Seery, Michael K.; Agustian, Hendra Y.; Doidge, Euan D.; Kucharski, Maciej M.; O'Connor, Helen M.; Price, Amy
2017-01-01
Laboratory work is at the core of any chemistry curriculum but literature on the assessment of laboratory skills is scant. In this study we report the use of a peer-observation protocol underpinned by exemplar videos. Students are required to watch exemplar videos for three techniques (titrations, distillations, preparation of standard solutions)…
Medical Laboratory Services. Student's Manual. Cluster Core for Health Occupations Education.
ERIC Educational Resources Information Center
Williams, Catherine
This student's manual on medical laboratory services is one of a series of self-contained, individualized materials for students enrolled in training within the allied health field. It includes competencies that are associated with the performance of skills common to several occupations in the medical laboratory. The material is intended for use…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, T.L.; Musicki, Z.; Kohut, P.
1994-06-01
During 1989, the Nuclear Regulatory Commission (NRC) initiated an extensive program to carefully examine the potential risks during low power and shutdown operations. The program includes two parallel projects being performed by Brookhaven National Laboratory (BNL) and Sandia National Laboratories (SNL). Two plants, Surry (pressurized water reactor) and Grand Gulf (boiling water reactor), were selected as the plants to be studied. The objectives of the program are to assess the risks of severe accidents initiated during plant operational states other than full power operation and to compare the estimated core damage frequencies, important accident sequences and other qualitative and quantitativemore » results with those accidents initiated during full power operation as assessed in NUREG-1150. The objective of this report is to document the approach utilized in the Surry plant and discuss the results obtained. A parallel report for the Grand Gulf plant is prepared by SNL. This study shows that the core-damage frequency during mid-loop operation at the Surry plant is comparable to that of power operation. The authors recognize that there is very large uncertainty in the human error probabilities in this study. This study identified that only a few procedures are available for mitigating accidents that may occur during shutdown. Procedures written specifically for shutdown accidents would be useful.« less
Discovery of the Ubiquitous Cation NS+ in Space Confirmed by Laboratory Spectroscopy
NASA Astrophysics Data System (ADS)
Cernicharo, J.; Lefloch, B.; Agúndez, M.; Bailleux, S.; Margulès, L.; Roueff, E.; Bachiller, R.; Marcelino, N.; Tercero, B.; Vastel, C.; Caux, E.
2018-02-01
We report the detection in space of a new molecular species that has been characterized spectroscopically and fully identified from astrophysical data. The observations were carried out with the IRAM 30 m telescope. The molecule is ubiquitous as its J=2\\to 1 transition has been found in cold molecular clouds, prestellar cores, and shocks. However, it is not found in the hot cores of Orion-KL and in the carbon-rich evolved star IRC+10216. Three rotational transitions in perfect harmonic relation J\\prime =2/3/5 have been identified in the prestellar core B1b. The molecule has a 1Σ electronic ground state and its J=2\\to 1 transition presents the hyperfine structure characteristic of a molecule containing a nucleus with spin 1. A careful analysis of possible carriers shows that the best candidate is NS+. The derived rotational constant agrees within 0.3%–0.7% with ab initio calculations. NS+ was also produced in the laboratory to unambiguously validate the astrophysical assignment. The observed rotational frequencies and determined molecular constants confirm the discovery of the nitrogen sulfide cation in space. The chemistry of NS+ and related nitrogen-bearing species has been analyzed by means of a time-dependent gas-phase model. The model reproduces well the observed NS/NS+ abundance ratio, in the range 30–50, and indicates that NS+ is formed by reactions of the neutral atoms N and S with the cations SH+ and NH+, respectively.
2003-07-18
KENNEDY SPACE CENTER, FLA. - STS-120 Mission Specialists Piers Sellers and Michael Foreman are in the Space Station Processing Facility for hardware familiarization. The mission will deliver the second of three Station connecting modules, Node 2, which attaches to the end of U.S. Lab. It will provide attach locations for the Japanese laboratory, European laboratory, the Centrifuge Accommodation Module and later Multi-Purpose Logistics Modules. The addition of Node 2 will complete the U.S. core of the International Space Station.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daniel, G.; Rudisill, T.
2017-07-17
As part of the Spent Nuclear Fuel (SNF) processing campaign, H-Canyon is planning to begin dissolving High Flux Isotope Reactor (HFIR) fuel in late FY17 or early FY18. Each HFIR fuel core contains inner and outer fuel elements which were fabricated from uranium oxide (U 3O 8) dispersed in a continuous Al phase using traditional powder metallurgy techniques. Fuels fabricated in this manner, like other SNF’s processed in H-Canyon, dissolve by the same general mechanisms with similar gas generation rates and the production of H 2. The HFIR fuel cores will be dissolved using a flowsheet developed by the Savannahmore » River National Laboratory (SRNL) in either the 6.4D or 6.1D dissolver using a unique insert. Multiple cores will be charged to the same dissolver solution maximizing the concentration of dissolved Al. The recovered U will be down-blended into low-enriched U for subsequent use as commercial reactor fuel. During the development of the HFIR fuel dissolution flowsheet, the cycle time for the initial core was estimated at 28 to 40 h. Once the cycle is complete, H-Canyon personnel will open the dissolver and probe the HFIR insert wells to determine the height of any fuel fragments which did not dissolve. Before the next core can be charged to the dissolver, an analysis of the potential for H 2 gas generation must show that the combined surface area of the fuel fragments and the subsequent core will not generate H 2 concentrations in the dissolver offgas which exceeds 60% of the lower flammability limit (LFL) of H 2 at 200 °C. The objective of this study is to identify the maximum fuel fragment height as a function of the Al concentration in the dissolving solution which will provide criteria for charging successive HFIR cores to an H-Canyon dissolver.« less
A NEW METHOD TO QUANTIFY CORE TEMPERATURE INSTABILITY IN RODENTS.
Methods to quantify instability of autonomic systems such as temperature regulation should be important in toxicant and drug safety studies. Stability of core temperature (Tc) in laboratory rodents is susceptible to a variety of stimuli. Calculating the temperature differential o...
NCI Core Open House Shines Spotlight on Supportive Science and Basic Research | Poster
The lobby of Building 549 at NCI at Frederick bustled with activity for two hours on Tuesday, May 1, as several dozen scientists and staff gathered for the NCI Core Open House. The event aimed to encourage discussion and educate visitors about the capabilities of the cores, laboratories, and facilities that offer support to NCI’s Center for Cancer Research.
RALPH: An online computer program for acquisition and reduction of pulse height data
NASA Technical Reports Server (NTRS)
Davies, R. C.; Clark, R. S.; Keith, J. E.
1973-01-01
A background/foreground data acquisition and analysis system incorporating a high level control language was developed for acquiring both singles and dual parameter coincidence data from scintillation detectors at the Radiation Counting Laboratory at the NASA Manned Spacecraft Center in Houston, Texas. The system supports acquisition of gamma ray spectra in a 256 x 256 coincidence matrix (utilizing disk storage) and simultaneous operation of any of several background support and data analysis functions. In addition to special instruments and interfaces, the hardware consists of a PDP-9 with 24K core memory, 256K words of disk storage, and Dectape and Magtape bulk storage.
ERIC Educational Resources Information Center
Piunno, Paul A. E.; Zetina, Adrian; Chu, Norman; Tavares, Anthony J.; Noor, M. Omair; Petryayeva, Eleonora; Uddayasankar, Uvaraj; Veglio, Andrew
2014-01-01
An advanced analytical chemistry undergraduate laboratory module on microfluidics that spans 4 weeks (4 h per week) is presented. The laboratory module focuses on comprehensive experiential learning of microfluidic device fabrication and the core characteristics of microfluidic devices as they pertain to fluid flow and the manipulation of samples.…
Evaluation of the impact of a total automation system in a large core laboratory on turnaround time.
Lou, Amy H; Elnenaei, Manal O; Sadek, Irene; Thompson, Shauna; Crocker, Bryan D; Nassar, Bassam
2016-11-01
Growing financial and workload pressures on laboratories coupled with user demands for faster turnaround time (TAT) has steered the implementation of total laboratory automation (TLA). The current study evaluates the impact of a complex TLA on core laboratory efficiency through the analysis of the In-lab to Report TAT (IR-TAT) for five representative tests based on the different requested priorities. Mean, median and outlier percentages (OP) for IR-TAT were determined following TLA implementation and where possible, compared to the pre-TLA era. The shortest mean IR-TAT via the priority lanes of the TLA was 22min for Complete Blood Count (CBC), followed by 34min, 39min and 40min for Prothrombin time (PT), urea and potassium testing respectively. The mean IR-TAT for STAT CBC loaded directly on to the analyzers was 5min shorter than that processed via the TLA. The mean IR-TATs for both STAT potassium and urea via offline centrifugation were comparable to that processed by the TLA. The longest mean IR-TAT via regular lanes of the TLA was 62min for Thyroid-Stimulating Hormone (TSH) while the shortest was 17min for CBC. All parameters for IR-TAT for CBC and PT tests decreased significantly post- TLA across all requested priorities in particular the outlier percentage (OP) at 30 and 60min. TLA helps to efficiently manage substantial volumes of samples across all requested priorities. Manual processing for small STAT volumes, at both the initial centrifugation stage and front loading directly on to analyzers, is however likely to yield the shortest IR-TAT. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Perea, D. E.; Evans, J. E.
2017-12-01
The ability to image biointerfaces over nanometer to micrometer length scales is fundamental to correlating biological composition and structure to physiological function, and is aided by a multimodal approach using advanced complementary microscopic and spectroscopic characterization techniques. Atom Probe Tomography (APT) is a rapidly expanding technique for atomic-scale three-dimensional structural and chemical analysis. However, the regular application of APT to soft biological materials is lacking in large part due to difficulties in specimen preparation and inabilities to yield meaningful tomographic reconstructions that produce atomic scale compositional distributions as no other technique currently can. Here we describe the atomic-scale tomographic analysis of biological materials using APT that is facilitated by an advanced focused ion beam based approach. A novel specimen preparation strategy is used in the analysis of horse spleen ferritin protein embedded in an organic polymer resin which provides chemical contrast to distinguish the inorganic-organic interface of the ferrihydrite mineral core and protein shell of the ferritin protein. One-dimensional composition profiles directly reveal an enhanced concentration of P and Na at the surface of the ferrihydrite mineral core. We will also describe the development of a unique multifunctional environmental transfer hub allowing controlled cryogenic transfer of specimens under vacuum pressure conditions between an Atom Probe and cryo-FIB/SEM. The utility of the environmental transfer hub is demonstrated through the acquisition of previously unavailable mass spectral analysis of an intact organometallic molecule made possible via controlled cryogenic transfer. The results demonstrate a viable application of APT analysis to study complex biological organic/inorganic interfaces relevant to energy and the environment. References D.E. Perea et al. An environmental transfer hub for multimodal atom probe tomography, Adv. Struct. Chem. Imag, 2017, 3:12 The research was performed at the Environmental Molecular Sciences Laboratory; a national scientific user facility sponsored by the Department of Energy's Office of Biological and Environmental Research located at Pacific Northwest National Laboratory.
NASA Astrophysics Data System (ADS)
Bonnelye, Audrey; David, Christian; Schubnel, Alexandre; Wassermann, Jérôme; Lefèvre, Mélody; Henry, Pierre; Guglielmi, Yves; Castilla, Raymi; Dick, Pierre
2017-04-01
Faults in general, and in clay materials in particular, have complex structures that can be linked to both a polyphased tectonic history and the anisotropic nature of the material. Drilling through faults in shaly materials allows one to measure properties such as the structure, the mineralogical composition, the stress orientation or physical properties. These relations can be investigated in the laboratory in order to have a better understanding on in-situ mechanisms. In this study we used shales of Toarcian age from the Tournemire underground research laboratory (France). We decided to couple different petrophysical measurements on core samples retrieved from a borehole drilled perpendicularly to a fault plane, and the fault size is of the order of tens of meters. This 25m long borehole was sampled in order to perform several types of measurements: density, porosity, saturation directly in the field, and velocity of elastic waves and magnetic susceptibility anisotropy in the laboratory. For all these measurements, special protocols were developed in order to preserve as much as possible the saturation state of the samples. All these measurements were carried out in three zones that intersects the borehole: the intact zone , the damaged zone and the fault core zone. From our measurements, we were able to associate specific properties to each zone of the fault. We then calculated Thomsen's parameters in order to quantify the elastic anisotropy across the fault. Our results show strong variations of the elastic anisotropy with the distance to the fault core as well as the occurrence of anisotropy reversal.
DESIGN CHARACTERISTICS OF THE IDAHO NATIONAL LABORATORY HIGH-TEMPERATURE GAS-COOLED TEST REACTOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sterbentz, James; Bayless, Paul; Strydom, Gerhard
2016-11-01
Uncertainty and sensitivity analysis is an indispensable element of any substantial attempt in reactor simulation validation. The quantification of uncertainties in nuclear engineering has grown more important and the IAEA Coordinated Research Program (CRP) on High-Temperature Gas Cooled Reactor (HTGR) initiated in 2012 aims to investigate the various uncertainty quantification methodologies for this type of reactors. The first phase of the CRP is dedicated to the estimation of cell and lattice model uncertainties due to the neutron cross sections co-variances. Phase II is oriented towards the investigation of propagated uncertainties from the lattice to the coupled neutronics/thermal hydraulics core calculations.more » Nominal results for the prismatic single block (Ex.I-2a) and super cell models (Ex.I-2c) have been obtained using the SCALE 6.1.3 two-dimensional lattice code NEWT coupled to the TRITON sequence for cross section generation. In this work, the TRITON/NEWT-flux-weighted cross sections obtained for Ex.I-2a and various models of Ex.I-2c is utilized to perform a sensitivity analysis of the MHTGR-350 core power densities and eigenvalues. The core solutions are obtained with the INL coupled code PHISICS/RELAP5-3D, utilizing a fixed-temperature feedback for Ex. II-1a.. It is observed that the core power density does not vary significantly in shape, but the magnitude of these variations increases as the moderator-to-fuel ratio increases in the super cell lattice models.« less
Lead Coolant Test Facility Systems Design, Thermal Hydraulic Analysis and Cost Estimate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soli Khericha; Edwin Harvego; John Svoboda
2012-01-01
The Idaho National Laboratory prepared a preliminary technical and functional requirements (T&FR), thermal hydraulic design and cost estimate for a lead coolant test facility. The purpose of this small scale facility is to simulate lead coolant fast reactor (LFR) coolant flow in an open lattice geometry core using seven electrical rods and liquid lead or lead-bismuth eutectic coolant. Based on review of current world lead or lead-bismuth test facilities and research needs listed in the Generation IV Roadmap, five broad areas of requirements were identified as listed: (1) Develop and Demonstrate Feasibility of Submerged Heat Exchanger; (2) Develop and Demonstratemore » Open-lattice Flow in Electrically Heated Core; (3) Develop and Demonstrate Chemistry Control; (4) Demonstrate Safe Operation; and (5) Provision for Future Testing. This paper discusses the preliminary design of systems, thermal hydraulic analysis, and simplified cost estimate. The facility thermal hydraulic design is based on the maximum simulated core power using seven electrical heater rods of 420 kW; average linear heat generation rate of 300 W/cm. The core inlet temperature for liquid lead or Pb/Bi eutectic is 4200 C. The design includes approximately seventy-five data measurements such as pressure, temperature, and flow rates. The preliminary estimated cost of construction of the facility is $3.7M (in 2006 $). It is also estimated that the facility will require two years to be constructed and ready for operation.« less
Optimizing the design of a reproduction toxicity test with the pond snail Lymnaea stagnalis.
Charles, Sandrine; Ducrot, Virginie; Azam, Didier; Benstead, Rachel; Brettschneider, Denise; De Schamphelaere, Karel; Filipe Goncalves, Sandra; Green, John W; Holbech, Henrik; Hutchinson, Thomas H; Faber, Daniel; Laranjeiro, Filipe; Matthiessen, Peter; Norrgren, Leif; Oehlmann, Jörg; Reategui-Zirena, Evelyn; Seeland-Fremer, Anne; Teigeler, Matthias; Thome, Jean-Pierre; Tobor Kaplon, Marysia; Weltje, Lennart; Lagadic, Laurent
2016-11-01
This paper presents the results from two ring-tests addressing the feasibility, robustness and reproducibility of a reproduction toxicity test with the freshwater gastropod Lymnaea stagnalis (RENILYS strain). Sixteen laboratories (from inexperienced to expert laboratories in mollusc testing) from nine countries participated in these ring-tests. Survival and reproduction were evaluated in L. stagnalis exposed to cadmium, tributyltin, prochloraz and trenbolone according to an OECD draft Test Guideline. In total, 49 datasets were analysed to assess the practicability of the proposed experimental protocol, and to estimate the between-laboratory reproducibility of toxicity endpoint values. The statistical analysis of count data (number of clutches or eggs per individual-day) leading to ECx estimation was specifically developed and automated through a free web-interface. Based on a complementary statistical analysis, the optimal test duration was established and the most sensitive and cost-effective reproduction toxicity endpoint was identified, to be used as the core endpoint. This validation process and the resulting optimized protocol were used to consolidate the OECD Test Guideline for the evaluation of reproductive effects of chemicals in L. stagnalis. Copyright © 2016 Elsevier Inc. All rights reserved.
Nano-catalysts with Magnetic Core: Sustainable Options for Greener Synthesis
Author’s perspective on nano-catalysts with magnetic core is summarized with recent work from his laboratory. Magnetically recyclable nano-catalysts and their use in benign media is an ideal blend for the development of sustainable methodologies in organic synthesis. Water or pol...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brauer, Carolyn S.; Pearson, John C.; Drouin, Brian J.
The spectrum of ethyl cyanide, or propionitrile (CH{sub 3}CH{sub 2}CN), has been repeatedly observed in the interstellar medium with large column densities and surprisingly high temperatures in hot core sources. The construction of new, more sensitive, observatories accessing higher frequencies such as Herschel, ALMA, and SOFIA have made it important to extend the laboratory data for ethyl cyanide to coincide with the capabilities of the new instruments. We report extensions of the laboratory measurements of the rotational spectrum of ethyl cyanide in its ground vibrational state to 1.6 THz. A global analysis of the ground state, which includes all ofmore » the previous data and 3356 newly assigned transitions, has been fitted to within experimental error to J = 132, K = 36, using both Watson A-reduced and Watson S-reduced Hamiltonians.« less
Laboratory Directed Research and Development FY2001 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Ayat, R
2002-06-20
Established by Congress in 1991, the Laboratory Directed Research and Development (LDRD) Program provides the Department of Energy (DOE)/National Nuclear Security Administration (NNSA) laboratories, like Lawrence Livermore National Laboratory (LLNL or the Laboratory), with the flexibility to invest up to 6% of their budget in long-term, high-risk, and potentially high payoff research and development (R&D) activities to support the DOE/NNSA's national security missions. By funding innovative R&D, the LDRD Program at LLNL develops and extends the Laboratory's intellectual foundations and maintains its vitality as a premier research institution. As proof of the Program's success, many of the research thrusts thatmore » started many years ago under LDRD sponsorship are at the core of today's programs. The LDRD Program, which serves as a proving ground for innovative ideas, is the Laboratory's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. Basic and applied research activities funded by LDRD enhance the Laboratory's core strengths, driving its technical vitality to create new capabilities that enable LLNL to meet DOE/NNSA's national security missions. The Program also plays a key role in building a world-class multidisciplinary workforce by engaging the Laboratory's best researchers, recruiting its future scientists and engineers, and promoting collaborations with all sectors of the larger scientific community.« less
The Joint European Compound Library: boosting precompetitive research.
Besnard, Jérémy; Jones, Philip S; Hopkins, Andrew L; Pannifer, Andrew D
2015-02-01
The Joint European Compound Library (JECL) is a new high-throughput screening collection aimed at driving precompetitive drug discovery and target validation. The JECL has been established with a core of over 321,000 compounds from the proprietary collections of seven pharmaceutical companies and will expand to around 500,000 compounds. Here, we analyse the physicochemical profile and chemical diversity of the core collection, showing that the collection is diverse and has a broad spectrum of predicted biological activity. We also describe a model for sharing compound information from multiple proprietary collections, enabling diversity and quality analysis without disclosing structures. The JECL is available for screening at no cost to European academic laboratories and SMEs through the IMI European Lead Factory (http://www.europeanleadfactory.eu/). Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Iles, Ray K; Cole, Laurence A; Butler, Stephen A
2014-06-05
The analysis of human chorionic gonadotropin (hCG) in clinical chemistry laboratories by specific immunoassay is well established. However, changes in glycosylation are not as easily assayed and yet alterations in hCG glycosylation is associated with abnormal pregnancy. hCGβ-core fragment (hCGβcf) was isolated from the urine of women, pregnant with normal, molar and hyperemesis gravidarum pregnancies. Each sample was subjected to matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS) analysis following dithiothreitol (DTT) reduction and fingerprint spectra of peptide hCGβ 6-40 were analyzed. Samples were variably glycosylated, where most structures were small, core and largely mono-antennary. Larger single bi-antennary and mixtures of larger mono-antennary and bi-antennary moieties were also observed in some samples. Larger glycoforms were more abundant in the abnormal pregnancies and tri-antennary carbohydrate moieties were only observed in the samples from molar and hyperemesis gravidarum pregnancies. Given that such spectral profiling differences may be characteristic, development of small sample preparation for mass spectral analysis of hCG may lead to a simpler and faster approach to glycostructural analysis and potentially a novel clinical diagnostic test.
NASA Astrophysics Data System (ADS)
Strasser, M.; Dugan, B.; Henry, P.; Jurado, M. J.; Kanagawa, K.; Kanamatsu, T.; Moore, G. F.; Panieri, G.; Pini, G. A.
2014-12-01
Mulitbeam swath bathymetry and reflection seismic data image large submarine landslide complexes along ocean margins worldwide. However, slope failure initiation, acceleration of motion and mass-transport dynamics of submarine landslides, which are all key to assess their tsunamigenic potential or impact on offshore infrastructure, cannot be conclusively deduced from geometric expression and acoustic characteristics of geophysical data sets alone, but cores and in situ data from the subsurface are needed to complement our understanding of submarine landslide dynamics. Here we present data and results from drilling, logging and coring thick mass-transport deposits (MTDs) in the Nankai Trough accretionary prism during Integrated Ocean Drilling Program (IODP) Expeditions 333 and 338. We integrate analysis on 3D seismic and Logging While Drilling (LWD) data sets, with data from laboratory analysis on core samples (geotechnical shear experiments, X-ray Computed Tomography (X-CT), Scanning Electron Microscopy (SEM) of deformation indicators, and magnetic fabric analysis) to study nature and mode of deformation and dynamics of mass transport in this active tectonic setting. In particular, we show that Fe-S filaments commonly observed on X-ray CT data of marine sediments, likely resulting from early diagenesis of worm burrows, are folded in large MTDs and display preferential orientation at their base. The observed lineation has low dip and is interpreted as the consequence of shear along the basal surface, revealing a new proxy for strain in soft sediments that can be applied to cores that reach through the entire depth of MTDs. Shear deformation in the lower part of thick MTDs is also revealed from AMS data, which - in combination with other paleo-magnetic data - is used to reconstruct strain and transport direction of the landslides.
Newell, K.D.
2007-01-01
Drill cuttings can be used for desorption analyses but with more uncertainty than desorption analyses done with cores. Drill cuttings are not recommended to take the place of core, but in some circumstances, desorption work with cuttings can provide a timely and economic supplement to that of cores. The mixed lithologic nature of drill cuttings is primarily the source of uncertainty in their analysis for gas content, for it is unclear how to apportion the gas generated from both the coal and the dark-colored shale that is mixed in usually with the coal. In the Western Interior Basin Coal Basin in eastern Kansas (Pennsylvanian-age coals), dark-colored shales with normal (??? 100 API units) gamma-ray levels seem to give off minimal amounts of gas on the order of less than five standard cubic feet per ton (scf/ton). In some cuttings analyses this rule of thumb for gas content of the shale is adequate for inferring the gas content of coals, but shales with high-gamma-ray values (>150 API units) may yield several times this amount of gas. The uncertainty in desorption analysis of drill cuttings can be depicted graphically on a diagram identified as a "lithologic component sensitivity analysis diagram." Comparison of cuttings desorption results from nearby wells on this diagram, can sometimes yield an unique solution for the gas content of both a dark shale and coal mixed in a cuttings sample. A mathematical solution, based on equating the dry, ash-free gas-contents of the admixed coal and dark-colored shale, also yields results that are correlative to data from nearby cores. ?? 2007 International Association for Mathematical Geology.
Raff, Lester J; Engel, George; Beck, Kenneth R; O'Brien, Andrea S; Bauer, Meagan E
2009-02-01
The elimination or reduction of medical errors has been a main focus of health care enterprises in the United States since the year 2000. Elimination of errors in patient and specimen identification is a key component of this focus and is the number one goal in the Joint Commission's 2008 National Patient Safety Goals Laboratory Services Program. To evaluate the effectiveness of using permanent inks to maintain specimen identity in sequentially submitted prostate needle biopsies. For a 12-month period, a grossing technician stained each prostate core with permanent ink developed for inking of pathology specimens. A different color was used for each patient, with all the prostate cores from all vials for a particular patient inked with the same color. Five colors were used sequentially: green, blue, yellow, orange, and black. The ink was diluted with distilled water to a consistency that allowed application of a thin, uniform coating of ink along the edges of the prostate core. The time required to ink patient specimens comprising different numbers of vials and prostate biopsies was timed. The number and type of inked specimen discrepancies were evaluated. The identified discrepancy rate for prostate biopsy patients was 0.13%. The discrepancy rate in terms of total number of prostate blocks was 0.014%. Diluted inks adhered to biopsy contours throughout tissue processing. The tissue showed no untoward reactions to the inks. Inking did not affect staining (histochemical or immunohistochemical) or pathologic evaluation. On average, inking prostate needle biopsies increases grossing time by 20%. Inking of all prostate core biopsies with colored inks, in sequential order, is an aid in maintaining specimen identity. It is a simple and effective method of addressing Joint Commission patient safety goals by maintaining specimen identity during processing of similar types of gross specimens. This technique may be applicable in other specialty laboratories and high-volume laboratories, where many similar tissue specimens are processed.
ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEM
ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEMThomas J. Hughes, QA and Records Manager, Experimental Toxicology Division (ETD), National Health and Environmental Effects Research Laboratory (NHEERL), ORD, U.S. EPA, RTP, NC 27709
ETD is the largest health divis...
Study on fracture identification of shale reservoir based on electrical imaging logging
NASA Astrophysics Data System (ADS)
Yu, Zhou; Lai, Fuqiang; Xu, Lei; Liu, Lin; Yu, Tong; Chen, Junyu; Zhu, Yuantong
2017-05-01
In recent years, shale gas exploration has made important development, access to a major breakthrough, in which the study of mud shale fractures is extremely important. The development of fractures has an important role in the development of gas reservoirs. Based on the core observation and the analysis of laboratory flakes and laboratory materials, this paper divides the lithology of the shale reservoirs of the XX well in Zhanhua Depression. Based on the response of the mudstone fractures in the logging curve, the fracture development and logging Response to the relationship between the conventional logging and electrical imaging logging to identify the fractures in the work, the final completion of the type of fractures in the area to determine and quantify the calculation of fractures. It is concluded that the fracture type of the study area is high and the microstructures are developed from the analysis of the XX wells in Zhanhua Depression. The shape of the fractures can be clearly seen by imaging logging technology to determine its type.
Update on the magnetic resonance imaging core of the Alzheimer's disease neuroimaging initiative.
Jack, Clifford R; Bernstein, Matt A; Borowski, Bret J; Gunter, Jeffrey L; Fox, Nick C; Thompson, Paul M; Schuff, Norbert; Krueger, Gunnar; Killiany, Ronald J; Decarli, Charles S; Dale, Anders M; Carmichael, Owen W; Tosun, Duygu; Weiner, Michael W
2010-05-01
Functions of the Alzheimer's Disease Neuroimaging Initiative (ADNI) magnetic resonance imaging (MRI) core fall into three categories: (1) those of the central MRI core laboratory at Mayo Clinic, Rochester, Minnesota, needed to generate high quality MRI data in all subjects at each time point; (2) those of the funded ADNI MRI core imaging analysis groups responsible for analyzing the MRI data; and (3) the joint function of the entire MRI core in designing and problem solving MR image acquisition, pre-processing, and analyses methods. The primary objective of ADNI was and continues to be improving methods for clinical trials in Alzheimer's disease. Our approach to the present ("ADNI-GO") and future ("ADNI-2," if funded) MRI protocol will be to maintain MRI methodological consistency in the previously enrolled "ADNI-1" subjects who are followed up longitudinally in ADNI-GO and ADNI-2. We will modernize and expand the MRI protocol for all newly enrolled ADNI-GO and ADNI-2 subjects. All newly enrolled subjects will be scanned at 3T with a core set of three sequence types: 3D T1-weighted volume, FLAIR, and a long TE gradient echo volumetric acquisition for micro hemorrhage detection. In addition to this core ADNI-GO and ADNI-2 protocol, we will perform vendor-specific pilot sub-studies of arterial spin-labeling perfusion, resting state functional connectivity, and diffusion tensor imaging. One of these sequences will be added to the core protocol on systems from each MRI vendor. These experimental sub-studies are designed to demonstrate the feasibility of acquiring useful data in a multicenter (but single vendor) setting for these three emerging MRI applications. Copyright 2010 The Alzheimer
A comprehensive Laboratory Services Survey of State Public Health Laboratories.
Inhorn, Stanley L; Wilcke, Burton W; Downes, Frances Pouch; Adjanor, Oluwatosin Omolade; Cada, Ronald; Ford, James R
2006-01-01
In November 2004, the Association of Public Health Laboratories (APHL) conducted a Comprehensive Laboratory Services Survey of State Public Health Laboratories (SPHLs) in order to establish the baseline data necessary for Healthy People 2010 Objective 23-13. This objective aims to measure the increase in the proportion of health agencies that provide or assure access to comprehensive laboratory services to support essential public health services. This assessment addressed only SPHLs and served as a baseline to periodically evaluate the level of improvement in the provision of laboratory services over the decade ending 2010. The 2004 survey used selected questions that were identified as key indicators of provision of comprehensive laboratory services. The survey was developed in consultation with the Centers for Disease Control and Prevention National Center for Health Statistics, based on newly developed data sources. Forty-seven states and one territory responded to the survey. The survey was based on the 11 core functions of SPHLs as previously defined by APHL. The range of performance among individual laboratories for the 11 core functions (subobjectives) reflects the challenging issues that have confronted SPHLs in the first half of this decade. APHL is now working on a coordinated effort with other stakeholders to create seamless state and national systems for the provision of laboratory services in support of public health programs. These services are necessary to help face the threats raised by the specter of terrorism, emerging infections, and natural disasters.
Basic data from five core holes in the Raft River geothermal area, Cassia County, Idaho
Crosthwaite, E. G.
1976-01-01
meters) were completed in the area (Crosthwaite, 1974), and the Aerojet Nuclear Company, under the auspices of the U.S. Energy Research and Development Administration, was planning some deep drilling 4,000 to 6,000 feet (1,200 to 1,800 meters) (fig. 1). The purpose of the core drilling was to provide information to test geophysical interpretations of the subsurface structure and lithology and to provide hydrologic and geologic data on the shallow part of the geothermal system. Samples of the core were made available to several divisions and branches of the Geological Survey and to people and agencies outside the Survey. This report presents the basic data from the core holes that had been collected to September 1, 1975, and includes lithologic and geophysical well logs, chemical analyses of water (table 1), and laboratory analyses of cores (table 2) that were completed as of the above date. The data were collected by the Idaho District office, Hydrologic Laboratory, Borehole Geophysics Research Project, and Drilling, Sampling, and Testing Section, all of the Water Resources Division, and the Branch of Central Environmental Geology of the Geologic Divison.
Blick, Kenneth E
2013-08-01
To develop a fully automated core laboratory, handling samples on a "first in, first out" real-time basis with Lean/Six Sigma management tools. Our primary goal was to provide services to critical care areas, eliminating turnaround time outlier percentage (TAT-OP) as a factor in patient length of stay (LOS). A secondary goal was to achieve a better laboratory return on investment. In 2011, we reached our primary goal when we calculated the TAT-OP distribution and found we had achieved a Six Sigma level of performance, ensuring that our laboratory service can be essentially eliminated as a factor in emergency department patient LOS. We also measured return on investment, showing a productivity improvement of 35%, keeping pace with our increased testing volume. As a result of our Lean process improvements and Six Sigma initiatives, in part through (1) strategic deployment of point-of-care testing and (2) core laboratory total automation with robotics, middleware, and expert system technology, physicians and nurses at the Oklahoma University Medical Center can more effectively deliver lifesaving health care using evidence-based protocols that depend heavily on "on time, every time" laboratory services.
A novel enzyme-based acidizing system: Matrix acidizing and drilling fluid damage removal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, R.E.; McKay, D.M.; Moses, V.
1995-12-31
A novel acidizing process is used to increase the permeability of carbonate rock cores in the laboratory and to remove drilling fluid damage from cores and wafers. Field results show the benefits of the technology as applied both to injector and producer wells.
Richard, Lucie; Torres, Sara; Tremblay, Marie-Claude; Chiocchio, François; Litvak, Éric; Fortin-Pellerin, Laurence; Beaudet, Nicole
2015-06-14
Professional development is a key component of effective public health infrastructures. To be successful, professional development programs in public health and health promotion must adapt to practitioners' complex real-world practice settings while preserving the core components of those programs' models and theoretical bases. An appropriate balance must be struck between implementation fidelity, defined as respecting the core nature of the program that underlies its effects, and adaptability to context to maximize benefit in specific situations. This article presents a professional development pilot program, the Health Promotion Laboratory (HPL), and analyzes how it was adapted to three different settings while preserving its core components. An exploratory analysis was also conducted to identify team and contextual factors that might have been at play in the emergence of implementation profiles in each site. This paper describes the program, its core components and adaptive features, along with three implementation experiences in local public health teams in Quebec, Canada. For each setting, documentary sources were analyzed to trace the implementation of activities, including temporal patterns throughout the project for each program component. Information about teams and their contexts/settings was obtained through documentary analysis and semi-structured interviews with HPL participants, colleagues and managers from each organization. While each team developed a unique pattern of implementing the activities, all the program's core components were implemented. Differences of implementation were observed in terms of numbers and percentages of activities related to different components of the program as well as in the patterns of activities across time. It is plausible that organizational characteristics influencing, for example, work schedule flexibility or learning culture might have played a role in the HPL implementation process. This paper shows how a professional development program model can be adapted to different contexts while preserving its core components. Capturing the heterogeneity of the intervention's exposure, as was done here, will make possible in-depth impact analyses involving, for example, the testing of program-context interactions to identify program outcomes predictors. Such work is essential to advance knowledge on the action mechanisms of professional development programs.
Gierczak, R F D; Devlin, J F; Rudolph, D L
2006-01-05
Elevated nitrate concentrations within a municipal water supply aquifer led to pilot testing of a field-scale, in situ denitrification technology based on carbon substrate injections. In advance of the pilot test, detailed characterization of the site was undertaken. The aquifer consisted of complex, discontinuous and interstratified silt, sand and gravel units, similar to other well studied aquifers of glaciofluvial origin, 15-40 m deep. Laboratory and field tests, including a conservative tracer test, a pumping test, a borehole flowmeter test, grain-size analysis of drill cuttings and core material, and permeameter testing performed on core samples, were performed on the most productive depth range (27-40 m), and the results were compared. The velocity profiles derived from the tracer tests served as the basis for comparison with other methods. The spatial variation in K, based on grain-size analysis, using the Hazen method, were poorly correlated with the breakthrough data. Trends in relative hydraulic conductivity (K/K(avg)) from permeameter testing compared somewhat better. However, the trends in transient drawdown with depth, measured in multilevel sampling points, corresponded particularly well with those of solute mass flux. Estimates of absolute K, based on standard pumping test analysis of the multilevel drawdown data, were inversely correlated with the tracer test data. The inverse nature of the correlation was attributed to assumptions in the transient drawdown packages that were inconsistent with the variable diffusivities encountered at the scale of the measurements. Collectively, the data showed that despite a relatively low variability in K within the aquifer under study (within a factor of 3), water and solute mass fluxes were concentrated in discrete intervals that could be targeted for later bioremediation.
Tailoring the response of Autonomous Reactivity Control (ARC) systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qvist, Staffan A.; Hellesen, Carl; Gradecka, Malwina
The Autonomous Reactivity Control (ARC) system was developed to ensure inherent safety of Generation IV reactors while having a minimal impact on reactor performance and economic viability. In this study we present the transient response of fast reactor cores to postulated accident scenarios with and without ARC systems installed. Using a combination of analytical methods and numerical simulation, the principles of ARC system design that assure stability and avoids oscillatory behavior have been identified. A comprehensive transient analysis study for ARC-equipped cores, including a series of Unprotected Loss of Flow (ULOF) and Unprotected Loss of Heat Sink (ULOHS) simulations, weremore » performed for Argonne National Laboratory (ANL) Advanced Burner Reactor (ABR) designs. With carefully designed ARC-systems installed in the fuel assemblies, the cores exhibit a smooth non-oscillatory transition to stabilization at acceptable temperatures following all postulated transients. To avoid oscillations in power and temperature, the reactivity introduced per degree of temperature change in the ARC system needs to be kept below a certain threshold the value of which is system dependent, the temperature span of actuation needs to be as large as possible.« less
Evaluation of external hazards to nuclear power plants in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimura, C.Y.; Budnitz, R.J.
1987-12-01
The Lawrence Livermore National Laboratory (LLNL) has performed a study of the risk of core damage to nuclear power plants in the United States due to externally initiated events. The broad objective has been to gain an understanding of whether or not each external initiator is among the major potential accident initiators that may pose a threat of severe reactor core damage or of large radioactive release to the environment from the reactor. Four external hazards were investigated in this report. These external hazards are internal fires, high winds/tornadoes, external floods, and transportation accidents. Analysis was based on two figures-of-merit,more » one based on core damage frequency and the other based on the frequency of large radioactive releases. Using these two figures-of-merit as evaluation criteria, it has been feasible to ascertain whether the risk from externally initiated accidents is, or is not, an important contributor to overall risk for the US nuclear power plants studied. This has been accomplished for each initiator separately. 208 refs., 17 figs., 45 tabs.« less
A Review of Gas-Cooled Reactor Concepts for SDI Applications
1989-08-01
710 program .) Wire- Core Reactor (proposed by Rockwell). The wire- core reactor utilizes thin fuel wires woven between spacer wires to form an open...reactor is based on results of developmental studies of nuclear rocket propulsion systems. The reactor core is made up of annular fuel assemblies of...XE Addendum to Volume II. NERVA Fuel Development , Westinghouse Astronuclear Laboratory, TNR-230, July 15’ 1972. J I8- Rover Program Reactor Tests
CT Scanning and Geophysical Measurements of the Marcellus Formation from the Tippens 6HS Well
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crandall, Dustin; Paronish, Thomas; Brown, Sarah
The computed tomography (CT) facilities and the Multi-Sensor Core Logger (MSCL) at the National Energy Technology Laboratory (NETL) Morgantown, West Virginia site were used to characterize core of the Marcellus Shale from a vertical well drilled in Eastern Ohio. The core is from the Tippens 6HS Well in Monroe County, Ohio and is comprised primarily of the Marcellus Shale from depths of 5550 to 5663 ft.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, T.L.; Musicki, Z.; Kohut, P.
1994-06-01
During 1989, the Nuclear Regulatory Commission (NRC) initiated an extensive program to carefully examine the Potential risks during low Power and shutdown operations. The program includes two parallel projects being performed by Brookhaven National Laboratory (BNL) and Sandia National Laboratories (SNL). Two plants, Surry (pressurized water reactor) and Grand Gulf (boiling water reactor), were selected as the Plants to be studied. The objectives of the program are to assess the risks of severe accidents initiated during plant operational states other than full power operation and to compare the estimated core damage frequencies, important accident sequences and other qualitative and quantitativemore » results with those accidents initiated during full power operation as assessed in NUREG-1150. The objective of this report is to document the approach utilized in the Surry plant and discuss the results obtained. A parallel report for the Grand Gulf plant is prepared by SNL. This study shows that the core-damage frequency during mid-loop operation at the Surry plant is comparable to that of power operation. We recognize that there is very large uncertainty in the human error probabilities in this study. This study identified that only a few procedures are available for mitigating accidents that may occur during shutdown. Procedures written specifically for shutdown accidents would be useful. This document, Volume 2, Pt. 2 provides appendices A through D of this report.« less
Stack-and-Draw Manufacture Process of a Seven-Core Optical Fiber for Fluorescence Measurements
NASA Astrophysics Data System (ADS)
Samir, Ahmed; Batagelj, Bostjan
2018-01-01
Multi-core, optical-fiber technology is expected to be used in telecommunications and sensory systems in a relatively short amount of time. However, a successful transition from research laboratories to industry applications will only be possible with an optimized design and manufacturing process. The fabrication process is an important aspect in designing and developing new multi-applicable, multi-core fibers, where the best candidate is a seven-core fiber. Here, the basics for designing and manufacturing a single-mode, seven-core fiber using the stack-and-draw process is described for the example of a fluorescence sensory system.
Characterization of Gas and Particle Emissions from Laboratory Burns of Peat
Peat cores collected from two locations in eastern North Carolina (NC, USA) were burned in a laboratory facility to characterize emissions during simulated field combustion. Particle and gas samples were analyzed to quantify emission factors for particulate matter (PM2.5), organi...
Integrated Laboratory and Field Investigations: Assessing Contaminant Risk to American Badgers
This manuscript provides an example of integrated laboratory and field approach to complete a toxicological ecological risk assessment at the landscape level. The core findings from the study demonstrate how radio telemetry data can allow for ranking the relative risks of contam...
2003-08-27
KENNEDY SPACE CENTER, FLA. - The U.S. Node 2 is undergoing a Multi-Element Integrated Test (MEIT) in the Space Station Processing Facility. Node 2 attaches to the end of the U.S. Lab on the ISS and provides attach locations for the Japanese laboratory, European laboratory, the Centrifuge Accommodation Module and, eventually, Multipurpose Logistics Modules. It will provide the primary docking location for the Shuttle when a pressurized mating adapter is attached to Node 2. Installation of the module will complete the U.S. Core of the ISS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, D.; Dubrovin, M.; Gaponenko, I.
Psana(Photon Science Analysis) is a software package that is used to analyze data produced by the Linac Coherent Light Source X-ray free-electron laser at the SLAC National Accelerator Laboratory. The project began in 2011, is written primarily in C++ with some Python, and provides user interfaces in both C++ and Python. Most users use the Python interface. The same code can be run in real time while data are being taken as well as offline, executing on many nodes/cores using MPI for parallelization. It is publicly available and installable on the RHEL5/6/7 operating systems.
Phytoscreening with SPME: Variability Analysis.
Limmer, Matt A; Burken, Joel G
2015-01-01
Phytoscreening has been demonstrated at a variety of sites over the past 15 years as a low-impact, sustainable tool in delineation of shallow groundwater contaminated with chlorinated solvents. Collection of tree cores is rapid and straightforward, but low concentrations in tree tissues requires sensitive analytics. Solid-phase microextraction (SPME) is amenable to the complex matrix while allowing for solvent-less extraction. Accurate quantification requires the absence of competitive sorption, examined here both in laboratory experiments and through comprehensive examination of field data. Analysis of approximately 2,000 trees at numerous field sites also allowed testing of the tree genus and diameter effects on measured tree contaminant concentrations. Collectively, while these variables were found to significantly affect site-adjusted perchloroethylene (PCE) concentrations, the explanatory power of these effects was small (adjusted R(2) = 0.031). 90th quantile chemical concentrations in trees were significantly reduced by increasing Henry's constant and increasing hydrophobicity. Analysis of replicate tree core data showed no correlation between replicate relative standard deviation (RSD) and wood type or tree diameter, with an overall median RSD of 30%. Collectively, these findings indicate SPME is an appropriate technique for sampling and analyzing chlorinated solvents in wood and that phytoscreening is robust against changes in tree type and diameter.
A laboratory model for solidification of Earth's core
NASA Astrophysics Data System (ADS)
Bergman, Michael I.; Macleod-Silberstein, Marget; Haskel, Michael; Chandler, Benjamin; Akpan, Nsikan
2005-11-01
To better understand the influence of rotating convection in the outer core on the solidification of the inner core we have constructed a laboratory model for solidification of Earth's core. The model consists of a 15 cm radius hemispherical acrylic tank concentric with a 5 cm radius hemispherical aluminum heat exchanger that serves as the incipient inner core onto which we freeze ice from salt water. Long exposure photographs of neutrally buoyant particles in illuminated planes suggest reduction of flow parallel to the rotation axis. Thermistors in the tank near the heat exchanger show that in experiments with rotation the temperature near the pole is lower than near the equator, unlike for control experiments without rotation or with a polymer that increases the fluid viscosity. The photographs and thermistors suggest that our observation that ice grows faster near the pole than near the equator for experiments with rotation is a result of colder water not readily convecting away from the pole. Because of the reversal of the thermal gradient, we expect faster equatorial solidification in the Earth's core. Such anisotropy in solidification has been suggested as a cause of inner core elastic (and attenuation) anisotropy, though the plausibility of this suggestion will depend on the core Nusselt number and the slope of the liquidus, and the effects of post-solidification deformation. Previous experiments on hexagonal close-packed alloys such as sea ice and zinc-tin have shown that fluid flow in the melt can result in a solidification texture transverse to the solidification direction, with the texture depending on the nature of the flow. A comparison of the visualized flow and the texture of columnar ice crystals in thin sections from these experiments confirms flow-induced transverse textures. This suggests that the convective pattern at the base of the outer core is recorded in the texture of the inner core, and that outer core convection might contribute to the complexity in the seismically inferred pattern of anisotropy in the Earth's inner core.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garner, P. L.; Hanan, N. A.
The MARIA reactor at the Institute of Atomic Energy (IAE) in Swierk (30 km SE of Warsaw) in the Republic of Poland is considering conversion from high-enriched uranium (HEU) to low-enriched uranium (LEU) fuel assemblies (FA). The FA design in MARIA is rather unique; a suitable LEU FA has never been designed or tested. IAE has contracted with CERCA (the fuel supply portion of AREVA in France) to supply 2 lead test assemblies (LTA). The LTAs will be irradiated in MARIA to burnup level of at least 40% for both LTAs and to 60% for one LTA. IAE may decidemore » to purchase additional LEU FAs for a full core conversion after the test irradiation. The Reactor Safety Committee within IAE and the National Atomic Energy Agency in Poland (PAA) must approve the LTA irradiation process. The approval will be based, in part, on IAE submitting revisions to portions of the Safety Analysis Report (SAR) which are affected by the insertion of the LTAs. (A similar process will be required for the full core conversion to LEU fuel.) The analysis required was established during working meetings between Argonne National Laboratory (ANL) and IAE staff during August 2006, subsequent email correspondence, and subsequent staff visits. The analysis needs to consider the current high-enriched uranium (HEU) core and 4 core configurations containing 1 and 2 LEU LTAs in various core positions. Calculations have been performed at ANL in support of the LTA irradiation. These calculations are summarized in this report and include criticality, burn-up, neutronics parameters, steady-state thermal hydraulics, and postulated transients. These calculations have been performed at the request of the IAE staff, who are performing similar calculations to be used in their SAR amendment submittal to the PAA. The ANL analysis has been performed independently from that being performed by IAE and should only be used as one step in the verification process.« less
The makeover of the Lakeshore General Hospital laboratories.
Estioko-Taimuri, Teresa
2006-01-31
This article describes the expansion and reorganization of a moderate-sized Canadian laboratory from Day One to "Live Day." The key factors to the success of this project were organized planning by the laboratory staff and the introduction of core lab theories, team building, and organized training sessions. The successful makeover resulted in improved turnaround time for STAT tests, especially those coming from the Emergency Unit. The efforts of the laboratory personnel toward the improvement of laboratory services, in spite of budget, human resources constraints, and resistance to change, are addressed.
Ernest Orlando Lawrence Berkeley National Laboratory institutional plan, FY 1996--2001
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-01
The FY 1996--2001 Institutional Plan provides an overview of the Ernest Orlando Lawrence Berkeley National Laboratory mission, strategic plan, core business areas, critical success factors, and the resource requirements to fulfill its mission in support of national needs in fundamental science and technology, energy resources, and environmental quality. The Laboratory Strategic Plan section identifies long-range conditions that will influence the Laboratory, as well as potential research trends and management implications. The Core Business Areas section identifies those initiatives that are potential new research programs representing major long-term opportunities for the Laboratory, and the resources required for their implementation. It alsomore » summarizes current programs and potential changes in research program activity, science and technology partnerships, and university and science education. The Critical Success Factors section reviews human resources; work force diversity; environment, safety, and health programs; management practices; site and facility needs; and communications and trust. The Resource Projections are estimates of required budgetary authority for the Laboratory`s ongoing research programs. The Institutional Plan is a management report for integration with the Department of Energy`s strategic planning activities, developed through an annual planning process. The plan identifies technical and administrative directions in the context of the national energy policy and research needs and the Department of Energy`s program planning initiatives. Preparation of the plan is coordinated by the Office of Planning and Communications from information contributed by the Laboratory`s scientific and support divisions.« less
NASA Astrophysics Data System (ADS)
Pini, Ronny; Benson, Sally M.
2017-10-01
We report results from an experimental investigation on the hysteretic behaviour of the capillary pressure curve for the supercritical CO2-water system in a Berea Sandstone core. Previous observations have highlighted the importance of subcore-scale capillary heterogeneity in developing local saturations during drainage; we show in this study that the same is true for the imbibition process. Spatially distributed drainage and imbibition scanning curves were obtained for mm-scale subsets of the rock sample non-invasively using X-ray CT imagery. Core- and subcore-scale measurements are well described using the Brooks-Corey formalism, which uses a linear trapping model to compute mobile saturations during imbibition. Capillary scaling yields two separate universal drainage and imbibition curves that are representative of the full subcore-scale data set. This enables accurate parameterisation of rock properties at the subcore-scale in terms of capillary scaling factors and permeability, which in turn serve as effective indicators of heterogeneity at the same scale even when hysteresis is a factor. As such, the proposed core-analysis workflow is quite general and provides the required information to populate numerical models that can be used to extend core-flooding experiments to conditions prevalent in the subsurface, which would be otherwise not attainable in the laboratory.
Three-phase inductive-coupled structures for contactless PHEV charging system
NASA Astrophysics Data System (ADS)
Lee, Jia-You; Shen, Hung-Yu; Li, Cheng-Bin
2016-07-01
In this article, a new-type three-phase inductive-coupled structure is proposed for the contactless plug-in hybrid electric vehicle (PHEV) charging system regarding with SAE J-1773. Four possible three-phase core structures are presented and subsequently investigated by the finite element analysis. To study the correlation between the core geometric parameter and the coupling coefficient, the magnetic equivalent circuit model of each structure is also established. In accordance with the simulation results, the low reluctance and the sharing of flux path in the core material are achieved by the proposed inductive-coupled structure with an arc-shape and three-phase symmetrical core material. It results in a compensation of the magnetic flux between each phase and a continuous flow of the output power in the inductive-coupled structure. Higher coupling coefficient between inductive-coupled structures is achieved. A comparison of coupling coefficient, mutual inductance, and self-inductance between theoretical and measured results is also performed to verify the proposed model. A 1 kW laboratory scale prototype of the contactless PHEV charging system with the proposed arc-shape three-phase inductive-coupled structure is implemented and tested. An overall system efficiency of 88% is measured when two series lithium iron phosphate battery packs of 25.6 V/8.4 Ah are charged.
The effect of rock fabric on P-wave velocity distribution in amphibolites
NASA Astrophysics Data System (ADS)
Vajdová, V.; Přikryl, R.; Pros, Z.; Klíma, K.
1999-07-01
This study presents contribution to the laboratory investigation of elastic properties and rock fabric of amphibolites. P-wave velocity was determined on four spherical samples prepared from a shallow borehole core. The measurement was conducted in 132 directions under various conditions of hydrostatic pressure (up to 400 MPa). The rock fabric was investigated by image analysis of thin sections that enabled precise determination of grain size, modal composition and shape parameters of rock-forming minerals. Laboratory measurement of P-waves revealed pseudoorthorhombic symmetry of rock fabric in amphibolites studied. This symmetry reflects rocks' macro- and microfabric. Maximum P-wave velocity corresponds to the macroscopically visible stretching lineation. Minimum P-wave velocity is oriented perpendicular to the foliation plane. The average grain size is the main microstructural factor controlling mean P-wave velocity.
Penesyan, Anahit; Kumar, Sheemal S.; Kamath, Karthik; Shathili, Abdulrahman M.; Venkatakrishnan, Vignesh; Krisp, Christoph; Packer, Nicolle H.; Molloy, Mark P.; Paulsen, Ian T.
2015-01-01
The opportunistic pathogen Pseudomonas aeruginosa is among the main colonizers of the lungs of cystic fibrosis (CF) patients. We have isolated and sequenced several P. aeruginosa isolates from the sputum of CF patients and compared them with each other and with the model strain PAO1. Phenotypic analysis of CF isolates showed significant variability in colonization and virulence-related traits suggesting different strategies for adaptation to the CF lung. Genomic analysis indicated these strains shared a large set of core genes with the standard laboratory strain PAO1, and identified the genetic basis for some of the observed phenotypic differences. Proteomics revealed that in a conventional laboratory medium PAO1 expressed 827 proteins that were absent in the CF isolates while the CF isolates shared a distinctive signature set of 703 proteins not detected in PAO1. PAO1 expressed many transporters for the uptake of organic nutrients and relatively few biosynthetic pathways. Conversely, the CF isolates expressed a narrower range of transporters and a broader set of metabolic pathways for the biosynthesis of amino acids, carbohydrates, nucleotides and polyamines. The proteomic data suggests that in a common laboratory medium PAO1 may transport a diverse set of “ready-made” nutrients from the rich medium, whereas the CF isolates may only utilize a limited number of nutrients from the medium relying mainly on their own metabolism for synthesis of essential nutrients. These variations indicate significant differences between the metabolism and physiology of P. aeruginosa CF isolates and PAO1 that cannot be detected at the genome level alone. The widening gap between the increasing genomic data and the lack of phenotypic data means that researchers are increasingly reliant on extrapolating from genomic comparisons using experimentally characterized model organisms such as PAO1. While comparative genomics can provide valuable information, our data suggests that such extrapolations may be fraught with peril. PMID:26431321
NASA Astrophysics Data System (ADS)
Heath, J. E.; Dewers, T. A.; McPherson, B. J.; Wilson, T. H.; Flach, T.
2009-12-01
Understanding and characterizing transport properties of fine-grained rocks is critical in development of shale gas plays or assessing retention of CO2 at geologic storage sites. Difficulties arise in that both small scale (i.e., ~ nm) properties of the rock matrix and much larger scale fractures, faults, and sedimentological architecture govern migration of multiphase fluids. We present a multi-scale investigation of sealing and transport properties of the Kirtland Formation, which is a regional aquitard and reservoir seal in the San Juan Basin, USA. Sub-micron dual FIB/SEM imaging and reconstruction of 3D pore networks in core samples reveal a variety of pore types, including slit-shaped pores that are co-located with sedimentary structures and variations in mineralogy. Micron-scale chemical analysis and XRD reveal a mixture of mixed-layer smectite/illite, chlorite, quartz, and feldspar with little organic matter. Analysis of sub-micron digital reconstructions, mercury capillary injection pressure, and gas breakthrough measurements indicate a high quality sealing matrix. Natural full and partially mineralized fractures observed in core and in FMI logs include those formed from early soil-forming processes, differential compaction, and tectonic events. The potential impact of both fracture and matrix properties on large-scale transport is investigated through an analysis of natural helium from core samples, 3D seismic data and poro-elastic modeling. While seismic interpretations suggest considerable fracturing of the Kirtland, large continuous fracture zones and faults extending through the seal to the surface cannot be inferred from the data. Observed Kirtland Formation multi-scale transport properties are included as part of a risk assessment methodology for CO2 storage. Acknowledgements: The authors gratefully acknowledge the U.S. Department of Energy’s (DOE) National Energy Technology Laboratory for sponsoring this project. The DOE’s Basic Energy Science Office funded the dual FIB/SEM analysis. The Kirtland Formation overlies the coal seams of the Fruitland into which CO2 has been injected as a Phase II demonstration of the Southwest Regional Partnership on Carbon Sequestration. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the U.S. Department of Energy under contract DE-ACOC4-94AL85000.
Williams working on the JAXA MS (Marangoni Surface) Experiment
2009-11-05
ISS021-E-020304 (5 Nov. 2009) --- NASA astronaut Jeffrey Williams, Expedition 21 flight engineer, works with Fluid Physics Experiment Facility/Marangoni Surface (FPEF MS) Core hardware in the Kibo laboratory of the International Space Station. Williams first inserted the Marangoni Inside (MI) cassette in the MI Core for a leak check, and then installed the MI Core into the FPEF MI Body. The Marangoni convection experiment in the FPEF examines fluid tension flow in micro-G.
Evidence for a core gut microbiota in the zebrafish
Roeselers, Guus; Mittge, Erika K; Stephens, W Zac; Parichy, David M; Cavanaugh, Colleen M; Guillemin, Karen; Rawls, John F
2011-01-01
Experimental analysis of gut microbial communities and their interactions with vertebrate hosts is conducted predominantly in domesticated animals that have been maintained in laboratory facilities for many generations. These animal models are useful for studying coevolved relationships between host and microbiota only if the microbial communities that occur in animals in lab facilities are representative of those that occur in nature. We performed 16S rRNA gene sequence-based comparisons of gut bacterial communities in zebrafish collected recently from their natural habitat and those reared for generations in lab facilities in different geographic locations. Patterns of gut microbiota structure in domesticated zebrafish varied across different lab facilities in correlation with historical connections between those facilities. However, gut microbiota membership in domesticated and recently caught zebrafish was strikingly similar, with a shared core gut microbiota. The zebrafish intestinal habitat therefore selects for specific bacterial taxa despite radical differences in host provenance and domestication status. PMID:21472014
The talk will highlight key aspects and results of analytical methods the EPA National Health and Environmental Effects Research Laboratory (NHEERL) Analytical Chemistry Research Core (ACRC) develops and uses to provide data on disposition, metabolism, and effects of environmenta...
Sscience & technology review; Science Technology Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-07-01
This review is published ten times a year to communicate, to a broad audience, Lawrence Livermore National Laboratory`s scientific and technological accomplishments, particularly in the Laboratory`s core mission areas - global security, energy and the environment, and bioscience and biotechnology. This review for the month of July 1996 discusses: Frontiers of research in advanced computations, The multibeam Fabry-Perot velocimeter: Efficient measurement of high velocities, High-tech tools for the American textile industry, and Rock mechanics: can the Tuff take the stress.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mac Donald, Philip Elsworth; Buongiorno, Jacopo; Davis, Cliff Bybee
The purpose of this collaborative Idaho National Engineering and Environmental Laboratory (INEEL) and Massachusetts Institute of Technology (MIT) Laboratory Directed Research and Development (LDRD) project is to investigate the suitability of lead or lead-bismuth cooled fast reactors for producing low-cost electricity as well as for actinide burning. The goal is to identify and analyze the key technical issues in core neutronics, materials, thermal-hydraulics, fuels, and economics associated with the development of this reactor concept. Work has been accomplished in four major areas of research: core neutronic design, plant engineering, material compatibility studies, and coolant activation. The publications derived from workmore » on this project (since project inception) are listed in Appendix A.« less
NASA Astrophysics Data System (ADS)
Smith, L. A.; Barbour, S. L.; Hendry, M. J.; Novakowski, K.; van der Kamp, G.
2016-07-01
Characterizing the hydraulic conductivity (K) of aquitards is difficult due to technical and logistical difficulties associated with field-based methods as well as the cost and challenge of collecting representative and competent core samples for laboratory analysis. The objective of this study was to produce a multiscale comparison of vertical and horizontal hydraulic conductivity (Kv and Kh, respectively) of a regionally extensive Cretaceous clay-rich aquitard in southern Saskatchewan. Ten vibrating wire pressure transducers were lowered into place at depths between 25 and 325 m, then the annular was space was filled with a cement-bentonite grout. The in situ Kh was estimated at the location of each transducer by simulating the early-time pore pressure measurements following setting of the grout using a 2-D axisymmetric, finite element, numerical model. Core samples were collected during drilling for conventional laboratory testing for Kv to compare with the transducer-determined in situ Kh. Results highlight the importance of scale and consideration of the presence of possible secondary features (e.g., fractures) in the aquitard. The proximity of the transducers to an active potash mine (˜1 km) where depressurization of an underlying aquifer resulted in drawdown through the aquitard provided a unique opportunity to model the current hydraulic head profile using both the Kh and Kv estimates. Results indicate that the transducer-determined Kh estimates would allow for the development of the current hydraulic head distribution, and that simulating the pore pressure recovery can be used to estimate moderately low in situ Kh (<10-11 m s-1).
Determination of subsurface fluid contents at a crude-oil spill site
Hess, K.M.; Herkelrath, W.N.; Essaid, H.I.
1992-01-01
Measurement of the fluid-content distribution at sites contaminated by immiscible fluids, including crude oil, is needed to better understand the movement of these fluids in the subsurface and to provide data to calibrate and verify numerical models and geophysical methods. A laboratory method was used to quantify the fluid contents of 146 core sections retrieved from boreholes aligned along a 120-m longitudinal transect at a crude-oil spill site near Bemidji, Minnesota, U.S.A. The 47-mm-diameter, minimally disturbed cores spanned a 4-m vertical interval contaminated by oil. Cores were frozen on site in a dry ice-alcohol bath to prevent redistribution and loss of fluids while sectioning the cores. We gravimetrically determined oil and water contents using a two-step method: (1) samples were slurried and the oil was removed by absorption onto strips of hydrophobic porous polyethylene (PPE); and (2) the samples were oven-dried to remove the water. The resulting data show sharp vertical gradients in the water and oil contents and a clearly defined oil body. The subsurface distribution is complex and appears to be influenced by sediment heterogeneities and water-table fluctuations. The center of the oil body has depressed the water-saturated zone boundary, and the oil is migrating laterally within the capillary fringe. The oil contents are as high as 0.3 cm3 cm-3, which indicates that oil is probably still mobile 10 years after the spill occurred. The thickness of oil measured in wells suggests that accumulated thickness in wells is a poor indicator of the actual distribution of oil in the subsurface. Several possible sources of error are identified with the field and laboratory methods. An error analysis indicates that adsorption of water and sediment into the PPE adds as much as 4% to the measured oil masses and that uncertainties in the calculated sample volume and the assumed oil density introduce an additional ??3% error when the masses are converted to fluid contents.
Phelan, Joan A.; Abrams, William R.; Norman, Robert G.; Li, Yihong; Laverty, Maura; Corby, Patricia M.; Nembhard, Jason; Neri, Dinah; Barber, Cheryl A.; Aberg, Judith A.; Fisch, Gene S.; Poles, Michael A.; Malamud, Daniel
2014-01-01
Introduction The impaired host defense system in HIV infection impacts the oral and gastrointestinal microbiota and associated opportunistic infections. Antiretroviral treatment is predicted to partially restore host defenses and decrease the oral manifestation of HIV/AIDS. Well-designed longitudinal studies are needed to better understand the interactions of soluble host defense proteins with bacteria and virus in HIV/AIDS. “Crosstalk” was designed as a longitudinal study of host responses along the gastrointestinal (GI) tract and interactions between defense molecules and bacteria in HIV infection and subsequent therapy. Purpose The clinical core formed the infrastructure for the study of the interactions between the proteome, microbiome and innate immune system. The core recruited and retained study subjects, scheduled visits, obtained demographic and medical data, assessed oral health status, collected samples, and guided analysis of the hypotheses. This manuscript presents a well-designed clinical core that may serve as a model for studies that combine clinical and laboratory data. Methods Crosstalk was a case-control longitudinal clinical study an initial planned enrollment of 170 subjects. HIV+ antiretroviral naïve subjects were followed for 9 visits over 96 weeks and HIV uninfected subjects for 3 visits over 24 weeks. Clinical prevalence of oral mucosal lesions, dental caries and periodontal disease were assessed. Results During the study, 116 subjects (47 HIV+, 69 HIV-) were enrolled. Cohorts of HIV+ and HIV- were demographically similar except for a larger proportion of women in the HIV- group. The most prevalent oral mucosal lesions were oral candidiasis and hairy leukoplakia in the HIV+ group. Discussion The clinical core was essential to enable the links between clinical and laboratory data. The study aims to determine specific differences between oral and GI tissues that account for unique patterns of opportunistic infections and to delineate the differences in their susceptibility to infection by HIV and their responses post-HAART. PMID:25409430
Phelan, Joan A; Abrams, William R; Norman, Robert G; Li, Yihong; Laverty, Maura; Corby, Patricia M; Nembhard, Jason; Neri, Dinah; Barber, Cheryl A; Aberg, Judith A; Fisch, Gene S; Poles, Michael A; Malamud, Daniel
2014-01-01
The impaired host defense system in HIV infection impacts the oral and gastrointestinal microbiota and associated opportunistic infections. Antiretroviral treatment is predicted to partially restore host defenses and decrease the oral manifestation of HIV/AIDS. Well-designed longitudinal studies are needed to better understand the interactions of soluble host defense proteins with bacteria and virus in HIV/AIDS. "Crosstalk" was designed as a longitudinal study of host responses along the gastrointestinal (GI) tract and interactions between defense molecules and bacteria in HIV infection and subsequent therapy. The clinical core formed the infrastructure for the study of the interactions between the proteome, microbiome and innate immune system. The core recruited and retained study subjects, scheduled visits, obtained demographic and medical data, assessed oral health status, collected samples, and guided analysis of the hypotheses. This manuscript presents a well-designed clinical core that may serve as a model for studies that combine clinical and laboratory data. Crosstalk was a case-control longitudinal clinical study an initial planned enrollment of 170 subjects. HIV+ antiretroviral naïve subjects were followed for 9 visits over 96 weeks and HIV uninfected subjects for 3 visits over 24 weeks. Clinical prevalence of oral mucosal lesions, dental caries and periodontal disease were assessed. During the study, 116 subjects (47 HIV+, 69 HIV-) were enrolled. Cohorts of HIV+ and HIV- were demographically similar except for a larger proportion of women in the HIV- group. The most prevalent oral mucosal lesions were oral candidiasis and hairy leukoplakia in the HIV+ group. The clinical core was essential to enable the links between clinical and laboratory data. The study aims to determine specific differences between oral and GI tissues that account for unique patterns of opportunistic infections and to delineate the differences in their susceptibility to infection by HIV and their responses post-HAART.
Geophysical Properties of Hard Rock for Investigation of Stress Fields in Deep Mines
NASA Astrophysics Data System (ADS)
Tibbo, M.; Young, R. P.; Schmitt, D. R.; Milkereit, B.
2014-12-01
A complication in geophysical monitoring of deep mines is the high-stress dependency of the physical properties of hard rocks. In-mine observations show anisotropic variability of the in situ P- and S-wave velocities and resistivity of the hard rocks that are likely related to stress field changes. As part of a comprehensive study in a deep, highly stressed mine located in Sudbury, Ontario, Canada, data from in situ monitoring of the seismicity, conductivity, stress, and stress dependent physical properties has been obtain. In-laboratory experiments are also being performed on borehole cores from the Sudbury mines. These experiments will measure the Norite borehole core's properties including elastic modulus, bulk modulus, P- and S-wave velocities, and density. Hydraulic fracturing has been successfully implemented in industries such as oil and gas and enhanced geothermal systems, and is currently being investigated as a potential method for preconditioning in mining. However, further research is required to quantify how hydraulic fractures propagate through hard, unfractured rock as well as naturally fractured rock typically found in mines. These in laboratory experiments will contribute to a hydraulic fracturing project evaluating the feasibility and effectiveness of hydraulic fracturing as a method of de-stressing hard rock mines. A tri-axial deformation cell equipped with 18 Acoustic Emission (AE) sensors will be used to bring the borehole cores to a tri-axial state of stress. The cores will then be injected with fluid until the the hydraulic fracture has propagated to the edge of the core, while AE waveforms will be digitized continuously at 10 MHz and 12-bit resolution for the duration of each experiment. These laboratory hydraulic fracture experiments will contribute to understanding how parameters including stress ratio, fluid injection rate, and viscosity, affect the fracturing process.
Revising laboratory work: sociological perspectives on the science classroom
NASA Astrophysics Data System (ADS)
Jobér, Anna
2017-09-01
This study uses sociological perspectives to analyse one of the core practices in science education: schoolchildren's and students' laboratory work. Applying an ethnographic approach to the laboratory work done by pupils at a Swedish compulsory school, data were generated through observations, field notes, interviews, and a questionnaire. The pupils, ages 14 and 15, were observed as they took a 5-week physics unit (specifically, mechanics). The analysis shows that the episodes of laboratory work could be filled with curiosity and exciting challenges; however, another picture emerged when sociological concepts and notions were applied to what is a very common way of working in the classroom. Laboratory work is characterised as a social activity that is expected to be organised as a group activity. This entails groups becoming, to some extent, `safe havens' for the pupils. On the other hand, this way of working in groups required pupils to subject to the groups and the peer effect, sometimes undermining their chances to learn and perform better. In addition, the practice of working in groups when doing laboratory work left some pupils and the teacher blaming themselves, even though the outcome of the learning situation was a result of a complex interplay of social processes. This article suggests a stronger emphasis on the contradictions and consequences of the science subjects, which are strongly influenced by their socio-historical legacy.
A New Resource for College Distance Education Astronomy Laboratory Exercises
ERIC Educational Resources Information Center
Vogt, Nicole P.; Cook, Stephen P.; Muise, Amy Smith
2013-01-01
This article introduces a set of distance education astronomy laboratory exercises for use by college students and instructors and discusses first usage results. This General Astronomy Education Source exercise set contains eight two-week projects designed to guide students through both core content and mathematical applications of general…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quirk, W.J.; Canada, J.; de Vore, L.
1994-04-01
This issue highlights the Lawrence Livermore National Laboratory`s 1993 accomplishments in our mission areas and core programs: economic competitiveness, national security, energy, the environment, lasers, biology and biotechnology, engineering, physics, chemistry, materials science, computers and computing, and science and math education. Secondary topics include: nonproliferation, arms control, international security, environmental remediation, and waste management.
An Introductory Undergraduate Course Covering Animal Cell Culture Techniques
ERIC Educational Resources Information Center
Mozdziak, Paul E.; Petitte, James N.; Carson, Susan D.
2004-01-01
Animal cell culture is a core laboratory technique in many molecular biology, developmental biology, and biotechnology laboratories. Cell culture is a relatively old technique that has been sparingly taught at the undergraduate level. The traditional methodology for acquiring cell culture training has been through trial and error, instruction when…
An Investigative, Cooperative Learning Approach for General Chemistry Laboratories
ERIC Educational Resources Information Center
Díaz-Vázquez, Liz M.; Montes, Barbara Casañas; Echevarría Vargas, Ileabett M.; Hernandez-Cancel, Griselle; Gonzalez, Fernando; Molina, Anna M.; Morales-Cruz, Moraima; Torres-Díaz, Carlos M.; Griebenow, Kai
2012-01-01
The integration of research and education is an essential component of our university's teaching philosophy. Recently, we made a curricular revision to facilitate such an approach in the General Chemistry Laboratory, to teach students that investigative approaches are at the core of sciences. The curriculum revision included new interdisciplinary…
Mallory, Melanie A; Lucic, Danijela; Ebbert, Mark T W; Cloherty, Gavin A; Toolsie, Dan; Hillyard, David R
2017-05-01
HCV genotyping remains a critical tool for guiding initiation of therapy and selecting the most appropriate treatment regimen. Current commercial genotyping assays may have difficulty identifying 1a, 1b and genotype 6. To evaluate the concordance for identifying 1a, 1b, and genotype 6 between two methods: the PLUS assay and core/NS5B sequencing. This study included 236 plasma and serum samples previously genotyped by core/NS5B sequencing. Of these, 25 samples were also previously tested by the Abbott RealTime HCV GT II Research Use Only (RUO) assay and yielded ambiguous results. The remaining 211 samples were routine genotype 1 (n=169) and genotype 6 (n=42). Genotypes obtained from sequence data were determined using a laboratory-developed HCV sequence analysis tool and the NCBI non-redundant database. Agreement between the PLUS assay and core/NS5B sequencing for genotype 1 samples was 95.8% (162/169), with 96% (127/132) and 95% (35/37) agreement for 1a and 1b samples respectively. PLUS results agreed with core/NS5B sequencing for 83% (35/42) of unselected genotype 6 samples, with the remaining seven "not detected" by the PLUS assay. Among the 25 samples with ambiguous GT II results, 15 were concordant by PLUS and core/NS5B sequencing, nine were not detected by PLUS, and one sample had an internal control failure. The PLUS assay is an automated method that identifies 1a, 1b and genotype 6 with good agreement with gold-standard core/NS5B sequencing and can aid in the resolution of certain genotype samples with ambiguous GT II results. Copyright © 2017 Elsevier B.V. All rights reserved.
Biosynthesis of glycosaminoglycans: associated disorders and biochemical tests.
Sasarman, Florin; Maftei, Catalina; Campeau, Philippe M; Brunel-Guitton, Catherine; Mitchell, Grant A; Allard, Pierre
2016-03-01
Glycosaminoglycans (GAG) are long, unbranched heteropolymers with repeating disaccharide units that make up the carbohydrate moiety of proteoglycans. Six distinct classes of GAGs are recognized. Their synthesis follows one of three biosynthetic pathways, depending on the type of oligosaccharide linker they contain. Chondroitin sulfate, dermatan sulfate, heparan sulfate, and heparin sulfate contain a common tetrasaccharide linker that is O-linked to specific serine residues in core proteins. Keratan sulfate can contain three different linkers, either N-linked to asparagine or O-linked to serine/threonine residues in core proteins. Finally, hyaluronic acid does not contain a linker and is not covalently attached to a core protein. Most inborn errors of GAG biosynthesis are reported in small numbers of patients. To date, in 20 diseases, convincing evidence for pathogenicity has been presented for mutations in a total of 16 genes encoding glycosyltransferases, sulfotransferases, epimerases or transporters. GAG synthesis defects should be suspected in patients with a combination of characteristic clinical features in more than one connective tissue compartment: bone and cartilage (short long bones with or without scoliosis), ligaments (joint laxity/dislocations), and subepithelial (skin, sclerae). Some produce distinct clinical syndromes. The commonest laboratory tests used for this group of diseases are analysis of GAGs, enzyme assays, and molecular testing. In principle, GAG analysis has potential as a general first-line diagnostic test for GAG biosynthesis disorders.
Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling
NASA Astrophysics Data System (ADS)
Schum, William K.; Doolittle, Christina M.; Boyarko, George A.
2006-05-01
During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.
Diagnostic Pathology and Laboratory Medicine in the Age of “Omics”
Finn, William G.
2007-01-01
Functional genomics and proteomics involve the simultaneous analysis of hundreds or thousands of expressed genes or proteins and have spawned the modern discipline of computational biology. Novel informatic applications, including sophisticated dimensionality reduction strategies and cancer outlier profile analysis, can distill clinically exploitable biomarkers from enormous experimental datasets. Diagnostic pathologists are now charged with translating the knowledge generated by the “omics” revolution into clinical practice. Food and Drug Administration-approved proprietary testing platforms based on microarray technologies already exist and will expand greatly in the coming years. However, for diagnostic pathology, the greatest promise of the “omics” age resides in the explosion in information technology (IT). IT applications allow for the digitization of histological slides, transforming them into minable data and enabling content-based searching and archiving of histological materials. IT will also allow for the optimization of existing (and often underused) clinical laboratory technologies such as flow cytometry and high-throughput core laboratory functions. The state of pathology practice does not always keep up with the pace of technological advancement. However, to use fully the potential of these emerging technologies for the benefit of patients, pathologists and clinical scientists must embrace the changes and transformational advances that will characterize this new era. PMID:17652635
NASA Astrophysics Data System (ADS)
Flemings, P. B.; Phillips, S. C.
2017-12-01
In May 2017, a science team led by the University of Texas-Austin conducted drilling and coring operations from the Helix Q4000 targeting gas hydrates in sand-rich reservoirs in the Green Canyon 955 block in the northern Gulf of Mexico. The UT-GOM2-1 expedition goals were to 1) test two configurations of pressure coring devices to assess relative performance with respect to recovery and quality of samples and 2) gather sufficient samples to allow laboratories throughout the US to investigate a range of outstanding science questions related to the origin and nature of gas hydrate-bearing sands. In the first well (UT-GOM2-1-H002), 1 of the 8 cores were recovered under pressure with 34% recovery. In the second well (UT-GOM2-1-H005), 12 of 13 cores were recovered under pressure with 77% recovery. The pressure cores were imaged and logged under pressure. Samples were degassed both shipboard and dockside to interpret hydrate concentration and gas composition. Samples for microbiological and porewater analysis were taken from the depressurized samples. 21 3 ft pressure cores were returned to the University of Texas for storage, distribution, and further analysis. Preliminary analyses document that the hydrate-bearing interval is composed of two interbedded (cm to m thickness) facies. Lithofacies II is composed of sandy silt and has trough cross bedding whereas Lithofacies III is composed of clayey silt and no bedforms are observed. Lithofacies II has low density (1.7 to 1.9 g/cc) and high velocity (3000-3250 m/s) beds whereas Lithofacies 3 has high density ( 1.9-2.1g/cc) and low velocity ( 1700 m/s). Quantitative degassing was used to determine that Lithofacies II contains high hydrate saturation (66-87%) and Lithofacies III contains moderate saturation ( 18-30%). Gas samples were analyzed periodically in each experiment and were composed of primarily methane with an average of 94 ppm ethane and detectable, but not quantifiable, propane. The core data will provide a foundation for scientific exploration by the greater hydrate research community.
Transport of atrazine and dicamba through silt and loam soils
Tindall, James A.; Friedel, Michael J.
2016-01-01
The objectives of this research were to determine the role of preferential flow paths in the transport of atrazine (2-chloro-4-(ethylamino)-6-(isopropylamino)-s-triazine) and dicamba (3-6-dichloro-2-methoxybenzoic acid) through silt and loam soils overlying the High Plains aquifer in Nebraska. In a previous study, 3 of 6 study areas demonstrated high percentages of macropores; those three areas were used in this study for analysis of chemical transport. As a subsequent part of the study, 12 intact soil cores (30-cm diameter by 40-cm height), were excavated sequentially, two from each of the following depths: 0-40cm and 40-80cm. These cores were used to study preferential flow characteristics using dye staining and to determine hydraulic properties. Two undisturbed experimental field plots, each with a 3-m2 surface area, were installed in three study areas in Nebraska. Each was instrumented with suction lysimeters and tensiometers at depths of 10cm to 80cm in 10-cm increments. Additionally, each plot was planted with corn (Zea mays). A neutron probe access tube was installed in each plot to determine soil w ater content at 15-cm intervals. All plots were enclosed w ith a raised frame (of 8-cm height) to prevent surface runoff. All suction lysimeters were purged monthly for three months and were sampled immediately prior to pre-plant herbicide application to obtain background chemical concentrations. Atrazine and dicamba moved rapidly through the soil, but only after a heavy rainfall event, probably owing to the presence of preferential flow paths and lack of microbial degradation in these soil areas. Staining of laboratory cores showed a positive correlation between the percent area stained by depth and the subsequent breakthrough of Br- in the laboratory and leaching of field-applied herbicides owing to large rainfall events. Suction lysimeter samples in the field showed increases in concentrations of herbicides at depths where laboratory data indicated greater percentages of what appeared to be preferential flow paths. Concentrations of atrazine and dicamba exceeding 0.30 and 0.05µg m1-1 were observed at depths of 10-30cm and 50-70cm after two months following heavy rainfall events. It appears from the laboratory experiment that preferential flow paths were a significant factor in transport of atrazine and dicamba.
The formation of the doubly stable stratification in the Mediterranean Outflow
NASA Astrophysics Data System (ADS)
Bormans, M.; Turner, J. S.
1990-11-01
The Mediterranean Outflow as it exits from the Strait of Gibraltar can be seen as a gravity current flowing down the slope and mixing with Atlantic Water until it reaches its own density level. Typical salinity and temperature profiles through the core region of a Meddy show that the bottom of the core is colder and saltier than the top, leading to a stably stratified core with respect to double-diffusive processes. The bottom of the core is also more enriched with Mediterranean Water than the top, and this behaviour can be explained by a reduced mixing of the source water with the environment close to the rigid bottom. Although the mechanism involved is different from the actual case, we have successfully produced these doubly stable gradient in some laboratory experiments which incorporate the "filling-box" mechanism. Salt and sugar were used as laboratory analogues of temperature and salt, respectively. The laboratory experiments consisted of supplying a dense input fluid at the surface of a linearly salt stratified environment. We suggest that req, the ratio of the initial volume flux at the source to the volume flux at the equilibrium level, is an important parameter, and that in our experiments this must be in general smaller than 0.1 in order to produce a doubly stable region of salt and sugar. The most relevant experiments had a mixed sugar/salt input which is the analogue of the Mediterranean Outflow as it mixes with Atlantic Water outside the Strait of Gibraltar.
Constellation Architecture Team-Lunar: Lunar Habitat Concepts
NASA Technical Reports Server (NTRS)
Toups, Larry; Kennedy, Kriss J.
2008-01-01
This paper will describe lunar habitat concepts that were defined as part of the Constellation Architecture Team-Lunar (CxAT-Lunar) in support of the Vision for Space Exploration. There are many challenges to designing lunar habitats such as mission objectives, launch packaging, lander capability, and risks. Surface habitats are required in support of sustaining human life to meet the mission objectives of lunar exploration, operations, and sustainability. Lunar surface operations consist of crew operations, mission operations, EVA operations, science operations, and logistics operations. Habitats are crewed pressurized vessels that include surface mission operations, science laboratories, living support capabilities, EVA support, logistics, and maintenance facilities. The challenge is to deliver, unload, and deploy self-contained habitats and laboratories to the lunar surface. The CxAT-Lunar surface campaign analysis focused on three primary trade sets of analysis. Trade set one (TS1) investigated sustaining a crew of four for six months with full outpost capability and the ability to perform long surface mission excursions using large mobility systems. Two basic habitat concepts of a hard metallic horizontal cylinder and a larger inflatable torus concept were investigated as options in response to the surface exploration architecture campaign analysis. Figure 1 and 2 depicts the notional outpost configurations for this trade set. Trade set two (TS2) investigated a mobile architecture approach with the campaign focused on early exploration using two small pressurized rovers and a mobile logistics support capability. This exploration concept will not be described in this paper. Trade set three (TS3) investigated delivery of a "core' habitation capability in support of an early outpost that would mature into the TS1 full outpost capability. Three core habitat concepts were defined for this campaign analysis. One with a four port core habitat, another with a 2 port core habitat, and the third investigated leveraging commonality of the lander ascent module and airlock pressure vessel hard shell. The paper will describe an overview of the various habitat concepts and their functionality. The Crew Operations area includes basic crew accommodations such as sleeping, eating, hygiene and stowage. The EVA Operations area includes additional EVA capability beyond the suit-port airlock function such as redundant airlock(s), suit maintenance, spares stowage, and suit stowage. The Logistics Operations area includes the enhanced accommodations for 180 days such as closed loop life support systems hardware, consumable stowage, spares stowage, interconnection to the other Hab units, and a common interface mechanism for future growth and mating to a pressurized rover. The Mission & Science Operations area includes enhanced outpost autonomy such as an IVA glove box, life support, and medical operations.
Infrastructure for Personalized Medicine at Partners HealthCare
Weiss, Scott T.; Shin, Meini Sumbada
2016-01-01
Partners HealthCare Personalized Medicine (PPM) is a center within the Partners HealthCare system (founded by Massachusetts General Hospital and Brigham and Women’s Hospital) whose mission is to utilize genetics and genomics to improve the care of patients in a cost effective manner. PPM consists of five interconnected components: (1) Laboratory for Molecular Medicine (LMM), a CLIA laboratory performing genetic testing for patients world-wide; (2) Translational Genomics Core (TGC), a core laboratory providing genomic platforms for Partners investigators; (3) Partners Biobank, a biobank of samples (DNA, plasma and serum) for 50,000 Consented Partners patients; (4) Biobank Portal, an IT infrastructure and viewer to bring together genotypes, samples, phenotypes (validated diagnoses, radiology, and clinical chemistry) from the electronic medical record to Partners investigators. These components are united by (5) a common IT system that brings researchers, clinicians, and patients together for optimal research and patient care. PMID:26927187
NASA Astrophysics Data System (ADS)
Rizzo, R. E.; Healy, D.; Farrell, N. J.
2017-12-01
Numerous laboratory brittle deformation experiments have shown that a rapid transition exists in the behaviour of porous materials under stress: at a certain point, early formed tensile cracks interact and coalesce into a `single' narrow zone, the shear plane, rather than remaining distributed throughout the material. In this work, we present and apply a novel image processing tool which is able to quantify this transition between distributed (`stable') damage accumulation and localised (`unstable') deformation, in terms of size, density, and orientation of cracks at the point of failure. Our technique, based on a two-dimensional (2D) continuous Morlet wavelet analysis, can recognise, extract and visually separate the multi-scale changes occurring in the fracture network during the deformation process. We have analysed high-resolution SEM-BSE images of thin sections of Hopeman Sandstone (Scotland, UK) taken from core plugs deformed under triaxial conditions, with increasing confining pressure. Through this analysis, we can determine the relationship between the initial orientation of tensile microcracks and the final geometry of the through-going shear fault, exploiting the total areal coverage of the analysed image. In addition, by comparing patterns of fractures in thin sections derived from triaxial (σ1>σ2=σ3=Pc) laboratory experiments conducted at different confining pressures (Pc), we can quantitatively explore the relationship between the observed geometry and the inferred mechanical processes. The methodology presented here can have important implications for larger-scale mechanical problems related to major fault propagation. Just as a core plug scale fault localises through extension and coalescence of microcracks, larger faults also grow by extension and coalescence of segments in a multi-scale process by which microscopic cracks can ultimately lead to macroscopic faulting. Consequently, wavelet analysis represents a useful tool for fracture pattern recognition, applicable to the detection of the transitions occurring at the time of catastrophic rupture.
Uncertainty assessment method for the Cs-137 fallout inventory and penetration depth.
Papadakos, G N; Karangelos, D J; Petropoulos, N P; Anagnostakis, M J; Hinis, E P; Simopoulos, S E
2017-05-01
Within the presented study, soil samples were collected in year 2007 at 20 different locations of the Greek terrain, both from the surface and also from depths down to 26 cm. Sampling locations were selected primarily from areas where high levels of 137 Cs deposition after the Chernobyl accident had already been identified by the Nuclear Engineering Laboratory of the National Technical University of Athens during and after the year of 1986. At one location of relatively higher deposition, soil core samples were collected following a 60 m by 60 m Cartesian grid with a 20 m node-to-node distance. Single or pair core samples were also collected from the remaining 19 locations. Sample measurements and analysis were used to estimate 137 Cs inventory and the corresponding depth migration, twenty years after the deposition on Greek terrain. Based on these data, the uncertainty components of the whole sampling-to-results procedure were investigated. A cause-and-effect assessment process was used to apply the law of error propagation and demonstrate that the dominating significant component of the combined uncertainty is that due to the spatial variability of the contemporary (2007) 137 Cs inventory. A secondary, yet also significant component was identified to be the activity measurement process itself. Other less-significant uncertainty parameters were sampling methods, the variation in the soil field density with depth and the preparation of samples for measurement. The sampling grid experiment allowed for the quantitative evaluation of the uncertainty due to spatial variability, also by the assistance of the semivariance analysis. Denser, optimized grid could return more accurate values for this component but with a significantly elevated laboratory cost, in terms of both, human and material resources. Using the hereby collected data and for the case of a single core soil sampling using a well-defined sampling methodology quality assurance, the uncertainty component due to spatial variability was evaluated to about 19% for the 137 Cs inventory and up to 34% for the 137 Cs penetration depth. Based on the presented results and also on related literature, it is argued that such high uncertainties should be anticipated for single core samplings conducted using similar methodology and employed as 137 Cs inventory and penetration depth estimators. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mass Spectrometry on Future Mars Landers
NASA Technical Reports Server (NTRS)
Brinckerhoff, W. B.; Mahaffy, P. R.
2011-01-01
Mass spectrometry investigations on the 2011 Mars Science Laboratory (MSL) and the 2018 ExoMars missions will address core science objectives related to the potential habitability of their landing site environments and more generally the near-surface organic inventory of Mars. The analysis of complex solid samples by mass spectrometry is a well-known approach that can provide a broad and sensitive survey of organic and inorganic compounds as well as supportive data for mineralogical analysis. The science value of such compositional information is maximized when one appreciates the particular opportunities and limitations of in situ analysis with resource-constrained instrumentation in the context of a complete science payload and applied to materials found in a particular environment. The Sample Analysis at Mars (SAM) investigation on MSL and the Mars Organic Molecule Analyzer (MOMA) investigation on ExoMars will thus benefit from and inform broad-based analog field site work linked to the Mars environments where such analysis will occur.
Turning the Page: Advancing Paper-Based Microfluidics for Broad Diagnostic Application.
Gong, Max M; Sinton, David
2017-06-28
Infectious diseases are a major global health issue. Diagnosis is a critical first step in effectively managing their spread. Paper-based microfluidic diagnostics first emerged in 2007 as a low-cost alternative to conventional laboratory testing, with the goal of improving accessibility to medical diagnostics in developing countries. In this review, we examine the advances in paper-based microfluidic diagnostics for medical diagnosis in the context of global health from 2007 to 2016. The theory of fluid transport in paper is first presented. The next section examines the strategies that have been employed to control fluid and analyte transport in paper-based assays. Tasks such as mixing, timing, and sequential fluid delivery have been achieved in paper and have enabled analytical capabilities comparable to those of conventional laboratory methods. The following section examines paper-based sample processing and analysis. The most impactful advancement here has been the translation of nucleic acid analysis to a paper-based format. Smartphone-based analysis is another exciting development with potential for wide dissemination. The last core section of the review highlights emerging health applications, such as male fertility testing and wearable diagnostics. We conclude the review with the future outlook, remaining challenges, and emerging opportunities.
Portable apparatus for surface evaluation of furniture panels
B. G. Heebink
1963-01-01
In 1959, a new technique was devised at the Forest Products Laboratory that provided a means of examining, evaluating, and recording the show- through pattern (often called telegraphing) of panels made with particle board cores. Although the technique was devised as a working tool to evaluate show-through characteristics of particle board cores, it can be used equally...
Factors affecting the use of increment cores to assess fixation
Stan T. Lebow
2001-01-01
As part of an effort to ensure that treated wood products have minimal environmental and handling concerns, an American Wood Preservers Association task force is considering the development of a test to assess the degree of fixation of Waterborne wood preservatives. The proposed test involves removal and leaching of increment cores. This paper describes a laboratory...
ERIC Educational Resources Information Center
Gilliland, John W.
Development of a design for a new elementary school facility is traced through evaluation of various innovative facilities. Significant features include--(1) the spiral plan form, (2) centralized core levels including teacher work center, "perception" core, and interior stream aquariam, (3) the learning laboratory classroom suites, (4) a unique…
The SIV plasma viral load assay performed by the Quantitative Molecular Diagnostics Core (QMDC) utilizes reagents specifically designed to detect and accurately quantify the full range of SIV/SHIV viral variants and clones in common usage in the rese
Proceedings of the wellbore sampling workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Traeger, R.K.; Harding, B.W.
Representatives from academia, industry and research laboratories participated in an intensive two-day review to identify major technological limitations in obtaining solid and fluid samples from wellbores. Top priorities identified for further development include: coring of hard and unconsolidated materials; flow through fluid samplers with borehole measurements T, P and pH; and nonintrusive interrogation of pressure cores.
NASA Astrophysics Data System (ADS)
Nakagawa, S.; Kneafsey, T. J.; Chang, C.; Harper, E.
2014-12-01
During geological sequestration of CO2, fractures are expected to play a critical role in controlling the migration of the injected fluid in reservoir rock. To detect the invasion of supercritical (sc-) CO2 and to determine its saturation, velocity and attenuation of seismic waves can be monitored. When both fractures and matrix porosity connected to the fractures are present, wave-induced dynamic poroelastic interactions between these two different types of rock porosity—high-permeability, high-compliance fractures and low-permeability, low-compliance matrix porosity—result in complex velocity and attenuation changes of compressional waves as scCO2 invades the rock. We conducted core-scale laboratory scCO2 injection experiments on small (diameter 1.5 inches, length 3.5-4 inches), medium-porosity/permeability (porosity 15%, matrix permeability 35 md) sandstone cores. During the injection, the compressional and shear (torsion) wave velocities and attenuations of the entire core were determined using our Split Hopkinson Resonant Bar (short-core resonant bar) technique in the frequency range of 1-2 kHz, and the distribution and saturation of the scCO2 determined via X-ray CT imaging using a medical CT scanner. A series of tests were conducted on (1) intact rock cores, (2) a core containing a mated, core-parallel fracture, (3) a core containing a sheared core-parallel fracture, and (4) a core containing a sheared, core-normal fracture. For intact cores and a core containing a mated sheared fracture, injections of scCO2 into an initially water-saturated sample resulted in large and continuous decreases in the compressional velocity as well as temporary increases in the attenuation. For a sheared core-parallel fracture, large attenuation was also observed, but almost no changes in the velocity occurred. In contrast, a sample containing a core-normal fracture exhibited complex behavior of compressional wave attenuation: the attenuation peaked as the leading edge of the scCO2 approached the fracture; followed by an immediate drop as scCO2 invaded the fracture; and by another, gradual increase as the scCO2 infiltrated into the other side of the fracture. The compressional wave velocity declined monotonically, but the rate of velocity decrease changed with the changes in attenuation.
Moreno, Lilliana I; Brown, Alice L; Callaghan, Thomas F
2017-07-01
Rapid DNA platforms are fully integrated systems capable of producing and analyzing short tandem repeat (STR) profiles from reference sample buccal swabs in less than two hours. The technology requires minimal user interaction and experience making it possible for high quality profiles to be generated outside an accredited laboratory. The automated production of point of collection reference STR profiles could eliminate the time delay for shipment and analysis of arrestee samples at centralized laboratories. Furthermore, point of collection analysis would allow searching against profiles from unsolved crimes during the normal booking process once the infrastructure to immediately search the Combined DNA Index System (CODIS) database from the booking station is established. The DNAscan/ANDE™ Rapid DNA Analysis™ System developed by Network Biosystems was evaluated for robustness and reliability in the production of high quality reference STR profiles for database enrollment and searching applications. A total of 193 reference samples were assessed for concordance of the CODIS 13 loci. Studies to evaluate contamination, reproducibility, precision, stutter, peak height ratio, noise and sensitivity were also performed. The system proved to be robust, consistent and dependable. Results indicated an overall success rate of 75% for the 13 CODIS core loci and more importantly no incorrect calls were identified. The DNAscan/ANDE™ could be confidently used without human interaction in both laboratory and non-laboratory settings to generate reference profiles. Published by Elsevier B.V.
MAGIC database and interfaces: an integrated package for gene discovery and expression.
Cordonnier-Pratt, Marie-Michèle; Liang, Chun; Wang, Haiming; Kolychev, Dmitri S; Sun, Feng; Freeman, Robert; Sullivan, Robert; Pratt, Lee H
2004-01-01
The rapidly increasing rate at which biological data is being produced requires a corresponding growth in relational databases and associated tools that can help laboratories contend with that data. With this need in mind, we describe here a Modular Approach to a Genomic, Integrated and Comprehensive (MAGIC) Database. This Oracle 9i database derives from an initial focus in our laboratory on gene discovery via production and analysis of expressed sequence tags (ESTs), and subsequently on gene expression as assessed by both EST clustering and microarrays. The MAGIC Gene Discovery portion of the database focuses on information derived from DNA sequences and on its biological relevance. In addition to MAGIC SEQ-LIMS, which is designed to support activities in the laboratory, it contains several additional subschemas. The latter include MAGIC Admin for database administration, MAGIC Sequence for sequence processing as well as sequence and clone attributes, MAGIC Cluster for the results of EST clustering, MAGIC Polymorphism in support of microsatellite and single-nucleotide-polymorphism discovery, and MAGIC Annotation for electronic annotation by BLAST and BLAT. The MAGIC Microarray portion is a MIAME-compliant database with two components at present. These are MAGIC Array-LIMS, which makes possible remote entry of all information into the database, and MAGIC Array Analysis, which provides data mining and visualization. Because all aspects of interaction with the MAGIC Database are via a web browser, it is ideally suited not only for individual research laboratories but also for core facilities that serve clients at any distance.
Bio-inspired approach for intelligent unattended ground sensors
NASA Astrophysics Data System (ADS)
Hueber, Nicolas; Raymond, Pierre; Hennequin, Christophe; Pichler, Alexander; Perrot, Maxime; Voisin, Philippe; Moeglin, Jean-Pierre
2015-05-01
Improving the surveillance capacity over wide zones requires a set of smart battery-powered Unattended Ground Sensors capable of issuing an alarm to a decision-making center. Only high-level information has to be sent when a relevant suspicious situation occurs. In this paper we propose an innovative bio-inspired approach that mimics the human bi-modal vision mechanism and the parallel processing ability of the human brain. The designed prototype exploits two levels of analysis: a low-level panoramic motion analysis, the peripheral vision, and a high-level event-focused analysis, the foveal vision. By tracking moving objects and fusing multiple criteria (size, speed, trajectory, etc.), the peripheral vision module acts as a fast relevant event detector. The foveal vision module focuses on the detected events to extract more detailed features (texture, color, shape, etc.) in order to improve the recognition efficiency. The implemented recognition core is able to acquire human knowledge and to classify in real-time a huge amount of heterogeneous data thanks to its natively parallel hardware structure. This UGS prototype validates our system approach under laboratory tests. The peripheral analysis module demonstrates a low false alarm rate whereas the foveal vision correctly focuses on the detected events. A parallel FPGA implementation of the recognition core succeeds in fulfilling the embedded application requirements. These results are paving the way of future reconfigurable virtual field agents. By locally processing the data and sending only high-level information, their energy requirements and electromagnetic signature are optimized. Moreover, the embedded Artificial Intelligence core enables these bio-inspired systems to recognize and learn new significant events. By duplicating human expertise in potentially hazardous places, our miniature visual event detector will allow early warning and contribute to better human decision making.
Fahed, Robert; Ben Maacha, Malek; Ducroux, Célina; Khoury, Naim; Blanc, Raphaël; Piotin, Michel; Lapergue, Bertrand
2018-05-14
We aimed to assess the agreement between study investigators and the core laboratory (core lab) of a thrombectomy trial for imaging scores. The Alberta Stroke Program Early CT Score (ASPECTS), the European Collaborative Acute Stroke Study (ECASS) hemorrhagic transformation (HT) classification, and the Thrombolysis In Cerebral Infarction (TICI) scores as recorded by study investigators were compared with the core lab scores in order to assess interrater agreement, using Cohen's unweighted and weighted kappa statistics. There were frequent discrepancies between study sites and core lab for all the scores. Agreement for ASPECTS and ECASS HT classification was less than substantial, with disagreement occurring in more than one-third of cases. Agreement was higher on MRI-based scores than on CT, and was improved after dichotomization on both CT and MRI. Agreement for TICI scores was moderate (with disagreement occurring in more than 25% of patients), and went above the substantial level (less than 10% disagreement) after dichotomization (TICI 0/1/2a vs 2b/3). Discrepancies between scores assessed by the imaging core lab and those reported by study sites occurred in a significant proportion of patients. Disagreement in the assessment of ASPECTS and day 1 HT scores was more frequent on CT than on MRI. The agreement for the dichotomized TICI score (the trial's primary outcome) was substantial, with less than 10% of disagreement between study sites and core lab. NCT02523261, Post-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Conze, R.; Krysiak, F.; Wallrabe-Adams, H.; Graham, C. C.
2004-12-01
During August/September 2004, the Arctic Coring Expedition (ACEX) was used to trial a new Offshore Drilling Information System (OffshoreDIS). ACEX was the first Mission Specific Platform (MSP) expedition of the Integrated Ocean Drilling Programme (IODP), funded by the European Consortium for Ocean Research Drilling (ECORD). The British Geological Survey in conjunction with the University of Bremen and the European Petrophysics Consortium were the ECORD Science Operator (ESO) for ACEX. IODP MSP expeditions have very similar data management requirements and operate in similar working environments to the lake drilling projects conducted by the International Continental Scientific Drilling Program (ICDP), for example, the GLAD800, which has very restricted space on board and operates in difficult conditions. Both organizations require data capture and management systems that are mobile, flexible and that can be deployed quickly on small- to medium-sized drilling platforms for the initial gathering of data, and that can also be deployed onshore in laboratories where the bulk of the scientific work is conducted. ESO, therefore, decided that an adapted version of the existing Drilling Information System (DIS) used by ICDP projects would satisfy its requirements. Based on the existing DIS, an OffshoreDIS has been developed for MSP expeditions. The underlying data model is compatible with IODP(JANUS), the Bremen Core Repository, WDC-MARE/PANGAEA and the LacCore in Minneapolis. According to the specific expedition platform configuration and on-board workflow requirements for the Arctic, this data model, data pumps and user interfaces were adapted for the ACEX-OffshoreDIS. On the drill ship Vidar Viking the cores were catalogued and petrophysically logged using a GeoTek Multi-Sensor Core Logger System, while further initial measurements, lithological descriptions and biostratigraphic investigations were undertaken on the Oden, which provided laboratory facilities for the expedition. Onboard samples were registered in a corresponding sample archive on both vessels. The ACEX-OffshoreDIS used a local area network covering the two ships of the three icebreaker fleet by wireless LAN between the ships and partly wired LAN on the ships. A DIS-server was installed on each ship. These were synchronized by database replication and linked to a total of 10 client systems and label printers across both ships. The ACEX-OffshoreDIS will also be used for the scientific measurement and analysis phase of the expedition during the post-field operations `shore-party' in November 2004 at the Bremen Core Repository (BCR). The data management system employed in the Arctic will be reconfigured and deployed at the BCR. In addition, an eXtended DIS (XDIS) Web interface will be available. This will allow controlled sample distribution (core curation, sub-sampling) as well as sharing of data (registration, upload and download) with other laboratories which will be undertaking additional sampling and analyses. The OffshoreDIS data management system will be of long-term benefit to both IODP and ICDP, being deployed in forthcoming MSP offshore projects, ICDP lake projects and joint IODP-ICDP projects such as the New Jersey Coastal Plain Drilling Project.
Fast imaging of laboratory core floods using 3D compressed sensing RARE MRI.
Ramskill, N P; Bush, I; Sederman, A J; Mantle, M D; Benning, M; Anger, B C; Appel, M; Gladden, L F
2016-09-01
Three-dimensional (3D) imaging of the fluid distributions within the rock is essential to enable the unambiguous interpretation of core flooding data. Magnetic resonance imaging (MRI) has been widely used to image fluid saturation in rock cores; however, conventional acquisition strategies are typically too slow to capture the dynamic nature of the displacement processes that are of interest. Using Compressed Sensing (CS), it is possible to reconstruct a near-perfect image from significantly fewer measurements than was previously thought necessary, and this can result in a significant reduction in the image acquisition times. In the present study, a method using the Rapid Acquisition with Relaxation Enhancement (RARE) pulse sequence with CS to provide 3D images of the fluid saturation in rock core samples during laboratory core floods is demonstrated. An objective method using image quality metrics for the determination of the most suitable regularisation functional to be used in the CS reconstructions is reported. It is shown that for the present application, Total Variation outperforms the Haar and Daubechies3 wavelet families in terms of the agreement of their respective CS reconstructions with a fully-sampled reference image. Using the CS-RARE approach, 3D images of the fluid saturation in the rock core have been acquired in 16min. The CS-RARE technique has been applied to image the residual water saturation in the rock during a water-water displacement core flood. With a flow rate corresponding to an interstitial velocity of vi=1.89±0.03ftday(-1), 0.1 pore volumes were injected over the course of each image acquisition, a four-fold reduction when compared to a fully-sampled RARE acquisition. Finally, the 3D CS-RARE technique has been used to image the drainage of dodecane into the water-saturated rock in which the dynamics of the coalescence of discrete clusters of the non-wetting phase are clearly observed. The enhancement in the temporal resolution that has been achieved using the CS-RARE approach enables dynamic transport processes pertinent to laboratory core floods to be investigated in 3D on a time-scale and with a spatial resolution that, until now, has not been possible. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Mechanical Stability of Fractured Rift Basin Mudstones: from lab to basin scale
NASA Astrophysics Data System (ADS)
Zakharova, N. V.; Goldberg, D.; Collins, D.; Swager, L.; Payne, W. G.
2016-12-01
Understanding petrophysical and mechanical properties of caprock mudstones is essential for ensuring good containment and mechanical formation stability at potential CO2 storage sites. Natural heterogeneity and presence of fractures, however, create challenges for accurate prediction of mudstone behavior under injection conditions and at reservoir scale. In this study, we present a multi-scale geomechanical analysis for Mesozoic mudstones from the Newark Rift basin, integrating petropyshical core and borehole data, in situ stress measurements, and caprock stability modeling. The project funded by the U.S. Department of Energy's National Energy Technology Laboratory (NETL) focuses on the Newark basin as a representative locality for a series of the Mesozoic rift basins in eastern North America considered as potential CO2 storage sites. An extensive core characterization program, which included laboratory CT scans, XRD, SEM, MICP, porosity, permeability, acoustic velocity measurements, and geomechanical testing under a range of confining pressures, revealed large variability and heterogeneity in both petrophysical and mechanical properties. Estimates of unconfined compressive strength for these predominantly lacustrine mudstones range from 5,000 to 50,000 psi, with only a weak correlation to clay content. Thinly bedded intervals exhibit up to 30% strength anisotropy. Mineralized fractures, abundant in most formations, are characterized by compressive strength as low as 10% of matrix strength. Upscaling these observations from core to reservoir scale is challenging. No simple one-to-one correlation between mechanical and petrophyscial properties exists, and therefore, we develop multivariate empirical relationships among these properties. A large suite of geophysical logs, including new measurements of the in situ stress field, is used to extrapolate these relationships to a basin-scale geomechanical model and predict mudstone behavior under injection conditions.
Developing laboratory networks: a practical guide and application.
Kirk, Carol J; Shult, Peter A
2010-01-01
The role of the public health laboratory (PHL) in support of public health response has expanded beyond testing to include a number of other core functions, such as emergency response, training and outreach, communications, laboratory-based surveillance, and laboratory data management. These functions can only be accomplished by a network that includes public health and other agency laboratories and clinical laboratories. It is a primary responsibility of the PHL to develop and maintain such a network. In this article, we present practical recommendations-based on 17 years of network development experience-for the development of statewide laboratory networks. These recommendations, and examples of current laboratory networks, are provided to facilitate laboratory network development in other states. The development of laboratory networks will enhance each state's public health system and is critical to the development of a robust national Laboratory Response Network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinehart, M.
1996-09-01
This document reports status and technical progress for Los Alamos National Laboratories 94-1 Research and Development projects. An introduction to the project structure and an executive summary are included. Projects described include Electrolytic Decontamination, Combustibles, Detox, Sand, Slag, and Crucible, Surveillance, and Core Technology.
The Invention Factory: Thomas Edison's Laboratories. Teaching with Historic Places.
ERIC Educational Resources Information Center
Bolger, Benjamin
This lesson explores the group of buildings in West Orange, New Jersey, built in 1887, that formed the core of Thomas Edison's research and development complex. They consisted of chemistry, physics, and metallurgy laboratories; machine shop; pattern shop; research library; and rooms for experiments. The lesson explains that the prototypes (ideas…
An Ill-Structured PBL-Based Microprocessor Course without Formal Laboratory
ERIC Educational Resources Information Center
Kim, Jungkuk
2012-01-01
This paper introduces a problem-based learning (PBL) microprocessor application course designed according to the following strategies: 1) hands-on training without having a formal laboratory, and 2) intense student-centered cooperative learning through an ill-structured problem. PBL was adopted as the core educational technique of the course to…
Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results
ERIC Educational Resources Information Center
Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.
2015-01-01
We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…
The stability and transport of radio-labeled Fe2O3 particles were studied using laboratory batch and column techniques. Core material collected from shallow sand and gravel aquifer was used as the immobile column matrix material. Variables in the study included flow rate, pH, i...
McDonald, Peter R; Roy, Anuradha; Chaguturu, Rathnam
2011-07-01
The University of Kansas High-Throughput Screening (KU HTS) core is a state-of-the-art drug-discovery facility with an entrepreneurial open-service policy, which provides centralized resources supporting public- and private-sector research initiatives. The KU HTS core was established in 2002 at the University of Kansas with support from an NIH grant and the state of Kansas. It collaborates with investigators from national and international academic, nonprofit and pharmaceutical organizations in executing HTS-ready assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization. This is part two of a contribution from the KU HTS laboratory.
Exploration of Antarctic Subglacial environments: a challenge for analytical chemistry
NASA Astrophysics Data System (ADS)
Traversi, R.; Becagli, S.; Castellano, E.; Ghedini, C.; Marino, F.; Rugi, F.; Severi, M.; Udisti, R.
2009-12-01
The large number of subglacial lakes detected in the Dome C area in East Antarctica suggests that this region may be a valuable source of paleo-records essential for understanding the evolution of the Antarctic ice cap and climate changes in the last several millions years. In the framework of the Project on “Exploration and characterization of Concordia Lake, Antarctica”, supported by Italian Program for Antarctic Research (PNRA), a glaciological investigation of the Dome C “Lake District” are planned. Indeed, the glacio-chemical characterisation of the ice column over subglacial lakes will allow to evaluate the fluxes of major and trace chemical species along the ice column and in the accreted ice and, consequently, the availability of nutrients and oligo-elements for possible biological activity in the lake water and sediments. Melting and freezing at the base of the ice sheet should be able to deliver carbon and salts to the lake, as observed for the Vostok subglacial lake, which are thought to be able to support a low concentration of micro-organisms for extended periods of time. Thus, this investigation represents the first step for exploring the subglacial environments including sampling and analysis of accreted ice, lake water and sediments. In order to perform reliable analytical measurements, especially of trace chemical species, clean sub-sampling and analytical techniques are required. For this purpose, the techniques already used by the CHIMPAC laboratory (Florence University) in the framework of international Antarctic drilling Projects (EPICA - European Project for Ice Coring in Antarctica, TALDICE - TALos Dome ICE core, ANDRILL MIS - ANTarctic DRILLing McMurdo Ice Shelf) were optimised and new techniques were developed to ensure a safe sample handling. CHIMPAC laboratory has been involved since several years in the study of Antarctic continent, primarily focused on understanding the bio-geo-chemical cycles of chemical markers and the interpretation of their records in sedimentary archives (ice cores, sediment cores). This activity takes advantage of facilities for storage, decontamination and pre-analysis treatment of ice and sediment strips (cold room equipped with laminar flow hoods and decontamination devices at different automation level, class 10000 clean room, systems for the complete acid digestion of sediment samples, production of ultra-pure acids and sediments’ granulometric selection) and for analytical determination of a wide range of chemical tracers. In particular, the operative instrumental set includes several Ion Chromatographs for inorganic and selected organic ions measurement (by classical Ion Chromatography and Fast Ion Chromatography), Atomic Absorption and Emission Spectrometers (F-AAS, GF-AAS, ICP-AES) and Inductively Coupled Plasma - Sector Field Mass Spectrometry (ICP-SFMS) for the analysis of the soluble or “available” inorganic fraction together with Ion Beam Analysis techniques for elemental composition (PIXE-PIGE, in collaboration with INFN and Physics Institute of Florence University) and geochemical analysis (SEM-EDS).
Prediction of Gas Injection Performance for Heterogeneous Reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blunt, Martin J.; Orr, Franklin M.
This report describes research carried out in the Department of Petroleum Engineering at Stanford University from September 1997 - September 1998 under the second year of a three-year grant from the Department of Energy on the "Prediction of Gas Injection Performance for Heterogeneous Reservoirs." The research effort is an integrated study of the factors affecting gas injection, from the pore scale to the field scale, and involves theoretical analysis, laboratory experiments, and numerical simulation. The original proposal described research in four areas: (1) Pore scale modeling of three phase flow in porous media; (2) Laboratory experiments and analysis of factorsmore » influencing gas injection performance at the core scale with an emphasis on the fundamentals of three phase flow; (3) Benchmark simulations of gas injection at the field scale; and (4) Development of streamline-based reservoir simulator. Each state of the research is planned to provide input and insight into the next stage, such that at the end we should have an integrated understanding of the key factors affecting field scale displacements.« less
The French initiative for scientific cores virtual curating : a user-oriented integrated approach
NASA Astrophysics Data System (ADS)
Pignol, Cécile; Godinho, Elodie; Galabertier, Bruno; Caillo, Arnaud; Bernardet, Karim; Augustin, Laurent; Crouzet, Christian; Billy, Isabelle; Teste, Gregory; Moreno, Eva; Tosello, Vanessa; Crosta, Xavier; Chappellaz, Jérome; Calzas, Michel; Rousseau, Denis-Didier; Arnaud, Fabien
2016-04-01
Managing scientific data is probably one the most challenging issue in modern science. The question is made even more sensitive with the need of preserving and managing high value fragile geological sam-ples: cores. Large international scientific programs, such as IODP or ICDP are leading an intense effort to solve this problem and propose detailed high standard work- and dataflows thorough core handling and curating. However most results derived from rather small-scale research programs in which data and sample management is generally managed only locally - when it is … The national excellence equipment program (Equipex) CLIMCOR aims at developing French facilities for coring and drilling investigations. It concerns indiscriminately ice, marine and continental samples. As part of this initiative, we initiated a reflexion about core curating and associated coring-data management. The aim of the project is to conserve all metadata from fieldwork in an integrated cyber-environment which will evolve toward laboratory-acquired data storage in a near future. In that aim, our demarche was conducted through an close relationship with field operators as well laboratory core curators in order to propose user-oriented solutions. The national core curating initiative currently proposes a single web portal in which all scientifics teams can store their field data. For legacy samples, this will requires the establishment of a dedicated core lists with associated metadata. For forthcoming samples, we propose a mobile application, under Android environment to capture technical and scientific metadata on the field. This application is linked with a unique coring tools library and is adapted to most coring devices (gravity, drilling, percussion, etc...) including multiple sections and holes coring operations. Those field data can be uploaded automatically to the national portal, but also referenced through international standards or persistent identifiers (IGSN, ORCID and INSPIRE) and displayed in international portals (currently, NOAA's IMLGS). In this paper, we present the architecture of the integrated system, future perspectives and the approach we adopted to reach our goals. We will also present in front of our poster, one of the three mobile applications, dedicated more particularly to the operations of continental drillings.
NASA Astrophysics Data System (ADS)
Pignol, C.; Arnaud, F.; Godinho, E.; Galabertier, B.; Caillo, A.; Billy, I.; Augustin, L.; Calzas, M.; Rousseau, D. D.; Crosta, X.
2016-12-01
Managing scientific data is probably one the most challenging issues in modern science. In plaeosciences the question is made even more sensitive with the need of preserving and managing high value fragile geological samples: cores. Large international scientific programs, such as IODP or ICDP led intense effort to solve this problem and proposed detailed high standard work- and dataflows thorough core handling and curating. However many paleoscience results derived from small-scale research programs in which data and sample management is too often managed only locally - when it is… In this paper we present a national effort leads in France to develop an integrated system to curate ice and sediment cores. Under the umbrella of the national excellence equipment program CLIMCOR, we launched a reflexion about core curating and the management of associated fieldwork data. Our aim was then to conserve all data from fieldwork in an integrated cyber-environment which will evolve toward laboratory-acquired data storage in a near future. To do so, our demarche was conducted through an intimate relationship with field operators as well laboratory core curators in order to propose user-oriented solutions. The national core curating initiative proposes a single web portal in which all teams can store their fieldwork data. This portal is used as a national hub to attribute IGSNs. For legacy samples, this requires the establishment of a dedicated core list with associated metadata. However, for forthcoming core data, we developed a mobile application to capture technical and scientific data directly on the field. This application is linked with a unique coring-tools library and is adapted to most coring devices (gravity, drilling, percussion etc.) including multiple sections and holes coring operations. Those field data can be uploaded automatically to the national portal, but also referenced through international standards (IGSN and INSPIRE) and displayed in international portals (currently, NOAA's IMLGS). In this paper, we present the architecture of the integrated system, future perspectives and the approach we adopted to reach our goals. We will also present our mobile application through didactic examples.
Status Report on NEAMS PROTEUS/ORIGEN Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A
2016-02-18
The US Department of Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program has contributed significantly to the development of the PROTEUS neutron transport code at Argonne National Laboratory and to the Oak Ridge Isotope Generation and Depletion Code (ORIGEN) depletion/decay code at Oak Ridge National Laboratory. PROTEUS’s key capability is the efficient and scalable (up to hundreds of thousands of cores) neutron transport solver on general, unstructured, three-dimensional finite-element-type meshes. The scalability and mesh generality enable the transfer of neutron and power distributions to other codes in the NEAMS toolkit for advanced multiphysics analysis. Recently, ORIGEN has received considerablemore » modernization to provide the high-performance depletion/decay capability within the NEAMS toolkit. This work presents a description of the initial integration of ORIGEN in PROTEUS, mainly performed during FY 2015, with minor updates in FY 2016.« less
The Mouse Genomes Project: a repository of inbred laboratory mouse strain genomes.
Adams, David J; Doran, Anthony G; Lilue, Jingtao; Keane, Thomas M
2015-10-01
The Mouse Genomes Project was initiated in 2009 with the goal of using next-generation sequencing technologies to catalogue molecular variation in the common laboratory mouse strains, and a selected set of wild-derived inbred strains. The initial sequencing and survey of sequence variation in 17 inbred strains was completed in 2011 and included comprehensive catalogue of single nucleotide polymorphisms, short insertion/deletions, larger structural variants including their fine scale architecture and landscape of transposable element variation, and genomic sites subject to post-transcriptional alteration of RNA. From this beginning, the resource has expanded significantly to include 36 fully sequenced inbred laboratory mouse strains, a refined and updated data processing pipeline, and new variation querying and data visualisation tools which are available on the project's website ( http://www.sanger.ac.uk/resources/mouse/genomes/ ). The focus of the project is now the completion of de novo assembled chromosome sequences and strain-specific gene structures for the core strains. We discuss how the assembled chromosomes will power comparative analysis, data access tools and future directions of mouse genetics.
Permeability analysis of Asbuton material used as core layers of water resistance in the body of dam
NASA Astrophysics Data System (ADS)
Rahim, H.; Tjaronge, M. W.; Thaha, A.; Djamaluddin, R.
2017-11-01
In order to increase consumption of the local materials and national products, large reserves of Asbuton material about 662.960 million tons in the Buton Islands became an alternative as a waterproof core layer in the body of dam. The Asbuton material was used in this research is Lawele Granular Asphalt (LGA). This study was an experimental study conducted in the laboratory by conducting density testing (content weight) and permeability on Asbuton material. Testing of the Asbuton material used Falling Head method to find out the permeability value of Asbuton material. The data of test result to be analyzed are the relation between compaction energy and density value also relation between density value and permeability value of Asbuton material. The result shows that increases the number of blow apply to the Asbuton material at each layer will increase the density of the Asbuton material. The density value of Asbuton material that satisfies the requirements for use as an impermeable core layer in the dam body is 1.53 grams/cm3. The increase the density value (the weight of the contents) of the Asbuton material will reduce its permeability value of the Asbuton material.
TAMEE: data management and analysis for tissue microarrays.
Thallinger, Gerhard G; Baumgartner, Kerstin; Pirklbauer, Martin; Uray, Martina; Pauritsch, Elke; Mehes, Gabor; Buck, Charles R; Zatloukal, Kurt; Trajanoski, Zlatko
2007-03-07
With the introduction of tissue microarrays (TMAs) researchers can investigate gene and protein expression in tissues on a high-throughput scale. TMAs generate a wealth of data calling for extended, high level data management. Enhanced data analysis and systematic data management are required for traceability and reproducibility of experiments and provision of results in a timely and reliable fashion. Robust and scalable applications have to be utilized, which allow secure data access, manipulation and evaluation for researchers from different laboratories. TAMEE (Tissue Array Management and Evaluation Environment) is a web-based database application for the management and analysis of data resulting from the production and application of TMAs. It facilitates storage of production and experimental parameters, of images generated throughout the TMA workflow, and of results from core evaluation. Database content consistency is achieved using structured classifications of parameters. This allows the extraction of high quality results for subsequent biologically-relevant data analyses. Tissue cores in the images of stained tissue sections are automatically located and extracted and can be evaluated using a set of predefined analysis algorithms. Additional evaluation algorithms can be easily integrated into the application via a plug-in interface. Downstream analysis of results is facilitated via a flexible query generator. We have developed an integrated system tailored to the specific needs of research projects using high density TMAs. It covers the complete workflow of TMA production, experimental use and subsequent analysis. The system is freely available for academic and non-profit institutions from http://genome.tugraz.at/Software/TAMEE.
Permeability-porosity data sets for sandstones
Nelson, P.H.
2004-01-01
Due to the variable nature of permeability-porosity relations, core should be obtained and permeability (k) and porosity (??) should be determined on core plugs in the laboratory for the formation of interest. A catalog of k versus (??) data sets is now available on the Web. Examples from the catalog are considered to illustrate some aspects of k versus ?? dependencies in siliciclastic reservoirs.
Publications - GMC 364 | Alaska Division of Geological & Geophysical
Alaska: Drew Pt #1, East Simpson Test Well #1, East Simpson #2, Ikpikpuk #1, J.W. Dalton #1, Seabee #1 , Topogoruk Test #1, and W. Dease #1 Authors: Talisman Energy Inc., and Core Laboratories Publication Date Properties Study on Core samples from 8 wells in Alaska: Drew Pt #1, East Simpson Test Well #1, East Simpson
Shakofsky, S.M.
1995-01-01
In order to assess the effect of filled waste disposal trenches on transport-governing soil properties, comparisons were made between profiles of undisturbed soil and disturbed soil in a simulated waste trench. The changes in soil properties induced by the construction of a simulated waste trench were measured near the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory (INEL) in the semi-arid southeast region of Idaho. The soil samples were collected, using a hydraulically- driven sampler to minimize sample disruption, from both a simulated waste trench and an undisturbed area nearby. Results show that the undisturbed profile has distinct layers whose properties differ significantly, whereas the soil profile in the simulated waste trench is. by comparison, homogeneous. Porosity was increased in the disturbed cores, and, correspondingly, saturated hydraulic conductivities were on average three times higher. With higher soil-moisture contents (greater than 0.32), unsaturated hydraulic conductivities for the undisturbed cores were typically greater than those for the disturbed cores. With lower moisture contents, most of the disturbed cores had greater hydraulic conductivities. The observed differences in hydraulic conductivities are interpreted and discussed as changes in the soil pore geometry.
Deterministic Modeling of the High Temperature Test Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortensi, J.; Cogliati, J. J.; Pope, M. A.
2010-06-01
Idaho National Laboratory (INL) is tasked with the development of reactor physics analysis capability of the Next Generation Nuclear Power (NGNP) project. In order to examine INL’s current prismatic reactor deterministic analysis tools, the project is conducting a benchmark exercise based on modeling the High Temperature Test Reactor (HTTR). This exercise entails the development of a model for the initial criticality, a 19 column thin annular core, and the fully loaded core critical condition with 30 columns. Special emphasis is devoted to the annular core modeling, which shares more characteristics with the NGNP base design. The DRAGON code is usedmore » in this study because it offers significant ease and versatility in modeling prismatic designs. Despite some geometric limitations, the code performs quite well compared to other lattice physics codes. DRAGON can generate transport solutions via collision probability (CP), method of characteristics (MOC), and discrete ordinates (Sn). A fine group cross section library based on the SHEM 281 energy structure is used in the DRAGON calculations. HEXPEDITE is the hexagonal z full core solver used in this study and is based on the Green’s Function solution of the transverse integrated equations. In addition, two Monte Carlo (MC) based codes, MCNP5 and PSG2/SERPENT, provide benchmarking capability for the DRAGON and the nodal diffusion solver codes. The results from this study show a consistent bias of 2–3% for the core multiplication factor. This systematic error has also been observed in other HTTR benchmark efforts and is well documented in the literature. The ENDF/B VII graphite and U235 cross sections appear to be the main source of the error. The isothermal temperature coefficients calculated with the fully loaded core configuration agree well with other benchmark participants but are 40% higher than the experimental values. This discrepancy with the measurement stems from the fact that during the experiments the control rods were adjusted to maintain criticality, whereas in the model, the rod positions were fixed. In addition, this work includes a brief study of a cross section generation approach that seeks to decouple the domain in order to account for neighbor effects. This spectral interpenetration is a dominant effect in annular HTR physics. This analysis methodology should be further explored in order to reduce the error that is systematically propagated in the traditional generation of cross sections.« less
Laboratory and field evaluation of hot mix asphalt with high contents of reclaimed asphalt pavement
NASA Astrophysics Data System (ADS)
Van Winkle, Clinton Isaac
Currently in Iowa, the amount of RAP materials allowed for the surface layer is limited to 15% by weight. The objective of this project was to develop quality standards for inclusion of RAP content higher than 15% in asphalt mixtures. To meet Superpave mix design requirements, it was necessary to fractionate the RAP materials. Based on the extensive sieve-by-sieve analysis of RAP materials, the optimum sieve size to fractionate RAP materials was identified. To determine if the higher percentage of RAP materials than 15% can be used in Iowa's state highway, three test sections with 30.0%, 35.5% and 39.2% of RAP materials were constructed on Highway 6 in Iowa City. The construction of the field test sections was monitored and the cores were obtained to measure field densities of test sections. Field mixtures collected from test sections were compacted in the laboratory in order to test the moisture sensitivity using a Hamburg Wheel Tracking Device. The binder was extracted from the field mixtures with varying amounts of RAP materials and tested to determine the effects of RAP materials on the PG grade of a virgin binder. Field cores were taken from the various mix designs to determine the percent density of each test section. A condition survey of the test sections was then performed to evaluate the short-term performance.
Modular workcells: modern methods for laboratory automation.
Felder, R A
1998-12-01
Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.
Tao, Yuqiang; Xue, Bin; Yao, Shuchun; Deng, Jiancai; Gui, Zhifan
2012-04-03
Although numerous studies have addressed sequestration of hydrophobic organic compounds (HOCs) in laboratory, little attention has been paid to its evaluation method in field at large temporal scale. A biomimetic tool, triolein embedded cellulose acetate membrane (TECAM), was therefore tested to evaluate sequestration of six PAHs with various hydrophobicity in a well-dated sediment core sampled from Nanyi Lake, China. Properties of sediment organic matter (OM) varying with aging time dominated the sequestration of PAHs in the sediment core. TECAM-sediment accumulation factors (MSAFs) of the PAHs declined with aging time, and significantly correlated with the corresponding biota-sediment accumulation factors (BSAFs) for gastropod (Bellamya aeruginosa) simultaneously incubated in the same sediment slices. Sequestration rates of the PAHs in the sediment core evaluated by TECAM were much lower than those obtained from laboratory study. The relationship between relative availability for TECAM (MSAF(t)/MSAF(0)) and aging time followed the first order exponential decay model. MSAF(t)/MSAF(0) was well-related to the minor changes of the properties of OM varying with aging time. Compared with chemical extraction, sequestration reflected by TECAM was much closer to that by B. aeruginosa. In contrast to B. aeruginosa, TECAM could avoid metabolism and the influences from feeding and other behaviors of organisms, and it is much easier to deploy and ready in laboratory. Hence TECAM provides an effective and convenient way to study sequestration of PAHs and probably other HOCs in field at large temporal scale.
NASA Astrophysics Data System (ADS)
Bernstein, A.; Allen, M.; Bowden, N.; Brennan, J.; Carr, D. J.; Estrada, J.; Hagmann, C.; Lund, J. C.; Madden, N. W.; Winant, C. D.
2005-09-01
Our Lawrence Livermore National Laboratory/Sandia National Laboratories collaboration has deployed a cubic-meter-scale antineutrino detector to demonstrate non-intrusive and automatic monitoring of the power levels and plutonium content of a nuclear reactor. Reactor monitoring of this kind is required for all non-nuclear weapons states under the Nuclear Nonproliferation Treaty (NPT), and is implemented by the International Atomic Energy Agency (IAEA). Since the antineutrino count rate and energy spectrum depend on the relative yields of fissioning isotopes in the reactor core, changes in isotopic composition can be observed without ever directly accessing the core. Data from a cubic meter scale antineutrino detector, coupled with the well-understood principles that govern the core's evolution in time, can be used to determine whether the reactor is being operated in an illegitimate way. Our group has deployed a detector at the San Onofre reactor site in California to demonstrate this concept. This paper describes the concept and shows preliminary results from 8 months of operation.
Pearce, Madison E; Alikhan, Nabil-Fareed; Dallman, Timothy J; Zhou, Zhemin; Grant, Kathie; Maiden, Martin C J
2018-06-02
Multi-country outbreaks of foodborne bacterial disease present challenges in their detection, tracking, and notification. As food is increasingly distributed across borders, such outbreaks are becoming more common. This increases the need for high-resolution, accessible, and replicable isolate typing schemes. Here we evaluate a core genome multilocus typing (cgMLST) scheme for the high-resolution reproducible typing of Salmonella enterica (S. enterica) isolates, by its application to a large European outbreak of S. enterica serovar Enteritidis. This outbreak had been extensively characterised using single nucleotide polymorphism (SNP)-based approaches. The cgMLST analysis was congruent with the original SNP-based analysis, the epidemiological data, and whole genome MLST (wgMLST) analysis. Combination of the cgMLST and epidemiological data confirmed that the genetic diversity among the isolates predated the outbreak, and was likely present at the infection source. There was consequently no link between country of isolation and genetic diversity, but the cgMLST clusters were congruent with date of isolation. Furthermore, comparison with publicly available Enteritidis isolate data demonstrated that the cgMLST scheme presented is highly scalable, enabling outbreaks to be contextualised within the Salmonella genus. The cgMLST scheme is therefore shown to be a standardised and scalable typing method, which allows Salmonella outbreaks to be analysed and compared across laboratories and jurisdictions. Copyright © 2018. Published by Elsevier B.V.
The changing role of the medical technologist from technologist to information specialist.
Miller, W G
2000-01-01
Pathology laboratory services are dependent on the laboratory information system (LIS) to organize the work, manage the operation, and communicate the results for effective laboratory medicine. For maximum efficiency, staffing for the LIS should be an integral component of laboratory operations and is facilitated by a two-tier structure. A core LIS staff provides system support and continuous services. A group of bench medical technologists have multitasking responsibilities, including LIS support for a specific laboratory work area. The two components form a team that uses staff efficiently to provide ongoing operational services and flexibility for problem solving and new functionality implementation.
NASA Astrophysics Data System (ADS)
Rhodes, Rachael H.; Faïn, Xavier; Stowasser, Christopher; Blunier, Thomas; Chappellaz, Jérôme; McConnell, Joseph R.; Romanini, Daniele; Mitchell, Logan E.; Brook, Edward J.
2013-04-01
Ancient air trapped inside bubbles in ice cores can now be analysed for methane concentration utilising a laser spectrometer coupled to a continuous melter system. We present a new ultra-high resolution record of atmospheric methane variability over the last 1800 yr obtained from continuous analysis of a shallow ice core from the North Greenland Eemian project (NEEM-2011-S1) during a 4-week laboratory-based measurement campaign. Our record faithfully replicates the form and amplitudes of multi-decadal oscillations previously observed in other ice cores and demonstrates the detailed depth resolution (5.3 cm), rapid acquisition time (30 m day-1) and good long-term reproducibility (2.6%, 2σ) of the continuous measurement technique. In addition, we report the detection of high frequency ice core methane signals of non-atmospheric origin. Firstly, measurements of air from the firn-ice transition region and an interval of ice core dating from 1546-1560 AD (gas age) resolve apparently quasi-annual scale methane oscillations. Traditional gas chromatography measurements on discrete ice samples confirm these signals and indicate peak-to-peak amplitudes of ca. 22 parts per billion (ppb). We hypothesise that these oscillations result from staggered bubble close-off between seasonal layers of contrasting density during time periods of sustained multi-year atmospheric methane change. Secondly, we report the detection of abrupt (20-100 cm depth interval), high amplitude (35-80 ppb excess) methane spikes in the NEEM ice that are reproduced by discrete measurements. We show for the first time that methane spikes present in thin and infrequent layers in polar, glacial ice are accompanied by elevated concentrations of carbon- and nitrogen-based chemical impurities, and suggest that biological in-situ production may be responsible.
Efficiency of static core turn-off in a system-on-a-chip with variation
Cher, Chen-Yong; Coteus, Paul W; Gara, Alan; Kursun, Eren; Paulsen, David P; Schuelke, Brian A; Sheets, II, John E; Tian, Shurong
2013-10-29
A processor-implemented method for improving efficiency of a static core turn-off in a multi-core processor with variation, the method comprising: conducting via a simulation a turn-off analysis of the multi-core processor at the multi-core processor's design stage, wherein the turn-off analysis of the multi-core processor at the multi-core processor's design stage includes a first output corresponding to a first multi-core processor core to turn off; conducting a turn-off analysis of the multi-core processor at the multi-core processor's testing stage, wherein the turn-off analysis of the multi-core processor at the multi-core processor's testing stage includes a second output corresponding to a second multi-core processor core to turn off; comparing the first output and the second output to determine if the first output is referring to the same core to turn off as the second output; outputting a third output corresponding to the first multi-core processor core if the first output and the second output are both referring to the same core to turn off.
ERIC Educational Resources Information Center
Gupta, Tanya; Burke, K. A.; Mehta, Akash; Greenbowe, Thomas J.
2015-01-01
The Science Writing Heuristic (SWH) laboratory instruction approach has been used successfully over a decade to engage students in laboratory activities. SWH-based instruction emphasizes knowledge construction through individual writing and reflection, and collaborative learning as a group. In the SWH approach, writing is a core component of…
Low-Cost Computer-Controlled Current Stimulator for the Student Laboratory
ERIC Educational Resources Information Center
Guclu, Burak
2007-01-01
Electrical stimulation of nerve and muscle tissues is frequently used for teaching core concepts in physiology. It is usually expensive to provide every student group in the laboratory with an individual stimulator. This article presents the design and application of a low-cost [about $100 (U.S.)] isolated stimulator that can be controlled by two…
Beginning Plant Biotechnology Laboratories Using Fast Plants.
ERIC Educational Resources Information Center
Williams, Mike
This set of 16 laboratory activities is designed to illustrate the life cycle of Brassicae plants from seeds in pots to pods in 40 days. At certain points along the production cycle of the central core of labs, there are related lateral labs to provide additional learning opportunities employing this family of plants, referred to as "fast…
ERIC Educational Resources Information Center
Hiebert, Sara M; Noveral, Jocelyne
2007-01-01
This investigative laboratory exercise uses the different relations between ambient temperature and metabolic rate in endotherms and ectotherms as a core concept to answer the following question: What thermoregulatory mode is employed by chicken embryos? Emphasis is placed on the physiological concepts that can be taught with this exercise,…
The State Public Health Laboratory System.
Inhorn, Stanley L; Astles, J Rex; Gradus, Stephen; Malmberg, Veronica; Snippes, Paula M; Wilcke, Burton W; White, Vanessa A
2010-01-01
This article describes the development since 2000 of the State Public Health Laboratory System in the United States. These state systems collectively are related to several other recent public health laboratory (PHL) initiatives. The first is the Core Functions and Capabilities of State Public Health Laboratories, a white paper that defined the basic responsibilities of the state PHL. Another is the Centers for Disease Control and Prevention National Laboratory System (NLS) initiative, the goal of which is to promote public-private collaboration to assure quality laboratory services and public health surveillance. To enhance the realization of the NLS, the Association of Public Health Laboratories (APHL) launched in 2004 a State Public Health Laboratory System Improvement Program. In the same year, APHL developed a Comprehensive Laboratory Services Survey, a tool to measure improvement through the decade to assure that essential PHL services are provided.
Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
Noninvasive hemoglobin monitoring in critically ill pediatric patients at risk of bleeding.
García-Soler, P; Camacho Alonso, J M; González-Gómez, J M; Milano-Manso, G
2017-05-01
To determine the accuracy and usefulness of noninvasive continuous hemoglobin (Hb) monitoring in critically ill patients at risk of bleeding. An observational prospective study was made, comparing core laboratory Hb measurement (LabHb) as the gold standard versus transcutaneous hemoglobin monitoring (SpHb). Pediatric Intensive Care Unit of a tertiary University Hospital. Patients weighing >3kg at risk of bleeding. SpHb was measured using the Radical7 pulse co-oximeter (Masimo Corp., Irvine, CA, USA) each time a blood sample was drawn for core laboratory analysis (Siemens ADVIA 2120i). Sociodemographic characteristics, perfusion index (PI), pleth variability index, heart rate, SaO 2 , rectal temperature, low signal quality and other events that can interfere with measurement. A total of 284 measurements were made (80 patients). Mean LabHb was 11.7±2.05g/dl. Mean SpHb was 12.32±2g/dl (Pearson 0.72, R 2 0.52). The intra-class correlation coefficient was 0.69 (95%CI 0.55-0.78)(p<0.001). Bland-Altman analysis showed a mean difference of 0.07 ±1.46g/dl. A lower PI and higher temperature independently increased the risk of low signal quality (OR 0.531 [95%CI 0.32-0.88] and 0.529 [95%CI 0.33-0.85], respectively). SpHb shows a good overall correlation to LabHb, though with wide limits of agreement. Its main advantage is continuous monitoring of patients at risk of bleeding. The reliability of the method is limited in cases with poor peripheral perfusion. Copyright © 2016 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.
López Ruiz, J A; Zabalza Estévez, I; Mieza Arana, J A
2016-01-01
To evaluate the possibility of determining the genetic profile of primary malignant tumors of the breast from specimens obtained by ultrasound-guided percutaneous biopsies during the diagnostic imaging workup. This is a retrospective study in 13 consecutive patients diagnosed with invasive breast cancer by B-mode ultrasound-guided 12 G core needle biopsy. After clinical indication, the pathologist decided whether the paraffin block specimens seemed suitable (on the basis of tumor size, validity of the sample, and percentage of tumor cells) before sending them for genetic analysis with the MammaPrint® platform. The size of the tumors on ultrasound ranged from 0.6cm to 5cm. In 11 patients the preserved specimen was considered valid and suitable for use in determining the genetic profile. In 1 patient (with a 1cm tumor) the pathologist decided that it was necessary to repeat the core biopsy to obtain additional samples. In 1 patient (with a 5cm tumor) the specimen was not considered valid by the genetic laboratory. The percentage of tumor cells in the samples ranged from 60% to 70%. In 11/13 cases (84.62%) it was possible to do the genetic analysis on the previously diagnosed samples. In most cases, regardless of tumor size, it is possible to obtain the genetic profile from tissue specimens obtained with ultrasound-guided 12 G core biopsy preserved in paraffin blocks. Copyright © 2015 SERAM. Published by Elsevier España, S.L.U. All rights reserved.
NASA Astrophysics Data System (ADS)
Carbonneau, A.; Allard, M.; L'Hérault, E.; LeBlanc, A.
2011-12-01
A study of permafrost conditions was undertaken in the Hamlet of Pangnirtung, Nunavut, by the Geological Survey of Canada (GSC) and Université Laval's Centre d'études nordiques (CEN) to support decision makers in their community planning work. The methods used for this project were based on geophysical and geomorphological approaches, including permafrost cores drilled in surficial deposits and ground penetrating radar surveys using a GPR Pulse EKKO 100 extending to the complete community area and to its projected expansion sector. Laboratory analysis allowed a detailed characterization of permafrost in terms of water contents, salinity and grain size. Cryostratigraphic analysis was done via CT-Scan imagery of frozen cores using medical imaging softwares such as Osiris. This non destructive method allows a 3D imaging of the entire core in order to locate the amount of the excess ice, determine the volumetric ice content and also interpret the ice-formation processes that took place during freezing of the permafrost. Our new map of the permafrost conditions in Pangnirtung illustrates that the dominant mapping unit consist of ice-rich colluvial deposits. Aggradationnal ice formed syngenitically with slope sedimentation. Buried soils were found imbedded in this colluvial layer and demonstrates that colluviation associated with overland-flow during snowmelt occurred almost continuously since 7080 cal. BP. In the eastern sector of town, the 1 to 4 meters thick colluviums cover till and a network of ice wedges that were revealed as spaced hyperbolic reflectors on GPR profiles. The colluviums also cover ice-rich marine silt and bedrock in the western sector of the hamlet; marine shells found in a permafrost core yielded a radiocarbon date of 9553 cal. BP which provides a revised age for the local deglaciation and also a revised marine submergence limit. Among the applied methods, shallow drilling in coarse grained permafrost, core recovery and CT-Scan allowed the discovery of the importance of Holocene slope processes on shaping the surface of the terrain and leading to the observed cryostructures and ice contents in the near surface permafrost.
McDonald, Peter R; Roy, Anuradha; Chaguturu, Rathnam
2011-01-01
The University of Kansas High-Throughput Screening (KU HTS) core is a state-of-the-art drug-discovery facility with an entrepreneurial open-service policy, which provides centralized resources supporting public- and private-sector research initiatives. The KU HTS core was established in 2002 at the University of Kansas with support from an NIH grant and the state of Kansas. It collaborates with investigators from national and international academic, nonprofit and pharmaceutical organizations in executing HTS-ready assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization. This is part two of a contribution from the KU HTS laboratory. PMID:21806374
Bloss, Benjamin R.; Bedrosian, Paul A.; Buesch, David C.
2015-01-01
Correlating laboratory resistivity measurements with geophysical resistivity models helps constrain these models to the geology and lithology of an area. Throughout the Fort Irwin National Training Center area, 111 samples from both cored boreholes and surface outcrops were collected and processed for laboratory measurements. These samples represent various lithologic types that include plutonic and metamorphic (basement) rocks, lava flows, consolidated sedimentary rocks, and unconsolidated sedimentary deposits that formed in a series of intermountain basins. Basement rocks, lava flows, and some lithified tuffs are generally resistive (≥100 ohm-meters [Ω·m]) when saturated. Saturated unconsolidated samples are moderately conductive to conductive, with resistivities generally less than 100 Ω·m, and many of these samples are less than 50 Ω·m. The unconsolidated samples can further be separated into two broad groups: (1) younger sediments that are moderately conductive, owing to their limited clay content, and (2) older, more conductive sediments with a higher clay content that reflects substantial amounts of originally glassy volcanic ash subsequently altered to clay. The older sediments are believed to be Tertiary. Time-domain electromagnetic (TEM) data were acquired near most of the boreholes, and, on the whole, close agreements between laboratory measurements and resistivity models were found.
Analysis of laboratory compaction methods of roller compacted concrete
NASA Astrophysics Data System (ADS)
Trtík, Tomáš; Chylík, Roman; Bílý, Petr; Fládr, Josef
2017-09-01
Roller-Compacted Concrete (RCC) is an ordinary concrete poured and compacted with machines typically used for laying of asphalt road layers. One of the problems connected with this technology is preparation of representative samples in the laboratory. The aim of this work was to analyse two methods of preparation of RCC laboratory samples with bulk density as the comparative parameter. The first method used dynamic compaction by pneumatic hammer. The second method of compaction had a static character. The specimens were loaded by precisely defined force in laboratory loading machine to create the same conditions as during static rolling (in the Czech Republic, only static rolling is commonly used). Bulk densities obtained by the two compaction methods were compared with core drills extracted from real RCC structure. The results have shown that the samples produced by pneumatic hammer tend to overestimate the bulk density of the material. For both compaction methods, immediate bearing index test was performed to verify the quality of compaction. A fundamental difference between static and dynamic compaction was identified. In static compaction, initial resistance to penetration of the mandrel was higher, after exceeding certain limit the resistance was constant. This means that the samples were well compacted just on the surface. Specimens made by pneumatic hammer actively resisted throughout the test, the whole volume was uniformly compacted.
Dowel-nut connection in Douglas-fir peeler cores
Ronald W. Wolfe; John R. King; Agron Gjinolli
As part of an effort to encourage more efficient use of small-diameter timber, the Forest Products Laboratory cooperated with Geiger Engineers in a study of the structural properties of Douglas-fir peeler cores and the efficacy of a bdowel-nutc connection detail for application in the design of a space frame roof system. A 44.5-mm- (1.75-in.-) diameter dowel-nut...
Laboratory determination of effective stress laws for deformation and permeability of chalk
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teufel, L W; Warpinski, N R
1990-01-01
Laboratory deformation and permeability measurements have been made on chalk samples from Ekofisk area fields as a function of confining stress and pore pressure to determine the effective stress laws for chalk. An understanding of the effective stress law is essential to obtain correct reservoir-property data from core analysis and is critical for reservoir management studies and reservoir compaction models. A powerful statistical technique known as the response surface method has been used to analyze our laboratory data determine the form of the effective stress law for deformation and permeability. Experiments were conducted on chalk samples that had a rangemore » of porosities from 15% to 36%, because porosity is the dominant intrinsic property that effects deformation and permeability behavior of chalk. Deformation of a 36% porosity chalk was highly nonlinear, but the effective stress law was linear, with {alpha} equal to about unity. Lower-porosity samples showed linear strain behavior and a linear effective stress law with {alpha} as low as 0.74. Analysis of the effective stress law for permeability is presented only for the lowest porosity chalk sample because changes in permeability in the higher-porosity chalk samples due to increasing confining stress or pore pressure were not were large enough, to deduce meaningful effective stress relationships. 15 refs., 8 figs., 2 tabs.« less
NASA Astrophysics Data System (ADS)
Carpenter, B. M.; Kitajima, H.; Sutherland, R.; Townend, J.; Toy, V. G.; Saffer, D. M.
2014-03-01
We report on laboratory measurements of permeability and elastic wavespeed for a suite of samples obtained by drilling across the active Alpine Fault on the South Island of New Zealand, as part of the first phase of the Deep Fault Drilling Project (DFDP-1). We find that clay-rich cataclasite and principal slip zone (PSZ) samples exhibit low permeabilities (⩽10-18 m), and that the permeability of hanging-wall cataclasites increases (from c. 10-18 m to 10-15 m) with distance from the fault. Additionally, the PSZ exhibits a markedly lower P-wave velocity and Young's modulus relative to the wall rocks. Our laboratory data are in good agreement with in situ wireline logging measurements and are consistent with the identification of an alteration zone surrounding the PSZ defined by observations of core samples. The properties of this zone and the low permeability of the PSZ likely govern transient hydrologic processes during earthquake slip, including thermal pressurization and dilatancy strengthening.
Laboratory Measurements for H3+ Deuteration Reactions
NASA Astrophysics Data System (ADS)
Bowen, Kyle; Hillenbrand, Pierre-Michel; Urbain, Xavier; Savin, Daniel Wolf
2018-06-01
Deuterated molecules are important chemical tracers of protostellar cores. At the ~106 cm-3 particle densities and ~20 K temperatures typical for protostellar cores, most molecules freeze onto dust grains. A notable exception is H3+ and its isotopologues. These become important carriers of positive charge in the gas, can couple to any ambient magnetic field, and can thereby alter the cloud dynamics. Knowing the total abundance of H3+ and its isotopologues is important for studying the evolution of protostellar cores. However, H3+ and D3+ have no dipole moment. They lack a pure rotational spectrum and are not observable at protostellar core temperatures. Fortunately H2D+ and D2H+ have dipole moments and a pure rotational spectrum that can be excited in protostellar cores. Observations of these two molecules, combined with astrochemical models, provide information about the total abundance of H3+ and all its isotopologues. The inferred abundances, though, rely on accurate astrochemical data for the deuteration of H3+ and its isotopologues.Here we present laboratory measurements of the rate coefficients for three important deuterating reactions, namely D + H3+/H2D+/D2H+ → H + H2D+/ D2H+/D3+. Astrochemical models currently rely on rate coefficients from classical (Langevin) or semi-classical methods for these reactions, as fully quantum-mechanical calculations are beyond current computational capabilities. Laboratory studies are the most tractable means of providing the needed data. For our studies we used our novel dual-source, merged fast-beams apparatus, which enables us to study reactions of neutral atoms and molecular ions. Co-propagating beams allow us to measure experimental rate coefficients as a function of collision energy. We extract cross section data from these results, which we then convolve with a Maxwell-Boltzmann distribution to generate thermal rate coefficients. Here we present our results for these three reactions and discuss some implications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D. H.; Reigel, M. M.
A full-scale formed core sampler was designed and functionally tested for use in the Saltstone Disposal Facility (SDF). Savannah River National Laboratory (SRNL) was requested to compare properties of the formed core samples and core drilled samples taken from adjacent areas in the full-scale sampler. While several physical properties were evaluated, the primary property of interest was hydraulic conductivity. Differences in hydraulic conductivity between the samples from the formed core sampler and those representing the bulk material were noted with respect to the initial handling and storage of the samples. Due to testing conditions, the site port samples were exposedmore » to uncontrolled temperature and humidity conditions prior to testing whereas the formed core samples were kept in sealed containers with minimal exposure to an uncontrolled environment prior to testing. Based on the results of the testing, no significant differences in porosity or density were found between the formed core samples and those representing the bulk material in the test stand.« less
Best, Michele; Sakande, Jean
2016-01-01
The role of national health laboratories in support of public health response has expanded beyond laboratory testing to include a number of other core functions such as emergency response, training and outreach, communications, laboratory-based surveillance and data management. These functions can only be accomplished by an efficient and resilient national laboratory network that includes public health, reference, clinical and other laboratories. It is a primary responsibility of the national health laboratory in the Ministry of Health to develop and maintain the national laboratory network in the country. In this article, we present practical recommendations based on 17 years of network development experience for the development of effective national laboratory networks. These recommendations and examples of current laboratory networks, are provided to facilitate laboratory network development in other states. The development of resilient, integrated laboratory networks will enhance each state's public health system and is critical to the development of a robust national laboratory response network to meet global health security threats.
2016-01-01
The role of national health laboratories in support of public health response has expanded beyond laboratory testing to include a number of other core functions such as emergency response, training and outreach, communications, laboratory-based surveillance and data management. These functions can only be accomplished by an efficient and resilient national laboratory network that includes public health, reference, clinical and other laboratories. It is a primary responsibility of the national health laboratory in the Ministry of Health to develop and maintain the national laboratory network in the country. In this article, we present practical recommendations based on 17 years of network development experience for the development of effective national laboratory networks. These recommendations and examples of current laboratory networks, are provided to facilitate laboratory network development in other states. The development of resilient, integrated laboratory networks will enhance each state’s public health system and is critical to the development of a robust national laboratory response network to meet global health security threats. PMID:28879137
ERIC Educational Resources Information Center
Kondratowicz, Izabela; Z?elechowska, Kamila
2017-01-01
The aim of this laboratory experiment is to utilize graphene oxide (GO) material to introduce undergraduate students to many well-known concepts of general chemistry. GO is a new nanomaterial that has generated worldwide interest and can be easily produced in every well-equipped undergraduate chemical laboratory. An in-depth examination of GO…
ERIC Educational Resources Information Center
Olsen, Robert J.
2008-01-01
I describe how data pooling and data visualization can be employed in the first-semester general chemistry laboratory to introduce core statistical concepts such as central tendency and dispersion of a data set. The pooled data are plotted as a 1-D scatterplot, a purpose-designed number line through which statistical features of the data are…
An Inexpensive Kinetic Study: The Reaction of FD&C Red #3 (Erythrosin B) with Hypochlorite
ERIC Educational Resources Information Center
Henary, Maher M.; Russell, Arlene A.
2007-01-01
Kinetics constitutes a core topic in both the lecture and laboratory components of lower- level chemistry courses. While textbook examples can ignore issues of time, temperature and safety, the laboratory can not. Reactions must occur slowly enough to be detected by students, occur rapidly enough for data collection in the few hours assigned to a…
Physical deterioration of preservative treated poles and pilings exposed to salt water
Grant T. Kirker; Jessie Glaeser; Stan T. Lebow; Frederick Green III; Carol A. Clausen
2011-01-01
This report details the results of laboratory analyses of wooden pilings sent to the USDA Forest Products Laboratory in March 2011. These samples were removed from coastal wooden posts, poles, piles, and deck boards. A total of 22 samples, consisting of either core borings or surface fiber samples, were removed from four installations along the South Carolina coast....
The Subsurface Ice Probe (SIPR): A Low-Power Thermal Probe for the Martian Polar Layered Deposits
NASA Technical Reports Server (NTRS)
Cardell, G.; Hecht, M. H.; Carsey, F. D.; Engelhardt, H.; Fisher, D.; Terrell, C.; Thompson, J.
2004-01-01
The distinctive layering visible in images from Mars Global Surveyor of the Martian polar caps, and particularly in the north polar cap, indicates that the stratigraphy of these polar layered deposits may hold a record of Martian climate history covering millions of years. On Earth, ice sheets are cored to retrieve a pristine record of the physical and chemical properties of the ice at depth, and then studied in exacting detail in the laboratory. On the Martian north polar cap, coring is probably not a practical method for implementation in an autonomous lander. As an alternative, thermal probes that drill by melting into the ice are feasible for autonomous operation, and are capable of reasonable approximations to the scientific investigations performed on terrestrial cores, while removing meltwater to the surface for analysis. The Subsurface Ice Probe (SIPR) is such a probe under development at JPL. To explore the dominant climate cycles, it is postulated that tens of meters of depth should be profiled, as this corresponds to the vertical separation of the major layers visible in the MOC images [1]. Optical and spectroscopic analysis of the layers, presumably demarcated by embedded dust and possibly by changes in the ice properties, would contribute to the construction of a chronology. Meltwater analysis may be used to determine the soluble chemistry of the embedded dust, and to monitor gradients of atmospheric gases, particularly hydrogen and oxygen, and isotopic variations that reflect atmospheric conditions at the time the layer was deposited. Thermal measurements can be used to determine the geothermal gradient and the bulk mechanical properties of the ice.
NASA Technical Reports Server (NTRS)
Mahaffy, P. R.
2006-01-01
The Mars Science Laboratory, under development for launch in 2009, is designed explore and quantitatively asses a local region on Mars as a potential habitat for present or past life. Its ambitious goals are to (1) assess the past or present biological potential of the target environment, (2) to characterize the geology and geochemistry at the MSL landing site, and (3) to investigate planetary processes that influence habitability. The planned capabilities of the rover payload will enable a comprehensive search for organic molecules, a determination of definitive mineralogy of sampled rocks and fines, chemical and isotopic analysis of both atmospheric and solid samples, and precision isotope measurements of several volatile elements. A range of contact and remote surface and subsurface survey tools will establish context for these measurements and will facilitate sample identification and selection. The Sample Analysis at Mars (SAM) suite of MSL addresses several of the mission's core measurement goals. It includes a gas chromatograph, a mass spectrometer, and a tunable laser spectrometer. These instruments will be designed to analyze either atmospheric samples or gases extracted from solid phase samples such as rocks and fines. We will describe the range of measurement protocols under development and study by the SAM engineering and science teams for use on the surface of Mars.
NASA Technical Reports Server (NTRS)
Glavin, D. P.; Buch, A.; Cabane, M.; Coll, P.; Navarro-Gonzalez, R.; Mahaffy, P. R.
2005-01-01
One of the core science objectives of NASA's 2009 Mars Science Laboratory (MSL) mission is to determine the past or present habitability of Mars. The search for key organic compounds relevant to terrestrial life will be an important part of that assessment. We have developed a protocol for the analysis of amino acids and carboxylic acids in Mars analogue materials using gas chromatography mass spectrometry (GCMS). As shown, a variety of carboxylic acids were readily identified in soil collected from the Atacama Desert in Chile at part-per-billion levels by GCMS after extraction and chemical derivatization using the reagent N,N-tert.-butyl (dimethylsilyl) trifluoroacetamide (MTBSTFA). Several derivatized amino acids including glycine and alanine were also detected by GCMS in the Atacama soil at lower concentrations (chromatogram not shown). Lacking derivatization capability, the Viking pyrolysis GCMS instruments could not have detected amino acids and carboxylic acids, since these non-volatile compounds require chemical transformation into volatile species that are stable in a GC column. We are currently optimizing the chemical extraction and derivatization technique for in situ GCMS analysis on Mars. Laboratory results of analyses of Atacama Desert samples and other Mars analogue materials using this protocol will be presented.
Earl-Boehm, Jennifer E; Bolgla, Lori A; Emory, Carolyn; Hamstra-Wright, Karrie L; Tarima, Sergey; Ferber, Reed
2018-06-12
Patellofemoral pain (PFP) is a common injury that interferes with quality of life and physical activity. Clinical subgroups of patients may exist, one of which is proximal muscle dysfunction. To develop clinical prediction rules that predict a positive outcome after either a hip and core- or knee-focused strengthening program for individuals with PFP. Secondary analysis of data from a randomized control trial. Four university laboratories. A total of 199 participants with PFP. Participants were randomly allocated to either a hip and core-focused (n = 111) or knee-focused (n = 88) rehabilitation group for a 6-week program. Demographics, self-reported knee pain (visual analog scale) and function (Anterior Knee Pain Scale), hip strength, abdominal muscle endurance, and hip range of motion were evaluated at baseline. Treatment success was defined as a decrease in visual analog scale score by ≥2 cm or an increase in the Anterior Knee Pain Scale score by ≥8 points or both. Bivariate relationships between the outcome (treatment success) and the predictor variables were explored, followed by a forward stepwise logistic regression to predict a successful outcome. Patients with more pain, better function, greater lateral core endurance, and less anterior core endurance were more likely to have a successful outcome after hip and core strengthening (88% sensitivity and 54% specificity). Patients with lower weight, weaker hip internal rotation, stronger hip extension, and greater trunk-extension endurance were more likely to have success after knee strengthening (82% sensitivity and 58% specificity). The patients with PFP who have more baseline pain and yet maintain a high level of function may experience additional benefit from hip and core strengthening. The clinical prediction rules from this study remain in the developmental phase and should be applied with caution until externally validated.
Observing Tropical Cyclones from the Global Hawk: HAMSR Results from GRIP
NASA Astrophysics Data System (ADS)
Lambrigtsen, B.; Brown, S.; Behrangi, A.
2011-12-01
The Global Hawk unmanned aerial vehicle (UAV) recently acquired by NASA was flown for the first time in 2010 in a hurricane field campaign, the NASA Genesis and Rapid Intensification Processes (GRIP) experiment. One of the primary payloads was the High Altitude MMIC Sounding Radiometer (HAMSR) developed at the Jet Propulsion Laboratory. HAMSR is a cloud penetrating microwave sounder that provides a picture of the state of the atmosphere, such as the thermodynamic environment around hurricanes and the convective structure in the inner core. We show results from GRIP, including analysis of observations of Hurricane Karl during 13 hours during a period of rapid intensification.
None
2018-02-13
NETL's CT Scanner laboratory is equipped with three CT scanners and a mobile core logging unit that work together to provide characteristic geologic and geophysical information at different scales, non-destructively.
Chemical Convention in the Lunar Core from Melting Experiments on the Ironsulfur System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, J.; Liu, J.; Chen, B.
2012-03-26
By reanalyzing Apollo lunar seismograms using array-processing methods, a recent study suggests that the Moon has a solid inner core and a fluid outer core, much like the Earth. The volume fraction of the lunar inner core is 38%, compared with 4% for the Earth. The pressure at the Moon's core-mantle boundary is 4.8 GPa, and that at the ICB is 5.2 GPa. The partially molten state of the lunar core provides constraints on the thermal and chemical states of the Moon: The temperature at the inner core boundary (ICB) corresponds to the liquidus of the outer core composition, andmore » the mass fraction of the solid core allows us to infer the bulk composition of the core from an estimated thermal profile. Moreover, knowledge on the extent of core solidification can be used to evaluate the role of chemical convection in the origin of early lunar core dynamo. Sulfur is considered an antifreeze component in the lunar core. Here we investigate the melting behavior of the Fe-S system at the pressure conditions of the lunar core, using the multi-anvil apparatus and synchrotron and laboratory-based analytical methods. Our goal is to understand compositionally driven convection in the lunar core and assess its role in generating an internal magnetic field in the early history of the Moon.« less
Acoustic emission characterization of microcracking in laboratory-scale hydraulic fracturing tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hampton, Jesse; Gutierrez, Marte; Matzar, Luis
Understanding microcracking near coalesced fracture generation is critically important for hydrocarbon and geothermal reservoir characterization as well as damage evaluation in civil engineering structures. Dense and sometimes random microcracking near coalesced fracture formation alters the mechanical properties of the nearby virgin material. Individual microcrack characterization is also significant in quantifying the material changes near the fracture faces (i.e. damage). Acoustic emission (AE) monitoring and analysis provide unique information regarding the microcracking process temporally, and information concerning the source characterization of individual microcracks can be extracted. In this context, laboratory hydraulic fracture tests were carried out while monitoring the AEs frommore » several piezoelectric transducers. In-depth post-processing of the AE event data was performed for the purpose of understanding the individual source mechanisms. Several source characterization techniques including moment tensor inversion, event parametric analysis, and volumetric deformation analysis were adopted. Post-test fracture characterization through coring, slicing and micro-computed tomographic imaging was performed to determine the coalesced fracture location and structure. Distinct differences in fracture characteristics were found spatially in relation to the openhole injection interval. Individual microcrack AE analysis showed substantial energy reduction emanating spatially from the injection interval. Lastly, it was quantitatively observed that the recorded AE signals provided sufficient information to generalize the damage radiating spatially away from the injection wellbore.« less
Acoustic emission characterization of microcracking in laboratory-scale hydraulic fracturing tests
Hampton, Jesse; Gutierrez, Marte; Matzar, Luis; ...
2018-06-11
Understanding microcracking near coalesced fracture generation is critically important for hydrocarbon and geothermal reservoir characterization as well as damage evaluation in civil engineering structures. Dense and sometimes random microcracking near coalesced fracture formation alters the mechanical properties of the nearby virgin material. Individual microcrack characterization is also significant in quantifying the material changes near the fracture faces (i.e. damage). Acoustic emission (AE) monitoring and analysis provide unique information regarding the microcracking process temporally, and information concerning the source characterization of individual microcracks can be extracted. In this context, laboratory hydraulic fracture tests were carried out while monitoring the AEs frommore » several piezoelectric transducers. In-depth post-processing of the AE event data was performed for the purpose of understanding the individual source mechanisms. Several source characterization techniques including moment tensor inversion, event parametric analysis, and volumetric deformation analysis were adopted. Post-test fracture characterization through coring, slicing and micro-computed tomographic imaging was performed to determine the coalesced fracture location and structure. Distinct differences in fracture characteristics were found spatially in relation to the openhole injection interval. Individual microcrack AE analysis showed substantial energy reduction emanating spatially from the injection interval. Lastly, it was quantitatively observed that the recorded AE signals provided sufficient information to generalize the damage radiating spatially away from the injection wellbore.« less
Fischer, Christoph; Domer, Benno; Wibmer, Thomas; Penzel, Thomas
2017-03-01
Photoplethysmography has been used in a wide range of medical devices for measuring oxygen saturation, cardiac output, assessing autonomic function, and detecting peripheral vascular disease. Artifacts can render the photoplethysmogram (PPG) useless. Thus, algorithms capable of identifying artifacts are critically important. However, the published PPG algorithms are limited in algorithm and study design. Therefore, the authors developed a novel embedded algorithm for real-time pulse waveform (PWF) segmentation and artifact detection based on a contour analysis in the time domain. This paper provides an overview about PWF and artifact classifications, presents the developed PWF analysis, and demonstrates the implementation on a 32-bit ARM core microcontroller. The PWF analysis was validated with data records from 63 subjects acquired in a sleep laboratory, ergometry laboratory, and intensive care unit in equal parts. The output of the algorithm was compared with harmonized experts' annotations of the PPG with a total duration of 31.5 h. The algorithm achieved a beat-to-beat comparison sensitivity of 99.6%, specificity of 90.5%, precision of 98.5%, and accuracy of 98.3%. The interrater agreement expressed as Cohen's kappa coefficient was 0.927 and as F-measure was 0.990. In conclusion, the PWF analysis seems to be a suitable method for PPG signal quality determination, real-time annotation, data compression, and calculation of additional pulse wave metrics such as amplitude, duration, and rise time.
Preparations to ship the TMI-2 damaged reactor core
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, R.C.; Quinn, G.J.
1985-11-01
The March 1979 accident at Three Mile Island Unit 2 (TMI-2) resulted in a severely damaged core. Entries into that core using various tools and inspection devices have shown a significant void, large amounts of rubble, partially intact fuel assemblies, and some resolidified molten materials. The removal and disposition of that core has been of considerable public, regulatory, and governmental interest for some time. In a contractual agreement between General Public Utility Nuclear (GPUN) and the US Department of Energy (DOE), DOE has agreed to accept the TMI-2 core for interim storage at the Idaho National Engineering Laboratory (INEL), conductmore » research on fuel and materials of the core, and eventually dispose of the core either by processing or internment at the national repository. GPUN has removed various samples of material from the core and was scheduled to begin extensive defueling operations in September 1985. EG and G Idaho, Inc. (EG and G), acting on behalf of DOE, is responsible for transporting, receiving, examining, and storing the TMI-2 core. This paper addresses the preparations to ship the core to INEL, which is scheduled to commence in March 1986.« less
Image processing and machine learning in the morphological analysis of blood cells.
Rodellar, J; Alférez, S; Acevedo, A; Molina, A; Merino, A
2018-05-01
This review focuses on how image processing and machine learning can be useful for the morphological characterization and automatic recognition of cell images captured from peripheral blood smears. The basics of the 3 core elements (segmentation, quantitative features, and classification) are outlined, and recent literature is discussed. Although red blood cells are a significant part of this context, this study focuses on malignant lymphoid cells and blast cells. There is no doubt that these technologies may help the cytologist to perform efficient, objective, and fast morphological analysis of blood cells. They may also help in the interpretation of some morphological features and may serve as learning and survey tools. Although research is still needed, it is important to define screening strategies to exploit the potential of image-based automatic recognition systems integrated in the daily routine of laboratories along with other analysis methodologies. © 2018 John Wiley & Sons Ltd.
Code of Federal Regulations, 2010 CFR
2010-07-01
... or use of man-made or natural materials (such as slag, dumps, cores and debitage); (v) Organic..., laboratory reports, computer cards and tapes, computer disks and diskettes, printouts of computerized data...
Code of Federal Regulations, 2012 CFR
2012-07-01
... or use of man-made or natural materials (such as slag, dumps, cores and debitage); (v) Organic..., laboratory reports, computer cards and tapes, computer disks and diskettes, printouts of computerized data...
Code of Federal Regulations, 2011 CFR
2011-07-01
... or use of man-made or natural materials (such as slag, dumps, cores and debitage); (v) Organic..., laboratory reports, computer cards and tapes, computer disks and diskettes, printouts of computerized data...
Code of Federal Regulations, 2014 CFR
2014-07-01
... or use of man-made or natural materials (such as slag, dumps, cores and debitage); (v) Organic..., laboratory reports, computer cards and tapes, computer disks and diskettes, printouts of computerized data...
Code of Federal Regulations, 2013 CFR
2013-07-01
... or use of man-made or natural materials (such as slag, dumps, cores and debitage); (v) Organic..., laboratory reports, computer cards and tapes, computer disks and diskettes, printouts of computerized data...
Twenty Years of Active Bacterial Core Surveillance
Schaffner, William; Farley, Monica M.; Lynfield, Ruth; Bennett, Nancy M.; Reingold, Arthur; Thomas, Ann; Harrison, Lee H.; Nichols, Megin; Petit, Susan; Miller, Lisa; Moore, Matthew R.; Schrag, Stephanie J.; Lessa, Fernanda C.; Skoff, Tami H.; MacNeil, Jessica R.; Briere, Elizabeth C.; Weston, Emily J.; Van Beneden, Chris
2015-01-01
Active Bacterial Core surveillance (ABCs) was established in 1995 as part of the Centers for Disease Control and Prevention Emerging Infections Program (EIP) network to assess the extent of invasive bacterial infections of public health importance. ABCs is distinctive among surveillance systems because of its large, population-based, geographically diverse catchment area; active laboratory-based identification of cases to ensure complete case capture; detailed collection of epidemiologic information paired with laboratory isolates; infrastructure that allows for more in-depth investigations; and sustained commitment of public health, academic, and clinical partners to maintain the system. ABCs has directly affected public health policies and practices through the development and evaluation of vaccines and other prevention strategies, the monitoring of antimicrobial drug resistance, and the response to public health emergencies and other emerging infections. PMID:26292067
,
2008-01-01
Chapter 1 of this CD-ROM is a database of digitized Fischer (shale-oil) assays of cores and cuttings from boreholes drilled in the Eocene Green River oil shale deposits in southwestern Wyoming. Assays of samples from some surface sections are also included. Most of the Fischer assay analyses were made by the former U.S. Bureau of Mines (USBM) at its laboratory in Laramie, Wyoming. Other assays, made by institutional or private laboratories, were donated to the U.S. Geological Survey (USGS) and are included in this database as well as Adobe PDF-scanned images of some of the original laboratory assay reports and lithologic logs prepared by USBM geologists. The size of this database is 75.2 megabytes and includes information on 971 core holes and rotary-drilled boreholes and numerous surface sections. Most of these data were released previously by the USBM and the USGS through the National Technical Information Service but are no longer available from that agency. Fischer assays for boreholes in northeastern Utah and northwestern Colorado have been published by the USGS. Additional data include geophysical logs, groundwater data, chemical and X-ray diffraction analyses, and other data. These materials are available for inspection in the office of the USGS Central Energy Resources Team in Lakewood, Colorado. The digitized assays were checked with the original laboratory reports, but some errors likely remain. Other information, such as locations and elevations of core holes and oil and gas tests, were not thoroughly checked. However, owing to the current interest in oil-shale development, it was considered in the public interest to make this preliminary database available at this time. Chapter 2 of this CD-ROM presents oil-yield histograms of samples of cores and cuttings from exploration drill holes in the Eocene Green River Formation in the Great Divide, Green River, and Washakie Basins of southwestern Wyoming. A database was compiled that includes about 47,000 Fischer assays from 186 core holes and 240 rotary drill holes. Most of the oil yield data are from analyses performed by the former U.S. Bureau of Mines oil shale laboratory in Laramie, Wyoming, with some analyses made by private laboratories. Location data for 971 Wyoming oil-shale drill holes are listed in a spreadsheet that is included in the CD-ROM. These Wyoming Fischer assays and histograms are part of a much larger collection of oil-shale information, including geophysical and lithologic logs, water data, chemical and X-ray diffraction analyses on the Green River oil-shale deposits in Colorado, Utah, and Wyoming held by the U.S. Geological Survey. Because of an increased interest in oil shale, this CD-ROM containing Fischer assay data and oil-yield histograms for the Green River oil-shale deposits in southwestern Wyoming is being released to the public. Microsoft Excel spreadsheets included with Chapter 2 contain the Fischer assay data from the 426 holes and data on the company name and drill-hole name, and location. Histograms of the oil yields obtained from the Fischer assays are presented in both Grapher and PDF format. Fischer assay text data files are also included in the CD-ROM.
Preliminary Results on Lunar Interior Properties from the GRAIL Mission
NASA Technical Reports Server (NTRS)
Williams, James G.; Konopliv, Alexander S.; Asmar, Sami W.; Lemoine, H. Jay; Melosh, H. Jay; Neumann, Gregory A.; Phillips, Roger J.; Smith, David E.; Solomon, Sean C.; Watkins, Michael M.;
2013-01-01
The Gravity Recovery and Interior Laboratory (GRAIL) mission has provided lunar gravity with unprecedented accuracy and resolution. GRAIL has produced a high-resolution map of the lunar gravity field while also determining tidal response. We present the latest gravity field solution and its preliminary implications for the Moon's interior structure, exploring properties such as the mean density, moment of inertia of the solid Moon, and tidal potential Love number k2. Lunar structure includes a thin crust, a deep mantle, a fluid core, and a suspected solid inner core. An accurate Love number mainly improves knowledge of the fluid core and deep mantle. In the future GRAIL will search for evidence of tidal dissipation and a solid inner core.
Teaching laboratory neuroscience at bowdoin: the laboratory instructor perspective.
Hauptman, Stephen; Curtis, Nancy
2009-01-01
Bowdoin College is a small liberal arts college that offers a comprehensive Neuroscience major. The laboratory experience is an integral part of the major, and many students progress through three stages. A core course offers a survey of concepts and techniques. Four upper-level courses function to give students more intensive laboratory research experience in neurophysiology, molecular neurobiology, social behavior, and learning and memory. Finally, many majors choose to work in the individual research labs of the Neuroscience faculty. We, as laboratory instructors, are vital to the process, and are actively involved in all aspects of the lab-based courses. We provide student instruction in state of the art techniques in neuroscience research. By sharing laboratory teaching responsibilities with course professors, we help to prepare students for careers in laboratory neuroscience and also support and facilitate faculty research programs.
New virtual laboratories presenting advanced motion control concepts
NASA Astrophysics Data System (ADS)
Goubej, Martin; Krejčí, Alois; Reitinger, Jan
2015-11-01
The paper deals with development of software framework for rapid generation of remote virtual laboratories. Client-server architecture is chosen in order to employ real-time simulation core which is running on a dedicated server. Ordinary web browser is used as a final renderer to achieve hardware independent solution which can be run on different target platforms including laptops, tablets or mobile phones. The provided toolchain allows automatic generation of the virtual laboratory source code from the configuration file created in the open- source Inkscape graphic editor. Three virtual laboratories presenting advanced motion control algorithms have been developed showing the applicability of the proposed approach.
Waite, W.F.; Kneafsey, T.J.; Winters, W.J.; Mason, D.H.
2008-01-01
Physical property measurements of sediment cores containing natural gas hydrate are typically performed on material exposed, at least briefly, to non-in situ conditions during recovery. To examine the effects of a brief excursion from the gas-hydrate stability field, as can occur when pressure cores are transferred to pressurized storage vessels, we measured physical properties on laboratory-formed sand packs containing methane hydrate and methane pore gas. After depressurizing samples to atmospheric pressure, we repressurized them into the methane-hydrate stability field and remeasured their physical properties. Thermal conductivity, shear strength, acoustic compressional and shear wave amplitudes, and speeds of the original and depressurized/repressurized samples are compared. X– ray computed tomography images track how the gas-hydrate distribution changes in the hydrate-cemented sands owing to the depressurizaton/repressurization process. Because depressurization-induced property changes can be substantial and are not easily predicted, particularly in water-saturated, hydrate-bearing sediment, maintaining pressure and temperature conditions throughout the core recovery and measurement process is critical for using laboratory measurements to estimate in situ properties.
Waite, W.F.; Kneafsey, T.J.; Winters, W.J.; Mason, D.H.
2008-01-01
Physical property measurements of sediment cores containing natural gas hydrate are typically performed on material exposed, at least briefly, to non-in situ conditions during recovery. To examine the effects of a brief excursion from the gas-hydrate stability field, as can occur when pressure cores are transferred to pressurized storage vessels, we measured physical properties on laboratory-formed sand packs containing methane hydrate and methane pore gas. After depressurizing samples to atmospheric pressure, we repressurized them into the methane-hydrate stability field and remeasured their physical properties. Thermal conductivity, shear strength, acoustic compressional and shear wave amplitudes, and speeds of the original and depressurized/repressurized samples are compared. X-ray computed tomography images track how the gas-hydrate distribution changes in the hydrate-cemented sands owing to the depressurizaton/repressurization process. Because depressurization-induced property changes can be substantial and are not easily predicted, particularly in water-saturated, hydrate-bearing sediment, maintaining pressure and temperature conditions throughout the core recovery and measurement process is critical for using laboratory measurements to estimate in situ properties.
Armentano, Antonio; Summa, Simona; Magro, Sonia Lo; D’Antini, Pasquale; Palermo, Carmen; Muscarella, Marilena
2016-01-01
A C18 column packed with core-shell particles was used for the chromatographic separation of sulphonamides in feed and meat by a conventional high performance liquid chromatography system coupled with a diode array detector. Two analytical methods, already used in our laboratory, have been modified without any changes in the extraction and clean-up steps and in the liquid chromatography instrumentation. Chromatographic conditions applied on a traditional 5-µm column have been optimized on a column packed with 2.6 µm core-shell particles. A binary mobile phase [acetate buffer solution at pH 4.50 and a mixture of methanol acetonitrile 50: 50 (v/v)] was employed in gradient mode at the flow rate of 1.2 mL with an injection volume of 6 µL. These chromatographic conditions allow the separation of 13 sulphonamides with an entire run of 13 minutes. Preliminary studies have been carried out comparing blanks and spiked samples of feed and meat. A good resolution and the absence of interferences were achieved in chromatograms for both matrices. Since no change was made to the sample preparation, the optimized method does not require a complete revalidation and can be used to make routine analysis faster. PMID:28217560
Simulation of the planetary interior differentiation processes in the laboratory.
Fei, Yingwei
2013-11-15
A planetary interior is under high-pressure and high-temperature conditions and it has a layered structure. There are two important processes that led to that layered structure, (1) percolation of liquid metal in a solid silicate matrix by planet differentiation, and (2) inner core crystallization by subsequent planet cooling. We conduct high-pressure and high-temperature experiments to simulate both processes in the laboratory. Formation of percolative planetary core depends on the efficiency of melt percolation, which is controlled by the dihedral (wetting) angle. The percolation simulation includes heating the sample at high pressure to a target temperature at which iron-sulfur alloy is molten while the silicate remains solid, and then determining the true dihedral angle to evaluate the style of liquid migration in a crystalline matrix by 3D visualization. The 3D volume rendering is achieved by slicing the recovered sample with a focused ion beam (FIB) and taking SEM image of each slice with a FIB/SEM crossbeam instrument. The second set of experiments is designed to understand the inner core crystallization and element distribution between the liquid outer core and solid inner core by determining the melting temperature and element partitioning at high pressure. The melting experiments are conducted in the multi-anvil apparatus up to 27 GPa and extended to higher pressure in the diamond-anvil cell with laser-heating. We have developed techniques to recover small heated samples by precision FIB milling and obtain high-resolution images of the laser-heated spot that show melting texture at high pressure. By analyzing the chemical compositions of the coexisting liquid and solid phases, we precisely determine the liquidus curve, providing necessary data to understand the inner core crystallization process.
Simulation of the Planetary Interior Differentiation Processes in the Laboratory
Fei, Yingwei
2013-01-01
A planetary interior is under high-pressure and high-temperature conditions and it has a layered structure. There are two important processes that led to that layered structure, (1) percolation of liquid metal in a solid silicate matrix by planet differentiation, and (2) inner core crystallization by subsequent planet cooling. We conduct high-pressure and high-temperature experiments to simulate both processes in the laboratory. Formation of percolative planetary core depends on the efficiency of melt percolation, which is controlled by the dihedral (wetting) angle. The percolation simulation includes heating the sample at high pressure to a target temperature at which iron-sulfur alloy is molten while the silicate remains solid, and then determining the true dihedral angle to evaluate the style of liquid migration in a crystalline matrix by 3D visualization. The 3D volume rendering is achieved by slicing the recovered sample with a focused ion beam (FIB) and taking SEM image of each slice with a FIB/SEM crossbeam instrument. The second set of experiments is designed to understand the inner core crystallization and element distribution between the liquid outer core and solid inner core by determining the melting temperature and element partitioning at high pressure. The melting experiments are conducted in the multi-anvil apparatus up to 27 GPa and extended to higher pressure in the diamond-anvil cell with laser-heating. We have developed techniques to recover small heated samples by precision FIB milling and obtain high-resolution images of the laser-heated spot that show melting texture at high pressure. By analyzing the chemical compositions of the coexisting liquid and solid phases, we precisely determine the liquidus curve, providing necessary data to understand the inner core crystallization process. PMID:24326245
van der Heide, Astrid; Werth, Esther; Donjacour, Claire E H M; Reijntjes, Robert H A M; Lammers, Gert Jan; Van Someren, Eus J W; Baumann, Christian R; Fronczek, Rolf
2016-11-01
Previous laboratory studies in narcolepsy patients showed altered core body and skin temperatures, which are hypothesised to be related to a disturbed sleep wake regulation. In this ambulatory study we assessed temperature profiles in normal daily life, and whether sleep attacks are heralded by changes in skin temperature. Furthermore, the effects of three months of treatment with sodium oxybate (SXB) were investigated. Twenty-five narcolepsy patients and 15 healthy controls were included. Core body, proximal and distal skin temperatures, and sleep-wake state were measured simultaneously for 24 hours in ambulatory patients. This procedure was repeated in 16 narcolepsy patients after at least 3 months of stable treatment with SXB. Increases in distal skin temperature and distal-to-proximal temperature gradient (DPG) strongly predicted daytime sleep attacks (P < 0.001). As compared to controls, patients had a higher proximal and distal skin temperature in the morning, and a lower distal skin temperature during the night (all P < 0.05). Furthermore, they had a higher core body temperature during the first part of the night (P < 0.05), which SXB decreased (F = 4.99, df = 1, P = 0.03) to a level similar to controls. SXB did not affect skin temperature. This ambulatory study demonstrates that daytime sleep attacks were preceded by clear changes in distal skin temperature and DPG. Furthermore, changes in core body and skin temperature in narcolepsy, previously only studied in laboratory settings, were partially confirmed. Treatment with SXB resulted in a normalisation of the core body temperature profile. Future studies should explore whether predictive temperature changes can be used to signal or even prevent sleep attacks. © 2016 Associated Professional Sleep Societies, LLC.
Head-on collision of the second mode internal solitary waves
NASA Astrophysics Data System (ADS)
Terletska, Kateryna; Maderich, Vladimir; Jung, Kyung Tae
2017-04-01
Second mode internal waves are widespread in offshore areas, and they frequently follow the first mode internal waves on the oceanic shelf. Large amplitude internal solitary waves (ISW) of second mode containing trapped cores associated with closed streamlines can also transport plankton and nutrients. An interaction of ISWs with trapped cores takes place in a specific manner. It motivated us to carry out a computational study of head-on collision of ISWs of second mode propagating in a laboratory-scale numerical tank using the nonhydrostatic 3D numerical model based on the Navier-Stokes equations for a continuously stratified fluid. Three main classes of ISW of second mode propagating in the pycnocline layer of thickness h between homogeneous deep layers can be identified: (i) the weakly nonlinear waves; (ii) the stable strongly nonlinear waves with trapped cores; and (iii) the shear unstable strongly nonlinear waves (Maderich et al., 2015). Four interaction regimes for symmetric collision were separated from simulation results using this classification: (A) an almost elastic interaction of the weakly nonlinear waves; (B) a non-elastic interaction of waves with trapped cores when ISW amplitudes were close to critical non-dimensional amplitude a/h; (C) an almost elastic interaction of stable strongly nonlinear waves with trapped cores; (D) non-elastic interaction of the unstable strongly nonlinear waves. The unexpected result of simulation was that relative loss of energy due to the collision was maximal for regime B. New regime appeared when ISW of different amplitudes belonged to class (ii) collide. In result of interaction the exchange of mass between ISW occurred: the trapped core of smaller wave was entrained by core of larger ISW without mixing forming a new ISW of larger amplitude whereas in smaller ISW core of smaller wave totally substituted by fluid from larger wave. Overall, the wave characteristics induced by head-on collision agree well with the results of several available laboratory experiments. References [1] V. Maderich, K. T. Jung, K. Terletska, I. Brovchenko, T. Talipova, "Incomplete similarity of internal solitary waves with trapped core," Fluid Dynamics Research 47, 035511 (2015).
Integrating Condensed Matter Physics into a Liberal Arts Physics Curriculum
NASA Astrophysics Data System (ADS)
Collett, Jeffrey
2008-03-01
The emergence of nanoscale science into the popular consciousness presents an opportunity to attract and retain future condensed matter scientists. We inject nanoscale physics into recruiting activities and into the introductory and the core portions of the curriculum. Laboratory involvement and research opportunity play important roles in maintaining student engagement. We use inexpensive scanning tunneling (STM) and atomic force (AFM) microscopes to introduce students to nanoscale structure early in their college careers. Although the physics of tip-surface interactions is sophisticated, the resulting images can be interpreted intuitively. We use the STM in introductory modern physics to explore quantum tunneling and the properties of electrons at surfaces. An interdisciplinary course in nanoscience and nanotechnology course team-taught with chemists looks at nanoscale phenomena in physics, chemistry, and biology. Core quantum and statistical physics courses look at effects of quantum mechanics and quantum statistics in degenerate systems. An upper level solid-state physics course takes up traditional condensed matter topics from a structural perspective by beginning with a study of both elastic and inelastic scattering of x-rays from crystalline solids and liquid crystals. Students encounter reciprocal space concepts through the analysis of laboratory scattering data and by the development of the scattering theory. The course then examines the importance of scattering processes in band structure and in electrical and thermal conduction. A segment of the course is devoted to surface physics and nanostructures where we explore the effects of restricting particles to two-dimensional surfaces, one-dimensional wires, and zero-dimensional quantum dots.
CRREL (Cold Regions Research and Engineering Laboratory) Technical Publications. Supplement
1986-09-01
Utilization for Fresh Water Production, ROAD. CHIMCAL COMPOSITION OF DUST nical memorandum~ Mar. 1976, No. 116, Muske Re- Weather Modilicationt, and...it appeared in a J-9 core Commes a i elon teo kEvromna 516 on an unusual boundary layer showing in th core andaa PV.e sEvromns assessment of the...SEA ICE IN at lowbtrquencinsfroad nttrsnsmtterseadofthelnductive 37.4035 THE 50-15 MHZ RANGEL coupling between two loop ntennaus are describsed
Restoration and PDS Archive of Apollo Lunar Rock Sample Data
NASA Technical Reports Server (NTRS)
Garcia, P. A.; Todd, N. S.; Lofgren, G. E.; Stefanov, W. L.; Runco, S. K.; LaBasse, D.; Gaddis, L. R.
2011-01-01
In 2008, scientists at the Johnson Space Center (JSC) Lunar Sample Laboratory and Image Science & Analysis Laboratory (under the auspices of the Astromaterials Research and Exploration Science Directorate or ARES) began work on a 4-year project to digitize the original film negatives of Apollo Lunar Rock Sample photographs. These rock samples together with lunar regolith and core samples were collected as part of the lander missions for Apollos 11, 12, 14, 15, 16 and 17. The original film negatives are stored at JSC under cryogenic conditions. This effort is data restoration in the truest sense. The images represent the only record available to scientists which allows them to view the rock samples when making a sample request. As the negatives are being scanned, they are also being formatted and documented for permanent archive in the NASA Planetary Data System (PDS) archive. The ARES group is working collaboratively with the Imaging Node of the PDS on the archiving.
Operational Philosophy for the Advanced Test Reactor National Scientific User Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Benson; J. Cole; J. Jackson
2013-02-01
In 2007, the Department of Energy (DOE) designated the Advanced Test Reactor (ATR) as a National Scientific User Facility (NSUF). At its core, the ATR NSUF Program combines access to a portion of the available ATR radiation capability, the associated required examination and analysis facilities at the Idaho National Laboratory (INL), and INL staff expertise with novel ideas provided by external contributors (universities, laboratories, and industry). These collaborations define the cutting edge of nuclear technology research in high-temperature and radiation environments, contribute to improved industry performance of current and future light-water reactors (LWRs), and stimulate cooperative research between user groupsmore » conducting basic and applied research. To make possible the broadest access to key national capability, the ATR NSUF formed a partnership program that also makes available access to critical facilities outside of the INL. Finally, the ATR NSUF has established a sample library that allows access to pre-irradiated samples as needed by national research teams.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kayser, Y., E-mail: yves.kayser@psi.ch; Paul Scherrer Institut, 5232 Villigen-PSI; Błachucki, W.
2014-04-15
The high-resolution von Hamos bent crystal spectrometer of the University of Fribourg was upgraded with a focused X-ray beam source with the aim of performing micro-sized X-ray fluorescence (XRF) measurements in the laboratory. The focused X-ray beam source integrates a collimating optics mounted on a low-power micro-spot X-ray tube and a focusing polycapillary half-lens placed in front of the sample. The performances of the setup were probed in terms of spatial and energy resolution. In particular, the fluorescence intensity and energy resolution of the von Hamos spectrometer equipped with the novel micro-focused X-ray source and a standard high-power water-cooled X-raymore » tube were compared. The XRF analysis capability of the new setup was assessed by measuring the dopant distribution within the core of Er-doped SiO{sub 2} optical fibers.« less
Spectroscopy of Al wire array stagnation on Z
NASA Astrophysics Data System (ADS)
Jones, B.; Jennings, C. A.; Hansen, S. B.; Bailey, J. E.; Rochau, G. A.; Coverdale, C. A.; Yu, E. P.; Ampleford, D. J.; Cuneo, M. E.; Maron, Y.; Fisher, V. I.; Bernshtam, V.; Starobinets, A.; Weingarten, L.; Pinhas, S.
2011-10-01
In this work, we present analysis of time-gated spectra of ~2 keV K-shell emissions from Al (5% Mg) wire arrays on Z to provide details of the plasma conditions and dynamics at the onset of stagnation. The plasma is modeled as concentric radial zones, and collisional-radiative modeling with self-consistent radiation transport is used to constrain the temperatures and densities in these regions. A hot ~2 keV plasma core bearing a few percent of the total mass forms at the foot of the x-ray pulse, with participating mass increasing toward peak x-ray power as material arrives on axis with ~50 cm/ μs implosion velocity. The atomic modeling accounts for K-shell line opacity and Doppler effects, and is compared to 3D MHD simulations. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DOE National Nuclear Security Administration under contract DE-AC04-94AL85000.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen-Mayer, H; Tosh, R
2015-06-15
Purpose: To reconcile air kerma and calorimetry measurements in a prototype calorimeter for obtaining absorbed dose in diagnostic CT beams. While corrections for thermal artifacts are routine and generally small in calorimetry of radiotherapy beams, large differences in relative stopping powers of calorimeter materials at the lower energies typical of CT beams greatly magnify their effects. Work-to-date on the problem attempts to reconcile laboratory measurements with modeling output from Monte Carlo and finite-element analysis of heat transfer. Methods: Small thermistor beads were embedded in a polystyrene (PS) core element of 1 cm diameter, which was inserted into a cylindrical HDPEmore » phantom of 30 cm diameter and subjected to radiation in a diagnostic CT x-ray imaging system. Resistance changes in the thermistors due to radiation heating were monitored via lock-in amplifier. Multiple 3-second exposures were recorded at 8 different dose-rates from the CT system, and least-squares fits to experimental data were compared to an expected thermal response obtained by finite-element analysis incorporating source terms based on semi-empirical modeling and Monte Carlo simulation. Results: Experimental waveforms exhibited large thermal artifacts with fast time constants, associated with excess heat in wires and glass, and smaller steps attributable to radiation heating of the core material. Preliminary finite-element analysis follows the transient component of the signal qualitatively, but predicts a slower decay of temperature spikes. This was supplemented by non-linear least-squares fits incorporating semi-empirical formulae for heat transfer, which were used to obtain dose-to-PS in reasonable agreement with the output of Monte Carlo calculations that converts air kerma to absorbed dose. Conclusion: Discrepancies between the finite-element analysis and our experimental data testify to the very significant heat transfer correction required for absorbed dose calorimetry of diagnostic CT beams. The results obtained here are being used to refine both simulations and design of calorimeter core components.« less
Hadziyannis, Emilia; Minopetrou, Martha; Georgiou, Anastasia; Spanou, Fotini; Koskinas, John
2013-01-01
Background Hepatitis C viral (HCV) load detection and quantification is routinely accomplished by HCV RNA measurement, an expensive but essential test, both for the diagnosis and treatment of chronic hepatitis C (CHC). HCV core antigen (Ag) testing has been suggested as an attractive alternative to molecular diagnostics. The aim of the study was to evaluate an automated chemiluminescent immunoassay (CLIA) for HCV core Ag measurement in comparison to quantitative HCV RNA determination. Methods HCV Ag was measured in 105 anti-HCV positive patients, from which 89 were HCV RNA positive with CHC and 16 HCV RNA negative after spontaneous HCV clearance. Viral load was quantified with branched DNA (bDNA, Versant, Siemens). Sera were stored at -70°C and then tested with the Architect HCV Ag test (Abbott Laboratories), a two-step CLIA assay, with high throughput and minimal handling of the specimens. Statistical analysis was performed on logarithmically transformed values. Results HCV-Ag was detectable and quantifiable in 83/89 and in grey zone in 4/89 HCV RNA positive sera. HCV-Ag was undetectable in all 16 HCV RNA negative samples. The sample with the lowest viral load that tested positive for HCV-Ag contained 1200 IU/mL HCV RNA. There was a positive correlation between HCV RNA and HCV-Ag (r=0.89). The HCV RNA/ HCV Ag ratio varied from 1.5 to 3.25. Conclusion The HCV core Ag is an easy test with comparable sensitivity (>90%) and satisfactory correlation with the HCV RNA bDNA assay. Its role in diagnostics and other clinical applications has to be determined based on cost effectiveness. PMID:24714621
Designing for time-dependent material response in spacecraft structures
NASA Technical Reports Server (NTRS)
Hyer, M. W.; Oleksuk, Lynda L. S.; Bowles, D. E.
1992-01-01
To study the influence on overall deformations of the time-dependent constitutive properties of fiber-reinforced polymeric matrix composite materials being considered for use in orbiting precision segmented reflectors, simple sandwich beam models are developed. The beam models include layers representing the face sheets, the core, and the adhesive bonding of the face sheets to the core. A three-layer model lumps the adhesive layers with the face sheets or core, while a five-layer model considers the adhesive layers explicitly. The deformation response of the three-layer and five-layer sandwich beam models to a midspan point load is studied. This elementary loading leads to a simple analysis, and it is easy to create this loading in the laboratory. Using the correspondence principle of viscoelasticity, the models representing the elastic behavior of the two beams are transformed into time-dependent models. Representative cases of time-dependent material behavior for the facesheet material, the core material, and the adhesive are used to evaluate the influence of these constituents being time-dependent on the deformations of the beam. As an example of the results presented, if it assumed that, as a worst case, the polymer-dominated shear properties of the core behave as a Maxwell fluid such that under constant shear stress the shear strain increases by a factor of 10 in 20 years, then it is shown that the beam deflection increases by a factor of 1.4 during that time. In addition to quantitative conclusions, several assumptions are discussed which simplify the analyses for use with more complicated material models. Finally, it is shown that the simpler three-layer model suffices in many situations.
NASA Astrophysics Data System (ADS)
Ziegler, Leah; Stoner, Joseph
2013-04-01
The dynamic changes in the Earth's magnetic field, caused by fluid motions in its outer core, can be captured in global marine sediments. Here we extend recent efforts to reconstruct Holocene paleomagnetic secular variation and environmental conditions in the mid-high latitude North Pacific with analyses of a marine sediment core taken from Prince William Sound, southern Alaska. Natural and laboratory remanent magnetizations were studied by progressive alternating field (AF) demagnetization of u-channel samples from jumbo piston core EW0408-95JC (60.66278N, 147.70847W, water depth 745m). The lithology is monitored by physical properties measurements, including CT Scans and core descriptions. The lithology of the upper 8.5 m of the 17.6 meter core consists primarily of magnetically homogenous bioturbated muds. Component directions calculated by PCA analysis are characterized by low MAD values (<4°) with inclinations consistent with GAD predictions and declinations varying in a manner consistent with PSV. Normalized remanences are comparable using a variety of normalizers and show minimal scatter through demagnetization suggesting that reliable paleointenisty estimates may be preserved. A detailed chronology developed from calibrated radiocarbon dating of benthic forams shows that the 8.5m spans ~1500 years, and yields sedimentation rates of several hundred cm/kyr - ultra high for marine sediments. Comparison with Pacific Northwest and broader North American records, provides a degree of reproducibility and allows us to assess the spatial scale of signal coherence at centennial resolution . The resulting record of paleosecular variation (PSV) and relative paleointensity are consistent with predictions from global geomagnetic field models, yet allow investigations of rates of change of the local field, that cannot be accessed from global field models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rouxelin, Pascal Nicolas; Strydom, Gerhard
Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented bymore » the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise II 1a. The steady state core calculations were simulated with the INL coupled-code system known as the Parallel and Highly Innovative Simulation for INL Code System (PHISICS) and the system thermal-hydraulics code known as the Reactor Excursion and Leak Analysis Program (RELAP) 5 3D using the nuclear data libraries previously generated with NEWT. It was observed that significant differences in terms of multiplication factor and neutron flux exist between the various permutations of the Phase I super-cell lattice calculations. The use of these cross section libraries only leads to minor changes in the Phase II core simulation results for fresh fuel but shows significantly larger discrepancies for spent fuel cores. Furthermore, large incongruities were found between the SCALE NEWT and KENO VI results for the super cells, and while some trends could be identified, a final conclusion on this issue could not yet be reached. This report will be revised in mid 2016 with more detailed analyses of the super-cell problems and their effects on the core models, using the latest version of SCALE (6.2). The super-cell models seem to show substantial improvements in terms of neutron flux as compared to single-block models, particularly at thermal energies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontogeorgakos, D.; Derstine, K.; Wright, A.
2013-06-01
The purpose of the TREAT reactor is to generate large transient neutron pulses in test samples without over-heating the core to simulate fuel assembly accident conditions. The power transients in the present HEU core are inherently self-limiting such that the core prevents itself from overheating even in the event of a reactivity insertion accident. The objective of this study was to support the assessment of the feasibility of the TREAT core conversion based on the present reactor performance metrics and the technical specifications of the HEU core. The LEU fuel assembly studied had the same overall design, materials (UO 2more » particles finely dispersed in graphite) and impurities content as the HEU fuel assembly. The Monte Carlo N–Particle code (MCNP) and the point kinetics code TREKIN were used in the analyses.« less
Laboratory investigations of earthquake dynamics
NASA Astrophysics Data System (ADS)
Xia, Kaiwen
In this thesis this will be attempted through controlled laboratory experiments that are designed to mimic natural earthquake scenarios. The earthquake dynamic rupturing process itself is a complicated phenomenon, involving dynamic friction, wave propagation, and heat production. Because controlled experiments can produce results without assumptions needed in theoretical and numerical analysis, the experimental method is thus advantageous over theoretical and numerical methods. Our laboratory fault is composed of carefully cut photoelastic polymer plates (Homahte-100, Polycarbonate) held together by uniaxial compression. As a unique unit of the experimental design, a controlled exploding wire technique provides the triggering mechanism of laboratory earthquakes. Three important components of real earthquakes (i.e., pre-existing fault, tectonic loading, and triggering mechanism) correspond to and are simulated by frictional contact, uniaxial compression, and the exploding wire technique. Dynamic rupturing processes are visualized using the photoelastic method and are recorded via a high-speed camera. Our experimental methodology, which is full-field, in situ, and non-intrusive, has better control and diagnostic capacity compared to other existing experimental methods. Using this experimental approach, we have investigated several problems: dynamics of earthquake faulting occurring along homogeneous faults separating identical materials, earthquake faulting along inhomogeneous faults separating materials with different wave speeds, and earthquake faulting along faults with a finite low wave speed fault core. We have observed supershear ruptures, subRayleigh to supershear rupture transition, crack-like to pulse-like rupture transition, self-healing (Heaton) pulse, and rupture directionality.
NASA Astrophysics Data System (ADS)
Angulo, A. A.; Kuranz, C. C.; Drake, R. P.; Huntington, C. M.; Park, H.-S.; Remington, B. A.; Kalantar, D.; MacLaren, S.; Raman, K.; Miles, A.; Trantham, Matthew; Kline, J. L.; Flippo, K.; Doss, F. W.; Shvarts, D.
2016-10-01
This poster will describe simulations based on results from ongoing laboratory astrophysics experiments at the National Ignition Facility (NIF) relevant to the effects of radiative shock on hydrodynamically unstable surfaces. The experiments performed on NIF uniquely provide the necessary conditions required to emulate radiative shock that occurs in astrophysical systems. The core-collapse explosions of red supergiant stars is such an example wherein the interaction between the supernova ejecta and the circumstellar medium creates a region susceptible to Rayleigh-Taylor (R-T) instabilities. Radiative and nonradiative experiments were performed to show that R-T growth should be reduced by the effects of the radiative shocks that occur during this core-collapse. Simulations were performed using the radiation hydrodynamics code Hyades using the experimental conditions to find the mean interface acceleration of the instability and then further analyzed in the buoyancy drag model to observe how the material expansion contributes to the mix-layer growth. This work is funded by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas under Grant Number DE-FG52-09NA29548.
An International Ki67 Reproducibility Study
2013-01-01
Background In breast cancer, immunohistochemical assessment of proliferation using the marker Ki67 has potential use in both research and clinical management. However, lack of consistency across laboratories has limited Ki67’s value. A working group was assembled to devise a strategy to harmonize Ki67 analysis and increase scoring concordance. Toward that goal, we conducted a Ki67 reproducibility study. Methods Eight laboratories received 100 breast cancer cases arranged into 1-mm core tissue microarrays—one set stained by the participating laboratory and one set stained by the central laboratory, both using antibody MIB-1. Each laboratory scored Ki67 as percentage of positively stained invasive tumor cells using its own method. Six laboratories repeated scoring of 50 locally stained cases on 3 different days. Sources of variation were analyzed using random effects models with log2-transformed measurements. Reproducibility was quantified by intraclass correlation coefficient (ICC), and the approximate two-sided 95% confidence intervals (CIs) for the true intraclass correlation coefficients in these experiments were provided. Results Intralaboratory reproducibility was high (ICC = 0.94; 95% CI = 0.93 to 0.97). Interlaboratory reproducibility was only moderate (central staining: ICC = 0.71, 95% CI = 0.47 to 0.78; local staining: ICC = 0.59, 95% CI = 0.37 to 0.68). Geometric mean of Ki67 values for each laboratory across the 100 cases ranged 7.1% to 23.9% with central staining and 6.1% to 30.1% with local staining. Factors contributing to interlaboratory discordance included tumor region selection, counting method, and subjective assessment of staining positivity. Formal counting methods gave more consistent results than visual estimation. Conclusions Substantial variability in Ki67 scoring was observed among some of the world’s most experienced laboratories. Ki67 values and cutoffs for clinical decision-making cannot be transferred between laboratories without standardizing scoring methodology because analytical validity is limited. PMID:24203987
Tinegate, H N; Davies, T; Elshaw, R J; Jane, G; Lyon, M; Norfolk, D R; Plews, D E; Troy, C B; Watson, D
2010-08-01
This study was undertaken to provide data relating to the timing of laboratory crossmatch procedures, and the source of requests for out of hours crossmatch, to support interpretation of error reports originating in the transfusion laboratory, received by the Serious Hazards of Transfusion haemovigilance scheme. Data on the timing, origin and urgency of all crossmatch requests were collected in 34 hospitals in northern England over a 7-day period in 2008. Additional data on clinical urgency were collected on crossmatches that were performed out of hours. Data were obtained on 2423 crossmatches, including 610 (25.2%) performed outside core hours. 30.3% of out of hours crossmatch requests were for transfusions that were set up outside 4 h of completion of the crossmatch. 2008 Serious Hazards of Transfusion data showed that 29/39 (74%) of laboratory errors resulting in 'wrong blood' occurred out of hours whilst our audit shows that only 25% of crossmatch requests are made in that time period, suggesting that crossmatching performed outside core hours carries increased risks. The reason for increased risk of error needs further research, but 25 laboratories had only one member of staff working out of hours, often combining blood transfusion, haematology and coagulation work. A total of 25% of out of hours requests were not clinically urgent. Hospitals should develop policies to define indications for out of hours transfusion testing, empower laboratory staff to challenge inappropriate requests and ensure that staffing and expertise is appropriate for the workload at all times.
The European Network of Analytical and Experimental Laboratories for Geosciences
NASA Astrophysics Data System (ADS)
Freda, Carmela; Funiciello, Francesca; Meredith, Phil; Sagnotti, Leonardo; Scarlato, Piergiorgio; Troll, Valentin R.; Willingshofer, Ernst
2013-04-01
Integrating Earth Sciences infrastructures in Europe is the mission of the European Plate Observing System (EPOS).The integration of European analytical, experimental, and analogue laboratories plays a key role in this context and is the task of the EPOS Working Group 6 (WG6). Despite the presence in Europe of high performance infrastructures dedicated to geosciences, there is still limited collaboration in sharing facilities and best practices. The EPOS WG6 aims to overcome this limitation by pushing towards national and trans-national coordination, efficient use of current laboratory infrastructures, and future aggregation of facilities not yet included. This will be attained through the creation of common access and interoperability policies to foster and simplify personnel mobility. The EPOS ambition is to orchestrate European laboratory infrastructures with diverse, complementary tasks and competences into a single, but geographically distributed, infrastructure for rock physics, palaeomagnetism, analytical and experimental petrology and volcanology, and tectonic modeling. The WG6 is presently organizing its thematic core services within the EPOS distributed research infrastructure with the goal of joining the other EPOS communities (geologists, seismologists, volcanologists, etc...) and stakeholders (engineers, risk managers and other geosciences investigators) to: 1) develop tools and services to enhance visitor programs that will mutually benefit visitors and hosts (transnational access); 2) improve support and training activities to make facilities equally accessible to students, young researchers, and experienced users (training and dissemination); 3) collaborate in sharing technological and scientific know-how (transfer of knowledge); 4) optimize interoperability of distributed instrumentation by standardizing data collection, archive, and quality control standards (data preservation and interoperability); 5) implement a unified e-Infrastructure for data analysis, numerical modelling, and joint development and standardization of numerical tools (e-science implementation); 6) collect and store data in a flexible inventory database accessible within and beyond the Earth Sciences community(open access and outreach); 7) connect to environmental and hazard protection agencies, stakeholders, and public to raise consciousness of geo-hazards and geo-resources (innovation for society). We will inform scientists and industrial stakeholders on the most recent WG6 achievements in EPOS and we will show how our community is proceeding to design the thematic core services.
NASA Astrophysics Data System (ADS)
Carpenter, B. M.; Marone, C.; Saffer, D. M.
2010-12-01
The debate concerning the apparent low strength of tectonic faults, including the San Andreas Fault (SAF), continues to focus on: 1) low intrinsic friction resulting from mineralogy and/or fabric, and 2) decreased effective normal stress due to elevated pore pressure. Here we inform this debate with laboratory measurements of the frictional behavior and permeability of cuttings and core returned from the SAF at a vertical depth of 2.7 km. We conducted experiments on cuttings and core recovered during SAFOD Phase III drilling. All samples in this study are adjacent to and within the active fault zone penetrated at 10814.5 ft (3296m) measured depth in the SAFOD borehole. We sheared gouge samples composed of drilling cuttings in a double-direct shear configuration subject to true-triaxial loading under constant effective normal stress, confining pressure, and pore pressure. Intact wafers of material were sheared in a single-direct shear configuration under similar conditions of effective stress, confining pressure, and pore pressure. We also report on permeability measurements on intact wafers of wall rock and fault gouge prior to shearing. Initial results from experiments on cuttings show: 1) a weak fault (µ=~0.21) compared to the surrounding wall rock (µ=~0.35), 2) velocity strengthening behavior, (a-b > 0), consistent with aseismic slip, and 3) near zero healing rates in material from the active fault. XRD analysis on cuttings indicates the main mineralogical difference between fault rock and wall rock, is the presence of significant amounts of smectite within the fault rock. Taken together, the measured frictional behavior and clay mineral content suggest that the clay composition exhibits a basic control on fault behavior. Our results document the first direct evidence of weak material from an active fault at seismogenic depths. In addition, our results could explain why the SAF in central California fails aseismically and hosts only small earthquakes.
Hardware accelerated high performance neutron transport computation based on AGENT methodology
NASA Astrophysics Data System (ADS)
Xiao, Shanjie
The spatial heterogeneity of the next generation Gen-IV nuclear reactor core designs brings challenges to the neutron transport analysis. The Arbitrary Geometry Neutron Transport (AGENT) AGENT code is a three-dimensional neutron transport analysis code being developed at the Laboratory for Neutronics and Geometry Computation (NEGE) at Purdue University. It can accurately describe the spatial heterogeneity in a hierarchical structure through the R-function solid modeler. The previous version of AGENT coupled the 2D transport MOC solver and the 1D diffusion NEM solver to solve the three dimensional Boltzmann transport equation. In this research, the 2D/1D coupling methodology was expanded to couple two transport solvers, the radial 2D MOC solver and the axial 1D MOC solver, for better accuracy. The expansion was benchmarked with the widely applied C5G7 benchmark models and two fast breeder reactor models, and showed good agreement with the reference Monte Carlo results. In practice, the accurate neutron transport analysis for a full reactor core is still time-consuming and thus limits its application. Therefore, another content of my research is focused on designing a specific hardware based on the reconfigurable computing technique in order to accelerate AGENT computations. It is the first time that the application of this type is used to the reactor physics and neutron transport for reactor design. The most time consuming part of the AGENT algorithm was identified. Moreover, the architecture of the AGENT acceleration system was designed based on the analysis. Through the parallel computation on the specially designed, highly efficient architecture, the acceleration design on FPGA acquires high performance at the much lower working frequency than CPUs. The whole design simulations show that the acceleration design would be able to speedup large scale AGENT computations about 20 times. The high performance AGENT acceleration system will drastically shortening the computation time for 3D full-core neutron transport analysis, making the AGENT methodology unique and advantageous, and thus supplies the possibility to extend the application range of neutron transport analysis in either industry engineering or academic research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martino, C
The Department of Energy (DOE) recognizes the need for the characterization of High-Level Waste (HLW) saltcake in the Savannah River Site (SRS) F- and H-area tank farms to support upcoming salt processing activities. As part of the enhanced characterization efforts, Tank 25F will be sampled and the samples analyzed at the Savannah River National Laboratory (SRNL). This Task Technical and Quality Assurance Plan documents the planned activities for the physical, chemical, and radiological analysis of the Tank 25F saltcake core samples. This plan does not cover other characterization activities that do not involve core sample analysis and it does notmore » address issues regarding sampling or sample transportation. The objectives of this report are: (1) Provide information useful in projecting the composition of dissolved salt batches by quantifying important components (such as actinides, {sup 137}Cs, and {sup 90}Sr) on a per batch basis. This will assist in process selection for the treatment of salt batches and provide data for the validation of dissolution modeling. (2) Determine the properties of the heel resulting from dissolution of the bulk saltcake. Also note tendencies toward post-mixing precipitation. (3) Provide a basis for determining the number of samples needed for the characterization of future saltcake tanks. Gather information useful towards performing characterization in a manner that is more cost and time effective.« less
Properties of the Lunar Interior: Preliminary Results from the GRAIL Mission
NASA Technical Reports Server (NTRS)
Williams, James G.; Konopliv, Alexander S.; Asmar, Sami W.; Lemoine, Frank G.; Melosh, H. Jay; Neumann, Gregory A.; Phillips, Roger J.; Smith, David E.; Solomon, Sean C.; Watkins, Michael M.;
2013-01-01
The Gravity Recovery and Interior Laboratory (GRAIL) mission [1] has provided lunar gravity with unprecedented accuracy and resolution. GRAIL has produced a high-resolution map of the lunar gravity field [2,3] while also determining tidal response. We present the latest gravity field solution and its preliminary implications for the Moon's interior structure, exploring properties such as the mean density, moment of inertia of the solid Moon, and tidal potential Love number k(sub 2). Lunar structure includes a thin crust, a thick mantle layer, a fluid outer core, and a suspected solid inner core. An accurate Love number mainly improves knowledge of the fluid core and deep mantle. In the future, we will search for evidence of tidal dissipation and a solid inner core using GRAIL data.
75 FR 51280 - National Institute of Allergy and Infectious Diseases; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-19
... Infectious Diseases Special Emphasis Panel, Nonhuman Primate Core Humoral Immunology Vaccine Laboratory. Date..., [email protected] . (Catalogue of Federal Domestic Assistance Program Nos. 93.855, Allergy, Immunology...
Concrete probe-strength study : final report.
DOT National Transportation Integrated Search
1969-12-01
The Windsor probe - test system was evaluated for determining compressive strength of concrete by comparing probe strengths against cylinder and core strengths from both laboratory and field-poured concrete. Advantages and disadvantages of this syste...
In-situ stress measurements using core-based methods in the vicinity of Nojima fault.
NASA Astrophysics Data System (ADS)
Yano, S.; Sugimoto, T.; Lin, W.; Lin, A.
2017-12-01
In the cycle of repeatable occurrence of earthquakes, stress accumulates at the source fault and its surroundings in an interseismic period until the next earthquake, and releases abruptly when the earthquake occurs. However, it is almost unknown that the quantitative relationship between stress change and earthquake occurrence. Hence, in order to improve our understanding on the mechanisms of the outbreak of earthquakes, it is important to grasp the stress states in the vicinity of the source fault and to evaluate its change over time. In this study, we carried out in-situ stress measurements by using core samples obtained from a scientific drilling penetrated through the Nojima fault which ruptured and caused the Hyogo-ken Nanbu earthquake, Japan in 1995. Our stress measurements were conducted from 2016 to 2017 when is 22 years after the earthquake. For this purpose, we applied the Anelastic Strain Recovery (ASR) method and Diametrical Core Deformation Analysis (DCDA). First, we measure the ASR change with time of the cores from stress releasing soon and calculate three-dimensional principal in-situ stress orientations and magnitudes from the ASR data. In this study, to ensure the enough amount of ASR, we conducted the measurements using the cores collected within a short time (e.g. 2.5 - 3.5 hours) after stress releasing by drilling at an on-site laboratory in the drilling site in Awaji island, Japan. The site locates at the south-west part of the Nojima fault. In DCDA, we measure the core diameters in all (360°) azimuths, and determine difference of the two horizontal principal stresses and their orientation by using the other cores as those used for ASR. DCDA experiments were conducted indoor and after a long time passed from core collecting. Lithology of all the core samples we used for ASR and DCDA are granite, and 19 and 7 cores were used for ASR and DCDA, respectively. As a result, it was found that the stress state in the depth range of 500 - 560 m and around 822 m were of normal faulting stress regime and that in 711 - 730 m was strike slip faulting type. As mentioned above, we have obtained a data set of the current state of stress around Nojima fault using the two core-based stress measurement methods. In future, we will determine the core orientations and restore the directions of the principal stress axes to the geologic coodinates.
NASA Astrophysics Data System (ADS)
Kneafsey, T. J.; Nakagawa, S.
2015-12-01
Distribution of supercritical (sc) CO2 has a large impact on its flow behavior as well as on the properties of seismic waves used for monitoring. Simultaneous imaging of scCO2 distribution in a rock core using X-ray computed tomography (CT) and measurements of seismic waves in the laboratory can help understand how the distribution evolves as scCO2 invades through rock, and the resulting seismic signatures. To this end, we performed a series of laboratory scCO2 core-flood experiments in intact and fractured anisotropic Carbon Tan sandstone samples. In these experiments, we monitored changes in the CO2 saturation distribution and sonic-frequency acoustic resonances (yielding both seismic velocity and attenuation) over the course of the floods. A short-core resonant bar test system (Split-Hopkinson Resonant Bar Apparatus) custom fit into a long X-ray transparent pressure vessel was used for the seismic measurements, and a modified General Electric medical CT scanner was used to acquire X-ray CT data from which scCO2 saturation distributions were determined. The focus of the experiments was on the impact of single fractures on the scCO2 distribution and the seismic properties. For this reason, we examined several cases including 1. intact, 2. a closely mated fracture along the core axis, 3. a sheared fracture along the core axis (both vertical and horizontal for examining the buoyancy effect), and 4. a sheared fracture perpendicular to the core axis. For the intact and closely mated fractured cores, Young's modulus declined with increasing CO2 saturation, and attenuation increased up to about 15% CO2 saturation after which attenuation declined. For cores having wide axial fractures, the Young's modulus was lower than for the intact and closely mated cases, however did not change much with CO2 pore saturation. Much lower CO2 pore saturations were achieved in these cases. Attenuation increased more rapidly however than for the intact sample. For the core-perpendicular fracture, the Young's modulus decreased quickly with increasing CO2 saturation. Attenuation increased with increasing CO2 saturation until the CO2 front reached the fracture, after which it fell to below that for the brine-saturated case, increasing again as the CO2 invaded the downstream core region.
Deterministic Modeling of the High Temperature Test Reactor with DRAGON-HEXPEDITE
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Ortensi; M.A. Pope; R.M. Ferrer
2010-10-01
The Idaho National Laboratory (INL) is tasked with the development of reactor physics analysis capability of the Next Generation Nuclear Power (NGNP) project. In order to examine the INL’s current prismatic reactor analysis tools, the project is conducting a benchmark exercise based on modeling the High Temperature Test Reactor (HTTR). This exercise entails the development of a model for the initial criticality, a 19 fuel column thin annular core, and the fully loaded core critical condition with 30 fuel columns. Special emphasis is devoted to physical phenomena and artifacts in HTTR that are similar to phenomena and artifacts in themore » NGNP base design. The DRAGON code is used in this study since it offers significant ease and versatility in modeling prismatic designs. DRAGON can generate transport solutions via Collision Probability (CP), Method of Characteristics (MOC) and Discrete Ordinates (Sn). A fine group cross-section library based on the SHEM 281 energy structure is used in the DRAGON calculations. The results from this study show reasonable agreement in the calculation of the core multiplication factor with the MC methods, but a consistent bias of 2–3% with the experimental values is obtained. This systematic error has also been observed in other HTTR benchmark efforts and is well documented in the literature. The ENDF/B VII graphite and U235 cross sections appear to be the main source of the error. The isothermal temperature coefficients calculated with the fully loaded core configuration agree well with other benchmark participants but are 40% higher than the experimental values. This discrepancy with the measurement partially stems from the fact that during the experiments the control rods were adjusted to maintain criticality, whereas in the model, the rod positions were fixed. In addition, this work includes a brief study of a cross section generation approach that seeks to decouple the domain in order to account for neighbor effects. This spectral interpenetration is a dominant effect in annular HTR physics. This analysis methodology should be further explored in order to reduce the error that is systematically propagated in the traditional generation of cross sections.« less
NASA Astrophysics Data System (ADS)
Muchitsch, Nanna; Van Nooten, Thomas; Bastiaens, Leen; Kjeldsen, Peter
2011-11-01
An important issue of concern for permeable reactive iron barriers is the long-term efficiency of the barriers due to the long operational periods required. Mineral precipitation resulting from the anaerobic corrosion of the iron filings and bacteria present in the barrier may play an important role in the long-term performance. An integrated study was performed on the Vapokon permeable reactive barrier (PRB) in Denmark by groundwater and iron core sample characterization. The detailed field groundwater sampling carried out from more than 75 well screens up and downstream the barrier showed a very efficient removal (> 99%) for the most important CAHs (PCE, TCE and 1,1,1-TCA). However, significant formation of cis-DCE within the PRB resulted in an overall insufficient efficiency for cis-DCE removal. The detailed analysis of the upstream groundwater revealed a very heterogeneous spatial distribution of contaminant loading into the PRB, which resulted in that only about a quarter of the barrier system is treating significant loads of CAHs. Laboratory batch experiments using contaminated groundwater from the site and iron material from the core samples revealed that the aged iron material performed equally well as virgin granular iron of the same type based on determined degradation rates despite that parts of the cored iron material were covered by mineral precipitates (especially iron sulfides, carbonate green rust and aragonite). The PCR analysis performed on the iron core samples indicated the presence of a microbial consortium in the barrier. A wide range of species were identified including sulfate and iron reducing bacteria, together with Dehalococcoides and Desulfuromonas species indicating microbial reductive dehalogenation potential. The microbes had a profound effect on the performance of the barrier, as indicated by significant degradation of dichloromethane (which is typically unaffected by zero valent iron) within the barrier.
Nuclear modules for space electric propulsion
NASA Technical Reports Server (NTRS)
Difilippo, F. C.
1998-01-01
Analysis of interplanetary cargo and piloted missions requires calculations of the performances and masses of subsystems to be integrated in a final design. In a preliminary and scoping stage the designer needs to evaluate options iteratively by using fast computer simulations. The Oak Ridge National Laboratory (ORNL) has been involved in the development of models and calculational procedures for the analysis (neutronic and thermal hydraulic) of power sources for nuclear electric propulsion. The nuclear modules will be integrated into the whole simulation of the nuclear electric propulsion system. The vehicles use either a Brayton direct-conversion cycle, using the heated helium from a NERVA-type reactor, or a potassium Rankine cycle, with the working fluid heated on the secondary side of a heat exchanger and lithium on the primary side coming from a fast reactor. Given a set of input conditions, the codes calculate composition. dimensions, volumes, and masses of the core, reflector, control system, pressure vessel, neutron and gamma shields, as well as the thermal hydraulic conditions of the coolant, clad and fuel. Input conditions are power, core life, pressure and temperature of the coolant at the inlet of the core, either the temperature of the coolant at the outlet of the core or the coolant mass flow and the fluences and integrated doses at the cargo area. Using state-of-the-art neutron cross sections and transport codes, a database was created for the neutronic performance of both reactor designs. The free parameters of the models are the moderator/fuel mass ratio for the NERVA reactor and the enrichment and the pitch of the lattice for the fast reactor. Reactivity and energy balance equations are simultaneously solved to find the reactor design. Thermalhydraulic conditions are calculated by solving the one-dimensional versions of the equations of conservation of mass, energy, and momentum with compressible flow.
NASA Astrophysics Data System (ADS)
Yoneda, J.; Oshima, M.; Kida, M.; Kato, A.; Konno, Y.; Jin, Y.; Waite, W. F.; Jang, J.; Kumar, P.; Tenma, N.
2017-12-01
Pressure coring and analysis technology allows for gas hydrate to be recovered from the deep seabed, transferred to the laboratory and characterized while continuously maintaining gas hydrate stability. For this study, dozens of hydrate-bearing pressure core sediment subsections recovered from the Krishna-Godavari Basin during India's National Gas Hydrate Program Expedition NGHP-02 were tested with Pressure Core Non-destructive Analysis Tools (PNATs) through a collaboration between Japan and India. PNATs, originally developed by AIST as a part of the Japanese National hydrate research program (MH21, funded by METI) conducted permeability, compression and consolidation tests under various effective stress conditions, including the in situ stress state estimated from downhole bulk density measurements. At the in situ effective stress, gas hydrate-bearing sediments had an effective permeability range of 0.01-10mD even at pore-space hydrate saturations above 60%. Permeability increased by 10 to 100 times after hydrate dissociation at the same effective stress, but these post-dissociation gains were erased when effective stress was increased from in situ values ( 1 MPa) to 10MPa in a simulation of the depressurization method for methane extraction from hydrate. Vertical-to-horizontal permeability anisotropy was also investigated. First-ever multi-stage loading tests and strain-rate alternation compression tests were successfully conducted for evaluating sediment strengthening dependence on the rate and magnitude of effective confining stress changes. In addition, oedometer tests were performed up to 40MPa of consolidation stress to simulate the depressurization method in ultra-deep sea environments. Consolidation curves measured with and without gas hydrate were investigated over a wide range of effective confining stresses. Compression curves for gas hydrate-bearing sediments were convex downward due to high hydrate saturations. Consolidation tests show that, regardless of the consolidation history with hydrate in place, the consolidation behavior after dissociation will first return to, then follow, the original normal consolidation curve for the hydrate-free host sediment.
Phase Coexistence in Insect Swarms
NASA Astrophysics Data System (ADS)
Sinhuber, Michael; Ouellette, Nicholas T.
2017-10-01
Animal aggregations are visually striking, and as such are popular examples of collective behavior in the natural world. Quantitatively demonstrating the collective nature of such groups, however, remains surprisingly difficult. Inspired by thermodynamics, we applied topological data analysis to laboratory insect swarms and found evidence for emergent, material-like states. We show that the swarms consist of a core "condensed" phase surrounded by a dilute "vapor" phase. These two phases coexist in equilibrium, and maintain their distinct macroscopic properties even though individual insects pass freely between them. We further define a pressure and chemical potential to describe these phases, extending theories of active matter to aggregations of macroscopic animals and laying the groundwork for a thermodynamic description of collective animal groups.
International Perspective on Government Nanotechnology Funding in 2005
NASA Astrophysics Data System (ADS)
Roco, M. C.
2005-12-01
The worldwide investment in nanotechnology research and development (R&D) reported by national government organizations and EC has increased approximately 9-fold in the last 8 years - from 432 million in 1997 to about 4,100 million in 2005. The proportion of national government investments for: academic R&D and education are between 20% (Korea, Taiwan) and 65% (US), industrial R&D - between 5% (US) and 60% (Korea, Taiwan), and core facilities and government laboratories - about 20-25% in all major contributing economies. This evaluation uses the NNI definition of nanotechnology (that excludes MEMS or microelectronics), and is based on direct information and analysis with managers of nanotechnology R&D programs in the respective countries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmack, William Jonathan; Braase, Lori Ann
Fuel recovery from severe accidents requires careful planning and execution. The Idaho National Laboratory played a key role in the Three Mile Island (TMI) fuel and core recovery. This involved technology development to locate and handle the damaged fuel; characterization of fuel and debris; analysis of fuel interaction with structural components and materials; development of fuel drying technology for long-term storage. However, one of the critical activities from the TMI project was the extensive effort document all the activities and archive the reports and photos. A historical review of the TMI project at the INL leads to the identification ofmore » current applications and considerations for facility designs, fuel handling, robotic applications, material characterization, etc.« less
Precision and manufacturing at the Lawrence Livermore National Laboratory
NASA Technical Reports Server (NTRS)
Saito, Theodore T.; Wasley, Richard J.; Stowers, Irving F.; Donaldson, Robert R.; Thompson, Daniel C.
1994-01-01
Precision Engineering is one of the Lawrence Livermore National Laboratory's core strengths. This paper discusses the past and present current technology transfer efforts of LLNL's Precision Engineering program and the Livermore Center for Advanced Manufacturing and Productivity (LCAMP). More than a year ago the Precision Machine Commercialization project embodied several successful methods of transferring high technology from the National Laboratories to industry. Currently, LCAMP has already demonstrated successful technology transfer and is involved in a broad spectrum of current programs. In addition, this paper discusses other technologies ripe for future transition including the Large Optics Diamond Turning Machine.
Precision and manufacturing at the Lawrence Livermore National Laboratory
NASA Astrophysics Data System (ADS)
Saito, Theodore T.; Wasley, Richard J.; Stowers, Irving F.; Donaldson, Robert R.; Thompson, Daniel C.
1994-02-01
Precision Engineering is one of the Lawrence Livermore National Laboratory's core strengths. This paper discusses the past and present current technology transfer efforts of LLNL's Precision Engineering program and the Livermore Center for Advanced Manufacturing and Productivity (LCAMP). More than a year ago the Precision Machine Commercialization project embodied several successful methods of transferring high technology from the National Laboratories to industry. Currently, LCAMP has already demonstrated successful technology transfer and is involved in a broad spectrum of current programs. In addition, this paper discusses other technologies ripe for future transition including the Large Optics Diamond Turning Machine.
Next generation tools for genomic data generation, distribution, and visualization
2010-01-01
Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq. PMID:20828407
NASA Astrophysics Data System (ADS)
Jenk, Theo Manuel; Rubino, Mauro; Etheridge, David; Ciobanu, Viorela Gabriela; Blunier, Thomas
2016-08-01
Palaeoatmospheric records of carbon dioxide and its stable carbon isotope composition (δ13C) obtained from polar ice cores provide important constraints on the natural variability of the carbon cycle. However, the measurements are both analytically challenging and time-consuming; thus only data exist from a limited number of sampling sites and time periods. Additional analytical resources with high analytical precision and throughput are thus desirable to extend the existing datasets. Moreover, consistent measurements derived by independent laboratories and a variety of analytical systems help to further increase confidence in the global CO2 palaeo-reconstructions. Here, we describe our new set-up for simultaneous measurements of atmospheric CO2 mixing ratios and atmospheric δ13C and δ18O-CO2 in air extracted from ice core samples. The centrepiece of the system is a newly designed needle cracker for the mechanical release of air entrapped in ice core samples of 8-13 g operated at -45 °C. The small sample size allows for high resolution and replicate sampling schemes. In our method, CO2 is cryogenically and chromatographically separated from the bulk air and its isotopic composition subsequently determined by continuous flow isotope ratio mass spectrometry (IRMS). In combination with thermal conductivity measurement of the bulk air, the CO2 mixing ratio is calculated. The analytical precision determined from standard air sample measurements over ice is ±1.9 ppm for CO2 and ±0.09 ‰ for δ13C. In a laboratory intercomparison study with CSIRO (Aspendale, Australia), good agreement between CO2 and δ13C results is found for Law Dome ice core samples. Replicate analysis of these samples resulted in a pooled standard deviation of 2.0 ppm for CO2 and 0.11 ‰ for δ13C. These numbers are good, though they are rather conservative estimates of the overall analytical precision achieved for single ice sample measurements. Facilitated by the small sample requirement, replicate measurements are feasible, allowing the method precision to be improved potentially. Further, new analytical approaches are introduced for the accurate correction of the procedural blank and for a consistent detection of measurement outliers, which is based on δ18O-CO2 and the exchange of oxygen between CO2 and the surrounding ice (H2O).
NASA Astrophysics Data System (ADS)
Millett, John; Haskins, Eric; Thomas, Donald; Jerram, Dougal; Planke, Sverre; Healy, Dave; Kück, Jochem; Rossetti, Lucas; Farrell, Natalie; Pierdominici, Simona
2017-04-01
Volcanic reservoirs are becoming increasingly important in the targeting of petroleum, geothermal and water resources globally. However, key areas of uncertainty in relation to volcanic reservoir properties during burial in different settings remain. In this contribution, we present results from borehole logging and sampling operations within two fully cored c. 1.5 km deep boreholes, PTA2 and KMA1, from the Humúula saddle region on the Big Island of Hawai'i. The boreholes were drilled as part of the Humu'ula Groundwater Research Project (HGRP) between 2013-2016 and provide unique insights into the evolution of pore structure with increasing burial in a basaltic dominated lava sequence. The boreholes encounter mixed sequences of 'a'ā, pāhoehoe and transitional lava flows along with subsidiary intrusions and sediments from the shield to post-shield phases of Mauna Kea. Borehole wireline data including sonic, spectral gamma and Televiewer imagery were collected along with density, porosity, permeability and ultrasonic velocity laboratory measurements from core samples. A range of intra-facies were sampled for analysis from various depths within the two boreholes. By comparison with core data, the potential for high resolution Televiewer imaging to reveal spectacular intra-facies features including individual vesicles, vesicle segregations, 'a'ā rubble zones, intrusive contacts, and intricate pāhoehoe lava flow lobe morphologies is demonstrated. High quality core data enables the calibration of Televiewer facies enabling improved interpretation of volcanic reservoir features in the more common exploration scenario where core is absent. Laboratory results record the ability of natural vesicular basalt samples to host very high porosity (>50%) and permeability (>10 darcies) within lava flow top facies which we demonstrate are associated with vesicle coalescence and not micro-fractures. These properties may be maintained to depths of c. 1.5 km in regions of limited alteration and secondary mineralization and, therefore, additional to fractures, may comprise important fluid pathways at depth. Alteration and porosity occlusion by secondary minerals is highly vertically compartmentalized and does not increase systematically with depth, implying a strong but heterogeneous lateral component in the migration and effects of hydrothermal fluids in these systems. The distribution and timing of dyke feeder zones coupled with the scale and spatial distribution of lava flows making up the lava pile form first order influences on the preservation potential of volcanic reservoir properties during burial. Our results demonstrate the complex relationship between the primary hydrogeology of lava flow fields and the resulting effects of hydrothermal fluid circulation on reservoir property evolution with burial.
Andrew B. Reinmann; Pamela H. Templer; John L. Campbell
2012-01-01
Considerable progress has been made in understanding the impacts of soil frost on carbon (C) and nitrogen (N) cycling, but the effects of soil frost on C and N fluxes during snowmelt remain poorly understood. We conducted a laboratory experiment to determine the effects of soil frost on C and N fluxes from forest floor soils during snowmelt. Soil cores were collected...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luther, Erik Paul; Leckie, Rafael M.; Dombrowski, David E.
This supplemental report describes fuel fabrication efforts conducted for the Idaho National Laboratory Trade Study for the TREAT Conversion project that is exploring the replacement of the HEU (Highly Enriched Uranium) fuel core of the TREAT reactor with LEU (Low Enriched Uranium) fuel. Previous reports have documented fabrication of fuel by the “upgrade” process developed at Los Alamos National Laboratory. These experiments supplement an earlier report that describes efforts to increase the graphite content of extruded fuel and minimize cracking.
NASA Astrophysics Data System (ADS)
Kiaalhosseini, Saeed
In modern contaminant hydrology, management of contaminated sites requires a holistic characterization of subsurface conditions. Delineation of contaminant distribution in all phases (i.e., aqueous, non-aqueous liquid, sorbed, and gas), as well as associated biogeochemical processes in a complex heterogeneous subsurface, is central to selecting effective remedies. Arguably, a factor contributing to the lack of success of managing contaminated sites effectively has been the limitations of site characterization methods that rely on monitoring wells and grab sediment samples. The overarching objective of this research is to advance a set of third-generation (3G) site characterization methods to overcome shortcomings of current site characterization techniques. 3G methods include 1) cryogenic core collection (C3) from unconsolidated geological subsurface to improve recovery of sediments and preserving key attributes, 2) high-throughput analysis (HTA) of frozen core in the laboratory to provide high-resolution, depth discrete data of subsurface conditions and processes, 3) resolution of non-aqueous phase liquid (NAPL) distribution within the porous media using a nuclear magnetic resonance (NMR) method, and 4) application of a complex resistivity method to track NAPL depletion in shallow geological formation over time. A series of controlled experiments were conducted to develop the C 3 tools and methods. The critical aspects of C3 are downhole circulation of liquid nitrogen via a cooling system, the strategic use of thermal insulation to focus cooling into the core, and the use of back pressure to optimize cooling. The C3 methods were applied at two contaminated sites: 1) F.E. Warren (FEW) Air Force Base near Cheyenne, WY and 2) a former refinery in the western U.S. The results indicated that the rate of core collection using the C3 methods is on the order of 30 foot/day. The C3 methods also improve core recovery and limits potential biases associated with flowing sands. HTA of frozen core was employed at the former refinery and FEW. Porosity and fluid saturations (i.e., aqueous, non-aqueous liquid, and gas) from the former refinery indicate that given in situ freezing, the results are not biased by drainage of pore fluids from the core during sample collection. At FEW, a comparison between the results of HTA of the frozen core collected in 2014 and the results of site characterization using unfrozen core, (second-generation (2G) methods) at the same locations (performed in 2010) indicate consistently higher contaminant concentrations using C 3. Many factors contribute to the higher quantification of contaminant concentrations using C3. The most significant factor is the preservation of the sediment attributes, in particular, pore fluids and volatile organic compounds (VOCs) in comparison to the unfrozen conventional sediment core. The NMR study was performed on laboratory-fabricated sediment core to resolve NAPL distribution within the porous media qualitatively and quantitatively. The fabricated core consisted of Colorado silica sand saturated with deionized water and trichloroethylene (TCE). The cores were scanned with a BRUKER small-animal scanner (2.3 Tesla, 100 MHz) at 20 °C and while the core was frozen at -25 °C. The acquired images indicated that freezing the water within the core suppressed the NMR signals of water-bound hydrogen. The hydrogen associated with TCE was still detectable since the TCE was in its liquid state (melting point of TCE is -73 °C). Therefore, qualitative detection of TCE within the sediment core was performed via the NMR scanning by freezing the water. A one-dimensional NMR scanning method was used for quantification of TCE mass distribution within the frozen core. However, the results indicated inconsistency in estimating the total TCE mass within the porous media. Downhole NMR logging was performed at the former refinery in the western U.S. to detect NAPL and to discriminate NAPL from water in the formation. The results indicated that detection of NMR signals to discriminate NAPL from water is compromised by the noise stemming from the active facilities and/or power lines passing over the site. A laboratory experiment was performed to evaluate the electrical response of unconsolidated porous media through time (30 days) while NAPL was being depleted. Sand columns (Colorado silica sand) contaminated with methyl tert-butyl ether (MTBE, a light non-aqueous phase liquid (LNAPL)) were studied. A multilevel electrode system was used to measure electrical resistivity of impacted sand by imposing alternative current. The trend of reduction in resistivity through the depth of columns over time followed depletion of LNAPL by volatilization. Finally, a field experiment was performed at the former refinery in the western U.S. to track natural losses of LNAPL over time. Multilevel systems consisting of water samplers, thermocouples, and electrodes were installed at a clean zone (background zone) and an LNAPL-impacted zone. In situ measurements of complex resistivity and temperature were taken and water sampling was performed for each depth (from 3 to 14 feet below the ground surface at one-foot spacing) within almost a year. At both locations, the results indicated decreases in apparent resistivity below the water table over time. This trend was supported by the geochemistry of the pore fluids. Overall, results indicate that application of the electrical resistivity method to track LNAPL depletion at field sites is difficult due to multiple conflicting factors affecting the geoelectrical response of LNAPL-impacted zones over time.
The student perspective of high school laboratory experiences
NASA Astrophysics Data System (ADS)
Lambert, R. Mitch
High school science laboratory experiences are an accepted teaching practice across the nation despite a lack of research evidence to support them. The purpose of this study was to examine the perspective of students---stakeholders often ignored---on these experiences. Insight into the students' perspective was explored progressively using a grounded theory methodology. Field observations of science classrooms led to an open-ended survey of high school science students, garnering 665 responses. Twelve student interviews then focused on the data and questions evolving from the survey. The student perspective on laboratory experiences revealed varied information based on individual experience. Concurrent analysis of the data revealed that although most students like (348/665) or sometimes like (270/665) these experiences, some consistent factors yielded negative experiences and prompted suggestions for improvement. The category of responses that emerged as the core idea focused on student understanding of the experience. Students desire to understand the why do, the how to, and the what it means of laboratory experiences. Lacking any one of these, the experience loses educational value for them. This single recurring theme crossed the boundaries of age, level in school, gender, and even the student view of lab experiences as positive or negative. This study suggests reflection on the current laboratory activities in which science teachers engage their students. Is the activity appropriate (as opposed to being merely a favorite), does it encourage learning, does it fit, does it operate at the appropriate level of inquiry, and finally what can science teachers do to integrate these activities into the classroom curriculum more effectively? Simply stated, what can teachers do so that students understand what to do, what's the point, and how that point fits into what they are learning outside the laboratory?
Semi-empirical studies of atomic structure. Progress report, 1 July 1982-1 February 1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, L.J.
1983-01-01
A program of studies of the properties of the heavy and highly ionized atomic systems which often occur as contaminants in controlled fusion devices is continuing. The project combines experimental measurements by fast-ion-beam excitation with semi-empirical data parametrizations to identify and exploit regularities in the properties of these very heavy and very highly ionized systems. The increasing use of spectroscopic line intensities as diagnostics for determining thermonuclear plasma temperatures and densities requires laboratory observation and analysis of such spectra, often to accuracies that exceed the capabilities of ab initio theoretical methods for these highly relativistic many electron systems. Through themore » acquisition and systematization of empirical data, remarkably precise methods for predicting excitation energies, transition wavelengths, transition probabilities, level lifetimes, ionization potentials, core polarizabilities, and core penetrabilities are being developed and applied. Although the data base for heavy, highly ionized atoms is still sparse, parametrized extrapolations and interpolations along isoelectronic, homologous, and Rydberg sequences are providing predictions for large classes of quantities, with a precision that is sharpened by subsequent measurements.« less
Semiempirical studies of atomic structure. Progress report, 1 July 1983-1 June 1984
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, L.J.
1984-01-01
A program of studies of the properties of the heavy and highly ionized atomic systems which often occur as contaminants in controlled fusion devices is continuing. The project combines experimental measurements by fast ion beam excitation with semiempirical data parametrizations to identify and exploit regularities in the properties of these very heavy and very highly ionized systems. The increasing use of spectroscopic line intensities as diagnostics for determining thermonuclear plasma temperatures and densities requires laboratory observation and analysis of such spectra, often to accuracies that exceed the capabilities of ab initio theoretical methods for these highly relativistic many electron systems.more » Through the acquisition and systematization of empirical data, remarkably precise methods for predicting excitation energies, transition wavelengths, transition probabilities, level lifetimes, ionization potentials, core polarizabilities, and core penetrabilities are being developed and applied. Although the data base for heavy, highly ionized atoms is still sparse, parametrized extrapolations and interpolations along isoelectronic, homologous, and Rydberg sequences are providing predictions for large classes of quantities, with a precision that is sharpened by subsequent measurements.« less
NASA Astrophysics Data System (ADS)
Gurtler, G.
2017-12-01
We discuss the challenges and achievements that a small HSI college had integrating undergraduate research experiences into an existing natural sciences program. Like most introductory college science courses, our natural science courses used textbooks, PowerPoint presentations, and lectures to illustrate basic scientific concepts. Though a collective decision was made by our science faculty to incorporate undergraduate research projects into various STEM courses, our greatest challenge was incorporating mandatory research courses into the degree plans of our Natural Science program. We found that students made considerable progress in understanding natural science by critically evaluating primary research articles and undertaking small research projects. Many of these student projects were conducted in cooperation with the Albuquerque District of the US Army Corps of Engineers, United States Geological Survey in Denver, and the National Ice Core Laboratory. These projects illustrated the effects of climate change on the water quality, sediment buildup, and biodiversity at local reservoirs. Other projects involved the analysis of ice core samples from Greenland and Antarctica. Students presented research posters at various research venues, including Community College Undergraduate Research Initiative colloquiums.
A wireless fatigue monitoring system utilizing a bio-inspired tree ring data tracking technique.
Bai, Shi; Li, Xuan; Xie, Zhaohui; Zhou, Zhi; Ou, Jinping
2014-03-05
Fatigue, a hot scientific research topic for centuries, can trigger sudden failure of critical structures such as aircraft and railway systems, resulting in enormous casualties as well as economic losses. The fatigue life of certain structures is intrinsically random and few monitoring techniques are capable of tracking the full life-cycle fatigue damage. In this paper, a novel in-situ wireless real-time fatigue monitoring system using a bio-inspired tree ring data tracking technique is proposed. The general framework, methodology, and verification of this intelligent system are discussed in details. The rain-flow counting (RFC) method is adopted as the core algorithm which quantifies fatigue damages, and Digital Signal Processing (DSP) is introduced as the core module for data collection and analysis. Laboratory test results based on strain gauges and polyvinylidene fluoride (PVDF) sensors have shown that the developed intelligent system can provide a reliable quick feedback and early warning of fatigue failure. With the merits of low cost, high accuracy and great reliability, the developed wireless fatigue sensing system can be further applied to mechanical engineering, civil infrastructures, transportation systems, aerospace engineering, etc.
Impactor core disruption by high-energy planetary collisions
NASA Astrophysics Data System (ADS)
Landeau, M.; Phillips, D.; Deguen, R.; Neufeld, J.; Dalziel, S.; Olson, P.
2017-12-01
Understanding the fate of impactor cores during large planetary collisions is key for predicting metal-silicate equilibration during Earth's accretion. Accretion models and geochemical observations indicate that much of Earth's mass accreted through high-energy impacts between planetary embryos already differentiated into a metallic core and a silicate mantle. Previous studies on core formation assume that the metallic core of the impactor is left intact by the impact, but it mixes with silicates during the post-impact fall in the magma ocean. Recent impact simulations, however, suggest that the impact cratering process induces significant core disruption and metal-silicate mixing. Unlike existing impact simulations, experiments can produce turbulence, a key ingredient to investigate disruption of the impactor core. Here we use laboratory experiments where a volume of salt solution (representing the impactor core) vertically impacts a pool of water (representing the magma ocean) to quantify impact-induced mixing between the impactor and the target as a function of impact velocity, impactor size and density difference. We find that the ratio between the impactor inertia and its weight controls mixing. Extrapolated to planetary accretion, our results suggest that the impact process induces no significant mixing for impactors of comparable size as the protoplanet whereas the impactor core is highly disrupted by impacts involving impactors much smaller than the protoplanet.
Blaya, Joaquin A; Shin, Sonya S; Yagui, Martin J A; Yale, Gloria; Suarez, Carmen Z; Asencios, Luis L; Cegielski, J Peter; Fraser, Hamish S F
2007-10-28
Multi-drug resistant tuberculosis patients in resource-poor settings experience large delays in starting appropriate treatment and may not be monitored appropriately due to an overburdened laboratory system, delays in communication of results, and missing or error-prone laboratory data. The objective of this paper is to describe an electronic laboratory information system implemented to alleviate these problems and its expanding use by the Peruvian public sector, as well as examine the broader issues of implementing such systems in resource-poor settings. A web-based laboratory information system "e-Chasqui" has been designed and implemented in Peru to improve the timeliness and quality of laboratory data. It was deployed in the national TB laboratory, two regional laboratories and twelve pilot health centres. Using needs assessment and workflow analysis tools, e-Chasqui was designed to provide for improved patient care, increased quality control, and more efficient laboratory monitoring and reporting. Since its full implementation in March 2006, 29,944 smear microscopy, 31,797 culture and 7,675 drug susceptibility test results have been entered. Over 99% of these results have been viewed online by the health centres. High user satisfaction and heavy use have led to the expansion of e-Chasqui to additional institutions. In total, e-Chasqui will serve a network of institutions providing medical care for over 3.1 million people. The cost to maintain this system is approximately US$0.53 per sample or 1% of the National Peruvian TB program's 2006 budget. Electronic laboratory information systems have a large potential to improve patient care and public health monitoring in resource-poor settings. Some of the challenges faced in these settings, such as lack of trained personnel, limited transportation, and large coverage areas, are obstacles that a well-designed system can overcome. e-Chasqui has the potential to provide a national TB laboratory network in Peru. Furthermore, the core functionality of e-Chasqui as been implemented in the open source medical record system OpenMRS http://www.openmrs.org for other countries to use.
Blaya, Joaquin A; Shin, Sonya S; Yagui, Martin JA; Yale, Gloria; Suarez, Carmen Z; Asencios, Luis L; Cegielski, J Peter; Fraser, Hamish SF
2007-01-01
Background Multi-drug resistant tuberculosis patients in resource-poor settings experience large delays in starting appropriate treatment and may not be monitored appropriately due to an overburdened laboratory system, delays in communication of results, and missing or error-prone laboratory data. The objective of this paper is to describe an electronic laboratory information system implemented to alleviate these problems and its expanding use by the Peruvian public sector, as well as examine the broader issues of implementing such systems in resource-poor settings. Methods A web-based laboratory information system "e-Chasqui" has been designed and implemented in Peru to improve the timeliness and quality of laboratory data. It was deployed in the national TB laboratory, two regional laboratories and twelve pilot health centres. Using needs assessment and workflow analysis tools, e-Chasqui was designed to provide for improved patient care, increased quality control, and more efficient laboratory monitoring and reporting. Results Since its full implementation in March 2006, 29,944 smear microscopy, 31,797 culture and 7,675 drug susceptibility test results have been entered. Over 99% of these results have been viewed online by the health centres. High user satisfaction and heavy use have led to the expansion of e-Chasqui to additional institutions. In total, e-Chasqui will serve a network of institutions providing medical care for over 3.1 million people. The cost to maintain this system is approximately US$0.53 per sample or 1% of the National Peruvian TB program's 2006 budget. Conclusion Electronic laboratory information systems have a large potential to improve patient care and public health monitoring in resource-poor settings. Some of the challenges faced in these settings, such as lack of trained personnel, limited transportation, and large coverage areas, are obstacles that a well-designed system can overcome. e-Chasqui has the potential to provide a national TB laboratory network in Peru. Furthermore, the core functionality of e-Chasqui as been implemented in the open source medical record system OpenMRS for other countries to use. PMID:17963522
Development of Mycoplasma synoviae (MS) core genome multilocus sequence typing (cgMLST) scheme.
Ghanem, Mostafa; El-Gazzar, Mohamed
2018-05-01
Mycoplasma synoviae (MS) is a poultry pathogen with reported increased prevalence and virulence in recent years. MS strain identification is essential for prevention, control efforts and epidemiological outbreak investigations. Multiple multilocus based sequence typing schemes have been developed for MS, yet the resolution of these schemes could be limited for outbreak investigation. The cost of whole genome sequencing became close to that of sequencing the seven MLST targets; however, there is no standardized method for typing MS strains based on whole genome sequences. In this paper, we propose a core genome multilocus sequence typing (cgMLST) scheme as a standardized and reproducible method for typing MS based whole genome sequences. A diverse set of 25 MS whole genome sequences were used to identify 302 core genome genes as cgMLST targets (35.5% of MS genome) and 44 whole genome sequences of MS isolates from six countries in four continents were used for typing applying this scheme. cgMLST based phylogenetic trees displayed a high degree of agreement with core genome SNP based analysis and available epidemiological information. cgMLST allowed evaluation of two conventional MLST schemes of MS. The high discriminatory power of cgMLST allowed differentiation between samples of the same conventional MLST type. cgMLST represents a standardized, accurate, highly discriminatory, and reproducible method for differentiation between MS isolates. Like conventional MLST, it provides stable and expandable nomenclature, allowing for comparing and sharing the typing results between different laboratories worldwide. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
An approach for coupled-code multiphysics core simulations from a common input
Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...
2014-12-10
This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less
Reengineering of waste management at the Oak Ridge National Laboratory. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myrick, T.E.
1997-08-01
A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Sitemore » Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Benchmarking of a commercial nuclear facility, a commercial research facility, and a DOE research facility was conducted to both validate the efficacy of these findings and seek additional ideas for improvement. The outcome of this evaluation is represented by the 15 final recommendations that are described in this report.« less
Qualitative Flow Visualization of a 110-N Hydrogen/Oxygen Laboratory Model Thruster
NASA Technical Reports Server (NTRS)
deGroot, Wim A.; McGuire, Thomas J.; Schneider, Steven J.
1997-01-01
The flow field inside a 110 N gaseous hydrogen/oxygen thruster was investigated using an optically accessible, two-dimensional laboratory test model installed in a high altitude chamber. The injector for this study produced an oxidizer-rich core flow, which was designed to fully mix and react inside a fuel-film sleeve insert before emerging into the main chamber section, where a substantial fuel film cooling layer was added to protect the chamber wall. Techniques used to investigate the flow consisted of spontaneous Raman spectra measurements, visible emission imaging, ultraviolet hydroxyl spectroscopy, and high speed schlieren imaging. Experimental results indicate that the oxygen rich core flow continued to react while emerging from the fuel-film sleeve, suggesting incomplete mixing of the hydrogen in the oxygen rich core flow. Experiments also showed that the fuel film cooling protective layer retained its integrity throughout the straight section of the combustion chamber. In the converging portion of the chamber, however, a turbulent reaction zone near the wall destroyed the integrity of the film layer, a result which implies that a lower contraction angle may improve the fuel film cooling in the converging section and extend the hardware lifetime.
Mercier, Tracey J.; Brownfield, Michael E.; Johnson, Ronald C.; Self, Jesse G.
1998-01-01
This CD-ROM includes updated files containing Fischer assays of samples of core holes and cuttings from exploration drill holes drilled in the Eocene Green River Formation in the Piceance Basin of northwestern Colorado. A database was compiled that includes more than 321,380 Fischer assays from 782 boreholes. Most of the oil yield data were analyzed by the former U.S. Bureau of Mines oil shale laboratory in Laramie, Wyoming, and some analyses were made by private laboratories. Location data for 1,042 core and rotary holes, oil and gas tests, as well as a few surface sections are listed in a spreadsheet and included in the CD-ROM. These assays are part of a larger collection of subsurface information held by the U.S. Geological Survey, including geophysical and lithologic logs, water data, and chemical and X-ray diffraction analyses having to do with the Green River oil shale deposits in Colorado, Wyoming, and Utah. Because of an increased interest in oil shale, this CD-ROM disc containing updated Fischer assay data for the Piceance Basin oil shale deposits in northwestern Colorado is being released to the public.
Quasi-Static Tensile Stress-Strain Curves. 1, 2024-T3510 Aluminum Alloy
1976-02-01
herein were conducted as part of the Core Materials Program of the Solid Mechanics Branch of the Terminal Ballistics Laboratory. The objective of this...describing the results of the Core Materials Program, covers quasi-static terVsile tests of 2024-T3510 aluminum E’lloy. The results include Young’s...11.31 4 580.6 9.94 TABLE II MATERIAL PROPERTIES OF 2024-T3510 ALUMINUM ALLOYa Results of Results of Results of Tensileb Compres ion Sonic Testing
2012-07-01
and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general architecture and a SAA testbed implementation that...that provides data and software services to enable a set of Unmanned Aircraft (UA) platforms to operate in a wide range of air domains which may...implemented by MIT Lincoln Laboratory in the form of a Sense and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general
Multi-Core Processor Memory Contention Benchmark Analysis Case Study
NASA Technical Reports Server (NTRS)
Simon, Tyler; McGalliard, James
2009-01-01
Multi-core processors dominate current mainframe, server, and high performance computing (HPC) systems. This paper provides synthetic kernel and natural benchmark results from an HPC system at the NASA Goddard Space Flight Center that illustrate the performance impacts of multi-core (dual- and quad-core) vs. single core processor systems. Analysis of processor design, application source code, and synthetic and natural test results all indicate that multi-core processors can suffer from significant memory subsystem contention compared to similar single-core processors.
Guidelines on Good Clinical Laboratory Practice
Ezzelle, J.; Rodriguez-Chavez, I. R.; Darden, J. M.; Stirewalt, M.; Kunwar, N.; Hitchcock, R.; Walter, T.; D’Souza, M. P.
2008-01-01
A set of Good Clinical Laboratory Practice (GCLP) standards that embraces both the research and clinical aspects of GLP were developed utilizing a variety of collected regulatory and guidance material. We describe eleven core elements that constitute the GCLP standards with the objective of filling a gap for laboratory guidance, based on IND sponsor requirements, for conducting laboratory testing using specimens from human clinical trials. These GCLP standards provide guidance on implementing GLP requirements that are critical for laboratory operations, such as performance of protocol-mandated safety assays, peripheral blood mononuclear cell processing and immunological or endpoint assays from biological interventions on IND-registered clinical trials. The expectation is that compliance with the GCLP standards, monitored annually by external audits, will allow research and development laboratories to maintain data integrity and to provide immunogenicity, safety, and product efficacy data that is repeatable, reliable, auditable and that can be easily reconstructed in a research setting. PMID:18037599
Controlled-Release Microcapsules for Smart Coatings for Corrosion Applications
NASA Technical Reports Server (NTRS)
2008-01-01
Corrosion is a serious problem that has enormous costs and serious safety implications. Localized corrosion, such as pitting, is very dangerous and can cause catastrophic failures. The NASA Corrosion Technology Laboratory at Kennedy Space Center is developing a smart coating based on pH-sensitive microcapsules for corrosion applications. These versatile microcapsules are designed to be incorporated into a smart coating and deliver their core content when corrosion starts. Corrosion indication was the first function incorporated into the microcapsules. Current efforts are focused on incorporating the corrosion inhibition function through the encapsulation of corrosion inhibitors into water core and oil core microcapsules. Scanning electron microscopy (SEM) images of encapsulated corrosion inhibitors are shown.
Thirsk with FPEF MS hardware in Kibo
2009-10-07
ISS020-E-048792 (7 Oct. 2009) --- Canadian Space Agency astronaut Robert Thirsk, Expedition 20/21 flight engineer, holds Fluid Physics Experiment Facility/Marangoni Surface (FPEF MS) Core hardware in the Kibo laboratory of the International Space Station.
Report: Review of Region 5 Laboratory Operations
Report #2000-P-3, Jan 1, 2000. In September 1998, the Region 5 Quality Assurance Core initiated a Management Systems Review of the CRL to determine whether the lab’s quality management system was operating as designed.
Publications - GMC 58 | Alaska Division of Geological & Geophysical Surveys
DGGS GMC 58 Publication Details Title: X-ray diffraction and scanning electron microscopy mineral , Michael, and Core Laboratories, 1985, X-ray diffraction and scanning electron microscopy mineral analyses
Atmospheric Science Data Center
2013-04-01
... free of charge from JPL, upon completion of a license agreement. hdfscan software consists of two components - a core hdf file ... at the Jet Propulsion Laboratory. To obtain the license agreement, go to the MISR Science Software web page , read the introductory ...
Self-stressed sandwich bridge decks.
DOT National Transportation Integrated Search
1971-01-01
Proposed is an entirely new type of bridge deck, consisting of an unreinforced lightweight concrete slab made of expanding cement sandwiched between two thin plates of steel. The expanding core serves to prestress the panel. Laboratory tests were con...
GRAIL Twin Spacecraft -- Crust to Core Artist Concept
2009-05-18
The Gravity Recovery and Interior Laboratory GRAIL mission utilizes the technique of twin spacecraft flying in formation with a known altitude above the lunar surface and known separation distance to investigate the gravity field of the moon.
NASA Astrophysics Data System (ADS)
Gómez-Gómez, Felipe; Rodriguez-Manfredi, Jose-Antonio; Perez, Lidia; Prieto-Ballesteros, Olga; Amils, Ricardo; Gomez-Elvira, Javier
We are developing a Universal Habitability Index for life prospection studies in space missions. Authors will present in this abstract the results of the application of the habitability index in two field case studies: Alaskan permafrost and Atacama Desert. We are using extreme envi-ronments as test facilities from an Astrobiological perspective, in order to reach three main objectives: 1) Define preservation patterns of biosignatures in extreme environments (cold, low water stress, high radiation. . . ) that may be used in future space exploration missions; 2) develop new instrumentation for detecting life in situ or remotely, and for new instrumenta-tion for detection and mapping of extreme niches where life (or biochemical tracers of past life) may be preserved and 3) develop an Universal Habitability Index for space astrobiolog-ical mission application (Mars or Europa life prospection). These aims will be achieved by selected site characterization using geophysical sounding and drilling, atmospheric characteri-zation by meteorological analysis, soil water and temperature profile analysis and, finally, by sampling different levels of the rock cores and analyzing their mineralogy, geochemistry and microbiology in laboratory. First case: studying the permafrost in the Imuruk lake volcanic field area (Alaska): In order to map the permafrost underground, electric tomography sounding was performed. Resulting tomographic data indicate that the permafrost of the studied area is at a mean depth of 0.50 meter from the surface, sometimes even shallower. Drilling points were selected depending on the permafrost depth known from the tomographic data analysis. Three perforations were done all along the hill. Samples were collected at several depths in the three holes for mineralogical, geochemical and biological analysis. They were in situ fixed with formaldehyde in order to be maintained till laboratory analysis was developed. Several growth fresh media were inoculated with samples from different depths in the field for microor-ganism's enrichment. First results report enrichment in several inoculated media including some specific for heterotrophic aerobic bacteria, anaerobic chemolithotrophic and methanogen bacteria. Two different molecular methods are being used for microbial determination: "In situ" hybridizitation (was used for microbial determination and cell counting also) and 16S rRNA genes amplification, cloning and sequencing. First results in cell counting determined a population density gradient vs. depth. Second case: studying Atacama Desert: Habitability studies in the Atacama Desert were developed by the selection of a 36 square meters area where an atmospheric station was installed. The following parameters were measured: UV radiation, atmosphere temperature, ground temperature at three different depths, wind direction. Sam-ples at the pre-selected depths were taken in order to develop microbiological studies in the laboratory. Interesting results that will be presented at the COSPAR session were obtained with decreasing levels of life presence along the core profile. Acknowledgments: Centro de Astrobiologia INTA/CSIC (Spain) supported the expedition to Imuruk Lake. The Atacama expedition was supported with the Grant ESP 2004-05008 "De-teccién de biomoléculas en exploracién planetaria" from the Spanish Government. Laboratory analysis was supported with the Grant ESP-2006/06640 "Desarrollo de Tecnologé para la ıa Identificacién de Vida de forma Automética (DTIVA)"
McDonald, Peter R; Roy, Anuradha; Chaguturu, Rathnam
2011-05-01
The University of Kansas High-Throughput Screening (KU HTS) core is a state-of-the-art drug-discovery facility with an entrepreneurial open-service policy, which provides centralized resources supporting public- and private-sector research initiatives. The KU HTS core applies pharmaceutical industry project-management principles in an academic setting by bringing together multidisciplinary teams to fill critical scientific and technology gaps, using an experienced team of industry-trained researchers and project managers. The KU HTS proactively engages in supporting grant applications for extramural funding, intellectual-property management and technology transfer. The KU HTS staff further provides educational opportunities for the KU faculty and students to learn cutting-edge technologies in drug-discovery platforms through seminars, workshops, internships and course teaching. This is the first instalment of a two-part contribution from the KU HTS laboratory.
Core components of a comprehensive quality assurance program in anatomic pathology.
Nakhleh, Raouf E
2009-11-01
In this article the core components of a comprehensive quality assurance and improvement plan are outlined. Quality anatomic pathology work comes with focus on accurate, timely, and complete reports. A commitment to continuous quality improvement and a systems approach with a persistent effort helps to achieve this end. Departments should have a quality assurance and improvement plan that includes a risk assessment of real and potential problems facing the laboratory. The plan should also list the individuals responsible for carrying out the program with adequate resources, a defined timetable, and annual assessment for progress and future directions. Quality assurance monitors should address regulatory requirements and be organized by laboratory division (surgical pathology, cytology, etc) as well as 5 segments (preanalytic, analytic, postanalytic phases of the test cycle, turn-around-time, and customer satisfaction). Quality assurance data can also be used to evaluate individual pathologists using multiple parameters with peer group comparison.
Cesarovic, Nikola; Jirkof, Paulin; Rettich, Andreas; Arras, Margarete
2011-11-21
The laboratory mouse is the animal species of choice for most biomedical research, in both the academic sphere and the pharmaceutical industry. Mice are a manageable size and relatively easy to house. These factors, together with the availability of a wealth of spontaneous and experimentally induced mutants, make laboratory mice ideally suited to a wide variety of research areas. In cardiovascular, pharmacological and toxicological research, accurate measurement of parameters relating to the circulatory system of laboratory animals is often required. Determination of heart rate, heart rate variability, and duration of PQ and QT intervals are based on electrocardiogram (ECG) recordings. However, obtaining reliable ECG curves as well as physiological data such as core body temperature in mice can be difficult using conventional measurement techniques, which require connecting sensors and lead wires to a restrained, tethered, or even anaesthetized animal. Data obtained in this fashion must be interpreted with caution, as it is well known that restraining and anesthesia can have a major artifactual influence on physiological parameters. Radiotelemetry enables data to be collected from conscious and untethered animals. Measurements can be conducted even in freely moving animals, and without requiring the investigator to be in the proximity of the animal. Thus, known sources of artifacts are avoided, and accurate and reliable measurements are assured. This methodology also reduces interanimal variability, thus reducing the number of animals used, rendering this technology the most humane method of monitoring physiological parameters in laboratory animals. Constant advancements in data acquisition technology and implant miniaturization mean that it is now possible to record physiological parameters and locomotor activity continuously and in realtime over longer periods such as hours, days or even weeks. Here, we describe a surgical technique for implantation of a commercially available telemetry transmitter used for continuous measurements of core body temperature, locomotor activity and biopotential (i.e. onelead ECG), from which heart rate, heart rate variability, and PQ and QT intervals can be established in freeroaming, untethered mice. We also present pre-operative procedures and protocols for post-operative intensive care and pain treatment that improve recovery, well-being and survival rates in implanted mice.
A Novel In-Beam Delayed Neutron Counting Technique for Characterization of Special Nuclear Materials
NASA Astrophysics Data System (ADS)
Bentoumi, G.; Rogge, R. B.; Andrews, M. T.; Corcoran, E. C.; Dimayuga, I.; Kelly, D. G.; Li, L.; Sur, B.
2016-12-01
A delayed neutron counting (DNC) system, where the sample to be analyzed remains stationary in a thermal neutron beam outside of the reactor, has been developed at the National Research Universal (NRU) reactor of the Canadian Nuclear Laboratories (CNL) at Chalk River. The new in-beam DNC is a novel approach for non-destructive characterization of special nuclear materials (SNM) that could enable identification and quantification of fissile isotopes within a large and shielded sample. Despite the orders of magnitude reduction in neutron flux, the in-beam DNC method can be as informative as the conventional in-core DNC for most cases while offering practical advantages and mitigated risk when dealing with large radioactive samples of unknown origin. This paper addresses (1) the qualification of in-beam DNC using a monochromatic thermal neutron beam in conjunction with a proven counting apparatus designed originally for in-core DNC, and (2) application of in-beam DNC to an examination of large sealed capsules containing unknown radioactive materials. Initial results showed that the in-beam DNC setup permits non-destructive analysis of bulky and gamma shielded samples. The method does not lend itself to trace analysis, and at best could only reveal the presence of a few milligrams of 235U via the assay of in-beam DNC total counts. Through analysis of DNC count rates, the technique could be used in combination with other neutron or gamma techniques to quantify isotopes present within samples.
New constant-temperature operating mode for graphite calorimeter at LNE-LNHB.
Daures, J; Ostrowsky, A
2005-09-07
The realization of the unit of absorbed dose at LNE-LNHB is based on calorimetry with the present GR8 graphite calorimeter. For this reason the calorimetric technique must be maintained, developed and improved in the laboratory. The usual quasi-adiabatic operating mode at LNHB is based on the thermal feedback between the core (sensitive element) and the jacket (adjacent body). When a core-jacket temperature difference is detected, a commercially available analogue PID (Proportional, Integral, Derivative) controller sends to the jacket an amount of electrical power to reduce this difference. Nevertheless, the core and jacket temperatures increase with irradiations and electrical calibrations whereas the surrounding is maintained at a fixed temperature to shield against the room temperature variations. At radiotherapy dose rates, fewer than ten measurements, or electrical calibrations, per day can be performed. This paper describes the new constant-temperature operating mode which has been implemented recently to improve flexibility in use and, to some extent, accuracy. The core and the jacket temperatures are maintained at fixed temperatures. A steady state is achieved without irradiation. Then, under irradiation, the electrical power needed to maintain the assigned temperature in the core is reduced by the amount of heat generated by ionizing radiation. The difference between these electrical powers, without and with irradiation, gives the mean absorbed dose rate to the core. The quality of this electrical power substitution measurement is strongly dependent upon the quality of the core and jacket thermal control. The core temperature is maintained at the set value using a digital PID regulator developed at the laboratory with LabView software on PC for this purpose. This regulator is versatile and particularly well suited for calorimetry purposes. Measurements in a cobalt-60 beam have shown no significant difference (<0.09%) between the two operating modes, with an equivalent reproducibility (1sigma < 0.06%). These results corroborate the negligible difference of heat transfer between steady and irradiation periods when working in quasi-adiabatic mode with thermal feedback between the core and the jacket. The new constant-temperature mode allows numerous and fully automated measurements. The electrical calibration is an integral part of the measurement; no extra runs are needed. It also allows faster thermal equilibrium before starting runs. Moreover the quality of vacuum within the gaps between the bodies is less important.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laney, R.; Laughlin, A.W.; Aldrich, M.J. Jr.
1981-07-01
Petrologic, geochemical, and structural analyses of cores and cuttings obtained from 3000 to 4389-m true vertical depth in drill hole EE-2 indicate that this deeper part of the Precambrian section at Fenton Hill, New Mexico is composed primarily of a very heterogeneous and structurally anisotropic metamorphic complex, locally intruded by dikes and sills of granodioritic and monzogranitic composition. In this borehole none of these igneous bodies approach in size the 335-m-thick biotite-granodiorite body encountered at 2591-m depth beneath Fenton Hill in the other two drill holes. Contacts between the igneous and metamorphic rocks range from sharp and discordant to gradational.more » Analysis of cuttings indicates that clay-rich alteration zones are relatively common in the openhole portion of EE-2. These zones average about 20 m in thickness. Fracture sets in the Precambrian basement rock intersected by the EE-2 well bore mostly trend northeast and are steeply dipping to vertical; however, one of the sets dips gently to the northwest. Slickensided fault planes are present in a core (No.5) taken from a true vertical depth of 4195 m. Available core orientation data and geologic inference suggest that the faults dip steeply and trend between N.42/sup 0/ and 59/sup 0/E.« less
Ataman, Meric
2017-01-01
Genome-scale metabolic reconstructions have proven to be valuable resources in enhancing our understanding of metabolic networks as they encapsulate all known metabolic capabilities of the organisms from genes to proteins to their functions. However the complexity of these large metabolic networks often hinders their utility in various practical applications. Although reduced models are commonly used for modeling and in integrating experimental data, they are often inconsistent across different studies and laboratories due to different criteria and detail, which can compromise transferability of the findings and also integration of experimental data from different groups. In this study, we have developed a systematic semi-automatic approach to reduce genome-scale models into core models in a consistent and logical manner focusing on the central metabolism or subsystems of interest. The method minimizes the loss of information using an approach that combines graph-based search and optimization methods. The resulting core models are shown to be able to capture key properties of the genome-scale models and preserve consistency in terms of biomass and by-product yields, flux and concentration variability and gene essentiality. The development of these “consistently-reduced” models will help to clarify and facilitate integration of different experimental data to draw new understanding that can be directly extendable to genome-scale models. PMID:28727725
Research Associate | Center for Cancer Research
The Basic Science Program (BSP) at the Frederick National Laboratory for Cancer Research (FNLCR) pursues independent, multidisciplinary research programs in basic or applied molecular biology, immunology, retrovirology, cancer biology or human genetics. As part of the BSP, the Microbiome and Genetics Core (the Core) characterizes microbiomes by next-generation sequencing to determine their composition and variation, as influenced by immune, genetic, and host health factors. The Core provides support across a spectrum of processes, from nucleic acid isolation through bioinformatics and statistical analysis. KEY ROLES/RESPONSIBILITIES The Research Associate II will provide support in the areas of automated isolation, preparation, PCR and sequencing of DNA on next generation platforms (Illumina MiSeq and NextSeq). An opportunity exists to join the Core’s team of highly trained experimentalists and bioinformaticians working to characterize microbiome samples. The following represent requirements of the position: A minimum of five (5) years related of biomedical experience. Experience with high-throughput nucleic acid (DNA/RNA) extraction. Experience in performing PCR amplification (including quantitative real-time PCR). Experience or familiarity with robotic liquid handling protocols (especially on the Eppendorf epMotion 5073 or 5075 platforms). Experience in operating and maintaining benchtop Illumina sequencers (MiSeq and NextSeq). Ability to evaluate experimental quality and to troubleshoot molecular biology protocols. Experience with sample tracking, inventory management and biobanking. Ability to operate and communicate effectively in a team-oriented work environment.
Site characterization design and techniques used at the Southern Shipbuilding Corporation site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, J.P.; Geraghty, C.A.; Moore, G.W.
1995-12-31
The Southern Shipbuilding Corporation (SSC) site is an inactive barge/ship manufacturing and repair facility situated on approximately 54 acres in Slidell, St. Tammany Parish, Louisiana. Two unlined surface impoundments (North and South impoundments) are situated on the northwest portion of the site and are surrounded on three sides by Bayou Bonfouca. These impoundments are the sources of carcinogenic polynuclear aromatic hydrocarbon (CPAH) contamination at the site. Inadequate containment has resulted in the release of impoundment wastes into the bayou. To evaluate potential response alternatives for the site, an Engineering Evaluation/Cost Analysis (EE/CA) field investigation was conducted from July through Octobermore » 1994. A two phase sampling approach was used in combination with innovative and traditional sampling techniques, field screening technologies, and exploitation of the visual characteristics of the waste to determine the extent of waste migration with limited off-site laboratory confirmation. A skid-mounted mobile drilling unit, secured to a specialized sampling platform designed for multiple applications, was used for collection of sediment cores from the bayou as well as tarry sludge cores from the impoundments. Field screening of core samples was accomplished on site using an organic vapor analyzer and a total petroleum hydrocarbon (TPH) field analyzer. Pollutants of concern include metals, cyanide, dioxin, and organic compounds. This paper presents details on the sampling design and characterization techniques used to accomplish the EE/CA field investigation.« less
Weaver, J; Leon, E; Edan, M; D'Alessio, F
2012-08-01
The World Organisation for Animal Health (OIE) carries out Gap Analysis missions (if a country so wishes) as part of its programme to assess and improve the Performance of Veterinary Services (the 'PVS Pathway') in Member Countries. These Gap Analysis missions have found that many national Veterinary Services comply to only a limited extent with the international standards established by the OIE and that their competence is compromised by poor governance. This failure threatens animal and public health not only nationally but also internationally. The OIE PVS Gap Analysis reports reviewed found that all the Veterinary Services have a strong vision and commitmentto improvement but are held back by a weak chain of command, inadequate and outdated legislation, insufficient funding, weak technical competencies, compromised technical independence, poor communications and limited joint programmes. There are weaknesses across all the core technical areas of trade, animal health, veterinary public health and veterinary laboratories and also in the overall management of the Veterinary Services. The OIE PVS Gap Analysis missions recommend significant increases in budget in all countries.
NASA Astrophysics Data System (ADS)
Eshelman, E.; Wanger, G.; Manatt, K.; Malaska, M.; Willis, M.; Abbey, W.; Doloboff, I.; Beegle, L. W.; DeFlores, L. P.; Priscu, J. C.; Lane, A. L.; Carrier, B. L.; Mellerowicz, B.; Kim, D.; Paulsen, G.; Zacny, K.; Bhartia, R.
2017-12-01
Future astrobiological missions to Europa and other ocean worlds may benefit from next-generation instrumentation capable of in situ organic and life detection in subsurface ice environments. WATSON (Wireline Analysis Tool for in Situ Observation of Northern ice sheets) is an instrument under development at NASA's Jet Propulsion Laboratory. WATSON contains high-TRL instrumentation developed for SHERLOC, the Mars 2020 deep-UV fluorescence and Raman spectrometer, including a 248.6 nm NeCu hollow cathode laser as an excitation source. In WATSON, these technologies provide spectroscopic capabilities highly sensitive to many organic compounds, including microbes, in an instrument package approximately 1.2 m long with a 101.6 mm diameter, designed to accommodate a 108 mm ice borehole. Interrogation into the ice wall with a laser allows for a non-destructive in situ measurement that preserves the spatial distribution of material within the ice. We report on a successful deployment of WATSON to Kangerlussuaq, Greenland, where the instrument was lowered to a 4.5 m depth in a hand-cored hole on the Kangerlussuaq sector of the Greenland ice sheet. Motorized stages within the instrument were used to raster a laser across cm-scale regions of the interior surface of the borehole, obtaining fluorescence spectral maps with a 200 µm spatial resolution and a spectral range from 265 nm to 440 nm. This region includes the UV emission bands of many aromatic compounds and microbes, and includes the water and ice Raman O-H stretching modes. We additionally report on experiments designed to inform an early-2018 deployment to Kangerlussuaq where WATSON will be incorporated into a Honeybee Robotics planetary deep drill, with a goal of drilling to a depth of 100 m and investigating the distribution of organic material within the ice sheet. These experiments include laboratory calibrations to determine the sensitivity to organic compounds embedded in ice at various depths, as well as analysis of ice cores obtained during the deployment and returned for subsequent study.
NASA Technical Reports Server (NTRS)
Briggs, G. A.; McKay, C.; George, J.; Derkowski, G.; Cooper, G.; Zacny, K.; Baker, R. Fincher; Pollard, W.; Clifford, S.
2003-01-01
As a project that is part of NASA s Astrobiology Technology & Instrument Development Program (ASTID), we are developing a low mass (approx.20kg) drill that will be operated without drilling fluids and at very low power levels (approx.60 watts electrical) to access and retrieve samples from permafrost regions of Earth and Mars. The drill, designed and built as a joint effort by NASA JSC and Baker-Hughes International, takes the form of a down-hole unit attached to a cable so that it can, in principle, be scaled easily to reach significant depths. A parallel laboratory effort is being carried out at UC Berkeley to characterize the physics of dry drilling under martian conditions of pressure, temperature and atmospheric composition. Data from the UCB and JSC laboratory experiments are being used as input to a drill simulation program which is under development to provide autonomous control of the drill. The first Arctic field test of the unit is planned for May 2004. A field expedition to Eureka on Ellesmere Island in Spring 2003 provided an introduction for several team members to the practical aspects of drilling under Arctic conditions. The field effort was organized by Wayne Pollard of McGill University and Christopher McKay of NASA ARC. A conventional science drill provided by New Zealand colleagues was used to recover ground ice cores for analysis of their microbial content and also to develop techniques using tracers to track the depth of penetration of contamination from the core surface into the interior of the samples.
Center for Fuel Cell Research and Applications development phase. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-12-01
The deployment and operation of clean power generation is becoming critical as the energy and transportation sectors seek ways to comply with clean air standards and the national deregulation of the utility industry. However, for strategic business decisions, considerable analysis is required over the next few years to evaluate the appropriate application and value added from this emerging technology. To this end the Houston Advanced Research Center (HARC) is proposing a three-year industry-driven project that centers on the creation of ``The Center for Fuel Cell Research and Applications.`` A collaborative laboratory housed at and managed by HARC, the Center willmore » enable a core group of six diverse participating companies--industry participants--to investigate the economic and operational feasibility of proton-exchange-membrane (PEM) fuel cells in a variety of applications (the core project). This document describes the unique benefits of a collaborative approach to PEM applied research, among them a shared laboratory concept leading to cost savings and shared risks as well as access to outstanding research talent and lab facilities. It also describes the benefits provided by implementing the project at HARC, with particular emphasis on HARC`s history of managing successful long-term research projects as well as its experience in dealing with industry consortia projects. The Center is also unique in that it will not duplicate the traditional university role of basic research or that of the fuel cell industry in developing commercial products. Instead, the Center will focus on applications, testing, and demonstration of fuel cell technology.« less
NASA Astrophysics Data System (ADS)
Jeppson, T.; Tobin, H. J.
2013-12-01
In the summer of 2005, Phase 2 of the San Andreas Fault Observatory at Depth (SAFOD) borehole was completed and logged with wireline tools including a dipole sonic tool to measure P- and S-wave velocities. A zone of anomalously low velocity was detected from 3150 to 3414 m measured depth (MD), corresponding with the subsurface location of the San Andreas Fault Zone (SAFZ). This low velocity zone is 5-30% slower than the surrounding host rock. Within this broad low-velocity zone, several slip surfaces were identified as well as two actively deforming shear zones: the southwest deformation zone (SDZ) and the central deformation zone (CDZ), located at 3192 and 3302 m MD, respectively. The SAFZ had also previously been identified as a low velocity zone in seismic velocity inversion models. The anomalously low velocity was hypothesized to result from either (a) brittle deformation in the damage zone of the fault, (b) high fluid pressures with in the fault zone, or (c) lithological variation, or a combination of the above. We measured P- and S-wave velocities at ultrasonic frequencies on saturated 2.5 cm diameter core plug samples taken from SAFOD core obtained in 2007 from within the low velocity zone. The resulting values fall into two distinct groups: foliated fault gouge and non-gouge. Samples of the foliated fault gouge have P-wave velocities between 2.3-3.5 km/s while non-gouge samples lie between 4.1-5.4 km/s over a range of effective pressures from 5-70 MPa. There is a good correlation between the log measurements and laboratory values of P-and S wave velocity at in situ pressure conditions especially for the foliated fault gouge. For non-gouge samples the laboratory values are approximately 0.08-0.73 km/s faster than the log values. This difference places the non-gouge velocities within the Great Valley siltstone velocity range, as measured by logs and ultrasonic measurements performed on outcrop samples. As a high fluid pressure zone was not encountered during SAFOD drilling, we use the ultrasonic velocities of SAFOD core and analogous outcrop samples to determine if the velocity reduction is due to lithologic variations or the presence of deformational fabrics and alteration in the fault zone. Preliminary analysis indicates that while the decrease in velocity across the broad fault zone is heavily influenced by fractures, the extremely low velocities associated with the actively deforming zones are more likely caused by the development of scaly fabric with clay coatings on the fracture surfaces. Analysis of thin sections and well logs are used to support this interpretation.
2013-01-01
Background Monitoring the progress of the Integrated Disease Surveillance (IDS) strategy is an important component to ensure its sustainability in the state of Maharashtra in India. The purpose of the study was to document the baseline performance of the system on its core and support functions and to understand the challenges for its transition from an externally funded “project” to a state owned surveillance “program”. Methods Multi-centre, retrospective cross-sectional evaluation study to assess the structure, core and support surveillance functions using modified WHO generic questionnaires. All 34 districts in the state and randomly identified 46 facilities and 25 labs were included in the study. Results Case definitions were rarely used at the periphery. Limited laboratory capacity at all levels compromised case and outbreak confirmation. Only 53% districts could confirm all priority diseases. Stool sample processing was the weakest at the periphery. Availability of transport media, trained staff, and rapid diagnostic tests were main challenges at the periphery. Data analysis was weak at both district and facility levels. Outbreak thresholds were better understood at facility level (59%) than at the district (18%). None of the outbreak indicator targets were met and submission of final outbreak report was the weakest. Feedback and training was significantly better (p < 0.0001) at district level (65%; 76%) than at facility level (15%; 37%). Supervision was better at the facility level (37%) than at district (18%) and so were coordination, communication and logistic resources. Contractual part time positions, administrative delays in recruitment, and vacancies (30%) were main human resource issues that hampered system performance. Conclusions Significant progress has been made in the core and support surveillance functions in Maharashtra, however some challenges exist. Support functions (laboratory, transport and communication equipment, training, supervision, human and other resources) are particularly weak at the district level. Structural integration and establishing permanent state and district surveillance officer positions will ensure leadership; improve performance; support continuity; and offer sustainability to the program. Institutionalizing the integrated disease surveillance strategy through skills based personnel development and infrastructure strengthening at district levels is the only way to avoid it from ending up isolated! Improving surveillance quality should be the next on agenda for the state. PMID:23764137
Tobias, B.; Chen, M.; Classen, I. G. J.; ...
2016-04-15
The electromagnetic coupling of helical modes, including those having different toroidal mode numbers, modifies the distribution of toroidal angular momentum in tokamak discharges. This can have deleterious effects on other transport channels as well as on magnetohydrodynamic (MHD) stability and disruptivity. At low levels of externally injected momentum, the coupling of core-localized modes initiates a chain of events, whereby flattening of the core rotation profile inside successive rational surfaces leads to the onset of a large m/n = 2/1 tearing mode and locked-mode disruption. Furthermore, with increased torque from neutral beam injection, neoclassical tearing modes in the core may phase-lockmore » to each other without locking to external fields or structures that are stationary in the laboratory frame. The dynamic processes observed in these cases are in general agreement with theory, and detailed diagnosis allows for momentum transport analysis to be performed, revealing a significant torque density that peaks near the 2/1 rational surface. However, as the coupled rational surfaces are brought closer together by reducing q95, additional momentum transport in excess of that required to attain a phase-locked state is sometimes observed. Rather than maintaining zero differential rotation (as is predicted to be dynamically stable by single-fluid, resistive MHD theory), these discharges develop hollow toroidal plasma fluid rotation profiles with reversed plasma flow shear in the region between the m/n = 3/2 and 2/1 islands. Additional forces expressed in this state are not readily accounted for, and therefore, analysis of these data highlights the impact of mode coupling on torque balance and the challenges associated with predicting the rotation dynamics of a fusion reactor-a key issue for ITER. Published by AIP Publishing.« less
Monte Carlo modelling of TRIGA research reactor
NASA Astrophysics Data System (ADS)
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
2010-10-01
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tobias, B.; Grierson, B. A.; Okabayashi, M.
2016-05-15
The electromagnetic coupling of helical modes, even those having different toroidal mode numbers, modifies the distribution of toroidal angular momentum in tokamak discharges. This can have deleterious effects on other transport channels as well as on magnetohydrodynamic (MHD) stability and disruptivity. At low levels of externally injected momentum, the coupling of core-localized modes initiates a chain of events, whereby flattening of the core rotation profile inside successive rational surfaces leads to the onset of a large m/n = 2/1 tearing mode and locked-mode disruption. With increased torque from neutral beam injection, neoclassical tearing modes in the core may phase-lock to each othermore » without locking to external fields or structures that are stationary in the laboratory frame. The dynamic processes observed in these cases are in general agreement with theory, and detailed diagnosis allows for momentum transport analysis to be performed, revealing a significant torque density that peaks near the 2/1 rational surface. However, as the coupled rational surfaces are brought closer together by reducing q{sub 95}, additional momentum transport in excess of that required to attain a phase-locked state is sometimes observed. Rather than maintaining zero differential rotation (as is predicted to be dynamically stable by single-fluid, resistive MHD theory), these discharges develop hollow toroidal plasma fluid rotation profiles with reversed plasma flow shear in the region between the m/n = 3/2 and 2/1 islands. The additional forces expressed in this state are not readily accounted for, and therefore, analysis of these data highlights the impact of mode coupling on torque balance and the challenges associated with predicting the rotation dynamics of a fusion reactor—a key issue for ITER.« less
Whyte, E F; Richter, C; O'Connor, S; Moran, K A
2018-02-01
Deficits in trunk control predict ACL injuries which frequently occur during high-risk activities such as cutting. However, no existing trunk control/core stability program has been found to positively affect trunk kinematics during cutting activities. This study investigated the effectiveness of a 6-week dynamic core stability program (DCS) on the biomechanics of anticipated and unanticipated side and crossover cutting maneuvers. Thirty-one male, varsity footballers participated in this randomized controlled trial. Three-dimensional trunk and lower limb biomechanics were captured in a motion analysis laboratory during the weight acceptance phase of anticipated and unanticipated side and crossover cutting maneuvers at baseline and 6-week follow-up. The DCS group performed a DCS program three times weekly for 6 weeks in a university rehabilitation room. Both the DCS and control groups concurrently completed their regular practice and match play. Statistical parametric mapping and repeated measures analysis of variance were used to determine any group (DCS vs control) by time (pre vs post) interactions. The DCS resulted in greater internal hip extensor (P=.017, η 2 =0.079), smaller internal knee valgus (P=.026, η 2 =0.076), and smaller internal knee external rotator moments (P=.041, η 2 =0.066) during anticipated side cutting compared with the control group. It also led to reduced posterior ground reaction forces for all cutting activities (P=.015-.030, η 2 =0.074-0.105). A 6-week DCS program did not affect trunk kinematics, but it did reduce a small number of biomechanical risk factors for ACL injury, predominantly during anticipated side cutting. A DCS program could play a role in multimodal ACL injury prevention programs. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Moreira, Daniel M; Nickel, J Curtis; Andriole, Gerald L; Castro-Santamaria, Ramiro; Freedland, Stephen J
2015-09-01
To evaluate whether baseline acute and chronic prostate inflammation among men with initial negative biopsy for prostate cancer (PC) is associated with PC volume at the 2-year repeat prostate biopsy in a clinical trial with systematic biopsies. Retrospective analysis of 886 men with negative baseline prostate biopsy and positive 2-year repeat biopsy in the Reduction by Dutasteride of PC Events (REDUCE) study. Acute and chronic inflammation and tumor volume were determined by central pathology. The association of baseline inflammation with 2-year repeat biopsy cancer volume was evaluated with linear and Poisson regressions controlling for demographics and laboratory variables. Chronic, acute inflammation, and both were detected in 531 (60%), 12 (1%), and 84 (9%) baseline biopsies, respectively. Acute and chronic inflammation were significantly associated with each other (P < 0.001). Chronic inflammation was associated with larger prostate (P < 0.001) and lower pre-repeat biopsy PSA (P = 0.01). At 2-year biopsy, baseline chronic inflammation was associated with lower mean tumor volume (2.07 µl vs. 3.15 µl; P = 0.001), number of biopsy cores involved (1.78 vs. 2.19; P < 0.001), percent of cores involved (17.8% vs. 22.8%; P < 0.001), core involvement (0.21 µl vs. 0.31 µl; P < 0.001), and overall percent tumor involvement (1.40% vs. 2.01%; P < 0.001). Results were unchanged in multivariable analysis. Baseline acute inflammation was not associated with any tumor volume measurement. In a cohort of men with 2-year repeat prostate biopsy positive for PC after a negative baseline biopsy, baseline chronic inflammation was associated with lower PC volume. © 2015 Wiley Periodicals, Inc.
Development of a SPARK Training Dataset
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sayre, Amanda M.; Olson, Jarrod R.
2015-03-01
In its first five years, the National Nuclear Security Administration’s (NNSA) Next Generation Safeguards Initiative (NGSI) sponsored more than 400 undergraduate, graduate, and post-doctoral students in internships and research positions (Wyse 2012). In the past seven years, the NGSI program has, and continues to produce a large body of scientific, technical, and policy work in targeted core safeguards capabilities and human capital development activities. Not only does the NGSI program carry out activities across multiple disciplines, but also across all U.S. Department of Energy (DOE)/NNSA locations in the United States. However, products are not readily shared among disciplines and acrossmore » locations, nor are they archived in a comprehensive library. Rather, knowledge of NGSI-produced literature is localized to the researchers, clients, and internal laboratory/facility publication systems such as the Electronic Records and Information Capture Architecture (ERICA) at the Pacific Northwest National Laboratory (PNNL). There is also no incorporated way of analyzing existing NGSI literature to determine whether the larger NGSI program is achieving its core safeguards capabilities and activities. A complete library of NGSI literature could prove beneficial to a cohesive, sustainable, and more economical NGSI program. The Safeguards Platform for Automated Retrieval of Knowledge (SPARK) has been developed to be a knowledge storage, retrieval, and analysis capability to capture safeguards knowledge to exist beyond the lifespan of NGSI. During the development process, it was necessary to build a SPARK training dataset (a corpus of documents) for initial entry into the system and for demonstration purposes. We manipulated these data to gain new information about the breadth of NGSI publications, and they evaluated the science-policy interface at PNNL as a practical demonstration of SPARK’s intended analysis capability. The analysis demonstration sought to answer the question, “Who leads research and development at PNNL, scientists or policy researchers?” The analysis was inconclusive as to whether policy researchers or scientists are primary drivers for research at PNNL. However, the dataset development and analysis activity did demonstrate the utility and usability of the SPARK dataset. After the initiation of the NGSI program there is a clear increase in the number of publications of safeguards products. Employing the natural language analysis tool IN SPIRE™ showed the presence of vocation- and topic-specific vernacular within NGSI sub-topics. The methodology developed to define the scope of the dataset was useful in describing safeguards applications, but may be applicable for research on other topics beyond safeguards. The analysis emphasized the need for an expanded dataset to fully understand the scope of safeguards publications and research both nationally and internationally. As the SPARK dataset grows to include publications outside PNNL, topics crosscutting disciplines and DOE/NNSA locations should become more apparent. NGSI was established in 2008 to cultivate the next generation of safeguards professionals and support the development of core safeguards capabilities (NNSA 2012). Now a robust system to preserve and share institutional memory such as SPARK is needed to inspire and equip the next generation of safeguards experts, technologies, and policies.« less
Chamber-core structures for fairing acoustic mitigation
NASA Astrophysics Data System (ADS)
Ardelean, Emil; Williams, Andrew; Korshin, Nicholas; Henderson, Kyle; Lane, Steven; Richard, Robert
2005-05-01
Extreme noise and vibration levels at lift-off and during ascent can damage sensitive payload components. Recently, the Air Force Research Laboratory, Space Vehicles Directorate has investigated a composite structure fabrication approach, called chamber-core, for building payload fairings. Chamber-core offers a strong, lightweight structure with inherent noise attenuation characteristics. It uses one-inch square axial tubes that are sandwiched between inner and outer face-sheets to form a cylindrical fairing structure. These hollow tubes can be used as acoustic dampers to attenuate the amplitude response of low frequency acoustic resonances within the fairing"s volume. A cylindrical, graphite-epoxy chamber-core structure was built to study noise transmission characteristics and to quantify the achievable performance improvement. The cylinder was tested in a semi-reverberant acoustics laboratory using bandlimited random noise at sound pressure levels up to 110 dB. The performance was measured using external and internal microphones. The noise reduction was computed as the ratio of the spatially averaged external response to the spatially averaged interior response. The noise reduction provided by the chamber-core cylinder was measured over three bandwidths, 20 Hz to 500 Hz, 20 Hz to 2000 Hz, and 20 Hz to 5000 Hz. For the bare cylinder with no acoustic resonators, the structure provided approximately 13 dB of attenuation over the 20 Hz to 500 Hz bandwidth. With the axial tubes acting as acoustic resonators at various frequencies over the bandwidth, the noise reduction provided by the cylinder increased to 18.2 dB, an overall increase of 4.8 dB over the bandwidth. Narrow-band reductions greater than 10 dB were observed at specific low frequency acoustic resonances. This was accomplished with virtually no added mass to the composite cylinder.
Smith, R.L.; Garabedian, S.P.; Brooks, M.H.
1996-01-01
The transport of many solutes in groundwater is dependent upon the relative rates of physical flow and microbial metabolism. Quantifying rates of microbial processes under subsurface conditions is difficult and is most commonly approximated using laboratory studies with aquifer materials. In this study, we measured in situ rates of denitrification in a nitrate- contaminated aquifer using small-scale, natural-gradient tracer tests and compared the results with rates obtained from laboratory incubations with aquifer core material. Activity was measured using the acetylene block technique. For the tracer tests, co-injection of acetylene and bromide into the aquifer produced a 30 ??M increase in nitrous oxide after 10 m of transport (23-30 days). An advection-dispersion transport model was modified to include an acetylene-dependent nitrous oxide production term and used to simulate the tracer breakthrough curves. The model required a 4-day lag period and a relatively low sensitivity to acetylene to match the narrow nitrous oxide breakthrough curves. Estimates of in situ denitrification rates were 0.60 and 1.51 nmol of N2O produced cm-3 aquifer day-1 for two successive tests. Aquifer core material collected from the tracer test site and incubated as mixed slurries in flasks and as intact cores yielded rates that were 1.2-26 times higher than the tracer test rate estimates. Results with the coring-dependent techniques were variable and subject to the small- scale heterogeneity within the aquifer, while the tracer tests integrated the heterogeneity along a flow path, giving a rate estimate that is more applicable to transport at the scale of the aquifer.
Kammerer-Jacquet, Solene-Florence; Compérat, Eva; Egevad, Lars; Hes, Ondra; Oxley, Jon; Varma, Murali; Kristiansen, Glen; Berney, Daniel M
2018-04-01
Transperineal template prostate biopsies (TTPB) are performed for assessments after unexpected negative transrectal ultrasound biopsies (TRUSB), correlation with imaging findings and during active surveillance. The impact of TTPBs on pathology has not been analysed. The European Network of Uropathology (ENUP) distributed a survey on TTPB, including how specimens were received, processed and analysed. Two hundred forty-four replies were received from 22 countries with TTPBs seen by 68.4% of the responders (n = 167). Biopsies were received in more than 12 pots in 35.2%. The number of cores embedded per cassette varied between 1 (39.5%) and 3 or more (39.5%). Three levels were cut in 48.3%, between 2 and 3 serial sections in 57.2% and unstained spare sections in 45.1%. No statistical difference was observed with TRUSB management. The number of positive cores was always reported and the majority gave extent per core (82.3%), per region (67.1%) and greatest involvement per core (69.4%). Total involvement in the whole series and continuous/discontinuous infiltrates were reported in 42.2 and 45.4%, respectively. The majority (79.4%) reported Gleason score in each site or core, and 59.6% gave an overall score. A minority (28.5%) provided a map or a diagram. For 19%, TTPB had adversely affected laboratory workload with only 27% managing to negotiate extra costs. Most laboratories process samples thoroughly and report TTPB similarly to TRUSB. Although TTPB have caused considerable extra work, it remains uncosted in most centres. Guidance is needed for workload impact and minimum standards of processing if TTPB work continues to increase.
Standardization of Laboratory Methods for the PERCH Study
Karron, Ruth A.; Morpeth, Susan C.; Bhat, Niranjan; Levine, Orin S.; Baggett, Henry C.; Brooks, W. Abdullah; Feikin, Daniel R.; Hammitt, Laura L.; Howie, Stephen R. C.; Knoll, Maria Deloria; Kotloff, Karen L.; Madhi, Shabir A.; Scott, J. Anthony G.; Thea, Donald M.; Adrian, Peter V.; Ahmed, Dilruba; Alam, Muntasir; Anderson, Trevor P.; Antonio, Martin; Baillie, Vicky L.; Dione, Michel; Endtz, Hubert P.; Gitahi, Caroline; Karani, Angela; Kwenda, Geoffrey; Maiga, Abdoul Aziz; McClellan, Jessica; Mitchell, Joanne L.; Morailane, Palesa; Mugo, Daisy; Mwaba, John; Mwansa, James; Mwarumba, Salim; Nyongesa, Sammy; Panchalingam, Sandra; Rahman, Mustafizur; Sawatwong, Pongpun; Tamboura, Boubou; Toure, Aliou; Whistler, Toni; O’Brien, Katherine L.; Murdoch, David R.
2017-01-01
Abstract The Pneumonia Etiology Research for Child Health study was conducted across 7 diverse research sites and relied on standardized clinical and laboratory methods for the accurate and meaningful interpretation of pneumonia etiology data. Blood, respiratory specimens, and urine were collected from children aged 1–59 months hospitalized with severe or very severe pneumonia and community controls of the same age without severe pneumonia and were tested with an extensive array of laboratory diagnostic tests. A standardized testing algorithm and standard operating procedures were applied across all study sites. Site laboratories received uniform training, equipment, and reagents for core testing methods. Standardization was further assured by routine teleconferences, in-person meetings, site monitoring visits, and internal and external quality assurance testing. Targeted confirmatory testing and testing by specialized assays were done at a central reference laboratory. PMID:28575358
Staff Scientist | Center for Cancer Research
The scientist will be tasked with independent research projects that support and/or further the scope of our laboratory goals as determined by the Principal Investigator. The scientist will be responsible for overseeing daily operations and coordination of projects in close conjunction with all laboratory personnel. The scientist will participate in teaching laboratory methods to first-time post-docs, research fellows, and students. The scientist will work closely with a full-time research biologist, both in collaboration of research projects and in the lab-critical administrative tasks of IRB-approval, animal protocols, budget, etc. Our laboratory has two post-doctoral researchers at any given time. This is a great opportunity for candidates who are interested in cancer biology and want to grow their research career by working in our program with outstanding support of other established laboratories and core facilities in the National Cancer Institute.
NASA Astrophysics Data System (ADS)
Limaye, A. B.; Komatsu, Y.; Suzuki, K.; Paola, C.
2017-12-01
Turbidity currents deliver clastic sediment from continental margins to the deep ocean, and are the main driver of landscape and stratigraphic evolution in many low-relief, submarine environments. The sedimentary architecture of turbidites—including the spatial organization of coarse and fine sediments—is closely related to the aggradation, scour, and lateral shifting of channels. Seismic stratigraphy indicates that submarine, meandering channels often aggrade rapidly relative to lateral shifting, and develop channel sand bodies with high vertical connectivity. In comparison, the stratigraphic architecture developed by submarine, braided is relatively uncertain. We present a new stratigraphic model for submarine braided channels that integrates predictions from laboratory experiments and flow modeling with constraints from sediment cores. In the laboratory experiments, a saline density current developed subaqueous channels in plastic sediment. The channels aggraded to form a deposit with a vertical scale of approximately five channel depths. We collected topography data during aggradation to (1) establish relative stratigraphic age, and (2) estimate the sorting patterns of a hypothetical grain size distribution. We applied a numerical flow model to each topographic surface and used modeled flow depth as a proxy for relative grain size. We then conditioned the resulting stratigraphic model to observed grain size distributions using sediment core data from the Nankai Trough, offshore Japan. Using this stratigraphic model, we establish new, quantitative predictions for the two- and three-dimensional connectivity of coarse sediment as a function of fine-sediment fraction. Using this case study as an example, we will highlight outstanding challenges in relating the evolution of low-relief landscapes to the stratigraphic record.
Reconfigurable Hardware Adapts to Changing Mission Demands
NASA Technical Reports Server (NTRS)
2003-01-01
A new class of computing architectures and processing systems, which use reconfigurable hardware, is creating a revolutionary approach to implementing future spacecraft systems. With the increasing complexity of electronic components, engineers must design next-generation spacecraft systems with new technologies in both hardware and software. Derivation Systems, Inc., of Carlsbad, California, has been working through NASA s Small Business Innovation Research (SBIR) program to develop key technologies in reconfigurable computing and Intellectual Property (IP) soft cores. Founded in 1993, Derivation Systems has received several SBIR contracts from NASA s Langley Research Center and the U.S. Department of Defense Air Force Research Laboratories in support of its mission to develop hardware and software for high-assurance systems. Through these contracts, Derivation Systems began developing leading-edge technology in formal verification, embedded Java, and reconfigurable computing for its PF3100, Derivational Reasoning System (DRS ), FormalCORE IP, FormalCORE PCI/32, FormalCORE DES, and LavaCORE Configurable Java Processor, which are designed for greater flexibility and security on all space missions.
Resolving Supercritical Orion Cores
NASA Astrophysics Data System (ADS)
Li, Di; Chapman, N.; Goldsmith, P.; Velusamy, T.
2009-01-01
The theoretical framework for high mass star formation (HMSF) is unclear. Observations reveal a seeming dichotomy between high- and low-mass star formation, with HMSF occurring only in Giant Molecular Clouds (GMC), mostly in clusters, and with higher star formation efficiencies than low-mass star formation. One crucial constraint to any theoretical model is the dynamical state of massive cores, in particular, whether a massive core is in supercritical collapse. Based on the mass-size relation of dust emission, we select likely unstable targets from a sample of massive cores (Li et al. 2007 ApJ 655, 351) in the nearest GMC, Orion. We have obtained N2H+ (1-0) maps using CARMA with resolution ( 2.5", 0.006 pc) significantly better than existing observations. We present observational and modeling results for ORI22. By revealing the dynamic structure down to Jeans scale, CARMA data confirms the dominance of gravity over turbulence in this cores. This work was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.
Scientific information repository assisting reflectance spectrometry in legal medicine.
Belenki, Liudmila; Sterzik, Vera; Bohnert, Michael; Zimmermann, Klaus; Liehr, Andreas W
2012-06-01
Reflectance spectrometry is a fast and reliable method for the characterization of human skin if the spectra are analyzed with respect to a physical model describing the optical properties of human skin. For a field study performed at the Institute of Legal Medicine and the Freiburg Materials Research Center of the University of Freiburg, a scientific information repository has been developed, which is a variant of an electronic laboratory notebook and assists in the acquisition, management, and high-throughput analysis of reflectance spectra in heterogeneous research environments. At the core of the repository is a database management system hosting the master data. It is filled with primary data via a graphical user interface (GUI) programmed in Java, which also enables the user to browse the database and access the results of data analysis. The latter is carried out via Matlab, Python, and C programs, which retrieve the primary data from the scientific information repository, perform the analysis, and store the results in the database for further usage.
Development of advanced strain diagnostic techniques for reactor environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fleming, Darryn D.; Holschuh, Thomas Vernon,; Miller, Timothy J.
2013-02-01
The following research is operated as a Laboratory Directed Research and Development (LDRD) initiative at Sandia National Laboratories. The long-term goals of the program include sophisticated diagnostics of advanced fuels testing for nuclear reactors for the Department of Energy (DOE) Gen IV program, with the future capability to provide real-time measurement of strain in fuel rod cladding during operation in situ at any research or power reactor in the United States. By quantifying the stress and strain in fuel rods, it is possible to significantly improve fuel rod design, and consequently, to improve the performance and lifetime of the cladding.more » During the past year of this program, two sets of experiments were performed: small-scale tests to ensure reliability of the gages, and reactor pulse experiments involving the most viable samples in the Annulated Core Research Reactor (ACRR), located onsite at Sandia. Strain measurement techniques that can provide useful data in the extreme environment of a nuclear reactor core are needed to characterize nuclear fuel rods. This report documents the progression of solutions to this issue that were explored for feasibility in FY12 at Sandia National Laboratories, Albuquerque, NM.« less
Gerwin, Philip M; Norinsky, Rada M; Tolwani, Ravi J
2018-03-01
Laboratory animal programs and core laboratories often set service rates based on cost estimates. However, actual costs may be unknown, and service rates may not reflect the actual cost of services. Accurately evaluating the actual costs of services can be challenging and time-consuming. We used a time-driven activity-based costing (ABC) model to determine the cost of services provided by a resource laboratory at our institution. The time-driven approach is a more efficient approach to calculating costs than using a traditional ABC model. We calculated only 2 parameters: the time required to perform an activity and the unit cost of the activity based on employee cost. This method allowed us to rapidly and accurately calculate the actual cost of services provided, including microinjection of a DNA construct, microinjection of embryonic stem cells, embryo transfer, and in vitro fertilization. We successfully implemented a time-driven ABC model to evaluate the cost of these services and the capacity of labor used to deliver them. We determined how actual costs compared with current service rates. In addition, we determined that the labor supplied to conduct all services (10,645 min/wk) exceeded the practical labor capacity (8400 min/wk), indicating that the laboratory team was highly efficient and that additional labor capacity was needed to prevent overloading of the current team. Importantly, this time-driven ABC approach allowed us to establish a baseline model that can easily be updated to reflect operational changes or changes in labor costs. We demonstrated that a time-driven ABC model is a powerful management tool that can be applied to other core facilities as well as to entire animal programs, providing valuable information that can be used to set rates based on the actual cost of services and to improve operating efficiency.
FIELD TRAPPING OF SUBSURFACE VAPOR PHASE PETROLEUM HYDROCARBONS
Soil gas samples from intact soil cores were collected on adsorbents at a field site, then thermally desorbed and analyzed by laboratory gas chromatography (GC). ertical concentration profiles of predominant vapor phase petroleum hydrocarbons under ambient conditions were obtaine...
Code of Federal Regulations, 2014 CFR
2014-07-01
... include, but is not limited to, identification of lithologic and fossil content, core analyses, laboratory... geophysical information. Interpreted geological information means knowledge, often in the form of schematic... geological information. Interpreted geophysical information means knowledge, often in the form of schematic...
Code of Federal Regulations, 2014 CFR
2014-07-01
... fossil content, core analyses, laboratory analyses of physical and chemical properties, well logs or... geological information means knowledge, often in the form of schematic cross sections, 3-dimensional... form of seismic cross sections, 3-dimensional representations, and maps, developed by determining the...
Code of Federal Regulations, 2012 CFR
2012-07-01
... fossil content, core analyses, laboratory analyses of physical and chemical properties, well logs or... geological information means knowledge, often in the form of schematic cross sections, 3-dimensional... form of seismic cross sections, 3-dimensional representations, and maps, developed by determining the...
Code of Federal Regulations, 2013 CFR
2013-07-01
... include, but is not limited to, identification of lithologic and fossil content, core analyses, laboratory... geophysical information. Interpreted geological information means knowledge, often in the form of schematic... geological information. Interpreted geophysical information means knowledge, often in the form of schematic...
Code of Federal Regulations, 2012 CFR
2012-07-01
... include, but is not limited to, identification of lithologic and fossil content, core analyses, laboratory... geophysical information. Interpreted geological information means knowledge, often in the form of schematic... geological information. Interpreted geophysical information means knowledge, often in the form of schematic...