Sample records for laboratory core analysis

  1. Magnetic resonance imaging in laboratory petrophysical core analysis

    NASA Astrophysics Data System (ADS)

    Mitchell, J.; Chandrasekera, T. C.; Holland, D. J.; Gladden, L. F.; Fordham, E. J.

    2013-05-01

    Magnetic resonance imaging (MRI) is a well-known technique in medical diagnosis and materials science. In the more specialized arena of laboratory-scale petrophysical rock core analysis, the role of MRI has undergone a substantial change in focus over the last three decades. Initially, alongside the continual drive to exploit higher magnetic field strengths in MRI applications for medicine and chemistry, the same trend was followed in core analysis. However, the spatial resolution achievable in heterogeneous porous media is inherently limited due to the magnetic susceptibility contrast between solid and fluid. As a result, imaging resolution at the length-scale of typical pore diameters is not practical and so MRI of core-plugs has often been viewed as an inappropriate use of expensive magnetic resonance facilities. Recently, there has been a paradigm shift in the use of MRI in laboratory-scale core analysis. The focus is now on acquiring data in the laboratory that are directly comparable to data obtained from magnetic resonance well-logging tools (i.e., a common physics of measurement). To maintain consistency with well-logging instrumentation, it is desirable to measure distributions of transverse (T2) relaxation time-the industry-standard metric in well-logging-at the laboratory-scale. These T2 distributions can be spatially resolved over the length of a core-plug. The use of low-field magnets in the laboratory environment is optimal for core analysis not only because the magnetic field strength is closer to that of well-logging tools, but also because the magnetic susceptibility contrast is minimized, allowing the acquisition of quantitative image voxel (or pixel) intensities that are directly scalable to liquid volume. Beyond simple determination of macroscopic rock heterogeneity, it is possible to utilize the spatial resolution for monitoring forced displacement of oil by water or chemical agents, determining capillary pressure curves, and estimating

  2. An improved method for field extraction and laboratory analysis of large, intact soil cores

    USGS Publications Warehouse

    Tindall, J.A.; Hemmen, K.; Dowd, J.F.

    1992-01-01

    Various methods have been proposed for the extraction of large, undisturbed soil cores and for subsequent analysis of fluid movement within the cores. The major problems associated with these methods are expense, cumbersome field extraction, and inadequate simulation of unsaturated flow conditions. A field and laboratory procedure is presented that is economical, convenient, and simulates unsaturated and saturated flow without interface flow problems and can be used on a variety of soil types. In the field, a stainless steel core barrel is hydraulically pressed into the soil (30-cm diam. and 38 cm high), the barrel and core are extracted from the soil, and after the barrel is removed from the core, the core is then wrapped securely with flexible sheet metal and a stainless mesh screen is attached to the bottom of the core for support. In the laboratory the soil core is set atop a porous ceramic plate over which a soil-diatomaceous earth slurry has been poured to assure good contact between plate and core. A cardboard cylinder (mold) is fastened around the core and the empty space filled with paraffin wax. Soil cores were tested under saturated and unsaturated conditions using a hanging water column for potentials ???0. Breakthrough curves indicated that no interface flow occurred along the edge of the core. This procedure proved to be reliable for field extraction of large, intact soil cores and for laboratory analysis of solute transport.

  3. Oak Ridge National Laboratory Core Competencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberto, J.B.; Anderson, T.D.; Berven, B.A.

    1994-12-01

    A core competency is a distinguishing integration of capabilities which enables an organization to deliver mission results. Core competencies represent the collective learning of an organization and provide the capacity to perform present and future missions. Core competencies are distinguishing characteristics which offer comparative advantage and are difficult to reproduce. They exhibit customer focus, mission relevance, and vertical integration from research through applications. They are demonstrable by metrics such as level of investment, uniqueness of facilities and expertise, and national impact. The Oak Ridge National Laboratory (ORNL) has identified four core competencies which satisfy the above criteria. Each core competencymore » represents an annual investment of at least $100M and is characterized by an integration of Laboratory technical foundations in physical, chemical, and materials sciences; biological, environmental, and social sciences; engineering sciences; and computational sciences and informatics. The ability to integrate broad technical foundations to develop and sustain core competencies in support of national R&D goals is a distinguishing strength of the national laboratories. The ORNL core competencies are: 9 Energy Production and End-Use Technologies o Biological and Environmental Sciences and Technology o Advanced Materials Synthesis, Processing, and Characterization & Neutron-Based Science and Technology. The distinguishing characteristics of each ORNL core competency are described. In addition, written material is provided for two emerging competencies: Manufacturing Technologies and Computational Science and Advanced Computing. Distinguishing institutional competencies in the Development and Operation of National Research Facilities, R&D Integration and Partnerships, Technology Transfer, and Science Education are also described. Finally, financial data for the ORNL core competencies are summarized in the appendices.« less

  4. Laboratory ultrasonic pulse velocity logging for determination of elastic properties from rock core

    NASA Astrophysics Data System (ADS)

    Blacklock, Natalie Erin

    During the development of deep underground excavations spalling and rockbursting have been recognized as significant mechanisms of violent brittle failure. In order to predict whether violent brittle failure will occur, it is important to identify the location of stiffness transitions that are associated with geologic structure. One approach to identify the effect of geologic structures is to apply borehole geophysical tools ahead of the tunnel advance. Stiffness transitions can be identified using mechanical property analysis surveys that combine acoustic velocity and density data to calculate acoustic estimates of elastic moduli. However, logistical concerns arise since the approach must be conducted at the advancing tunnel face. As a result, borehole mechanical property analyses are rarely used. Within this context, laboratory ultrasonic pulse velocity testing has been proposed as a potential alternative to borehole mechanical property analysis since moving the analysis to the laboratory would remove logistical constraints and improve safety for the evaluators. In addition to the traditional method of conducting velocity testing along the core axis, two new methodologies for point-focused testing were developed across the core diameter, and indirectly along intact lengths of drill core. The indirect test procedure was implemented in a continuous ultrasonic velocity test program along 573m of drill core to identify key geologic structures that generated transitions in ultrasonic elastic moduli. The test program was successful at identifying the location of geologic contacts, igneous intrusions, faults and shear structures. Ultrasonic values of Young's modulus and bulk modulus were determined at locations of significant velocity transitions to examine the potential for energy storage and energy release. Comparison of results from different ultrasonic velocity test configurations determined that the indirect test configuration provided underestimates for values of

  5. Implementation of Quality Management in Core Service Laboratories

    PubMed Central

    Creavalle, T.; Haque, K.; Raley, C.; Subleski, M.; Smith, M.W.; Hicks, B.

    2010-01-01

    CF-28 The Genetics and Genomics group of the Advanced Technology Program of SAIC-Frederick exists to bring innovative genomic expertise, tools and analysis to NCI and the scientific community. The Sequencing Facility (SF) provides next generation short read (Illumina) sequencing capacity to investigators using a streamlined production approach. The Laboratory of Molecular Technology (LMT) offers a wide range of genomics core services including microarray expression analysis, miRNA analysis, array comparative genome hybridization, long read (Roche) next generation sequencing, quantitative real time PCR, transgenic genotyping, Sanger sequencing, and clinical mutation detection services to investigators from across the NIH. As the technology supporting this genomic research becomes more complex, the need for basic quality processes within all aspects of the core service groups becomes critical. The Quality Management group works alongside members of these labs to establish or improve processes supporting operations control (equipment, reagent and materials management), process improvement (reengineering/optimization, automation, acceptance criteria for new technologies and tech transfer), and quality assurance and customer support (controlled documentation/SOPs, training, service deficiencies and continual improvement efforts). Implementation and expansion of quality programs within unregulated environments demonstrates SAIC-Frederick's dedication to providing the highest quality products and services to the NIH community.

  6. Automation in clinical biochemistry: core, peripheral, STAT, and specialist laboratories in Australia.

    PubMed

    Streitberg, George S; Angel, Lyndall; Sikaris, Kenneth A; Bwititi, Phillip T

    2012-10-01

    Pathology has developed substantially since the 1990s with the introduction of total laboratory automation (TLA), in response to workloads and the need to improve quality. TLA has enhanced core laboratories, which evolved from discipline-based laboratories. Work practices have changed, with central reception now loading samples onto the Inlet module of the TLA. It is important to continually appraise technology. This study looked at the impact of technology using a self-administered survey to seniors in clinical biochemistry in NATA GX/GY-classified laboratories in Australia. The responses were yes, no, or not applicable and are expressed as percentages of responses. Some of the questions sourced for descriptive answers. Eighty-one laboratories responded, and the locations were 63%, 33%, and 4% in capital cities, regional cities, and country towns, respectively. Forty-two percent were public and 58% private. Clinical biochemistry was in all core laboratories of various sizes, and most performed up to 20 tests per sample. Thirty percent of the 121 surveyed laboratories had plans to install an automated line. Fifty-eight percent had hematology and biochemistry instrumentations in their peripheral laboratory, and 16% had a STAT laboratory on the same site as the core laboratory. There were varied instruments in specialist laboratories, and analyzers with embedded computers were in all laboratories. Medium and large laboratories had workstations with integrated instruments, and some large laboratories had TLA. Technology evolution and rising demand for pathology services make it imperative for laboratories to embrace such changes and reorganize the laboratories to take into account point-of-care testing and the efficiencies of core laboratories and TLA.

  7. Application of the Toyota Production System improves core laboratory operations.

    PubMed

    Rutledge, Joe; Xu, Min; Simpson, Joanne

    2010-01-01

    To meet the increased clinical demands of our hospital expansion, improve quality, and reduce costs, our tertiary care, pediatric core laboratory used the Toyota Production System lean processing to reorganize our 24-hour, 7 d/wk core laboratory. A 4-month, consultant-driven process removed waste, led to a physical reset of the space to match the work flow, and developed a work cell for our random access analyzers. In addition, visual controls, single piece flow, standard work, and "5S" were instituted. The new design met our goals as reflected by achieving and maintaining improved turnaround time (TAT; mean for creatinine reduced from 54 to 23 minutes) with increased testing volume (20%), monetary savings (4 full-time equivalents), decreased variability in TAT, and better space utilization (25% gain). The project had the unanticipated consequence of eliminating STAT testing because our in-laboratory TAT for routine testing was less than our prior STAT turnaround goal. The viability of this approach is demonstrated by sustained gains and further PDCA (Plan, Do, Check, Act) improvements during the 4 years after completion of the project.

  8. A laboratory model for solidification of Earth's core

    NASA Astrophysics Data System (ADS)

    Bergman, Michael I.; Macleod-Silberstein, Marget; Haskel, Michael; Chandler, Benjamin; Akpan, Nsikan

    2005-11-01

    To better understand the influence of rotating convection in the outer core on the solidification of the inner core we have constructed a laboratory model for solidification of Earth's core. The model consists of a 15 cm radius hemispherical acrylic tank concentric with a 5 cm radius hemispherical aluminum heat exchanger that serves as the incipient inner core onto which we freeze ice from salt water. Long exposure photographs of neutrally buoyant particles in illuminated planes suggest reduction of flow parallel to the rotation axis. Thermistors in the tank near the heat exchanger show that in experiments with rotation the temperature near the pole is lower than near the equator, unlike for control experiments without rotation or with a polymer that increases the fluid viscosity. The photographs and thermistors suggest that our observation that ice grows faster near the pole than near the equator for experiments with rotation is a result of colder water not readily convecting away from the pole. Because of the reversal of the thermal gradient, we expect faster equatorial solidification in the Earth's core. Such anisotropy in solidification has been suggested as a cause of inner core elastic (and attenuation) anisotropy, though the plausibility of this suggestion will depend on the core Nusselt number and the slope of the liquidus, and the effects of post-solidification deformation. Previous experiments on hexagonal close-packed alloys such as sea ice and zinc-tin have shown that fluid flow in the melt can result in a solidification texture transverse to the solidification direction, with the texture depending on the nature of the flow. A comparison of the visualized flow and the texture of columnar ice crystals in thin sections from these experiments confirms flow-induced transverse textures. This suggests that the convective pattern at the base of the outer core is recorded in the texture of the inner core, and that outer core convection might contribute to the

  9. [Contribution of HCV core antigen testing in HCV diagnosis by test from the company Abbott Laboratories].

    PubMed

    Trbusek, J

    2009-11-01

    Detection of HCV core antigen as direct marker of hepatitis C infection clearly improves diagnosis of this disease (especially reduction of window period) and brings broad clinical utilization. The company Abbott Laboratories offers fully automated laboratory test for measurement of HCV core antigen on ARCHITECT analyzers.

  10. Combustion and Energy Transfer Experiments: A Laboratory Model for Linking Core Concepts across the Science Curriculum

    ERIC Educational Resources Information Center

    Barreto, Jose C.; Dubetz, Terry A.; Schmidt, Diane L.; Isern, Sharon; Beatty, Thomas; Brown, David W.; Gillman, Edward; Alberte, Randall S.; Egiebor, Nosa O.

    2007-01-01

    Core concepts can be integrated throughout lower-division science and engineering courses by using a series of related, cross-referenced laboratory experiments. Starting with butane combustion in chemistry, the authors expanded the underlying core concepts of energy transfer into laboratories designed for biology, physics, and engineering. This…

  11. Updated procedures for using drill cores and cuttings at the Lithologic Core Storage Library, Idaho National Laboratory, Idaho

    USGS Publications Warehouse

    Hodges, Mary K.V.; Davis, Linda C.; Bartholomay, Roy C.

    2018-01-30

    In 1990, the U.S. Geological Survey, in cooperation with the U.S. Department of Energy Idaho Operations Office, established the Lithologic Core Storage Library at the Idaho National Laboratory (INL). The facility was established to consolidate, catalog, and permanently store nonradioactive drill cores and cuttings from subsurface investigations conducted at the INL, and to provide a location for researchers to examine, sample, and test these materials.The facility is open by appointment to researchers for examination, sampling, and testing of cores and cuttings. This report describes the facility and cores and cuttings stored at the facility. Descriptions of cores and cuttings include the corehole names, corehole locations, and depth intervals available.Most cores and cuttings stored at the facility were drilled at or near the INL, on the eastern Snake River Plain; however, two cores drilled on the western Snake River Plain are stored for comparative studies. Basalt, rhyolite, sedimentary interbeds, and surficial sediments compose most cores and cuttings, most of which are continuous from land surface to their total depth. The deepest continuously drilled core stored at the facility was drilled to 5,000 feet below land surface. This report describes procedures and researchers' responsibilities for access to the facility and for examination, sampling, and return of materials.

  12. Core courses in public health laboratory science and practice: findings from 2006 and 2011 surveys.

    PubMed

    DeBoy, John M; Beck, Angela J; Boulton, Matthew L; Kim, Deborah H; Wichman, Michael D; Luedtke, Patrick F

    2013-01-01

    We identified academic training courses or topics most important to the careers of U.S. public health, environmental, and agricultural laboratory (PHEAL) scientist-managers and directors, and determined what portions of the national PHEAL workforce completed these courses. We conducted electronic national surveys in 2006 and 2011, and analyzed data using numerical ranking, Chi-square tests comparing rates, and Spearman's formula measuring rank correlation. In 2006, 40 of 50 PHEAL directors identified 56 course topics as either important, useful, or not needed for someone in their position. These course topics were then ranked to provide a list of 31 core courses. In 2011, 1,659 of approximately 5,555 PHEAL scientific and technical staff, using a subset of 25 core courses, evidenced higher core course completion rates associated with higher-level job classification, advanced academic degree, and age. The 2011 survey showed that 287 PHEAL scientist-managers and directors, on average, completed 37.7% (n=5/13) of leadership/managerial core courses and 51.7% (n=6/12) of scientific core courses. For 1,659 laboratorians in all scientific and technical classifications, core-subject completion rates were higher in local laboratories (42.8%, n=11/25) than in state (36.0%, n=9/25), federal (34.4%, n=9/25), and university (31.2%, n=8/25) laboratories. There is a definable range of scientific, leadership, and managerial core courses needed by PHEAL scientist-managers and directors to function effectively in their positions. Potential PHEAL scientist-managers and directors need greater and continuing access to these courses, and academic and practice entities supporting development of this workforce should adopt curricula and core competencies aligned with these course topics.

  13. Core Courses in Public Health Laboratory Science and Practice: Findings from 2006 and 2011 Surveys

    PubMed Central

    Beck, Angela J.; Boulton, Matthew L.; Kim, Deborah H.; Wichman, Michael D.; Luedtke, Patrick F.

    2013-01-01

    Objectives We identified academic training courses or topics most important to the careers of U.S. public health, environmental, and agricultural laboratory (PHEAL) scientist-managers and directors, and determined what portions of the national PHEAL workforce completed these courses. Methods We conducted electronic national surveys in 2006 and 2011, and analyzed data using numerical ranking, Chi-square tests comparing rates, and Spearman's formula measuring rank correlation. Results In 2006, 40 of 50 PHEAL directors identified 56 course topics as either important, useful, or not needed for someone in their position. These course topics were then ranked to provide a list of 31 core courses. In 2011, 1,659 of approximately 5,555 PHEAL scientific and technical staff, using a subset of 25 core courses, evidenced higher core course completion rates associated with higher-level job classification, advanced academic degree, and age. The 2011 survey showed that 287 PHEAL scientist-managers and directors, on average, completed 37.7% (n=5/13) of leadership/managerial core courses and 51.7% (n=6/12) of scientific core courses. For 1,659 laboratorians in all scientific and technical classifications, core-subject completion rates were higher in local laboratories (42.8%, n=11/25) than in state (36.0%, n=9/25), federal (34.4%, n=9/25), and university (31.2%, n=8/25) laboratories. Conclusions There is a definable range of scientific, leadership, and managerial core courses needed by PHEAL scientist-managers and directors to function effectively in their positions. Potential PHEAL scientist-managers and directors need greater and continuing access to these courses, and academic and practice entities supporting development of this workforce should adopt curricula and core competencies aligned with these course topics. PMID:23997310

  14. TREAT Transient Analysis Benchmarking for the HEU Core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kontogeorgakos, D. C.; Connaway, H. M.; Wright, A. E.

    2014-05-01

    This work was performed to support the feasibility study on the potential conversion of the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory from the use of high enriched uranium (HEU) fuel to the use of low enriched uranium (LEU) fuel. The analyses were performed by the GTRI Reactor Conversion staff at the Argonne National Laboratory (ANL). The objective of this study was to benchmark the transient calculations against temperature-limited transients performed in the final operating HEU TREAT core configuration. The MCNP code was used to evaluate steady-state neutronics behavior, and the point kinetics code TREKIN was used tomore » determine core power and energy during transients. The first part of the benchmarking process was to calculate with MCNP all the neutronic parameters required by TREKIN to simulate the transients: the transient rod-bank worth, the prompt neutron generation lifetime, the temperature reactivity feedback as a function of total core energy, and the core-average temperature and peak temperature as a functions of total core energy. The results of these calculations were compared against measurements or against reported values as documented in the available TREAT reports. The heating of the fuel was simulated as an adiabatic process. The reported values were extracted from ANL reports, intra-laboratory memos and experiment logsheets and in some cases it was not clear if the values were based on measurements, on calculations or a combination of both. Therefore, it was decided to use the term “reported” values when referring to such data. The methods and results from the HEU core transient analyses will be used for the potential LEU core configurations to predict the converted (LEU) core’s performance.« less

  15. Central Core Laboratory versus Site Interpretation of Coronary CT Angiography: Agreement and Association with Cardiovascular Events in the PROMISE Trial.

    PubMed

    Lu, Michael T; Meyersohn, Nandini M; Mayrhofer, Thomas; Bittner, Daniel O; Emami, Hamed; Puchner, Stefan B; Foldyna, Borek; Mueller, Martin E; Hearne, Steven; Yang, Clifford; Achenbach, Stephan; Truong, Quynh A; Ghoshhajra, Brian B; Patel, Manesh R; Ferencik, Maros; Douglas, Pamela S; Hoffmann, Udo

    2018-04-01

    Purpose To assess concordance and relative prognostic utility between central core laboratory and local site interpretation for significant coronary artery disease (CAD) and cardiovascular events. Materials and Methods In the Prospective Multicenter Imaging Study for Evaluation of Chest Pain (PROMISE) trial, readers at 193 North American sites interpreted coronary computed tomographic (CT) angiography as part of the clinical evaluation of stable chest pain. Readers at a central core laboratory also interpreted CT angiography blinded to clinical data, site interpretation, and outcomes. Significant CAD was defined as stenosis greater than or equal to 50%; cardiovascular events were defined as a composite of cardiovascular death or myocardial infarction. Results In 4347 patients (51.8% women; mean age ± standard deviation, 60.4 years ± 8.2), core laboratory and site interpretations were discordant in 16% (683 of 4347), most commonly because of a finding of significant CAD by site but not by core laboratory interpretation (80%, 544 of 683). Overall, core laboratory interpretation resulted in 41% fewer patients being reported as having significant CAD (14%, 595 of 4347 vs 23%, 1000 of 4347; P < .001). Over a median follow-up period of 25 months, 1.3% (57 of 4347) sustained myocardial infarction or cardiovascular death. The C statistic for future myocardial infarction or cardiovascular death was 0.61 (95% confidence interval [CI]: 0.54, 0.68) for the core laboratory and 0.63 (95% CI: 0.56, 0.70) for the sites. Conclusion Compared with interpretation by readers at 193 North American sites, standardized core laboratory interpretation classified 41% fewer patients as having significant CAD. © RSNA, 2017 Online supplemental material is available for this article. Clinical trial registration no. NCT01174550.

  16. Angiographic core laboratory reproducibility analyses: implications for planning clinical trials using coronary angiography and left ventriculography end-points.

    PubMed

    Steigen, Terje K; Claudio, Cheryl; Abbott, David; Schulzer, Michael; Burton, Jeff; Tymchak, Wayne; Buller, Christopher E; John Mancini, G B

    2008-06-01

    To assess reproducibility of core laboratory performance and impact on sample size calculations. Little information exists about overall reproducibility of core laboratories in contradistinction to performance of individual technicians. Also, qualitative parameters are being adjudicated increasingly as either primary or secondary end-points. The comparative impact of using diverse indexes on sample sizes has not been previously reported. We compared initial and repeat assessments of five quantitative parameters [e.g., minimum lumen diameter (MLD), ejection fraction (EF), etc.] and six qualitative parameters [e.g., TIMI myocardial perfusion grade (TMPG) or thrombus grade (TTG), etc.], as performed by differing technicians and separated by a year or more. Sample sizes were calculated from these results. TMPG and TTG were also adjudicated by a second core laboratory. MLD and EF were the most reproducible, yielding the smallest sample size calculations, whereas percent diameter stenosis and centerline wall motion require substantially larger trials. Of the qualitative parameters, all except TIMI flow grade gave reproducibility characteristics yielding sample sizes of many 100's of patients. Reproducibility of TMPG and TTG was only moderately good both within and between core laboratories, underscoring an intrinsic difficulty in assessing these. Core laboratories can be shown to provide reproducibility performance that is comparable to performance commonly ascribed to individual technicians. The differences in reproducibility yield huge differences in sample size when comparing quantitative and qualitative parameters. TMPG and TTG are intrinsically difficult to assess and conclusions based on these parameters should arise only from very large trials.

  17. Digital management and regulatory submission of medical images from clinical trials: role and benefits of the core laboratory

    NASA Astrophysics Data System (ADS)

    Robbins, William L.; Conklin, James J.

    1995-10-01

    Medical images (angiography, CT, MRI, nuclear medicine, ultrasound, x ray) play an increasingly important role in the clinical development and regulatory review process for pharmaceuticals and medical devices. Since medical images are increasingly acquired and archived digitally, or are readily digitized from film, they can be visualized, processed and analyzed in a variety of ways using digital image processing and display technology. Moreover, with image-based data management and data visualization tools, medical images can be electronically organized and submitted to the U.S. Food and Drug Administration (FDA) for review. The collection, processing, analysis, archival, and submission of medical images in a digital format versus an analog (film-based) format presents both challenges and opportunities for the clinical and regulatory information management specialist. The medical imaging 'core laboratory' is an important resource for clinical trials and regulatory submissions involving medical imaging data. Use of digital imaging technology within a core laboratory can increase efficiency and decrease overall costs in the image data management and regulatory review process.

  18. Core analysis of heterogeneous rocks using experimental observations and digital whole core simulation

    NASA Astrophysics Data System (ADS)

    Jackson, S. J.; Krevor, S. C.; Agada, S.

    2017-12-01

    A number of studies have demonstrated the prevalent impact that small-scale rock heterogeneity can have on larger scale flow in multiphase flow systems including petroleum production and CO2sequestration. Larger scale modeling has shown that this has a significant impact on fluid flow and is possibly a significant source of inaccuracy in reservoir simulation. Yet no core analysis protocol has been developed that faithfully represents the impact of these heterogeneities on flow functions used in modeling. Relative permeability is derived from core floods performed at conditions with high flow potential in which the impact of capillary heterogeneity is voided. A more accurate representation would be obtained if measurements were made at flow conditions where the impact of capillary heterogeneity on flow is scaled to be representative of the reservoir system. This, however, is generally impractical due to laboratory constraints and the role of the orientation of the rock heterogeneity. We demonstrate a workflow of combined observations and simulations, in which the impact of capillary heterogeneity may be faithfully represented in the derivation of upscaled flow properties. Laboratory measurements that are a variation of conventional protocols are used for the parameterization of an accurate digital rock model for simulation. The relative permeability at the range of capillary numbers relevant to flow in the reservoir is derived primarily from numerical simulations of core floods that include capillary pressure heterogeneity. This allows flexibility in the orientation of the heterogeneity and in the range of flow rates considered. We demonstrate the approach in which digital rock models have been developed alongside core flood observations for three applications: (1) A Bentheimer sandstone with a simple axial heterogeneity to demonstrate the validity and limitations of the approach, (2) a set of reservoir rocks from the Captain sandstone in the UK North Sea targeted

  19. Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, John Edward; Unal, Cetin

    A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.

  20. Introductory Archaeology: The Inexpensive Laboratory.

    ERIC Educational Resources Information Center

    Rice, Patricia C.

    1990-01-01

    Describes a number of student-focused laboratory exercises that are inexpensive, yet show the scientific character of archaeology. Describes the environmental laboratory exercise which includes the following analysis topics: (1) pollen; (2) earth core; (3) microfaunal; and (4) microwear. Describes the ceramic laboratory which involves…

  1. Reliability on intra-laboratory and inter-laboratory data of hair mineral analysis comparing with blood analysis.

    PubMed

    Namkoong, Sun; Hong, Seung Phil; Kim, Myung Hwa; Park, Byung Cheol

    2013-02-01

    Nowadays, although its clinical value remains controversial institutions utilize hair mineral analysis. Arguments about the reliability of hair mineral analysis persist, and there have been evaluations of commercial laboratories performing hair mineral analysis. The objective of this study was to assess the reliability of intra-laboratory and inter-laboratory data at three commercial laboratories conducting hair mineral analysis, compared to serum mineral analysis. Two divided hair samples taken from near the scalp were submitted for analysis at the same time, to all laboratories, from one healthy volunteer. Each laboratory sent a report consisting of quantitative results and their interpretation of health implications. Differences among intra-laboratory and interlaboratory data were analyzed using SPSS version 12.0 (SPSS Inc., USA). All the laboratories used identical methods for quantitative analysis, and they generated consistent numerical results according to Friedman analysis of variance. However, the normal reference ranges of each laboratory varied. As such, each laboratory interpreted the patient's health differently. On intra-laboratory data, Wilcoxon analysis suggested they generated relatively coherent data, but laboratory B could not in one element, so its reliability was doubtful. In comparison with the blood test, laboratory C generated identical results, but not laboratory A and B. Hair mineral analysis has its limitations, considering the reliability of inter and intra laboratory analysis comparing with blood analysis. As such, clinicians should be cautious when applying hair mineral analysis as an ancillary tool. Each laboratory included in this study requires continuous refinement from now on for inducing standardized normal reference levels.

  2. CHAP-2 heat-transfer analysis of the Fort St. Vrain reactor core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kotas, J.F.; Stroh, K.R.

    1983-01-01

    The Los Alamos National Laboratory is developing the Composite High-Temperature Gas-Cooled Reactor Analysis Program (CHAP) to provide advanced best-estimate predictions of postulated accidents in gas-cooled reactor plants. The CHAP-2 reactor-core model uses the finite-element method to initialize a two-dimensional temperature map of the Fort St. Vrain (FSV) core and its top and bottom reflectors. The code generates a finite-element mesh, initializes noding and boundary conditions, and solves the nonlinear Laplace heat equation using temperature-dependent thermal conductivities, variable coolant-channel-convection heat-transfer coefficients, and specified internal fuel and moderator heat-generation rates. This paper discusses this method and analyzes an FSV reactor-core accident thatmore » simulates a control-rod withdrawal at full power.« less

  3. Fast imaging of laboratory core floods using 3D compressed sensing RARE MRI.

    PubMed

    Ramskill, N P; Bush, I; Sederman, A J; Mantle, M D; Benning, M; Anger, B C; Appel, M; Gladden, L F

    2016-09-01

    Three-dimensional (3D) imaging of the fluid distributions within the rock is essential to enable the unambiguous interpretation of core flooding data. Magnetic resonance imaging (MRI) has been widely used to image fluid saturation in rock cores; however, conventional acquisition strategies are typically too slow to capture the dynamic nature of the displacement processes that are of interest. Using Compressed Sensing (CS), it is possible to reconstruct a near-perfect image from significantly fewer measurements than was previously thought necessary, and this can result in a significant reduction in the image acquisition times. In the present study, a method using the Rapid Acquisition with Relaxation Enhancement (RARE) pulse sequence with CS to provide 3D images of the fluid saturation in rock core samples during laboratory core floods is demonstrated. An objective method using image quality metrics for the determination of the most suitable regularisation functional to be used in the CS reconstructions is reported. It is shown that for the present application, Total Variation outperforms the Haar and Daubechies3 wavelet families in terms of the agreement of their respective CS reconstructions with a fully-sampled reference image. Using the CS-RARE approach, 3D images of the fluid saturation in the rock core have been acquired in 16min. The CS-RARE technique has been applied to image the residual water saturation in the rock during a water-water displacement core flood. With a flow rate corresponding to an interstitial velocity of vi=1.89±0.03ftday(-1), 0.1 pore volumes were injected over the course of each image acquisition, a four-fold reduction when compared to a fully-sampled RARE acquisition. Finally, the 3D CS-RARE technique has been used to image the drainage of dodecane into the water-saturated rock in which the dynamics of the coalescence of discrete clusters of the non-wetting phase are clearly observed. The enhancement in the temporal resolution that has

  4. Contributed Review: Nuclear magnetic resonance core analysis at 0.3 T

    NASA Astrophysics Data System (ADS)

    Mitchell, Jonathan; Fordham, Edmund J.

    2014-11-01

    Nuclear magnetic resonance (NMR) provides a powerful toolbox for petrophysical characterization of reservoir core plugs and fluids in the laboratory. Previously, there has been considerable focus on low field magnet technology for well log calibration. Now there is renewed interest in the study of reservoir samples using stronger magnets to complement these standard NMR measurements. Here, the capabilities of an imaging magnet with a field strength of 0.3 T (corresponding to 12.9 MHz for proton) are reviewed in the context of reservoir core analysis. Quantitative estimates of porosity (saturation) and pore size distributions are obtained under favorable conditions (e.g., in carbonates), with the added advantage of multidimensional imaging, detection of lower gyromagnetic ratio nuclei, and short probe recovery times that make the system suitable for shale studies. Intermediate field instruments provide quantitative porosity maps of rock plugs that cannot be obtained using high field medical scanners due to the field-dependent susceptibility contrast in the porous medium. Example data are presented that highlight the potential applications of an intermediate field imaging instrument as a complement to low field instruments in core analysis and for materials science studies in general.

  5. Tank 241-AP-105, cores 208, 209 and 210, analytical results for the final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuzum, J.L.

    1997-10-24

    This document is the final laboratory report for Tank 241-AP-105. Push mode core segments were removed from Risers 24 and 28 between July 2, 1997, and July 14, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-AP-105 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997) and Tank Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT), differential scanning calorimetry (DSC) analysis, or total organic carbon (TOC) analysis exceeded the notification limits as stated in TSAP and DQO. The statisticalmore » results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report. Appearance and Sample Handling Two cores, each consisting of four segments, were expected from Tank 241-AP-105. Three cores were sampled, and complete cores were not obtained. TSAP states core samples should be transported to the laboratory within three calendar days from the time each segment is removed from the tank. This requirement was not met for all cores. Attachment 1 illustrates subsamples generated in the laboratory for analysis and identifies their sources. This reference also relates tank farm identification numbers to their corresponding 222-S Laboratory sample numbers.« less

  6. Challenges for proteomics core facilities.

    PubMed

    Lilley, Kathryn S; Deery, Michael J; Gatto, Laurent

    2011-03-01

    Many analytical techniques have been executed by core facilities established within academic, pharmaceutical and other industrial institutions. The centralization of such facilities ensures a level of expertise and hardware which often cannot be supported by individual laboratories. The establishment of a core facility thus makes the technology available for multiple researchers in the same institution. Often, the services within the core facility are also opened out to researchers from other institutions, frequently with a fee being levied for the service provided. In the 1990s, with the onset of the age of genomics, there was an abundance of DNA analysis facilities, many of which have since disappeared from institutions and are now available through commercial sources. Ten years on, as proteomics was beginning to be utilized by many researchers, this technology found itself an ideal candidate for being placed within a core facility. We discuss what in our view are the daily challenges of proteomics core facilities. We also examine the potential unmet needs of the proteomics core facility that may also be applicable to proteomics laboratories which do not function as core facilities. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. The statistical analysis of circadian phase and amplitude in constant-routine core-temperature data

    NASA Technical Reports Server (NTRS)

    Brown, E. N.; Czeisler, C. A.

    1992-01-01

    Accurate estimation of the phases and amplitude of the endogenous circadian pacemaker from constant-routine core-temperature series is crucial for making inferences about the properties of the human biological clock from data collected under this protocol. This paper presents a set of statistical methods based on a harmonic-regression-plus-correlated-noise model for estimating the phases and the amplitude of the endogenous circadian pacemaker from constant-routine core-temperature data. The methods include a Bayesian Monte Carlo procedure for computing the uncertainty in these circadian functions. We illustrate the techniques with a detailed study of a single subject's core-temperature series and describe their relationship to other statistical methods for circadian data analysis. In our laboratory, these methods have been successfully used to analyze more than 300 constant routines and provide a highly reliable means of extracting phase and amplitude information from core-temperature data.

  8. The role of total laboratory automation in a consolidated laboratory network.

    PubMed

    Seaberg, R S; Stallone, R O; Statland, B E

    2000-05-01

    In an effort to reduce overall laboratory costs and improve overall laboratory efficiencies at all of its network hospitals, the North Shore-Long Island Health System recently established a Consolidated Laboratory Network with a Core Laboratory at its center. We established and implemented a centralized Core Laboratory designed around the Roche/Hitachi CLAS Total Laboratory Automation system to perform the general and esoteric laboratory testing throughout the system in a timely and cost-effective fashion. All remaining STAT testing will be performed within the Rapid Response Laboratories (RRLs) at each of the system's hospitals. Results for this laboratory consolidation and implementation effort demonstrated a decrease in labor costs and improved turnaround time (TAT) at the core laboratory. Anticipated system savings are approximately $2.7 million. TATs averaged 1.3 h within the Core Laboratory and less than 30 min in the RRLs. When properly implemented, automation systems can reduce overall laboratory expenses, enhance patient services, and address the overall concerns facing the laboratory today: job satisfaction, decreased length of stay, and safety. The financial savings realized are primarily a result of labor reductions.

  9. s-core network decomposition: A generalization of k-core analysis to weighted networks

    NASA Astrophysics Data System (ADS)

    Eidsaa, Marius; Almaas, Eivind

    2013-12-01

    A broad range of systems spanning biology, technology, and social phenomena may be represented and analyzed as complex networks. Recent studies of such networks using k-core decomposition have uncovered groups of nodes that play important roles. Here, we present s-core analysis, a generalization of k-core (or k-shell) analysis to complex networks where the links have different strengths or weights. We demonstrate the s-core decomposition approach on two random networks (ER and configuration model with scale-free degree distribution) where the link weights are (i) random, (ii) correlated, and (iii) anticorrelated with the node degrees. Finally, we apply the s-core decomposition approach to the protein-interaction network of the yeast Saccharomyces cerevisiae in the context of two gene-expression experiments: oxidative stress in response to cumene hydroperoxide (CHP), and fermentation stress response (FSR). We find that the innermost s-cores are (i) different from innermost k-cores, (ii) different for the two stress conditions CHP and FSR, and (iii) enriched with proteins whose biological functions give insight into how yeast manages these specific stresses.

  10. Laboratory Equipment for Investigation of Coring Under Mars-like Conditions

    NASA Astrophysics Data System (ADS)

    Zacny, K.; Cooper, G.

    2004-12-01

    To develop a suitable drill bit and set of operating conditions for Mars sample coring applications, it is essential to make tests under conditions that match those of the mission. The goal of the laboratory test program was to determine the drilling performance of diamond-impregnated bits under simulated Martian conditions, particularly those of low pressure and low temperature in a carbon dioxide atmosphere. For this purpose, drilling tests were performed in a vacuum chamber kept at a pressure of 5 torr. Prior to drilling, a rock, soil or a clay sample was cooled down to minus 80 degrees Celsius (Zacny et al, 2004). Thus, all Martian conditions, except the low gravity were simulated in the controlled environment. Input drilling parameters of interest included the weight on bit and rotational speed. These two independent variables were controlled from a PC station. The dependent variables included the bit reaction torque, the depth of the bit inside the drilled hole and the temperatures at various positions inside the drilled sample, in the center of the core as it was being cut and at the bit itself. These were acquired every second by a data acquisition system. Additional information such as the rate of penetration and the drill power were calculated after the test was completed. The weight of the rock and the bit prior to and after the test were measured to aid in evaluating the bit performance. In addition, the water saturation of the rock was measured prior to the test. Finally, the bit was viewed under the Scanning Electron Microscope and the Stereo Optical Microscope. The extent of the bit wear and its salient features were captured photographically. The results revealed that drilling or coring under Martian conditions in a water saturated rock is different in many respects from drilling on Earth. This is mainly because the Martian atmospheric pressure is in the vicinity of the pressure at the triple point of water. Thus ice, heated by contact with the

  11. 7 CFR 160.17 - Laboratory analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Laboratory analysis. 160.17 Section 160.17 Agriculture... STANDARDS FOR NAVAL STORES Methods of Analysis, Inspection, Sampling and Grading § 160.17 Laboratory analysis. The analysis and laboratory testing of naval stores shall be conducted, so far as is practicable...

  12. 7 CFR 160.17 - Laboratory analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory analysis. 160.17 Section 160.17 Agriculture... STANDARDS FOR NAVAL STORES Methods of Analysis, Inspection, Sampling and Grading § 160.17 Laboratory analysis. The analysis and laboratory testing of naval stores shall be conducted, so far as is practicable...

  13. Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.

    2000-01-01

    The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.

  14. Video networking of cardiac catheterization laboratories.

    PubMed

    Tobis, J; Aharonian, V; Mansukhani, P; Kasaoka, S; Jhandyala, R; Son, R; Browning, R; Youngblood, L; Thompson, M

    1999-02-01

    The purpose of this study was to assess the feasibility and accuracy of a video telecommunication network to transmit coronary images to provide on-line interaction between personnel in a cardiac catheterization laboratory and a remote core laboratory. A telecommunication system was installed in the cardiac catheterization laboratory at Kaiser Hospital, Los Angeles, and the core laboratory at the University of California, Irvine, approximately 40 miles away. Cineangiograms, live fluoroscopy, intravascular ultrasound studies and images of the catheterization laboratory were transmitted in real time over a dedicated T1 line at 768 kilobytes/second at 15 frames/second. These cases were performed during a clinical study of angiographic guidance versus intravascular ultrasound (IVUS) guidance of stent deployment. During the cases the core laboratory performed quantitative analysis of the angiograms and ultrasound images. Selected images were then annotated and transmitted back to the catheterization laboratory to facilitate discussion during the procedure. A successful communication hookup was obtained in 39 (98%) of 40 cases. Measurements of angiographic parameters were very close between the original cinefilm and the transmitted images. Quantitative analysis of the ultrasound images showed no significant difference in any of the diameter or cross-sectional area measurements between the original ultrasound tape and the transmitted images. The telecommunication link during the interventional procedures had a significant impact in 23 (58%) of 40 cases affecting the area to be treated, the size of the inflation balloon, recognition of stent underdeployment, or the existence of disease in other areas that was not noted on the original studies. Current video telecommunication systems provide high-quality images on-line with accurate representation of cineangiograms and intravascular ultrasound images. This system had a significant impact on 58% of the cases in this small

  15. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-01-12

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase I : Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activitymore » Waste and High-Level Waste Feed Data Quality Objectives (L and H DQO) (Patello et al. 1999), and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  16. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-05-19

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy ''Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO)' (Nguyen 1999a), ''Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Butch X (LAW DQO) (Nguyen 1999b)'', ''Low Activity Wastemore » and High-Level Waste Feed Data Quality Objectives (L&H DQO)'' (Patello et al. 1999), and ''Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO)'' (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide sub-samples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  17. A pressure core ultrasonic test system for on-board analysis of gas hydrate-bearing sediments under in situ pressures.

    PubMed

    Yang, Lei; Zhou, Weihua; Xue, Kaihua; Wei, Rupeng; Ling, Zheng

    2018-05-01

    The enormous potential as an alternative energy resource has made natural gas hydrates a material of intense research interest. Their exploration and sample characterization require a quick and effective analysis of the hydrate-bearing cores recovered under in situ pressures. Here a novel Pressure Core Ultrasonic Test System (PCUTS) for on-board analysis of sediment cores containing gas hydrates at in situ pressures is presented. The PCUTS is designed to be compatible with an on-board pressure core transfer device and a long gravity-piston pressure-retained corer. It provides several advantages over laboratory core analysis including quick and non-destructive detection, in situ and successive acoustic property acquisition, and remission of sample storage and transportation. The design of the unique assembly units to ensure the in situ detection is demonstrated, involving the U-type protecting jackets, transducer precession device, and pressure stabilization system. The in situ P-wave velocity measurements make the detection of gas hydrate existence in the sediments possible on-board. Performance tests have verified the feasibility and sensitivity of the ultrasonic test unit, showing the dependence of P-wave velocity on gas hydrate saturation. The PCUTS has been successfully applied for analysis of natural samples containing gas hydrates recovered from the South China Sea. It is indicated that on-board P-wave measurements could provide a quick and effective understanding of the hydrate occurrence in natural samples, which can assist further resource exploration, assessment, and subsequent detailed core analysis.

  18. A pressure core ultrasonic test system for on-board analysis of gas hydrate-bearing sediments under in situ pressures

    NASA Astrophysics Data System (ADS)

    Yang, Lei; Zhou, Weihua; Xue, Kaihua; Wei, Rupeng; Ling, Zheng

    2018-05-01

    The enormous potential as an alternative energy resource has made natural gas hydrates a material of intense research interest. Their exploration and sample characterization require a quick and effective analysis of the hydrate-bearing cores recovered under in situ pressures. Here a novel Pressure Core Ultrasonic Test System (PCUTS) for on-board analysis of sediment cores containing gas hydrates at in situ pressures is presented. The PCUTS is designed to be compatible with an on-board pressure core transfer device and a long gravity-piston pressure-retained corer. It provides several advantages over laboratory core analysis including quick and non-destructive detection, in situ and successive acoustic property acquisition, and remission of sample storage and transportation. The design of the unique assembly units to ensure the in situ detection is demonstrated, involving the U-type protecting jackets, transducer precession device, and pressure stabilization system. The in situ P-wave velocity measurements make the detection of gas hydrate existence in the sediments possible on-board. Performance tests have verified the feasibility and sensitivity of the ultrasonic test unit, showing the dependence of P-wave velocity on gas hydrate saturation. The PCUTS has been successfully applied for analysis of natural samples containing gas hydrates recovered from the South China Sea. It is indicated that on-board P-wave measurements could provide a quick and effective understanding of the hydrate occurrence in natural samples, which can assist further resource exploration, assessment, and subsequent detailed core analysis.

  19. Exploration Laboratory Analysis - ARC

    NASA Technical Reports Server (NTRS)

    Krihak, Michael K.; Fung, Paul P.

    2012-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk, Risk of Inability to Adequately Treat an Ill or Injured Crew Member, and ExMC Gap 4.05: Lack of minimally invasive in-flight laboratory capabilities with limited consumables required for diagnosing identified Exploration Medical Conditions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability in future exploration missions. Mission architecture poses constraints on equipment and procedures that will be available to treat evidence-based medical conditions according to the Space Medicine Exploration Medical Conditions List (SMEMCL). The SMEMCL provided diagnosis and treatment for the evidence-based medical conditions and hence, a basis for developing ELA functional requirements.

  20. The ADNI PET Core: 2015

    PubMed Central

    Jagust, William J.; Landau, Susan M.; Koeppe, Robert A.; Reiman, Eric M.; Chen, Kewei; Mathis, Chester A.; Price, Julie C.; Foster, Norman L.; Wang, Angela Y.

    2015-01-01

    INTRODUCTION This paper reviews the work done in the ADNI PET core over the past 5 years, largely concerning techniques, methods, and results related to amyloid imaging in ADNI. METHODS The PET Core has utilized [18F]florbetapir routinely on ADNI participants, with over 1600 scans available for download. Four different laboratories are involved in data analysis, and have examined factors such as longitudinal florbetapir analysis, use of FDG-PET in clinical trials, and relationships between different biomarkers and cognition. RESULTS Converging evidence from the PET Core has indicated that cross-sectional and longitudinal florbetapir analyses require different reference regions. Studies have also examined the relationship between florbetapir data obtained immediately after injection, which reflects perfusion, and FDG-PET results. Finally, standardization has included the translation of florbetapir PET data to a centiloid scale. CONCLUSION The PET Core has demonstrated a variety of methods for standardization of biomarkers such as florbetapir PET in a multicenter setting. PMID:26194311

  1. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  2. Agreement between core laboratory and study investigators for imaging scores in a thrombectomy trial.

    PubMed

    Fahed, Robert; Ben Maacha, Malek; Ducroux, Célina; Khoury, Naim; Blanc, Raphaël; Piotin, Michel; Lapergue, Bertrand

    2018-05-14

    We aimed to assess the agreement between study investigators and the core laboratory (core lab) of a thrombectomy trial for imaging scores. The Alberta Stroke Program Early CT Score (ASPECTS), the European Collaborative Acute Stroke Study (ECASS) hemorrhagic transformation (HT) classification, and the Thrombolysis In Cerebral Infarction (TICI) scores as recorded by study investigators were compared with the core lab scores in order to assess interrater agreement, using Cohen's unweighted and weighted kappa statistics. There were frequent discrepancies between study sites and core lab for all the scores. Agreement for ASPECTS and ECASS HT classification was less than substantial, with disagreement occurring in more than one-third of cases. Agreement was higher on MRI-based scores than on CT, and was improved after dichotomization on both CT and MRI. Agreement for TICI scores was moderate (with disagreement occurring in more than 25% of patients), and went above the substantial level (less than 10% disagreement) after dichotomization (TICI 0/1/2a vs 2b/3). Discrepancies between scores assessed by the imaging core lab and those reported by study sites occurred in a significant proportion of patients. Disagreement in the assessment of ASPECTS and day 1 HT scores was more frequent on CT than on MRI. The agreement for the dichotomized TICI score (the trial's primary outcome) was substantial, with less than 10% of disagreement between study sites and core lab. NCT02523261, Post-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. Exploration Laboratory Analysis FY13

    NASA Technical Reports Server (NTRS)

    Krihak, Michael; Perusek, Gail P.; Fung, Paul P.; Shaw, Tianna, L.

    2013-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk, which is stated as the Risk of Inability to Adequately Treat an Ill or Injured Crew Member, and ExMC Gap 4.05: Lack of minimally invasive in-flight laboratory capabilities with limited consumables required for diagnosing identified Exploration Medical Conditions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability in future exploration missions. Mission architecture poses constraints on equipment and procedures that will be available to treat evidence-based medical conditions according to the Space Medicine Exploration Medical Conditions List (SMEMCL), and to perform human research studies on the International Space Station (ISS) that are supported by the Human Health and Countermeasures (HHC) element. Since there are significant similarities in the research and medical operational requirements, ELA hardware development has emerged as a joint effort between ExMC and HHC. In 2012, four significant accomplishments were achieved towards the development of exploration laboratory analysis for medical diagnostics. These achievements included (i) the development of high priority analytes for research and medical operations, (ii) the development of Level 1 functional requirements and concept of operations documentation, (iii) the selection and head-to-head competition of in-flight laboratory analysis instrumentation, and (iv) the phase one completion of the Small Business Innovation Research (SBIR) projects under the topic Smart Phone Driven Blood-Based Diagnostics. To utilize resources efficiently, the associated documentation and advanced technologies were integrated into a single ELA plan that encompasses ExMC and HHC development efforts. The requirements and high priority analytes was used in the selection of the four in-flight laboratory analysis performers. Based upon the

  4. Exploration Laboratory Analysis

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Ronzano, K.; Shaw, T.

    2016-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the down selection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institutes rHEALTH X and Intelligent Optical Systems later flow assays combined with Holomics smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements that will be finalized in FY16. Also, the down selected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.

  5. Exploration Laboratory Analysis

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Ronzano, K.; Shaw, T.

    2016-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the downselection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institute's rHEALTH X and Intelligent Optical System's lateral flow assays combined with Holomic's smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements. The technology demonstrations and metrics for success will be finalized in FY16. Also, the downselected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.

  6. Multi-Core Processor Memory Contention Benchmark Analysis Case Study

    NASA Technical Reports Server (NTRS)

    Simon, Tyler; McGalliard, James

    2009-01-01

    Multi-core processors dominate current mainframe, server, and high performance computing (HPC) systems. This paper provides synthetic kernel and natural benchmark results from an HPC system at the NASA Goddard Space Flight Center that illustrate the performance impacts of multi-core (dual- and quad-core) vs. single core processor systems. Analysis of processor design, application source code, and synthetic and natural test results all indicate that multi-core processors can suffer from significant memory subsystem contention compared to similar single-core processors.

  7. Posttest analysis of a laboratory-cast monolith of salt-saturated concrete. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wakeley, L.D.; Poole, T.S.

    A salt-saturated concrete was formulated for laboratory testing of cementitious mixtures with potential for use in disposal of radioactive wastes in a geologic repository in halite rock. Cores were taken from a laboratory-cast concrete monolith on completion of tests of permeability, strain, and stress. The cores were analyzed for physical and chemical evidence of brine migration through the concrete, and other features with potential impact on installation of crete plugs at the Waste Isolation Pilot Plant (WIPP) in New Mexico. The posttest analyses of the cores provided evidence of brine movement along the interface between concrete and pipe, and littlemore » indication of permeability through the monolith itself. There may also have been diffusion of chloride into the monolith without actual brine flow.« less

  8. Tank 241-T-204, core 188 analytical results for the final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuzum, J.L.

    TANK 241-T-204, CORE 188, ANALYTICAL RESULTS FOR THE FINAL REPORT. This document is the final laboratory report for Tank 241 -T-204. Push mode core segments were removed from Riser 3 between March 27, 1997, and April 11, 1997. Segments were received and extruded at 222-8 Laboratory. Analyses were performed in accordance with Tank 241-T-204 Push Mode Core Sampling and analysis Plan (TRAP) (Winkleman, 1997), Letter of instruction for Core Sample Analysis of Tanks 241-T-201, 241- T-202, 241-T-203, and 241-T-204 (LAY) (Bell, 1997), and Safety Screening Data Qual@ Objective (DO) ODukelow, et al., 1995). None of the subsamples submitted for totalmore » alpha activity (AT) or differential scanning calorimetry (DC) analyses exceeded the notification limits stated in DO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group and are not considered in this report.« less

  9. Method for tracking core-contributed publications.

    PubMed

    Loomis, Cynthia A; Curchoe, Carol Lynn

    2012-12-01

    Accurately tracking core-contributed publications is an important and often difficult task. Many core laboratories are supported by programmatic grants (such as Cancer Center Support Grant and Clinical Translational Science Awards) or generate data with instruments funded through S10, Major Research Instrumentation, or other granting mechanisms. Core laboratories provide their research communities with state-of-the-art instrumentation and expertise, elevating research. It is crucial to demonstrate the specific projects that have benefited from core services and expertise. We discuss here the method we developed for tracking core contributed publications.

  10. Cost analysis in the toxicology laboratory.

    PubMed

    Travers, E M

    1990-09-01

    The process of determining laboratory sectional and departmental costs and test costs for instrument-generated and manually generated reportable results for toxicology laboratories has been outlined in this article. It is hoped that the basic principles outlined in the preceding text will clarify and elucidate one of the most important areas needed for laboratory fiscal integrity and its survival in these difficult times for health care providers. The following general principles derived from this article are helpful aids for managers of toxicology laboratories. 1. To manage a cost-effective, efficient toxicology laboratory, several factors must be considered: the laboratory's instrument configuration, test turnaround time needs, the test menu offered, the analytic methods used, the cost of labor based on time expended and the experience and educational level of the staff, and logistics that determine specimen delivery time and costs. 2. There is a wide variation in costs for toxicologic methods, which requires that an analysis of capital (equipment) purchase and operational (test performance) costs be performed to avoid waste, purchase wisely, and determine which tests consume the majority of the laboratory's resources. 3. Toxicologic analysis is composed of many complex steps. Each step must be individually cost-accounted. Screening test results must be confirmed, and the cost for both steps must be included in the cost per reportable result. 4. Total costs will vary in the same laboratory and between laboratories based on differences in salaries paid to technical staff, differences in reagent/supply costs, the number of technical staff needed to operate the analyzer or perform the method, and the inefficient use of highly paid staff to operate the analyzer or perform the method. 5. Since direct test costs vary directly with the type and number of analyzers or methods and are dependent on the operational mode designed by the manufacturer, laboratory managers

  11. Evolution of petrophysical properties of across natural faults: a study on cores from the Tournemire underground research laboratory (France)

    NASA Astrophysics Data System (ADS)

    Bonnelye, Audrey; David, Christian; Schubnel, Alexandre; Wassermann, Jérôme; Lefèvre, Mélody; Henry, Pierre; Guglielmi, Yves; Castilla, Raymi; Dick, Pierre

    2017-04-01

    Faults in general, and in clay materials in particular, have complex structures that can be linked to both a polyphased tectonic history and the anisotropic nature of the material. Drilling through faults in shaly materials allows one to measure properties such as the structure, the mineralogical composition, the stress orientation or physical properties. These relations can be investigated in the laboratory in order to have a better understanding on in-situ mechanisms. In this study we used shales of Toarcian age from the Tournemire underground research laboratory (France). We decided to couple different petrophysical measurements on core samples retrieved from a borehole drilled perpendicularly to a fault plane, and the fault size is of the order of tens of meters. This 25m long borehole was sampled in order to perform several types of measurements: density, porosity, saturation directly in the field, and velocity of elastic waves and magnetic susceptibility anisotropy in the laboratory. For all these measurements, special protocols were developed in order to preserve as much as possible the saturation state of the samples. All these measurements were carried out in three zones that intersects the borehole: the intact zone , the damaged zone and the fault core zone. From our measurements, we were able to associate specific properties to each zone of the fault. We then calculated Thomsen's parameters in order to quantify the elastic anisotropy across the fault. Our results show strong variations of the elastic anisotropy with the distance to the fault core as well as the occurrence of anisotropy reversal.

  12. Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov Websites

    Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination different hydroxyl groups (-OH) in pyrolysis bio-oil: aliphatic-OH, phenolic-OH, and carboxylic-OH. Download

  13. Core Technical Capability Laboratory Management System

    NASA Technical Reports Server (NTRS)

    Shaykhian, Linda; Dugger, Curtis; Griffin, Laurie

    2008-01-01

    The Core Technical Capability Lab - oratory Management System (CTCLMS) consists of dynamically generated Web pages used to access a database containing detailed CTC lab data with the software hosted on a server that allows users to have remote access.

  14. Optical Methods for Identifying Hard Clay Core Samples During Petrophysical Studies

    NASA Astrophysics Data System (ADS)

    Morev, A. V.; Solovyeva, A. V.; Morev, V. A.

    2018-01-01

    X-ray phase analysis of the general mineralogical composition of core samples from one of the West Siberian fields was performed. Electronic absorption spectra of the clay core samples with an added indicator were studied. The speed and availability of applying the two methods in petrophysical laboratories during sample preparation for standard and special studies were estimated.

  15. Metallurgical failure analysis of MH-1A reactor core hold-down bolts. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hawthorne, J.R.; Watson, H.E.

    1976-11-01

    The Naval Research Laboratory has performed a failure analysis on two MH-1A reactor core hold-down bolts that broke in service. Adherence to fabrication specifications, post-service properties and possible causes of bolt failure were investigated. The bolt material was verified as 17-4PH precipitation hardening stainless steel. Measured bolt dimensions also were in accordance with fabrication drawing specifications. Bolt failure occurred in the region of a locking pin hole which reduced the bolt net section by 47 percent. The failure analysis indicates that the probable cause of failure was net section overloading resulting from a lateral bending force on the bolt. Themore » analysis indicates that net section overloading could also have resulted from combined tensile stresses (bolt preloading plus differential thermal expansion). Recommendations are made for improved bolting.« less

  16. NETL - Thermogravimetric Analysis Laboratory

    ScienceCinema

    Richards, George

    2018-06-22

    Researchers in NETL's Thermal Analysis Laboratory are investigating chemical looping combustion. As a clean and efficient fossil fuel technology, chemical looping combustion controls CO2 emissions and offers a promising alternative to traditional combustion.

  17. Tank 241-AZ-102 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RASMUSSEN, J.H.

    1999-08-02

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AZ-102. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AZ-102 required to satisfy the Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank TIS An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase 1: Confirm Tank TIS An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activity Waste andmore » High Level Waste Feed Data Quality Objectives (L&H DQO) (Patello et al. 1999) and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). The Tank Characterization Technical Sampling Basis document (Brown et al. 1998) indicates that these issues, except the Equipment DQO apply to tank 241-AZ-102 for this sampling event. The Equipment DQO is applied for shear strength measurements of the solids segments only. Poppiti (1999) requires additional americium-241 analyses of the sludge segments. Brown et al. (1998) also identify safety screening, regulatory issues and provision of samples to the Privatization Contractor(s) as applicable issues for this tank. However, these issues will not be addressed via this sampling event. Reynolds et al. (1999) concluded that information from previous sampling events was sufficient to satisfy the safety screening requirements for tank 241 -AZ-102. Push mode core samples will be obtained from risers 15C and 24A to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples, composite the liquids and solids, perform chemical

  18. Gait Analysis Laboratory

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.

  19. Integration of Biosafety into Core Facility Management

    PubMed Central

    Fontes, Benjamin

    2013-01-01

    This presentation will discuss the implementation of biosafety policies for small, medium and large core laboratories with primary shared objectives of ensuring the control of biohazards to protect core facility operators and assure conformity with applicable state and federal policies, standards and guidelines. Of paramount importance is the educational process to inform core laboratories of biosafety principles and policies and to illustrate the technology and process pathways of the core laboratory for biosafety professionals. Elevating awareness of biohazards and the biosafety regulatory landscape among core facility operators is essential for the establishment of a framework for both project and material risk assessment. The goal of the biohazard risk assessment process is to identify the biohazard risk management parameters to conduct the procedure safely and in compliance with applicable regulations. An evaluation of the containment, protective equipment and work practices for the procedure for the level of risk identified is facilitated by the establishment of a core facility registration form for work with biohazards and other biological materials with potential risk. The final step in the biocontainment process is the assumption of Principal Investigator role with full responsibility for the structure of the site-specific biosafety program plan by core facility leadership. The presentation will provide example biohazard protocol reviews and accompanying containment measures for core laboratories at Yale University.

  20. Integration of ANFIS, NN and GA to determine core porosity and permeability from conventional well log data

    NASA Astrophysics Data System (ADS)

    Ja'fari, Ahmad; Hamidzadeh Moghadam, Rasoul

    2012-10-01

    Routine core analysis provides useful information for petrophysical study of the hydrocarbon reservoirs. Effective porosity and fluid conductivity (permeability) could be obtained from core analysis in laboratory. Coring hydrocarbon bearing intervals and analysis of obtained cores in laboratory is expensive and time consuming. In this study an improved method to make a quantitative correlation between porosity and permeability obtained from core and conventional well log data by integration of different artificial intelligent systems is proposed. The proposed method combines the results of adaptive neuro-fuzzy inference system (ANFIS) and neural network (NN) algorithms for overall estimation of core data from conventional well log data. These methods multiply the output of each algorithm with a weight factor. Simple averaging and weighted averaging were used for determining the weight factors. In the weighted averaging method the genetic algorithm (GA) is used to determine the weight factors. The overall algorithm was applied in one of SW Iran’s oil fields with two cored wells. One-third of all data were used as the test dataset and the rest of them were used for training the networks. Results show that the output of the GA averaging method provided the best mean square error and also the best correlation coefficient with real core data.

  1. Multiple pre- and post-analytical lean approaches to the improvement of the laboratory turnaround time in a large core laboratory.

    PubMed

    Lou, Amy H; Elnenaei, Manal O; Sadek, Irene; Thompson, Shauna; Crocker, Bryan D; Nassar, Bassam A

    2017-10-01

    Core laboratory (CL), as a new business model, facilitates consolidation and integration of laboratory services to enhance efficiency and reduce costs. This study evaluates the impact of total laboratory automation system (TLA), electric track vehicle (ETV) system and auto-verification (AV) of results on overall turnaround time (TAT) (phlebotomy to reporting TAT: PR-TAT) within a CL setting. Mean, median and percentage of outlier (OP) for PR-TAT were compared for pre- and post-CL eras using five representative tests based on different request priorities. Comparison studies were also carried out on the intra-laboratory TAT (in-lab to reporting TAT: IR-TAT) and the delivery TAT (phlebotomy to in-lab TAT: PI-TAT) to reflect the efficiency of the TLA (both before and after introducing result AV) and ETV systems respectively. Median PR-TATs for the urgent samples were reduced on average by 16% across all representative analytes. Median PR-TATs for the routine samples were curtailed by 51%, 50%, 49%, 34% and 22% for urea, potassium, thyroid stimulating hormone (TSH), complete blood count (CBC) and prothrombin time (PT) respectively. The shorter PR-TAT was attributed to a significant reduction of IR-TAT through the TLA. However, the median PI-TAT was delayed when the ETV was used. Application of various AV rules shortened the median IR-TATs for potassium and urea. However, the OP of PR-TAT for the STAT requests exceeding 60min were all higher than those from the pre-CL era. TLA and auto-verification rules help to efficiently manage substantial volumes of urgent and routine samples. However, the ETV application as it stands shows a negative impact on the PR-TAT. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Evaluation of the impact of a total automation system in a large core laboratory on turnaround time.

    PubMed

    Lou, Amy H; Elnenaei, Manal O; Sadek, Irene; Thompson, Shauna; Crocker, Bryan D; Nassar, Bassam

    2016-11-01

    Growing financial and workload pressures on laboratories coupled with user demands for faster turnaround time (TAT) has steered the implementation of total laboratory automation (TLA). The current study evaluates the impact of a complex TLA on core laboratory efficiency through the analysis of the In-lab to Report TAT (IR-TAT) for five representative tests based on the different requested priorities. Mean, median and outlier percentages (OP) for IR-TAT were determined following TLA implementation and where possible, compared to the pre-TLA era. The shortest mean IR-TAT via the priority lanes of the TLA was 22min for Complete Blood Count (CBC), followed by 34min, 39min and 40min for Prothrombin time (PT), urea and potassium testing respectively. The mean IR-TAT for STAT CBC loaded directly on to the analyzers was 5min shorter than that processed via the TLA. The mean IR-TATs for both STAT potassium and urea via offline centrifugation were comparable to that processed by the TLA. The longest mean IR-TAT via regular lanes of the TLA was 62min for Thyroid-Stimulating Hormone (TSH) while the shortest was 17min for CBC. All parameters for IR-TAT for CBC and PT tests decreased significantly post- TLA across all requested priorities in particular the outlier percentage (OP) at 30 and 60min. TLA helps to efficiently manage substantial volumes of samples across all requested priorities. Manual processing for small STAT volumes, at both the initial centrifugation stage and front loading directly on to analyzers, is however likely to yield the shortest IR-TAT. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  3. Petrographic Analysis of Cores from Plant 42

    DTIC Science & Technology

    2016-10-01

    ER D C TR -X X- D R AF T Petrographic Analysis of Cores from Plant 42 E n gi n ee r R es ea rc h a n d D ev el op m en t C en te r...E. Rae Reed-Gore, Kyle Klaus, and Robert D. Moser October 2016 Approved for public release; distribution is unlimited. The U.S. Army...ORIGINATOR. ii Figures and Tables Figures Figure 1. Test location map of AF Plant 42 with core number locations. ............................. 2

  4. Advanced Materials and Solids Analysis Research Core (AMSARC)

    EPA Science Inventory

    The Advanced Materials and Solids Analysis Research Core (AMSARC), centered at the U.S. Environmental Protection Agency's (EPA) Andrew W. Breidenbach Environmental Research Center in Cincinnati, Ohio, is the foundation for the Agency's solids and surfaces analysis capabilities. ...

  5. Instrument Synthesis and Analysis Laboratory

    NASA Technical Reports Server (NTRS)

    Wood, H. John

    2004-01-01

    The topics addressed in this viewgraph presentation include information on 1) Historic instruments at Goddard; 2) Integrated Design Capability at Goddard; 3) The Instrument Synthesis and Analysis Laboratory (ISAL).

  6. The Translational Genomics Core at Partners Personalized Medicine: Facilitating the Transition of Research towards Personalized Medicine

    PubMed Central

    Blau, Ashley; Brown, Alison; Mahanta, Lisa; Amr, Sami S.

    2016-01-01

    The Translational Genomics Core (TGC) at Partners Personalized Medicine (PPM) serves as a fee-for-service core laboratory for Partners Healthcare researchers, providing access to technology platforms and analysis pipelines for genomic, transcriptomic, and epigenomic research projects. The interaction of the TGC with various components of PPM provides it with a unique infrastructure that allows for greater IT and bioinformatics opportunities, such as sample tracking and data analysis. The following article describes some of the unique opportunities available to an academic research core operating within PPM, such the ability to develop analysis pipelines with a dedicated bioinformatics team and maintain a flexible Laboratory Information Management System (LIMS) with the support of an internal IT team, as well as the operational challenges encountered to respond to emerging technologies, diverse investigator needs, and high staff turnover. In addition, the implementation and operational role of the TGC in the Partners Biobank genotyping project of over 25,000 samples is presented as an example of core activities working with other components of PPM. PMID:26927185

  7. The Translational Genomics Core at Partners Personalized Medicine: Facilitating the Transition of Research towards Personalized Medicine.

    PubMed

    Blau, Ashley; Brown, Alison; Mahanta, Lisa; Amr, Sami S

    2016-02-26

    The Translational Genomics Core (TGC) at Partners Personalized Medicine (PPM) serves as a fee-for-service core laboratory for Partners Healthcare researchers, providing access to technology platforms and analysis pipelines for genomic, transcriptomic, and epigenomic research projects. The interaction of the TGC with various components of PPM provides it with a unique infrastructure that allows for greater IT and bioinformatics opportunities, such as sample tracking and data analysis. The following article describes some of the unique opportunities available to an academic research core operating within PPM, such the ability to develop analysis pipelines with a dedicated bioinformatics team and maintain a flexible Laboratory Information Management System (LIMS) with the support of an internal IT team, as well as the operational challenges encountered to respond to emerging technologies, diverse investigator needs, and high staff turnover. In addition, the implementation and operational role of the TGC in the Partners Biobank genotyping project of over 25,000 samples is presented as an example of core activities working with other components of PPM.

  8. Comparability of river suspended-sediment sampling and laboratory analysis methods

    USGS Publications Warehouse

    Groten, Joel T.; Johnson, Gregory D.

    2018-03-06

    Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.

  9. MULTI-CORE AND OPTICAL PROCESSOR RELATED APPLICATIONS RESEARCH AT OAK RIDGE NATIONAL LABORATORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barhen, Jacob; Kerekes, Ryan A; ST Charles, Jesse Lee

    2008-01-01

    High-speed parallelization of common tasks holds great promise as a low-risk approach to achieving the significant increases in signal processing and computational performance required for next generation innovations in reconfigurable radio systems. Researchers at the Oak Ridge National Laboratory have been working on exploiting the parallelization offered by this emerging technology and applying it to a variety of problems. This paper will highlight recent experience with four different parallel processors applied to signal processing tasks that are directly relevant to signal processing required for SDR/CR waveforms. The first is the EnLight Optical Core Processor applied to matched filter (MF) correlationmore » processing via fast Fourier transform (FFT) of broadband Dopplersensitive waveforms (DSW) using active sonar arrays for target tracking. The second is the IBM CELL Broadband Engine applied to 2-D discrete Fourier transform (DFT) kernel for image processing and frequency domain processing. And the third is the NVIDIA graphical processor applied to document feature clustering. EnLight Optical Core Processor. Optical processing is inherently capable of high-parallelism that can be translated to very high performance, low power dissipation computing. The EnLight 256 is a small form factor signal processing chip (5x5 cm2) with a digital optical core that is being developed by an Israeli startup company. As part of its evaluation of foreign technology, ORNL's Center for Engineering Science Advanced Research (CESAR) had access to a precursor EnLight 64 Alpha hardware for a preliminary assessment of capabilities in terms of large Fourier transforms for matched filter banks and on applications related to Doppler-sensitive waveforms. This processor is optimized for array operations, which it performs in fixed-point arithmetic at the rate of 16 TeraOPS at 8-bit precision. This is approximately 1000 times faster than the fastest DSP available today. The optical

  10. Advanced core-analyses for subsurface characterization

    NASA Astrophysics Data System (ADS)

    Pini, R.

    2017-12-01

    The heterogeneity of geological formations varies over a wide range of length scales and represents a major challenge for predicting the movement of fluids in the subsurface. Although they are inherently limited in the accessible length-scale, laboratory measurements on reservoir core samples still represent the only way to make direct observations on key transport properties. Yet, properties derived on these samples are of limited use and should be regarded as sample-specific (or `pseudos'), if the presence of sub-core scale heterogeneities is not accounted for in data processing and interpretation. The advent of imaging technology has significantly reshaped the landscape of so-called Special Core Analysis (SCAL) by providing unprecedented insight on rock structure and processes down to the scale of a single pore throat (i.e. the scale at which all reservoir processes operate). Accordingly, improved laboratory workflows are needed that make use of such wealth of information by e.g., referring to the internal structure of the sample and in-situ observations, to obtain accurate parameterisation of both rock- and flow-properties that can be used to populate numerical models. We report here on the development of such workflow for the study of solute mixing and dispersion during single- and multi-phase flows in heterogeneous porous systems through a unique combination of two complementary imaging techniques, namely X-ray Computed Tomography (CT) and Positron Emission Tomography (PET). The experimental protocol is applied to both synthetic and natural porous media, and it integrates (i) macroscopic observations (tracer effluent curves), (ii) sub-core scale parameterisation of rock heterogeneities (e.g., porosity, permeability and capillary pressure), and direct 3D observation of (iii) fluid saturation distribution and (iv) the dynamic spreading of the solute plumes. Suitable mathematical models are applied to reproduce experimental observations, including both 1D and 3D

  11. Results and analysis of saltstone cores taken from saltstone disposal unit cell 2A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reigel, M. M.; Hill, K. A.

    2016-03-01

    As part of an ongoing Performance Assessment (PA) Maintenance Plan, Savannah River Remediation (SRR) has developed a sampling and analyses strategy to facilitate the comparison of field-emplaced samples (i.e., saltstone placed and cured in a Saltstone Disposal Unit (SDU)) with samples prepared and cured in the laboratory. The primary objectives of the Sampling and Analyses Plan (SAP) are; (1) to demonstrate a correlation between the measured properties of laboratory-prepared, simulant samples (termed Sample Set 3), and the field-emplaced saltstone samples (termed Sample Set 9), and (2) to validate property values assumed for the Saltstone Disposal Facility (SDF) PA modeling. Themore » analysis and property data for Sample Set 9 (i.e. six core samples extracted from SDU Cell 2A (SDU2A)) are documented in this report, and where applicable, the results are compared to the results for Sample Set 3. Relevant properties to demonstrate the aforementioned objectives include bulk density, porosity, saturated hydraulic conductivity (SHC), and radionuclide leaching behavior.« less

  12. Tank Vapor Sampling and Analysis Data Package for Tank 241-Z-361 Sampled 09/22/1999 and 09/271999 During Sludge Core Removal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VISWANATH, R.S.

    This data package presents sampling data and analytical results from the September 22 and 27, 1999, headspace vapor sampling of Hanford Site Tank 241-2-361 during sludge core removal. The Lockheed Martin Hanford Corporation (LMHC) sampling team collected the samples and Waste Management Laboratory (WML) analyzed the samples in accordance with the requirements specified in the 241-2361 Sludge Characterization Sampling and Analysis Plan, (SAP), HNF-4371, Rev. 1, (Babcock and Wilcox Hanford Corporation, 1999). Six SUMMA{trademark} canister samples were collected on each day (1 ambient field blank and 5 tank vapor samples collected when each core segment was removed). The samples weremore » radiologically released on September 28 and October 4, 1999, and received at the laboratory on September 29 and October 6, 1999. Target analytes were not detected at concentrations greater than their notification limits as specified in the SAP. Analytical results for the target analytes and tentatively identified compounds (TICs) are presented in Section 2.2.2 starting on page 2B-7. Three compounds identified for analysis in the SAP were analyzed as TICs. The discussion of this modification is presented in Section 2.2.1.2.« less

  13. In-Flight Laboratory Analysis

    NASA Technical Reports Server (NTRS)

    Baumann, David; Perusek, Gail; Nelson, Emily; Krihak, Michael; Brown, Dan

    2012-01-01

    One-year study objectives align with HRP requirements. HRP requirements include measurement panels for research and medical operations - These measurement panels are distinctly different. Instrument requirements are defined - Power, volume and mass not quite a critical limitation as for medical operations (deep space exploration missions). One-year evaluation goals will lead HHC towards in-flight laboratory analysis capability.

  14. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denman, Matthew R.; Brooks, Dusty Marie

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on keymore » figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .« less

  15. Pharmacology Portal: An Open Database for Clinical Pharmacologic Laboratory Services.

    PubMed

    Karlsen Bjånes, Tormod; Mjåset Hjertø, Espen; Lønne, Lars; Aronsen, Lena; Andsnes Berg, Jon; Bergan, Stein; Otto Berg-Hansen, Grim; Bernard, Jean-Paul; Larsen Burns, Margrete; Toralf Fosen, Jan; Frost, Joachim; Hilberg, Thor; Krabseth, Hege-Merete; Kvan, Elena; Narum, Sigrid; Austgulen Westin, Andreas

    2016-01-01

    More than 50 Norwegian public and private laboratories provide one or more analyses for therapeutic drug monitoring or testing for drugs of abuse. Practices differ among laboratories, and analytical repertoires can change rapidly as new substances become available for analysis. The Pharmacology Portal was developed to provide an overview of these activities and to standardize the practices and terminology among laboratories. The Pharmacology Portal is a modern dynamic web database comprising all available analyses within therapeutic drug monitoring and testing for drugs of abuse in Norway. Content can be retrieved by using the search engine or by scrolling through substance lists. The core content is a substance registry updated by a national editorial board of experts within the field of clinical pharmacology. This ensures quality and consistency regarding substance terminologies and classification. All laboratories publish their own repertoires in a user-friendly workflow, adding laboratory-specific details to the core information in the substance registry. The user management system ensures that laboratories are restricted from editing content in the database core or in repertoires within other laboratory subpages. The portal is for nonprofit use, and has been fully funded by the Norwegian Medical Association, the Norwegian Society of Clinical Pharmacology, and the 8 largest pharmacologic institutions in Norway. The database server runs an open-source content management system that ensures flexibility with respect to further development projects, including the potential expansion of the Pharmacology Portal to other countries. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.

  16. Core vs. Bulk Samples in Soil-Moisture Tension Analyses

    Treesearch

    Walter M. Broadfoot

    1954-01-01

    The usual laboratory procedure in determining soil-moisture tension values is to use "undisturbed" soil cores for tensions up to 60 cm. of water and bulk soil samples for higher tensions. Low tensions are usually obtained with a tension table and the higher tensions by use of pressure plate apparatus. In tension analysis at the Vicksburg Infiltration Project...

  17. The Alzheimer's Disease Neuroimaging Initiative 2 PET Core: 2015.

    PubMed

    Jagust, William J; Landau, Susan M; Koeppe, Robert A; Reiman, Eric M; Chen, Kewei; Mathis, Chester A; Price, Julie C; Foster, Norman L; Wang, Angela Y

    2015-07-01

    This article reviews the work done in the Alzheimer's Disease Neuroimaging Initiative positron emission tomography (ADNI PET) core over the past 5 years, largely concerning techniques, methods, and results related to amyloid imaging in ADNI. The PET Core has used [(18)F]florbetapir routinely on ADNI participants, with over 1600 scans available for download. Four different laboratories are involved in data analysis, and have examined factors such as longitudinal florbetapir analysis, use of [(18)F]fluorodeoxyglucose (FDG)-PET in clinical trials, and relationships between different biomarkers and cognition. Converging evidence from the PET Core has indicated that cross-sectional and longitudinal florbetapir analyses require different reference regions. Studies have also examined the relationship between florbetapir data obtained immediately after injection, which reflects perfusion, and FDG-PET results. Finally, standardization has included the translation of florbetapir PET data to a centiloid scale. The PET Core has demonstrated a variety of methods for the standardization of biomarkers such as florbetapir PET in a multicenter setting. Copyright © 2015 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  18. Core outcome measures for opioid abuse liability laboratory assessment studies in humans: IMMPACT recommendations

    PubMed Central

    Comer, Sandra D.; Zacny, James P.; Dworkin, Robert H.; Turk, Dennis C.; Bigelow, George E.; Foltin, Richard W.; Jasinski, Donald R.; Sellers, Edward M.; Adams, Edgar H.; Balster, Robert; Burke, Laurie B.; Cerny, Igor; Colucci, Robert D.; Cone, Edward; Cowan, Penney; Farrar, John T.; Haddox, J. David; Haythornthwaite, Jennifer A.; Hertz, Sharon; Jay, Gary W.; Johanson, Chris-Ellyn; Junor, Roderick; Katz, Nathaniel P.; Klein, Michael; Kopecky, Ernest A.; Leiderman, Deborah B.; McDermott, Michael P.; O’Brien, Charles; O’Connor, Alec B.; Palmer, Pamela P.; Raja, Srinivasa N.; Rappaport, Bob A.; Rauschkolb, Christine; Rowbotham, Michael C.; Sampaio, Cristina; Setnik, Beatrice; Sokolowska, Marta; Stauffer, Joseph W.; Walsh, Sharon L.

    2012-01-01

    A critical component in development of opioid analgesics is assessment of their abuse liability (AL). Standardization of approaches and measures used in assessing AL has the potential to facilitate comparisons across studies, research laboratories, and drugs. The goal of this report is to provide consensus recommendations regarding core outcome measures for assessing abuse potential of opioid medications in humans in a controlled laboratory setting. Although many of the recommended measures are appropriate for assessing the AL of medications from other drug classes, the focus here is on opioid medications because they present unique risks from both physiological (e.g., respiratory depression, physical dependence) and public health (e.g., individuals in pain) perspectives. A brief historical perspective on AL testing is provided and then those measures that can be considered primary and secondary outcomes and possible additional outcomes in AL assessment are discussed. These outcome measures include: (1) subjective effects (some of which comprise the primary outcome measures, including drug liking); (2) physiological responses; (3) drug self-administration behavior; and (4) cognitive and psychomotor performance. Prior to presenting recommendations for standardized approaches and measures to be used in AL assessments, the appropriateness of using these measures in clinical trials with patients in pain is discussed. PMID:22998781

  19. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  20. A comprehensive Laboratory Services Survey of State Public Health Laboratories.

    PubMed

    Inhorn, Stanley L; Wilcke, Burton W; Downes, Frances Pouch; Adjanor, Oluwatosin Omolade; Cada, Ronald; Ford, James R

    2006-01-01

    In November 2004, the Association of Public Health Laboratories (APHL) conducted a Comprehensive Laboratory Services Survey of State Public Health Laboratories (SPHLs) in order to establish the baseline data necessary for Healthy People 2010 Objective 23-13. This objective aims to measure the increase in the proportion of health agencies that provide or assure access to comprehensive laboratory services to support essential public health services. This assessment addressed only SPHLs and served as a baseline to periodically evaluate the level of improvement in the provision of laboratory services over the decade ending 2010. The 2004 survey used selected questions that were identified as key indicators of provision of comprehensive laboratory services. The survey was developed in consultation with the Centers for Disease Control and Prevention National Center for Health Statistics, based on newly developed data sources. Forty-seven states and one territory responded to the survey. The survey was based on the 11 core functions of SPHLs as previously defined by APHL. The range of performance among individual laboratories for the 11 core functions (subobjectives) reflects the challenging issues that have confronted SPHLs in the first half of this decade. APHL is now working on a coordinated effort with other stakeholders to create seamless state and national systems for the provision of laboratory services in support of public health programs. These services are necessary to help face the threats raised by the specter of terrorism, emerging infections, and natural disasters.

  1. High Resolution Continuous Flow Analysis System for Polar Ice Cores

    NASA Astrophysics Data System (ADS)

    Dallmayr, Remi; Azuma, Kumiko; Yamada, Hironobu; Kjær, Helle Astrid; Vallelonga, Paul; Azuma, Nobuhiko; Takata, Morimasa

    2014-05-01

    In the last decades, Continuous Flow Analysis (CFA) technology for ice core analyses has been developed to reconstruct the past changes of the climate system 1), 2). Compared with traditional analyses of discrete samples, a CFA system offers much faster and higher depth resolution analyses. It also generates a decontaminated sample stream without time-consuming sample processing procedure by using the inner area of an ice-core sample.. The CFA system that we have been developing is currently able to continuously measure stable water isotopes 3) and electrolytic conductivity, as well as to collect discrete samples for the both inner and outer areas with variable depth resolutions. Chemistry analyses4) and methane-gas analysis 5) are planned to be added using the continuous water stream system 5). In order to optimize the resolution of the current system with minimal sample volumes necessary for different analyses, our CFA system typically melts an ice core at 1.6 cm/min. Instead of using a wire position encoder with typical 1mm positioning resolution 6), we decided to use a high-accuracy CCD Laser displacement sensor (LKG-G505, Keyence). At the 1.6 cm/min melt rate, the positioning resolution was increased to 0.27mm. Also, the mixing volume that occurs in our open split debubbler is regulated using its weight. The overflow pumping rate is smoothly PID controlled to maintain the weight as low as possible, while keeping a safety buffer of water to avoid air bubbles downstream. To evaluate the system's depth-resolution, we will present the preliminary data of electrolytic conductivity obtained by melting 12 bags of the North Greenland Eemian Ice Drilling (NEEM) ice core. The samples correspond to different climate intervals (Greenland Stadial 21, 22, Greenland Stadial 5, Greenland Interstadial 5, Greenland Interstadial 7, Greenland Stadial 8). We will present results for the Greenland Stadial -8, whose depths and ages are between 1723.7 and 1724.8 meters, and 35.520 to

  2. Sandia Laboratories technical capabilities: engineering analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundergan, C. D.

    1975-12-01

    This report characterizes the engineering analysis capabilities at Sandia Laboratories. Selected applications of these capabilities are presented to illustrate the extent to which they can be applied in research and development programs. (auth)

  3. Purification and Analysis of Colorful Hypothetical Open Reading Frames: An Inexpensive Gateway Laboratory

    ERIC Educational Resources Information Center

    DeSantis, Kara A.; Reinking, Jeffrey L.

    2011-01-01

    This laboratory exercise is an inquiry-based investigation developed around the core experiment where students, working alone or in groups, each purify and analyze their own prescreened colored proteins using immobilized metal affinity chromatography (IMAC). Here, we present reagents and protocols that allow 12 different proteins to be purified in…

  4. Models of the Earth's Core.

    PubMed

    Stevenson, D J

    1981-11-06

    Combined inferences from seismology, high-pressure experiment and theory, geomagnetism, fluid dynamics, and current views of terrestrial planetary evolution lead to models of the earth's core with the following properties. Core formation was contemporaneous with earth accretion; the core is not in chemical equilibrium with the mantle; the outer core is a fluid iron alloy containing significant quantities of lighter elements and is probably almost adiabatic and compositionally uniform; the more iron-rich inner solid core is a consequence of partial freezing of the outer core, and the energy release from this process sustains the earth's magnetic field; and the thermodynamic properties of the core are well constrained by the application of liquid-state theory to seismic and laboratory data.

  5. A Full Virial Analysis of the Prestellar Cores in the Ophiuchus Molecular Cloud

    NASA Astrophysics Data System (ADS)

    Pattle, Kate; Ward-Thompson, Derek

    We use SCUBA-2, HARP C18O J= 3 -> 2, Herschel and IRAM N2H+ J= 1 -> 0 observations of the Ophiuchus molecular cloud to identify and characterise the properties of the starless cores in the region. The SCUBA-2, HARP and Herschel data were taken as part of the JCMT and Herschel Gould Belt Surveys. We determine masses and temperatures and perform a full virial analysis on our cores, and find that our cores are all either bound or virialised, with gravitational energy and external pressure energy on average of similar importance in confining the cores. There is wide variation from region to region, with cores in the region influenced by B stars (Oph A) being substantially gravitationally bound, and cores in the most quiescent region (Oph C) being pressure-confined. We observe dissipation of turbulence in all our cores, and find that this dissipation is more effective in regions which do not contain outflow-driving protostars. Full details of this analysis are presented by Pattle et al. (2015).

  6. Web-Based Virtual Laboratory for Food Analysis Course

    NASA Astrophysics Data System (ADS)

    Handayani, M. N.; Khoerunnisa, I.; Sugiarti, Y.

    2018-02-01

    Implementation of learning on food analysis course in Program Study of Agro-industrial Technology Education faced problems. These problems include the availability of space and tools in the laboratory that is not comparable with the number of students also lack of interactive learning tools. On the other hand, the information technology literacy of students is quite high as well the internet network is quite easily accessible on campus. This is a challenge as well as opportunities in the development of learning media that can help optimize learning in the laboratory. This study aims to develop web-based virtual laboratory as one of the alternative learning media in food analysis course. This research is R & D (research and development) which refers to Borg & Gall model. The results showed that assessment’s expert of web-based virtual labs developed, in terms of software engineering aspects; visual communication; material relevance; usefulness and language used, is feasible as learning media. The results of the scaled test and wide-scale test show that students strongly agree with the development of web based virtual laboratory. The response of student to this virtual laboratory was positive. Suggestions from students provided further opportunities for improvement web based virtual laboratory and should be considered for further research.

  7. Pacific Northwest National Laboratory institutional plan: FY 1996--2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-01-01

    This report contains the operation and direction plan for the Pacific Northwest National Laboratory of the US Department of Energy. The topics of the plan include the laboratory mission and core competencies, the laboratory strategic plan; the laboratory initiatives in molecular sciences, microbial biotechnology, global environmental change, complex modeling of physical systems, advanced processing technology, energy technology development, and medical technologies and systems; core business areas, critical success factors, and resource projections.

  8. Planetary Sample Analysis Laboratory at DLR

    NASA Astrophysics Data System (ADS)

    Helbert, J.; Maturilli, A.; de Vera, J. P.

    2018-04-01

    Building on the available infrastructure and the long heritage, DLR is planning to create a Planetary Sample Analysis laboratory (PSA), which can be later extended to a full sample curation facility in collaboration with the Robert-Koch Institute.

  9. Biospecimen Core Resource - TCGA

    Cancer.gov

    The Cancer Genome Atlas (TCGA) Biospecimen Core Resource centralized laboratory reviews and processes blood and tissue samples and their associated data using optimized standard operating procedures for the entire TCGA Research Network.

  10. Impact Vaporization of Planetesimal Cores

    NASA Astrophysics Data System (ADS)

    Kraus, R. G.; Root, S.; Lemke, R. W.; Stewart, S. T.; Jacobsen, S. B.; Mattsson, T. R.

    2013-12-01

    The degree of mixing and chemical equilibration between the iron cores of planetesimals and the mantle of the growing Earth has important consequences for understanding the end stages of Earth's formation and planet formation in general. At the Sandia Z machine, we developed a new shock-and-release technique to determine the density on the liquid-vapor dome of iron, the entropy on the iron shock Hugoniot, and the criteria for shock-induced vaporization of iron. We find that the critical shock pressure to vaporize iron is 507(+65,-85) GPa and show that decompression from a 15 km/s impact will initiate vaporization of iron cores, which is a velocity that is readily achieved at the end stages of planet formation. Vaporization of the iron cores increases dispersal of planetesimal cores, enables more complete chemical equilibration of the planetesimal cores with Earth's mantle, and reduces the highly siderophile element abundance on the Moon relative to Earth due to the expanding iron vapor exceeding the Moon's escape velocity. Sandia National Laboratories is a multiprogram laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Securities Administration under Contract No. DE-AC04-94AL85000.

  11. A three-dimensional stratigraphic model for aggrading submarine channels based on laboratory experiments, numerical modeling, and sediment cores

    NASA Astrophysics Data System (ADS)

    Limaye, A. B.; Komatsu, Y.; Suzuki, K.; Paola, C.

    2017-12-01

    Turbidity currents deliver clastic sediment from continental margins to the deep ocean, and are the main driver of landscape and stratigraphic evolution in many low-relief, submarine environments. The sedimentary architecture of turbidites—including the spatial organization of coarse and fine sediments—is closely related to the aggradation, scour, and lateral shifting of channels. Seismic stratigraphy indicates that submarine, meandering channels often aggrade rapidly relative to lateral shifting, and develop channel sand bodies with high vertical connectivity. In comparison, the stratigraphic architecture developed by submarine, braided is relatively uncertain. We present a new stratigraphic model for submarine braided channels that integrates predictions from laboratory experiments and flow modeling with constraints from sediment cores. In the laboratory experiments, a saline density current developed subaqueous channels in plastic sediment. The channels aggraded to form a deposit with a vertical scale of approximately five channel depths. We collected topography data during aggradation to (1) establish relative stratigraphic age, and (2) estimate the sorting patterns of a hypothetical grain size distribution. We applied a numerical flow model to each topographic surface and used modeled flow depth as a proxy for relative grain size. We then conditioned the resulting stratigraphic model to observed grain size distributions using sediment core data from the Nankai Trough, offshore Japan. Using this stratigraphic model, we establish new, quantitative predictions for the two- and three-dimensional connectivity of coarse sediment as a function of fine-sediment fraction. Using this case study as an example, we will highlight outstanding challenges in relating the evolution of low-relief landscapes to the stratigraphic record.

  12. Hydraulic and acoustic properties of the active Alpine Fault, New Zealand: Laboratory measurements on DFDP-1 drill core

    NASA Astrophysics Data System (ADS)

    Carpenter, B. M.; Kitajima, H.; Sutherland, R.; Townend, J.; Toy, V. G.; Saffer, D. M.

    2014-03-01

    We report on laboratory measurements of permeability and elastic wavespeed for a suite of samples obtained by drilling across the active Alpine Fault on the South Island of New Zealand, as part of the first phase of the Deep Fault Drilling Project (DFDP-1). We find that clay-rich cataclasite and principal slip zone (PSZ) samples exhibit low permeabilities (⩽10-18 m), and that the permeability of hanging-wall cataclasites increases (from c. 10-18 m to 10-15 m) with distance from the fault. Additionally, the PSZ exhibits a markedly lower P-wave velocity and Young's modulus relative to the wall rocks. Our laboratory data are in good agreement with in situ wireline logging measurements and are consistent with the identification of an alteration zone surrounding the PSZ defined by observations of core samples. The properties of this zone and the low permeability of the PSZ likely govern transient hydrologic processes during earthquake slip, including thermal pressurization and dilatancy strengthening.

  13. Laboratory Mid-frequency (Kilohertz) Range Seismic Property Measurements and X-ray CT Imaging of Fractured Sandstone Cores During Supercritical CO2 Injection

    NASA Astrophysics Data System (ADS)

    Nakagawa, S.; Kneafsey, T. J.; Chang, C.; Harper, E.

    2014-12-01

    During geological sequestration of CO2, fractures are expected to play a critical role in controlling the migration of the injected fluid in reservoir rock. To detect the invasion of supercritical (sc-) CO2 and to determine its saturation, velocity and attenuation of seismic waves can be monitored. When both fractures and matrix porosity connected to the fractures are present, wave-induced dynamic poroelastic interactions between these two different types of rock porosity—high-permeability, high-compliance fractures and low-permeability, low-compliance matrix porosity—result in complex velocity and attenuation changes of compressional waves as scCO2 invades the rock. We conducted core-scale laboratory scCO2 injection experiments on small (diameter 1.5 inches, length 3.5-4 inches), medium-porosity/permeability (porosity 15%, matrix permeability 35 md) sandstone cores. During the injection, the compressional and shear (torsion) wave velocities and attenuations of the entire core were determined using our Split Hopkinson Resonant Bar (short-core resonant bar) technique in the frequency range of 1-2 kHz, and the distribution and saturation of the scCO2 determined via X-ray CT imaging using a medical CT scanner. A series of tests were conducted on (1) intact rock cores, (2) a core containing a mated, core-parallel fracture, (3) a core containing a sheared core-parallel fracture, and (4) a core containing a sheared, core-normal fracture. For intact cores and a core containing a mated sheared fracture, injections of scCO2 into an initially water-saturated sample resulted in large and continuous decreases in the compressional velocity as well as temporary increases in the attenuation. For a sheared core-parallel fracture, large attenuation was also observed, but almost no changes in the velocity occurred. In contrast, a sample containing a core-normal fracture exhibited complex behavior of compressional wave attenuation: the attenuation peaked as the leading edge of

  14. Core Needle Lung Biopsy Specimens: Adequacy for EGFR and KRAS Mutational Analysis

    PubMed Central

    Zakowski, Maureen F.; Pao, William; Thornton, Raymond H.; Ladanyi, Marc; Kris, Mark G.; Rusch, Valerie W.; Rizvi, Naiyer A.

    2013-01-01

    OBJECTIVE The purpose of this study was to prospectively compare the adequacy of core needle biopsy specimens with the adequacy of specimens from resected tissue, the histologic reference standard, for mutational analysis of malignant tumors of the lung. SUBJECTS AND METHODS The first 18 patients enrolled in a phase 2 study of gefitinib for lung cancer in July 2004 through August 2005 underwent CT- or fluoroscopy-guided lung biopsy before the start of gefitinib therapy. Three weeks after gefitinib therapy, the patients underwent lung tumor resection. The results of EGFR and KRAS mutational analysis of the core needle biopsy specimens were compared with those of EGFR and KRAS mutational analysis of the surgical specimens. RESULTS Two specimens were unsatisfactory for mutational analysis. The results of mutational assay results of the other 16 specimens were the same as those of analysis of the surgical specimens obtained an average of 31 days after biopsy. CONCLUSION Biopsy with small (18- to 20-gauge) core needles can yield sufficient and reliable samples for mutational analysis. This technique is likely to become an important tool with the increasing use of pharmacotherapy based on the genetics of specific tumors in individual patients. PMID:20028932

  15. Modelling of magnetostriction of transformer magnetic core for vibration analysis

    NASA Astrophysics Data System (ADS)

    Marks, Janis; Vitolina, Sandra

    2017-12-01

    Magnetostriction is a phenomenon occurring in transformer core in normal operation mode. Yet in time, it can cause the delamination of magnetic core resulting in higher level of vibrations that are measured on the surface of transformer tank during diagnostic tests. The aim of this paper is to create a model for evaluating elastic deformations in magnetic core that can be used for power transformers with intensive vibrations in order to eliminate magnetostriction as a their cause. Description of the developed model in Matlab and COMSOL software is provided including restrictions concerning geometry and properties of materials, and the results of performed research on magnetic core anisotropy are provided. As a case study modelling of magnetostriction for 5-legged 200 MVA power transformer with the rated voltage of 13.8/137kV is conducted, based on which comparative analysis of vibration levels and elastic deformations is performed.

  16. Gap analysis: a method to assess core competency development in the curriculum.

    PubMed

    Fater, Kerry H

    2013-01-01

    To determine the extent to which safety and quality improvement core competency development occurs in an undergraduate nursing program. Rapid change and increased complexity of health care environments demands that health care professionals are adequately prepared to provide high quality, safe care. A gap analysis compared the present state of competency development to a desirable (ideal) state. The core competencies, Nurse of the Future Nursing Core Competencies, reflect the ideal state and represent minimal expectations for entry into practice from pre-licensure programs. Findings from the gap analysis suggest significant strengths in numerous competency domains, deficiencies in two competency domains, and areas of redundancy in the curriculum. Gap analysis provides valuable data to direct curriculum revision. Opportunities for competency development were identified, and strategies were created jointly with the practice partner, thereby enhancing relevant knowledge, attitudes, and skills nurses need for clinical practice currently and in the future.

  17. Pacific Northwest Laboratory Institutional Plan FY 1995-2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-12-01

    This report serves as a document to describe the role PNL is positioned to take in the Department of Energy`s plans for its national centers in the period 1995-2000. It highlights the strengths of the facilities and personnel present at the laboratory, touches on the accomplishments and projects they have contributed to, and the direction being taken to prepare for the demands to be placed on DOE facilities in the near and far term. It consists of sections titled: director`s statement; laboratory mission and core competencies; laboratory strategic plan; laboratory initiatives; core business areas; critical success factors.

  18. The Point-of-Care Laboratory in Clinical Microbiology

    PubMed Central

    Michel-Lepage, Audrey; Boyer, Sylvie; Raoult, Didier

    2016-01-01

    SUMMARY Point-of-care (POC) laboratories that deliver rapid diagnoses of infectious diseases were invented to balance the centralization of core laboratories. POC laboratories operate 24 h a day and 7 days a week to provide diagnoses within 2 h, largely based on immunochromatography and real-time PCR tests. In our experience, these tests are conveniently combined into syndrome-based kits that facilitate sampling, including self-sampling and test operations, as POC laboratories can be operated by trained operators who are not necessarily biologists. POC laboratories are a way of easily providing clinical microbiology testing for populations distant from laboratories in developing and developed countries and on ships. Modern Internet connections enable support from core laboratories. The cost-effectiveness of POC laboratories has been established for the rapid diagnosis of tuberculosis and sexually transmitted infections in both developed and developing countries. PMID:27029593

  19. Core psychopathology in anorexia nervosa and bulimia nervosa: A network analysis.

    PubMed

    Forrest, Lauren N; Jones, Payton J; Ortiz, Shelby N; Smith, April R

    2018-04-25

    The cognitive-behavioral theory of eating disorders (EDs) proposes that shape and weight overvaluation are the core ED psychopathology. Core symptoms can be statistically identified using network analysis. Existing ED network studies support that shape and weight overvaluation are the core ED psychopathology, yet no studies have estimated AN core psychopathology and concerns exist about the replicability of network analysis findings. The current study estimated ED symptom networks among people with anorexia nervosa (AN) and bulimia nervosa (BN) and among a combined group of people with AN and BN. Participants were girls and women with AN (n = 604) and BN (n = 477) seeking residential ED treatment. ED symptoms were assessed with the Eating Disorder Examination-Questionnaire (EDE-Q); 27 of the EDE-Q items were included as nodes in symptom networks. Core symptoms were determined by expected influence and strength values. In all networks, desiring weight loss, restraint, shape and weight preoccupation, and shape overvaluation emerged as the most important symptoms. In addition, in the AN and combined networks, fearing weight gain emerged as an important symptom. In the BN network, weight overvaluation emerged as another important symptom. Findings support the cognitive-behavioral premise that shape and weight overvaluation are at the core of AN psychopathology. Our BN and combined network findings provide a high degree of replication of previous findings. Clinically, findings highlight the importance of considering shape and weight overvaluation as a severity specifier and primary treatment target for people with EDs. © 2018 Wiley Periodicals, Inc.

  20. Clinical laboratory as an economic model for business performance analysis.

    PubMed

    Buljanović, Vikica; Patajac, Hrvoje; Petrovecki, Mladen

    2011-08-15

    To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by implementing changes in the next fiscal

  1. An Efficient Analysis Methodology for Fluted-Core Composite Structures

    NASA Technical Reports Server (NTRS)

    Oremont, Leonard; Schultz, Marc R.

    2012-01-01

    The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

  2. Patient identification error among prostate needle core biopsy specimens--are we ready for a DNA time-out?

    PubMed

    Suba, Eric J; Pfeifer, John D; Raab, Stephen S

    2007-10-01

    Patient identification errors in surgical pathology often involve switches of prostate or breast needle core biopsy specimens among patients. We assessed strategies for decreasing the occurrence of these uncommon and yet potentially catastrophic events. Root cause analyses were performed following 3 cases of patient identification error involving prostate needle core biopsy specimens. Patient identification errors in surgical pathology result from slips and lapses of automatic human action that may occur at numerous steps during pre-laboratory, laboratory and post-laboratory work flow processes. Patient identification errors among prostate needle biopsies may be difficult to entirely prevent through the optimization of work flow processes. A DNA time-out, whereby DNA polymorphic microsatellite analysis is used to confirm patient identification before radiation therapy or radical surgery, may eliminate patient identification errors among needle biopsies.

  3. Microbial Analysis of Australian Dry Lake Cores; Analogs For Biogeochemical Processes

    NASA Astrophysics Data System (ADS)

    Nguyen, A. V.; Baldridge, A. M.; Thomson, B. J.

    2014-12-01

    Lake Gilmore in Western Australia is an acidic ephemeral lake that is analogous to Martian geochemical processes represented by interbedded phyllosilicates and sulfates. These areas demonstrate remnants of a global-scale change on Mars during the late Noachian era from a neutral to alkaline pH to relatively lower pH in the Hesperian era that continues to persist today. The geochemistry of these areas could possibly be caused by small-scale changes such as microbial metabolism. Two approaches were used to determine the presence of microbes in the Australian dry lake cores: DNA analysis and lipid analysis. Detecting DNA or lipids in the cores will provide evidence of living or deceased organisms since they provide distinct markers for life. Basic DNA analysis consists of extraction, amplification through PCR, plasmid cloning, and DNA sequencing. Once the sequence of unknown DNA is known, an online program, BLAST, will be used to identify the microbes for further analysis. The lipid analysis approach consists of phospholipid fatty acid analysis that is done by Microbial ID, which will provide direct identification any microbes from the presence of lipids. Identified microbes are then compared to mineralogy results from the x-ray diffraction of the core samples to determine if the types of metabolic reactions are consistent with the variation in composition in these analog deposits. If so, it provides intriguing implications for the presence of life in similar Martian deposits.

  4. 3D Magnetic Field Analysis of a Turbine Generator Stator Core-end Region

    NASA Astrophysics Data System (ADS)

    Wakui, Shinichi; Takahashi, Kazuhiko; Ide, Kazumasa; Takahashi, Miyoshi; Watanabe, Takashi

    In this paper we calculated magnetic flux density and eddy current distributions of a 71MVA turbine generator stator core-end using three-dimensional numerical magnetic field analysis. Subsequently, the magnetic flux densities and eddy current densities in the stator core-end region on the no-load and three-phase short circuit conditions obtained by the analysis have good agreements with the measurements. Furthermore, the differences of eddy current and eddy current loss in the stator core-end region for various load conditions are shown numerically. As a result, the facing had an effect that decrease the eddy current loss of the end plate about 84%.

  5. Hybrid Analysis of Engine Core Noise

    NASA Astrophysics Data System (ADS)

    O'Brien, Jeffrey; Kim, Jeonglae; Ihme, Matthias

    2015-11-01

    Core noise, or the noise generated within an aircraft engine, is becoming an increasing concern for the aviation industry as other noise sources are progressively reduced. The prediction of core noise generation and propagation is especially challenging for computationalists since it involves extensive multiphysics including chemical reaction and moving blades in addition to the aerothermochemical effects of heated jets. In this work, a representative engine flow path is constructed using experimentally verified geometries to simulate the physics of core noise. A combustor, single-stage turbine, nozzle and jet are modeled in separate calculations using appropriate high fidelity techniques including LES, actuator disk theory and Ffowcs-Williams Hawkings surfaces. A one way coupling procedure is developed for passing fluctuations downstream through the flowpath. This method effectively isolates the core noise from other acoustic sources, enables straightforward study of the interaction between core noise and jet exhaust, and allows for simple distinction between direct and indirect noise. The impact of core noise on the farfield jet acoustics is studied extensively and the relative efficiency of different disturbance types and shapes is examined in detail.

  6. Analysis of chemical concepts as the basic of virtual laboratory development and process science skills in solubility and solubility product subject

    NASA Astrophysics Data System (ADS)

    Syafrina, R.; Rohman, I.; Yuliani, G.

    2018-05-01

    This study aims to analyze the concept characteristics of solubility and solubility products that will serve as the basis for the development of virtual laboratory and students' science process skills. Characteristics of the analyzed concepts include concept definitions, concept attributes, and types of concepts. The concept analysis method uses concept analysis according to Herron. The results of the concept analysis show that there are twelve chemical concepts that become the prerequisite concept before studying the solubility and solubility and five core concepts that students must understand in the solubility and Solubility product. As many as 58.3% of the definitions of the concepts contained in high school textbooks support students' science process skills, the rest of the definition of the concept is memorized. Concept attributes that meet three levels of chemical representation and can be poured into a virtual laboratory have a percentage of 66.6%. Type of concept, 83.3% is a concept based on principle; and 16.6% concepts that state the process. Meanwhile, the science process skills that can be developed based on concept analysis are the ability to observe, calculate, measure, predict, interpret, hypothesize, apply, classify, and inference.

  7. [Development of laboratory sequence analysis software based on WWW and UNIX].

    PubMed

    Huang, Y; Gu, J R

    2001-01-01

    Sequence analysis tools based on WWW and UNIX were developed in our laboratory to meet the needs of molecular genetics research in our laboratory. General principles of computer analysis of DNA and protein sequences were also briefly discussed in this paper.

  8. Analysis of laboratory compaction methods of roller compacted concrete

    NASA Astrophysics Data System (ADS)

    Trtík, Tomáš; Chylík, Roman; Bílý, Petr; Fládr, Josef

    2017-09-01

    Roller-Compacted Concrete (RCC) is an ordinary concrete poured and compacted with machines typically used for laying of asphalt road layers. One of the problems connected with this technology is preparation of representative samples in the laboratory. The aim of this work was to analyse two methods of preparation of RCC laboratory samples with bulk density as the comparative parameter. The first method used dynamic compaction by pneumatic hammer. The second method of compaction had a static character. The specimens were loaded by precisely defined force in laboratory loading machine to create the same conditions as during static rolling (in the Czech Republic, only static rolling is commonly used). Bulk densities obtained by the two compaction methods were compared with core drills extracted from real RCC structure. The results have shown that the samples produced by pneumatic hammer tend to overestimate the bulk density of the material. For both compaction methods, immediate bearing index test was performed to verify the quality of compaction. A fundamental difference between static and dynamic compaction was identified. In static compaction, initial resistance to penetration of the mandrel was higher, after exceeding certain limit the resistance was constant. This means that the samples were well compacted just on the surface. Specimens made by pneumatic hammer actively resisted throughout the test, the whole volume was uniformly compacted.

  9. Teaching laboratory neuroscience at bowdoin: the laboratory instructor perspective.

    PubMed

    Hauptman, Stephen; Curtis, Nancy

    2009-01-01

    Bowdoin College is a small liberal arts college that offers a comprehensive Neuroscience major. The laboratory experience is an integral part of the major, and many students progress through three stages. A core course offers a survey of concepts and techniques. Four upper-level courses function to give students more intensive laboratory research experience in neurophysiology, molecular neurobiology, social behavior, and learning and memory. Finally, many majors choose to work in the individual research labs of the Neuroscience faculty. We, as laboratory instructors, are vital to the process, and are actively involved in all aspects of the lab-based courses. We provide student instruction in state of the art techniques in neuroscience research. By sharing laboratory teaching responsibilities with course professors, we help to prepare students for careers in laboratory neuroscience and also support and facilitate faculty research programs.

  10. Development of a three-dimensional core dynamics analysis program for commercial boiling water reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bessho, Yasunori; Yokomizo, Osamu; Yoshimoto, Yuichiro

    1997-03-01

    Development and qualification results are described for a three-dimensional, time-domain core dynamics analysis program for commercial boiling water reactors (BWRs). The program allows analysis of the reactor core with a detailed mesh division, which eliminates calculational ambiguity in the nuclear-thermal-hydraulic stability analysis caused by reactor core regional division. During development, emphasis was placed on high calculational speed and large memory size as attained by the latest supercomputer technology. The program consists of six major modules, namely a core neutronics module, a fuel heat conduction/transfer module, a fuel channel thermal-hydraulic module, an upper plenum/separator module, a feedwater/recirculation flow module, and amore » control system module. Its core neutronics module is based on the modified one-group neutron kinetics equation with the prompt jump approximation and with six delayed neutron precursor groups. The module is used to analyze one fuel bundle of the reactor core with one mesh (region). The fuel heat conduction/transfer module solves the one-dimensional heat conduction equation in the radial direction with ten nodes in the fuel pin. The fuel channel thermal-hydraulic module is based on separated three-equation, two-phase flow equations with the drift flux correlation, and it analyzes one fuel bundle of the reactor core with one channel to evaluate flow redistribution between channels precisely. Thermal margin is evaluated by using the GEXL correlation, for example, in the module.« less

  11. 28. MODIFIED CHAIN SAW FOR CUTTING ROCK CORES; BRUNTON COMPASS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. MODIFIED CHAIN SAW FOR CUTTING ROCK CORES; BRUNTON COMPASS STAND FOR DETERMINING CORE'S FIELD ORIENTATION; INSECTICIDE DISPENSER MODIFIED TO LUBRICATE CORE DRILLING PROCESS. - U.S. Geological Survey, Rock Magnetics Laboratory, 345 Middlefield Road, Menlo Park, San Mateo County, CA

  12. Continuous analysis of phosphate in a Greenland shallow ice core

    NASA Astrophysics Data System (ADS)

    Kjær, Helle Astrid; Svensson, Anders; Bigler, Matthias; Vallelonga, Paul; Kettner, Ernesto; Dahl-Jensen, Dorthe

    2010-05-01

    Phosphate is an important and sometimes limiting nutrient for primary production in the oceans. Because of deforestation and the use of phosphate as a fertilizer changes in the phosphate cycle have occurred over the last centuries. On longer time scales, sea level changes are thought to have also caused changes in the phosphate cycle. Analyzing phosphate concentrations in ice cores may help to gain important knowledge about those processes. In the present study, we attach a phosphate detection line to an existing continuous flow analysis (CFA) setup for ice core analysis at the University of Copenhagen. The CFA system is optimized for high-resolution measurements of insoluble dust particles, electrolytic melt water conductivity, and the concentrations of ammonium and sodium. For the phosphate analysis we apply a continuous and highly sensitive absorption method that has been successfully applied to determine phosphate concentrations of sea water (Zhang and Chi, 2002). A line of melt water from the CFA melt head (1.01 ml per minute) is combined with a molybdate blue reagent and an ascorbic acid buffer. An uncompleted reaction takes place in five meters of heated mixing coils before the absorption measurement at a wavelength of 710 nanometer takes place in a 2 m long liquid waveguide cell (LWCC) with an inner volume of 0.5 ml. The method has a detection limit of around 0.1 ppb and we are currently investigating a possible interference from molybdate reacting with silicates that are present in low amounts in the ice. Preliminary analysis of early Holocene samples from the NGRIP ice core show phosphate concentration values of a few ppb. In this study, we will attempt to determine past levels of phosphate in a shallow Northern Greenland firn core with an annual layer thickness of about 20 cm ice equivalent. With a melt speed of 2.5 cm ice per minute our method should allow the resolution of any seasonal variability in phosphate concentrations.

  13. A cost benefit analysis of outsourced laboratory services.

    PubMed

    Bowers, J A

    1995-11-01

    As healthcare moves toward increased capitation, hospital administrators must be aware of all costs associated with patient services. This article describes the cost benefit analysis process used by northern Indiana hospital consumers during 1994-1995 to evaluate a local laboratory service outsource provider, South Bend Medical Foundation (SBMF). In an effort to meet the best interests of the community at large, three competing hospitals, medical leadership, and the local outsource provider joined forces to ensure that cost effective quality services would be provided. Laboratory utilization patterns for common DRGs were also analyzed. The team created a reconfiguration analysis to help develop benchmark figures for consideration in future contract negotiations.

  14. TRAC-BF1 thermal-hydraulic, ANSYS stress analysis for core shroud cracking phenomena

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shoop, U.; Feltus, M.A.; Baratta, A.J.

    1996-12-31

    The U.S. Nuclear Regulatory Commission sent Generic Letter 94-03 informing all licensees about the intergranular stress corrosion cracking (IGSCC) of core shrouds found in both Dresden unit I and Quad Cities unit 1. The letter directed all licensees to perform safety analysis of their boiling water reactor (BWR) units. Two transients of special concern for the core shroud safety analysis include the main steam line break (MSLB) and recirculation line break transient.

  15. An Improved Extraction and Analysis Technique for Determination of Carbon Monoxide Stable Isotopes and Mixing Ratios from Ice Core and Atmospheric Air Samples.

    NASA Astrophysics Data System (ADS)

    Place, P., Jr.; Petrenko, V. V.; Vimont, I.

    2017-12-01

    Carbon Monoxide (CO) is an important atmospheric trace gas that affects the oxidative capacity of the atmosphere and contributes indirectly to anthropogenic radiative forcing. Carbon monoxide stable isotopes can also serve as a tracer for variations in biomass burning, particularly in the preindustrial atmosphere. A good understanding of the past variations in CO mole fractions and isotopic composition can help improve the skill of chemical transport models and constrain biomass burning changes. Ice cores may preserve a record of past atmospheric CO for analysis and interpretation. To this end, a new extraction system has been developed for analysis of stable isotopes (δ13CO and δC18O) of atmospheric carbon monoxide from ice core and atmospheric air samples. This system has been designed to measure relatively small sample sizes (80 cc STP of air) to accommodate the limited availability of ice core samples. Trapped air is extracted from ice core samples via melting in a glass vacuum chamber. This air is expanded into a glass expansion loop and then compressed into the sample loop of a Reducing Gas Detector (Peak Laboratories, Peak Performer 1 RCP) for the CO mole fraction measurement. The remaining sample gas will be expelled from the melt vessel into a larger expansion loop via headspace compression for isotopic analysis. The headspace compression will be accomplished by introduction of clean degassed water into the bottom of the melt vessel. Isotopic analysis of the sample gas is done utilizing the Schütze Reagent to convert the carbon monoxide to carbon dioxide (CO2) which is then measured using continuous-flow isotope ratio mass spectrometry (Elementar Americas, IsoPrime 100). A series of cryogenic traps are used to purify the sample air, capture the converted sample CO2, and cryofocus the sample CO2 prior to injection.

  16. Clinical laboratory as an economic model for business performance analysis

    PubMed Central

    Buljanović, Vikica; Patajac, Hrvoje; Petrovečki, Mladen

    2011-01-01

    Aim To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Methods Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. Conclusion The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by

  17. Implantation of radiotelemetry transmitters yielding data on ECG, heart rate, core body temperature and activity in free-moving laboratory mice.

    PubMed

    Cesarovic, Nikola; Jirkof, Paulin; Rettich, Andreas; Arras, Margarete

    2011-11-21

    The laboratory mouse is the animal species of choice for most biomedical research, in both the academic sphere and the pharmaceutical industry. Mice are a manageable size and relatively easy to house. These factors, together with the availability of a wealth of spontaneous and experimentally induced mutants, make laboratory mice ideally suited to a wide variety of research areas. In cardiovascular, pharmacological and toxicological research, accurate measurement of parameters relating to the circulatory system of laboratory animals is often required. Determination of heart rate, heart rate variability, and duration of PQ and QT intervals are based on electrocardiogram (ECG) recordings. However, obtaining reliable ECG curves as well as physiological data such as core body temperature in mice can be difficult using conventional measurement techniques, which require connecting sensors and lead wires to a restrained, tethered, or even anaesthetized animal. Data obtained in this fashion must be interpreted with caution, as it is well known that restraining and anesthesia can have a major artifactual influence on physiological parameters. Radiotelemetry enables data to be collected from conscious and untethered animals. Measurements can be conducted even in freely moving animals, and without requiring the investigator to be in the proximity of the animal. Thus, known sources of artifacts are avoided, and accurate and reliable measurements are assured. This methodology also reduces interanimal variability, thus reducing the number of animals used, rendering this technology the most humane method of monitoring physiological parameters in laboratory animals. Constant advancements in data acquisition technology and implant miniaturization mean that it is now possible to record physiological parameters and locomotor activity continuously and in realtime over longer periods such as hours, days or even weeks. Here, we describe a surgical technique for implantation of a

  18. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  19. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  20. Laboratory Information Management Systems for Forensic Laboratories: A White Paper for Directors and Decision Makers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony Hendrickson; Brian Mennecke; Kevin Scheibe

    2005-10-01

    Modern, forensics laboratories need Laboratory Information Management Systems (LIMS) implementations that allow the lab to track evidentiary items through their examination lifecycle and also serve all pertinent laboratory personnel. The research presented here presents LIMS core requirements as viewed by respondents serving in different forensic laboratory capacities as well as different forensic laboratory environments. A product-development methodology was employed to evaluate the relative value of the key features that constitute a LIMS, in order to develop a set of relative values for these features and the specifics of their implementation. In addition to the results of the product development analysis,more » this paper also provides an extensive review of LIMS and provides an overview of the preparation and planning process for the successful upgrade or implementation of a LIMS. Analysis of the data indicate that the relative value of LIMS components are viewed differently depending upon respondents' job roles (i.e., evidence technicians, scientists, and lab management), as well as by laboratory size. Specifically, the data show that: (1) Evidence technicians place the most value on chain of evidence capabilities and on chain of custody tracking; (2) Scientists generally place greatest value on report writing and generation, and on tracking daughter evidence that develops during their analyses; (3) Lab. Managers place the greatest value on chain of custody, daughter evidence, and not surprisingly, management reporting capabilities; and (4) Lab size affects LIMS preference in that, while all labs place daughter evidence tracking, chain of custody, and management and analyst report generation as their top three priorities, the order of this prioritization is size dependent.« less

  1. The effectiveness of inking needle core prostate biopsies for preventing patient specimen identification errors: a technique to address Joint Commission patient safety goals in specialty laboratories.

    PubMed

    Raff, Lester J; Engel, George; Beck, Kenneth R; O'Brien, Andrea S; Bauer, Meagan E

    2009-02-01

    The elimination or reduction of medical errors has been a main focus of health care enterprises in the United States since the year 2000. Elimination of errors in patient and specimen identification is a key component of this focus and is the number one goal in the Joint Commission's 2008 National Patient Safety Goals Laboratory Services Program. To evaluate the effectiveness of using permanent inks to maintain specimen identity in sequentially submitted prostate needle biopsies. For a 12-month period, a grossing technician stained each prostate core with permanent ink developed for inking of pathology specimens. A different color was used for each patient, with all the prostate cores from all vials for a particular patient inked with the same color. Five colors were used sequentially: green, blue, yellow, orange, and black. The ink was diluted with distilled water to a consistency that allowed application of a thin, uniform coating of ink along the edges of the prostate core. The time required to ink patient specimens comprising different numbers of vials and prostate biopsies was timed. The number and type of inked specimen discrepancies were evaluated. The identified discrepancy rate for prostate biopsy patients was 0.13%. The discrepancy rate in terms of total number of prostate blocks was 0.014%. Diluted inks adhered to biopsy contours throughout tissue processing. The tissue showed no untoward reactions to the inks. Inking did not affect staining (histochemical or immunohistochemical) or pathologic evaluation. On average, inking prostate needle biopsies increases grossing time by 20%. Inking of all prostate core biopsies with colored inks, in sequential order, is an aid in maintaining specimen identity. It is a simple and effective method of addressing Joint Commission patient safety goals by maintaining specimen identity during processing of similar types of gross specimens. This technique may be applicable in other specialty laboratories and high

  2. Nonlinear seismic analysis of a reactor structure impact between core components

    NASA Technical Reports Server (NTRS)

    Hill, R. G.

    1975-01-01

    The seismic analysis of the FFTF-PIOTA (Fast Flux Test Facility-Postirradiation Open Test Assembly), subjected to a horizontal DBE (Design Base Earthquake) is presented. The PIOTA is the first in a set of open test assemblies to be designed for the FFTF. Employing the direct method of transient analysis, the governing differential equations describing the motion of the system are set up directly and are implicitly integrated numerically in time. A simple lumped-nass beam model of the FFTF which includes small clearances between core components is used as a "driver" for a fine mesh model of the PIOTA. The nonlinear forces due to the impact of the core components and their effect on the PIOTA are computed.

  3. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  4. Embedding Hands-On Mini Laboratory Experiences in a Core Undergraduate Fluid Mechanics Course: A Pilot Study

    ERIC Educational Resources Information Center

    Han, Duanduan; Ugaz, Victor

    2017-01-01

    Three self-contained mini-labs were integrated into a core undergraduate fluid mechanics course, with the goal of delivering hands-on content in a manner scalable to large class sizes. These mini-labs supported learning objectives involving friction loss in pipes, flow measurement, and centrifugal pump analysis. The hands-on experiments were…

  5. Viral Evolution Core | FNLCR Staging

    Cancer.gov

    Brandon F. Keele, Ph.D. PI/Senior Principal Investigator, Retroviral Evolution Section Head, Viral Evolution Core Leidos Biomedical Research, Inc. Frederick National Laboratory for Cancer Research Frederick, MD 21702-1201 Tel: 301-846-173

  6. Compression After Impact on Honeycomb Core Sandwich Panels with Thin Facesheets, Part 2: Analysis

    NASA Technical Reports Server (NTRS)

    Mcquigg, Thomas D.; Kapania, Rakesh K.; Scotti, Stephen J.; Walker, Sandra P.

    2012-01-01

    A two part research study has been completed on the topic of compression after impact (CAI) of thin facesheet honeycomb core sandwich panels. The research has focused on both experiments and analysis in an effort to establish and validate a new understanding of the damage tolerance of these materials. Part 2, the subject of the current paper, is focused on the analysis, which corresponds to the CAI testings described in Part 1. Of interest, are sandwich panels, with aerospace applications, which consist of very thin, woven S2-fiberglass (with MTM45-1 epoxy) facesheets adhered to a Nomex honeycomb core. Two sets of materials, which were identical with the exception of the density of the honeycomb core, were tested in Part 1. The results highlighted the need for analysis methods which taken into account multiple failure modes. A finite element model (FEM) is developed here, in Part 2. A commercial implementation of the Multicontinuum Failure Theory (MCT) for progressive failure analysis (PFA) in composite laminates, Helius:MCT, is included in this model. The inclusion of PFA in the present model provided a new, unique ability to account for multiple failure modes. In addition, significant impact damage detail is included in the model. A sensitivity study, used to assess the effect of each damage parameter on overall analysis results, is included in an appendix. Analysis results are compared to the experimental results for each of the 32 CAI sandwich panel specimens tested to failure. The failure of each specimen is predicted using the high-fidelity, physicsbased analysis model developed here, and the results highlight key improvements in the understanding of honeycomb core sandwich panel CAI failure. Finally, a parametric study highlights the strength benefits compared to mass penalty for various core densities.

  7. Analysis and Test Support for Phillips Laboratory Precision Structures

    DTIC Science & Technology

    1998-11-01

    Air Force Research Laboratory ( AFRL ), Phillips Research Site . Task objectives centered...around analysis and structural dynamic test support on experiments within the Space Vehicles Directorate at Kirtland Air Force Base. These efforts help...support for Phillips Laboratory Precision Structures." Mr. James Goodding of CSA Engineering was the principal investigator for this task. Mr.

  8. SAXS analysis of single- and multi-core iron oxide magnetic nanoparticles

    PubMed Central

    Szczerba, Wojciech; Costo, Rocio; Morales, Maria del Puerto; Thünemann, Andreas F.

    2017-01-01

    This article reports on the characterization of four superparamagnetic iron oxide nanoparticles stabilized with dimercaptosuccinic acid, which are suitable candidates for reference materials for magnetic properties. Particles p1 and p2 are single-core particles, while p3 and p4 are multi-core particles. Small-angle X-ray scattering analysis reveals a lognormal type of size distribution for the iron oxide cores of the particles. Their mean radii are 6.9 nm (p1), 10.6 nm (p2), 5.5 nm (p3) and 4.1 nm (p4), with narrow relative distribution widths of 0.08, 0.13, 0.08 and 0.12. The cores are arranged as a clustered network in the form of dense mass fractals with a fractal dimension of 2.9 in the multi-core particles p3 and p4, but the cores are well separated from each other by a protecting organic shell. The radii of gyration of the mass fractals are 48 and 44 nm, and each network contains 117 and 186 primary particles, respectively. The radius distributions of the primary particle were confirmed with transmission electron microscopy. All particles contain purely maghemite, as shown by X-ray absorption fine structure spectroscopy. PMID:28381973

  9. 7 CFR 57.960 - Small importations for consignee's personal use, display, or laboratory analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., display, or laboratory analysis. 57.960 Section 57.960 Agriculture Regulations of the Department of..., display, or laboratory analysis. Any eggs that are offered for importation, exclusively for the consignee's personal use, display, or laboratory analysis, and not for sale or distribution; which is sound...

  10. Operating environmental laboratories--an overview of analysis equipment procurement and management.

    PubMed

    Pandya, G H; Shinde, V M; Kanade, G S; Kondawar, V K

    2003-10-01

    Management of equipment in an environmental laboratory requires planning involving assessment of the workload on a particular equipment, establishment of criteria and specification for the purchase of equipment, creation of infrastructure for installation and testing of the equipment, optimization of analysis conditions, development of preventive maintenance procedures and establishment of in-house repair facilities. The paper reports the results of such an analysis carried for operating environmental laboratories associated with R& D work, serving as an Govt. laboratory or attached to an Industry for analysing industrial emissions.

  11. Lack of Association between Hepatitis C Virus core Gene Variation 70/91aa and Insulin Resistance.

    PubMed

    Scalioni, Letícia de Paula; da Silva, Allan Peres; Miguel, Juliana Custódio; Espírito Santo, Márcia Paschoal do; Marques, Vanessa Alves; Brandão-Mello, Carlos Eduardo; Villela-Nogueira, Cristiane Alves; Lewis-Ximenez, Lia Laura; Lampe, Elisabeth; Villar, Livia Melo

    2017-07-21

    The role of hepatitis C virus (HCV) in insulin resistance (IR) is not fully understood. The aim of this study was to determine the impact of amino acid (aa) substitutions in the core region of HCV according to IR and to identify clinical and laboratory associations. Ninety-two treatment-naive HCV patients were recruited to determine laboratory data and blood cell count. IR was determined using Homeostasis Model Assessment (HOMA) index where IR was defined as HOMA ≥2. HCV RNA load and genotype were determined by Abbott Real time HCV. HCV core region was determined by direct nucleotide sequencing. Bivariate analysis was conducted using HOMA IR ≥2 as a dependent factor. IR prevalence was 43.5% ( n = 40), vitamin D sufficiency was found in 76.1% ( n = 70) and 72.8% ( n = 67) had advanced liver fibrosis. In the bivariate analyses, elevated values of γGT ( p = 0.024) and fibrosis staging ( p = 0.004) were associated with IR, but IR was not related to core mutations. The presence of glutamine in position 70 was associated with low vitamin D concentration ( p = 0.005). In the multivariate analysis, no variable was independently associated with HOMA-IR. In conclusion, lack of association between IR and HCV core mutations in positions 70 and 91 suggests that genetic variability of this region has little impact on IR.

  12. Global Precipitation Measurement - Report 9 Core Coverage Trade Space Analysis

    NASA Technical Reports Server (NTRS)

    Mailhe, Laurie; Schiff, Conrad; Mendelsohn, Chad; Everett, David; Folta, David

    2002-01-01

    This paper summarizes the GPM-Core coverage trade space analysis. The goal of this analysis was to determine the GPM-Core sensitivity to changes in altitude and inclination for the three onboard instruments: the radiometer, the KU band radar and the KA band radar. This study will enable a better choice of the nominal GPM-Core orbit as well as the optimal size of the maintenance box (+/-1 km, +/-5 km..). For this work, we used two different figures-of-merit: (1) the time required to cover 100% of the +/-65 deg latitude band and (2) the coverage obtained for a given propagation time (7 days and 30 days). The first figure-of-merit is used for the radiometer as it has a sensor cone half-angle between 3 to 5 times bigger than the radars. Thus, we anticipate that for this instrument the period of the orbit (i.e. altitude) will be the main driver and that the 100% coverage value will be reached within less than a week. The second figure-of-merit is used for the radar instruments as they have small sensor cone half-angle and will, in some cases, never reach the 100% coverage threshold point.

  13. Evaluation of potential severe accidents during low power and shutdown operations at Surry, Unit 1. Volume 5: Analysis of core damage frequency from seismic events during mid-loop operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budnitz, R.J.; Davis, P.R.; Ravindra, M.K.

    1994-08-01

    In 1989 the US Nuclear Regulatory Commission (NRC) initiated an extensive program to examine carefully the potential risks during low-power and shutdown operations. The program included two parallel projects, one at Brookhaven National Laboratory studying a pressurized water reactor (Surry Unit 1) and the other at Sandia National Laboratories studying a boiling water reactor (Grand Gulf). Both the Brookhaven and Sandia projects have examined only accidents initiated by internal plant faults--so-called ``internal initiators.`` This project, which has explored the likelihood of seismic-initiated core damage accidents during refueling shutdown conditions, is complementary to the internal-initiator analyses at Brookhaven and Sandia. Thismore » report covers the seismic analysis at Surry Unit 1. All of the many systems modeling assumptions, component non-seismic failure rates, and human error rates that were used in the internal-initiator study at Surry have been adopted here, so that the results of the two studies can be as comparable as possible. Both the Brookhaven study and this study examine only two shutdown plant operating states (POSs) during refueling outages at Surry, called POS 6 and POS 10, which represent mid-loop operation before and after refueling, respectively. This analysis has been limited to work analogous to a level-1 seismic PRA, in which estimates have been developed for the core-damage frequency from seismic events during POSs 6 and 10. The results of the analysis are that the core-damage frequency of earthquake-initiated accidents during refueling outages in POS 6 and POS 10 is found to be low in absolute terms, less than 10{sup {minus}6}/year.« less

  14. Non-destructive Analysis of Oil-Contaminated Soil Core Samples by X-ray Computed Tomography and Low-Field Nuclear Magnetic Resonance Relaxometry: a Case Study

    PubMed Central

    Mitsuhata, Yuji; Nishiwaki, Junko; Kawabe, Yoshishige; Utsuzawa, Shin; Jinguuji, Motoharu

    2010-01-01

    Non-destructive measurements of contaminated soil core samples are desirable prior to destructive measurements because they allow obtaining gross information from the core samples without touching harmful chemical species. Medical X-ray computed tomography (CT) and time-domain low-field nuclear magnetic resonance (NMR) relaxometry were applied to non-destructive measurements of sandy soil core samples from a real site contaminated with heavy oil. The medical CT visualized the spatial distribution of the bulk density averaged over the voxel of 0.31 × 0.31 × 2 mm3. The obtained CT images clearly showed an increase in the bulk density with increasing depth. Coupled analysis with in situ time-domain reflectometry logging suggests that this increase is derived from an increase in the water volume fraction of soils with depth (i.e., unsaturated to saturated transition). This was confirmed by supplementary analysis using high-resolution micro-focus X-ray CT at a resolution of ∼10 μm, which directly imaged the increase in pore water with depth. NMR transverse relaxation waveforms of protons were acquired non-destructively at 2.7 MHz by the Carr–Purcell–Meiboom–Gill (CPMG) pulse sequence. The nature of viscous petroleum molecules having short transverse relaxation times (T2) compared to water molecules enabled us to distinguish the water-saturated portion from the oil-contaminated portion in the core sample using an M0–T2 plot, where M0 is the initial amplitude of the CPMG signal. The present study demonstrates that non-destructive core measurements by medical X-ray CT and low-field NMR provide information on the groundwater saturation level and oil-contaminated intervals, which is useful for constructing an adequate plan for subsequent destructive laboratory measurements of cores. PMID:21258437

  15. Core-Shell Columns in High-Performance Liquid Chromatography: Food Analysis Applications

    PubMed Central

    Preti, Raffaella

    2016-01-01

    The increased separation efficiency provided by the new technology of column packed with core-shell particles in high-performance liquid chromatography (HPLC) has resulted in their widespread diffusion in several analytical fields: from pharmaceutical, biological, environmental, and toxicological. The present paper presents their most recent applications in food analysis. Their use has proved to be particularly advantageous for the determination of compounds at trace levels or when a large amount of samples must be analyzed fast using reliable and solvent-saving apparatus. The literature hereby described shows how the outstanding performances provided by core-shell particles column on a traditional HPLC instruments are comparable to those obtained with a costly UHPLC instrumentation, making this novel column a promising key tool in food analysis. PMID:27143972

  16. Diffraction data of core-shell nanoparticles from an X-ray free electron laser

    DOE PAGES

    Li, Xuanxuan; Chiu, Chun -Ya; Wang, Hsiang -Ju; ...

    2017-04-11

    X-ray free-electron lasers provide novel opportunities to conduct single particle analysis on nanoscale particles. Coherent diffractive imaging experiments were performed at the Linac Coherent Light Source (LCLS), SLAC National Laboratory, exposing single inorganic core-shell nanoparticles to femtosecond hard-X-ray pulses. Each facetted nanoparticle consisted of a crystalline gold core and a differently shaped palladium shell. Scattered intensities were observed up to about 7 nm resolution. Analysis of the scattering patterns revealed the size distribution of the samples, which is consistent with that obtained from direct real-space imaging by electron microscopy. Furthermore, scattering patterns resulting from single particles were selected and compiledmore » into a dataset which can be valuable for algorithm developments in single particle scattering research.« less

  17. Method and apparatus for recovering unstable cores

    DOEpatents

    McGuire, Patrick L.; Barraclough, Bruce L.

    1983-01-01

    A method and apparatus suitable for stabilizing hydrocarbon cores are given. Such stabilized cores have not previously been obtainable for laboratory study, and such study is believed to be required before the hydrate reserves can become a utilizable resource. The apparatus can be built using commercially available parts and is very simple and safe to operate.

  18. Modal analysis and acoustic transmission through offset-core honeycomb sandwich panels

    NASA Astrophysics Data System (ADS)

    Mathias, Adam Dustin

    The work presented in this thesis is motivated by an earlier research that showed that double, offset-core honeycomb sandwich panels increased thermal resistance and, hence, decreased heat transfer through the panels. This result lead to the hypothesis that these panels could be used for acoustic insulation. Using commercial finite element modeling software, COMSOL Multiphysics, the acoustical properties, specifically the transmission loss across a variety of offset-core honeycomb sandwich panels, is studied for the case of a plane acoustic wave impacting the panel at normal incidence. The transmission loss results are compared with those of single-core honeycomb panels with the same cell sizes. The fundamental frequencies of the panels are also computed in an attempt to better understand the vibrational modes of these particular sandwich-structured panels. To ensure that the finite element analysis software is adequate for the task at hand, two relevant benchmark problems are solved and compared with theory. Results from these benchmark results compared well to those obtained from theory. Transmission loss results from the offset-core honeycomb sandwich panels show increased transmission loss, especially for large cell honeycombs when compared to single-core honeycomb panels.

  19. Novel laboratory methods for determining the fine scale electrical resistivity structure of core

    NASA Astrophysics Data System (ADS)

    Haslam, E. P.; Gunn, D. A.; Jackson, P. D.; Lovell, M. A.; Aydin, A.; Prance, R. J.; Watson, P.

    2014-12-01

    High-resolution electrical resistivity measurements are made on saturated rocks using novel laboratory instrumentation and multiple electrical voltage measurements involving in principle a four-point electrode measurement but with a single, moving electrode. Flat, rectangular core samples are scanned by varying the electrode position over a range of hundreds of millimetres with an accuracy of a tenth of a millimetre. Two approaches are tested involving a contact electrode and a non-contact electrode arrangement. The first galvanic method uses balanced cycle switching of a floating direct current (DC) source to minimise charge polarisation effects masking the resistivity distribution related to fine scale structure. These contacting electrode measurements are made with high common mode noise rejection via differential amplification with respect to a reference point within the current flow path. A computer based multifunction data acquisition system logs the current through the sample and voltages along equipotentials from which the resistivity measurements are derived. Multiple measurements are combined to create images of the surface resistivity structure, with variable spatial resolution controlled by the electrode spacing. Fine scale sedimentary features and open fractures in saturated rocks are interpreted from the measurements with reference to established relationships between electrical resistivity and porosity. Our results successfully characterise grainfall lamination and sandflow cross-stratification in a brine saturated, dune bedded core sample representative of a southern North Sea reservoir sandstone, studied using the system in constant current, variable voltage mode. In contrast, in a low porosity marble, identification of open fracture porosity against a background very low matrix porosity is achieved using the constant voltage, variable current mode. This new system is limited by the diameter of the electrode that for practical reasons can only be

  20. Integration of Video-Based Demonstrations to Prepare Students for the Organic Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Nadelson, Louis S.; Scaggs, Jonathan; Sheffield, Colin; McDougal, Owen M.

    2015-08-01

    Consistent, high-quality introductions to organic chemistry laboratory techniques effectively and efficiently support student learning in the organic chemistry laboratory. In this work, we developed and deployed a series of instructional videos to communicate core laboratory techniques and concepts. Using a quasi-experimental design, we tested the videos in five traditional laboratory experiments by integrating them with the standard pre-laboratory student preparation presentations and instructor demonstrations. We assessed the influence of the videos on student laboratory knowledge and performance, using sections of students who did not view the videos as the control. Our analysis of pre-quizzes revealed the control group had equivalent scores to the treatment group, while the post-quiz results show consistently greater learning gains for the treatment group. Additionally, the students who watched the videos as part of their pre-laboratory instruction completed their experiments in less time.

  1. Improving communication skill training in patient centered medical practice for enhancing rational use of laboratory tests: The core of bioinformation for leveraging stakeholder engagement in regulatory science.

    PubMed

    Moura, Josemar de Almeida; Costa, Bruna Carvalho; de Faria, Rosa Malena Delbone; Soares, Taciana Figueiredo; Moura, Eliane Perlatto; Chiappelli, Francesco

    2013-01-01

    Requests for laboratory tests are among the most relevant additional tools used by physicians as part of patient's health problemsolving. However, the overestimation of complementary investigation may be linked to less reflective medical practice as a consequence of a poor physician-patient communication, and may impair patient-centered care. This scenario is likely to result from reduced consultation time, and a clinical model focused on the disease. We propose a new medical intervention program that specifically targets improving the patient-centered communication of laboratory tests results, the core of bioinformation in health care. Expectations are that medical students training in communication skills significantly improve physicians-patient relationship, reduce inappropriate use of laboratorial tests, and raise stakeholder engagement.

  2. Laboratory measurements of electrical resistivity versus water content on small soil cores

    NASA Astrophysics Data System (ADS)

    Robain, H.; Camerlynck, C.; Bellier, G.; Tabbagh, A.

    2003-04-01

    The assessment of soil water content variations more and more leans on geophysical methods that are non invasive and that allow a high spatial sampling. Among the different methods, DC electrical imaging is moving forward. DC Electrical resistivity shows indeed strong seasonal variations that principally depend on soil water content variations. Nevertheless, the widely used Archie's empirical law [1], that links resistivity with voids saturation and water conductivity is not well suited to soil materials with high clay content. Furthermore, the shrinking and swelling properties of soil materials have to be considered. Hence, it is relevant to develop new laboratory experiments in order to establish a relation between electrical resistivity and water content taking into account the rheological and granulometrical specificities of soil materials. The experimental device developed in IRD laboratory allows to monitor simultaneously (i) the water content, (ii) the electrical resistivity and (iii) the volume of a small cylindrical soil core (100cm3) put in a temperature controlled incubator (30°C). It provides both the shrinkage curve of the soil core (voids volume versus water content) and the electrical resistivity versus water content curve The modelisation of the shrinkage curve gives for each moisture state the water respectively contained in macro and micro voids [2], and then allows to propose a generalized Archie's like law as following : 1/Rs = 1/Fma.Rma + 1/Fmi.Rmi and Fi = Ai/(Vi^Mi.Si^Ni) with Rs : the soil resistivity. Fma and Fmi : the so called "formation factor" for macro and micro voids, respectively. Rma and Rmi : the resistivity of the water contained in macro and micro voids, respectively. Vi : the volume of macro and micro voids, respectively. Si : the saturation of macro and micro voids, respectively. Ai, Mi and Ni : adjustment coefficients. The variations of Rmi are calculated, assuming that Rma is a constant. Indeed, the rise of ionic

  3. Neutronics Analyses of the Minimum Original HEU TREAT Core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kontogeorgakos, D.; Connaway, H.; Yesilyurt, G.

    2014-04-01

    This work was performed to support the feasibility study on the potential conversion of the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory from the use of high-enriched uranium (HEU) fuel to the use of low-enriched uranium (LEU) fuel. The analyses were performed by the GTRI Reactor Conversion staff at the Argonne National Laboratory (ANL). The objective of this study was to validate the MCNP model of the TREAT reactor with the well-documented measurements which were taken during the start-up and early operation of TREAT. Furthermore, the effect of carbon graphitization was also addressed. The graphitization level was assumedmore » to be 100% (ANL/GTRI/TM-13/4). For this purpose, a set of experiments was chosen to validate the TREAT MCNP model, involving the approach to criticality procedure, in-core neutron flux measurements with foils, and isothermal temperature coefficient and temperature distribution measurements. The results of this study extended the knowledge base for the TREAT MCNP calculations and established the credibility of the MCNP model to be used in the core conversion feasibility analysis.« less

  4. Laboratory directed research and development program, FY 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-02-01

    The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab) Laboratory Directed Research and Development Program FY 1996 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of the Laboratory Directed Research and Development (LDRD) program planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The Berkeley Lab LDRD program is a critical tool for directing the Laboratory`s forefront scientific research capabilities toward vital, excellent, and emerging scientific challenges. The program provides themore » resources for Berkeley Lab scientists to make rapid and significant contributions to critical national science and technology problems. The LDRD program also advances the Laboratory`s core competencies, foundations, and scientific capability, and permits exploration of exciting new opportunities. Areas eligible for support include: (1) Work in forefront areas of science and technology that enrich Laboratory research and development capability; (2) Advanced study of new hypotheses, new experiments, and innovative approaches to develop new concepts or knowledge; (3) Experiments directed toward proof of principle for initial hypothesis testing or verification; and (4) Conception and preliminary technical analysis to explore possible instrumentation, experimental facilities, or new devices.« less

  5. Qualitative Analysis of Common Definitions for Core Advanced Pharmacy Practice Experiences

    PubMed Central

    Danielson, Jennifer; Weber, Stanley S.

    2014-01-01

    Objective. To determine how colleges and schools of pharmacy interpreted the Accreditation Council for Pharmacy Education’s (ACPE’s) Standards 2007 definitions for core advanced pharmacy practice experiences (APPEs), and how they differentiated community and institutional practice activities for introductory pharmacy practice experiences (IPPEs) and APPEs. Methods. A cross-sectional, qualitative, thematic analysis was done of survey data obtained from experiential education directors in US colleges and schools of pharmacy. Open-ended responses to invited descriptions of the 4 core APPEs were analyzed using grounded theory to determine common themes. Type of college or school of pharmacy (private vs public) and size of program were compared. Results. Seventy-one schools (72%) with active APPE programs at the time of the survey responded. Lack of strong frequent themes describing specific activities for the acute care/general medicine core APPE indicated that most respondents agreed on the setting (hospital or inpatient) but the student experience remained highly variable. Themes were relatively consistent between public and private institutions, but there were differences across programs of varying size. Conclusion. Inconsistencies existed in how colleges and schools of pharmacy defined the core APPEs as required by ACPE. More specific descriptions of core APPEs would help to standardize the core practice experiences across institutions and provide an opportunity for quality benchmarking. PMID:24954931

  6. Frictional and hydrologic behavior of the San Andreas Fault: Insights from laboratory experiments on SAFOD cuttings and core

    NASA Astrophysics Data System (ADS)

    Carpenter, B. M.; Marone, C.; Saffer, D. M.

    2010-12-01

    The debate concerning the apparent low strength of tectonic faults, including the San Andreas Fault (SAF), continues to focus on: 1) low intrinsic friction resulting from mineralogy and/or fabric, and 2) decreased effective normal stress due to elevated pore pressure. Here we inform this debate with laboratory measurements of the frictional behavior and permeability of cuttings and core returned from the SAF at a vertical depth of 2.7 km. We conducted experiments on cuttings and core recovered during SAFOD Phase III drilling. All samples in this study are adjacent to and within the active fault zone penetrated at 10814.5 ft (3296m) measured depth in the SAFOD borehole. We sheared gouge samples composed of drilling cuttings in a double-direct shear configuration subject to true-triaxial loading under constant effective normal stress, confining pressure, and pore pressure. Intact wafers of material were sheared in a single-direct shear configuration under similar conditions of effective stress, confining pressure, and pore pressure. We also report on permeability measurements on intact wafers of wall rock and fault gouge prior to shearing. Initial results from experiments on cuttings show: 1) a weak fault (µ=~0.21) compared to the surrounding wall rock (µ=~0.35), 2) velocity strengthening behavior, (a-b > 0), consistent with aseismic slip, and 3) near zero healing rates in material from the active fault. XRD analysis on cuttings indicates the main mineralogical difference between fault rock and wall rock, is the presence of significant amounts of smectite within the fault rock. Taken together, the measured frictional behavior and clay mineral content suggest that the clay composition exhibits a basic control on fault behavior. Our results document the first direct evidence of weak material from an active fault at seismogenic depths. In addition, our results could explain why the SAF in central California fails aseismically and hosts only small earthquakes.

  7. Core Physics and Kinetics Calculations for the Fissioning Plasma Core Reactor

    NASA Technical Reports Server (NTRS)

    Butler, C.; Albright, D.

    2007-01-01

    Highly efficient, compact nuclear reactors would provide high specific impulse spacecraft propulsion. This analysis and numerical simulation effort has focused on the technical feasibility issues related to the nuclear design characteristics of a novel reactor design. The Fissioning Plasma Core Reactor (FPCR) is a shockwave-driven gaseous-core nuclear reactor, which uses Magneto Hydrodynamic effects to generate electric power to be used for propulsion. The nuclear design of the system depends on two major calculations: core physics calculations and kinetics calculations. Presently, core physics calculations have concentrated on the use of the MCNP4C code. However, initial results from other codes such as COMBINE/VENTURE and SCALE4a. are also shown. Several significant modifications were made to the ISR-developed QCALC1 kinetics analysis code. These modifications include testing the state of the core materials, an improvement to the calculation of the material properties of the core, the addition of an adiabatic core temperature model and improvement of the first order reactivity correction model. The accuracy of these modifications has been verified, and the accuracy of the point-core kinetics model used by the QCALC1 code has also been validated. Previously calculated kinetics results for the FPCR were described in the ISR report, "QCALC1: A code for FPCR Kinetics Model Feasibility Analysis" dated June 1, 2002.

  8. MIMI: multimodality, multiresource, information integration environment for biomedical core facilities.

    PubMed

    Szymanski, Jacek; Wilson, David L; Zhang, Guo-Qiang

    2009-10-01

    The rapid expansion of biomedical research has brought substantial scientific and administrative data management challenges to modern core facilities. Scientifically, a core facility must be able to manage experimental workflow and the corresponding set of large and complex scientific data. It must also disseminate experimental data to relevant researchers in a secure and expedient manner that facilitates collaboration and provides support for data interpretation and analysis. Administratively, a core facility must be able to manage the scheduling of its equipment and to maintain a flexible and effective billing system to track material, resource, and personnel costs and charge for services to sustain its operation. It must also have the ability to regularly monitor the usage and performance of its equipment and to provide summary statistics on resources spent on different categories of research. To address these informatics challenges, we introduce a comprehensive system called MIMI (multimodality, multiresource, information integration environment) that integrates the administrative and scientific support of a core facility into a single web-based environment. We report the design, development, and deployment experience of a baseline MIMI system at an imaging core facility and discuss the general applicability of such a system in other types of core facilities. These initial results suggest that MIMI will be a unique, cost-effective approach to addressing the informatics infrastructure needs of core facilities and similar research laboratories.

  9. Quantifying inter-laboratory variability in stable isotope analysis of ancient skeletal remains.

    PubMed

    Pestle, William J; Crowley, Brooke E; Weirauch, Matthew T

    2014-01-01

    Over the past forty years, stable isotope analysis of bone (and tooth) collagen and hydroxyapatite has become a mainstay of archaeological and paleoanthropological reconstructions of paleodiet and paleoenvironment. Despite this method's frequent use across anthropological subdisciplines (and beyond), the present work represents the first attempt at gauging the effects of inter-laboratory variability engendered by differences in a) sample preparation, and b) analysis (instrumentation, working standards, and data calibration). Replicate analyses of a 14C-dated ancient human bone by twenty-one archaeological and paleoecological stable isotope laboratories revealed significant inter-laboratory isotopic variation for both collagen and carbonate. For bone collagen, we found a sizeable range of 1.8‰ for δ13Ccol and 1.9‰ for δ15Ncol among laboratories, but an interpretatively insignificant average pairwise difference of 0.2‰ and 0.4‰ for δ13Ccol and δ15Ncol respectively. For bone hydroxyapatite the observed range increased to a troublingly large 3.5‰ for δ13Cap and 6.7‰ for δ18Oap, with average pairwise differences of 0.6‰ for δ13Cap and a disquieting 2.0‰ for δ18Oap. In order to assess the effects of preparation versus analysis on isotopic variability among laboratories, a subset of the samples prepared by the participating laboratories were analyzed a second time on the same instrument. Based on this duplicate analysis, it was determined that roughly half of the isotopic variability among laboratories could be attributed to differences in sample preparation, with the other half resulting from differences in analysis (instrumentation, working standards, and data calibration). These findings have serious implications for choices made in the preparation and extraction of target biomolecules, the comparison of results obtained from different laboratories, and the interpretation of small differences in bone collagen and hydroxyapatite isotope values

  10. Quantifying Inter-Laboratory Variability in Stable Isotope Analysis of Ancient Skeletal Remains

    PubMed Central

    Pestle, William J.; Crowley, Brooke E.; Weirauch, Matthew T.

    2014-01-01

    Over the past forty years, stable isotope analysis of bone (and tooth) collagen and hydroxyapatite has become a mainstay of archaeological and paleoanthropological reconstructions of paleodiet and paleoenvironment. Despite this method's frequent use across anthropological subdisciplines (and beyond), the present work represents the first attempt at gauging the effects of inter-laboratory variability engendered by differences in a) sample preparation, and b) analysis (instrumentation, working standards, and data calibration). Replicate analyses of a 14C-dated ancient human bone by twenty-one archaeological and paleoecological stable isotope laboratories revealed significant inter-laboratory isotopic variation for both collagen and carbonate. For bone collagen, we found a sizeable range of 1.8‰ for δ13Ccol and 1.9‰ for δ15Ncol among laboratories, but an interpretatively insignificant average pairwise difference of 0.2‰ and 0.4‰ for δ13Ccol and δ15Ncol respectively. For bone hydroxyapatite the observed range increased to a troublingly large 3.5‰ for δ13Cap and 6.7‰ for δ18Oap, with average pairwise differences of 0.6‰ for δ13Cap and a disquieting 2.0‰ for δ18Oap. In order to assess the effects of preparation versus analysis on isotopic variability among laboratories, a subset of the samples prepared by the participating laboratories were analyzed a second time on the same instrument. Based on this duplicate analysis, it was determined that roughly half of the isotopic variability among laboratories could be attributed to differences in sample preparation, with the other half resulting from differences in analysis (instrumentation, working standards, and data calibration). These findings have serious implications for choices made in the preparation and extraction of target biomolecules, the comparison of results obtained from different laboratories, and the interpretation of small differences in bone collagen and hydroxyapatite isotope values

  11. Digital Core Modelling for Clastic Oil and Gas Reservoir

    NASA Astrophysics Data System (ADS)

    Belozerov, I.; Berezovsky, V.; Gubaydullin, M.; Yur’ev, A.

    2018-05-01

    "Digital core" is a multi-purpose tool for solving a variety of tasks in the field of geological exploration and production of hydrocarbons at various stages, designed to improve the accuracy of geological study of subsurface resources, the efficiency of reproduction and use of mineral resources, as well as applying the results obtained in production practice. The actuality of the development of the "Digital core" software is that even a partial replacement of natural laboratory experiments with mathematical modelling can be used in the operative calculation of reserves in exploratory drilling, as well as in the absence of core material from wells. Or impossibility of its research by existing laboratory methods (weakly cemented, loose, etc. rocks). 3D-reconstruction of the core microstructure can be considered as a cheap and least time-consuming method for obtaining petrophysical information about the main filtration-capacitive properties and fluid motion in reservoir rocks.

  12. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    DOE PAGES

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  13. An approach to model reactor core nodalization for deterministic safety analysis

    NASA Astrophysics Data System (ADS)

    Salim, Mohd Faiz; Samsudin, Mohd Rafie; Mamat @ Ibrahim, Mohd Rizal; Roslan, Ridha; Sadri, Abd Aziz; Farid, Mohd Fairus Abd

    2016-01-01

    Adopting good nodalization strategy is essential to produce an accurate and high quality input model for Deterministic Safety Analysis (DSA) using System Thermal-Hydraulic (SYS-TH) computer code. The purpose of such analysis is to demonstrate the compliance against regulatory requirements and to verify the behavior of the reactor during normal and accident conditions as it was originally designed. Numerous studies in the past have been devoted to the development of the nodalization strategy for small research reactor (e.g. 250kW) up to the bigger research reactor (e.g. 30MW). As such, this paper aims to discuss the state-of-arts thermal hydraulics channel to be employed in the nodalization for RTP-TRIGA Research Reactor specifically for the reactor core. At present, the required thermal-hydraulic parameters for reactor core, such as core geometrical data (length, coolant flow area, hydraulic diameters, and axial power profile) and material properties (including the UZrH1.6, stainless steel clad, graphite reflector) have been collected, analyzed and consolidated in the Reference Database of RTP using standardized methodology, mainly derived from the available technical documentations. Based on the available information in the database, assumptions made on the nodalization approach and calculations performed will be discussed and presented. The development and identification of the thermal hydraulics channel for the reactor core will be implemented during the SYS-TH calculation using RELAP5-3D® computer code. This activity presented in this paper is part of the development of overall nodalization description for RTP-TRIGA Research Reactor under the IAEA Norwegian Extra-Budgetary Programme (NOKEBP) mentoring project on Expertise Development through the Analysis of Reactor Thermal-Hydraulics for Malaysia, denoted as EARTH-M.

  14. An approach to model reactor core nodalization for deterministic safety analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salim, Mohd Faiz, E-mail: mohdfaizs@tnb.com.my; Samsudin, Mohd Rafie, E-mail: rafies@tnb.com.my; Mamat Ibrahim, Mohd Rizal, E-mail: m-rizal@nuclearmalaysia.gov.my

    Adopting good nodalization strategy is essential to produce an accurate and high quality input model for Deterministic Safety Analysis (DSA) using System Thermal-Hydraulic (SYS-TH) computer code. The purpose of such analysis is to demonstrate the compliance against regulatory requirements and to verify the behavior of the reactor during normal and accident conditions as it was originally designed. Numerous studies in the past have been devoted to the development of the nodalization strategy for small research reactor (e.g. 250kW) up to the bigger research reactor (e.g. 30MW). As such, this paper aims to discuss the state-of-arts thermal hydraulics channel to bemore » employed in the nodalization for RTP-TRIGA Research Reactor specifically for the reactor core. At present, the required thermal-hydraulic parameters for reactor core, such as core geometrical data (length, coolant flow area, hydraulic diameters, and axial power profile) and material properties (including the UZrH{sub 1.6}, stainless steel clad, graphite reflector) have been collected, analyzed and consolidated in the Reference Database of RTP using standardized methodology, mainly derived from the available technical documentations. Based on the available information in the database, assumptions made on the nodalization approach and calculations performed will be discussed and presented. The development and identification of the thermal hydraulics channel for the reactor core will be implemented during the SYS-TH calculation using RELAP5-3D{sup ®} computer code. This activity presented in this paper is part of the development of overall nodalization description for RTP-TRIGA Research Reactor under the IAEA Norwegian Extra-Budgetary Programme (NOKEBP) mentoring project on Expertise Development through the Analysis of Reactor Thermal-Hydraulics for Malaysia, denoted as EARTH-M.« less

  15. Real-time oil-saturation monitoring in rock cores with low-field NMR.

    PubMed

    Mitchell, J; Howe, A M; Clarke, A

    2015-07-01

    Nuclear magnetic resonance (NMR) provides a powerful suite of tools for studying oil in reservoir core plugs at the laboratory scale. Low-field magnets are preferred for well-log calibration and to minimize magnetic-susceptibility-induced internal gradients in the porous medium. We demonstrate that careful data processing, combined with prior knowledge of the sample properties, enables real-time acquisition and interpretation of saturation state (relative amount of oil and water in the pores of a rock). Robust discrimination of oil and brine is achieved with diffusion weighting. We use this real-time analysis to monitor the forced displacement of oil from porous materials (sintered glass beads and sandstones) and to generate capillary desaturation curves. The real-time output enables in situ modification of the flood protocol and accurate control of the saturation state prior to the acquisition of standard NMR core analysis data, such as diffusion-relaxation correlations. Although applications to oil recovery and core analysis are demonstrated, the implementation highlights the general practicality of low-field NMR as an inline sensor for real-time industrial process control. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Utilization of Multimedia Laboratory: An Acceptance Analysis using TAM

    NASA Astrophysics Data System (ADS)

    Modeong, M.; Palilingan, V. R.

    2018-02-01

    Multimedia is often utilized by teachers to present a learning materials. Learning that delivered by multimedia enables people to understand the information of up to 60% of the learning in general. To applying the creative learning to the classroom, multimedia presentation needs a laboratory as a space that provides multimedia needs. This study aims to reveal the level of student acceptance on the multimedia laboratories, by explaining the direct and indirect effect of internal support and technology infrastructure. Technology Acceptance Model (TAM) is used as the basis of measurement on this research, through the perception of usefulness, ease of use, and the intention, it’s recognized capable of predicting user acceptance about technology. This study used the quantitative method. The data analysis using path analysis that focuses on trimming models, it’s performed to improve the model of path analysis structure by removing exogenous variables that have insignificant path coefficients. The result stated that Internal Support and Technology Infrastructure are well mediated by TAM variables to measure the level of technology acceptance. The implications suggest that TAM can measure the success of multimedia laboratory utilization in Faculty of Engineering UNIMA.

  17. Estimating the spatial distribution of soil organic matter density and geochemical properties in a polygonal shaped Arctic Tundra using core sample analysis and X-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Soom, F.; Ulrich, C.; Dafflon, B.; Wu, Y.; Kneafsey, T. J.; López, R. D.; Peterson, J.; Hubbard, S. S.

    2016-12-01

    The Arctic tundra with its permafrost dominated soils is one of the regions most affected by global climate change, and in turn, can also influence the changing climate through biogeochemical processes, including greenhouse gas release or storage. Characterization of shallow permafrost distribution and characteristics are required for predicting ecosystem feedbacks to a changing climate over decadal to century timescales, because they can drive active layer deepening and land surface deformation, which in turn can significantly affect hydrological and biogeochemical responses, including greenhouse gas dynamics. In this study, part of the Next-Generation Ecosystem Experiment (NGEE-Arctic), we use X-ray computed tomography (CT) to estimate wet bulk density of cores extracted from a field site near Barrow AK, which extend 2-3m through the active layer into the permafrost. We use multi-dimensional relationships inferred from destructive core sample analysis to infer organic matter density, dry bulk density and ice content, along with some geochemical properties from nondestructive CT-scans along the entire length of the cores, which was not obtained by the spatially limited destructive laboratory analysis. Multi-parameter cross-correlations showed good agreement between soil properties estimated from CT scans versus properties obtained through destructive sampling. Soil properties estimated from cores located in different types of polygons provide valuable information about the vertical distribution of soil and permafrost properties as a function of geomorphology.

  18. Dynamic analysis of gas-core reactor system

    NASA Technical Reports Server (NTRS)

    Turner, K. H., Jr.

    1973-01-01

    A heat transfer analysis was incorporated into a previously developed model CODYN to obtain a model of open-cycle gaseous core reactor dynamics which can predict the heat flux at the cavity wall. The resulting model was used to study the sensitivity of the model to the value of the reactivity coefficients and to determine the system response for twenty specified perturbations. In addition, the model was used to study the effectiveness of several control systems in controlling the reactor. It was concluded that control drums located in the moderator region capable of inserting reactivity quickly provided the best control.

  19. Writing to Learn by Learning to Write during the School Science Laboratory: Helping Middle and High School Students Develop Argumentative Writing Skills as They Learn Core Ideas

    ERIC Educational Resources Information Center

    Sampson, Victor; Enderle, Patrick; Grooms, Jonathon; Witte, Shelbie

    2013-01-01

    This study examined how students' science-specific argumentative writing skills and understanding of core ideas changed over the course of a school year as they participated in a series of science laboratories designed using the Argument-Driven Inquiry (ADI) instructional model. The ADI model is a student-centered and writing-intensive approach to…

  20. Analysis of the core genome and pangenome of Pseudomonas putida.

    PubMed

    Udaondo, Zulema; Molina, Lázaro; Segura, Ana; Duque, Estrella; Ramos, Juan L

    2016-10-01

    Pseudomonas putida are strict aerobes that proliferate in a range of temperate niches and are of interest for environmental applications due to their capacity to degrade pollutants and ability to promote plant growth. Furthermore solvent-tolerant strains are useful for biosynthesis of added-value chemicals. We present a comprehensive comparative analysis of nine strains and the first characterization of the Pseudomonas putida pangenome. The core genome of P. putida comprises approximately 3386 genes. The most abundant genes within the core genome are those that encode nutrient transporters. Other conserved genes include those for central carbon metabolism through the Entner-Doudoroff pathway, the pentose phosphate cycle, arginine and proline metabolism, and pathways for degradation of aromatic chemicals. Genes that encode transporters, enzymes and regulators for amino acid metabolism (synthesis and degradation) are all part of the core genome, as well as various electron transporters, which enable aerobic metabolism under different oxygen regimes. Within the core genome are 30 genes for flagella biosynthesis and 12 key genes for biofilm formation. Pseudomonas putida strains share 85% of the coding regions with Pseudomonas aeruginosa; however, in P. putida, virulence factors such as exotoxins and type III secretion systems are absent. © 2015 Society for Applied Microbiology and John Wiley & Sons Ltd.

  1. Coherent network analysis of gravitational waves from three-dimensional core-collapse supernova models

    NASA Astrophysics Data System (ADS)

    Hayama, Kazuhiro; Kuroda, Takami; Kotake, Kei; Takiwaki, Tomoya

    2015-12-01

    Using predictions from three-dimensional (3D) hydrodynamics simulations of core-collapse supernovae (CCSNe), we present a coherent network analysis for the detection, reconstruction, and source localization of the gravitational-wave (GW) signals. We use the RIDGE pipeline for the analysis, in which the network of LIGO Hanford, LIGO Livingston, VIRGO, and KAGRA is considered. By combining with a GW spectrogram analysis, we show that several important hydrodynamics features in the original waveforms persist in the waveforms of the reconstructed signals. The characteristic excess in the spectrograms originates not only from the rotating core collapse, bounce, and subsequent ringdown of the proto-neutron star (PNS) as previously identified, but also from the formation of magnetohydrodynamics jets and nonaxisymmetric instabilities in the vicinity of the PNS. Regarding the GW signals emitted near the rotating core bounce, the horizon distance extends up to ˜18 kpc for the most rapidly rotating 3D model in this work. Following the rotating core bounce, the dominant source of the GW emission shifts to the nonaxisymmetric instabilities. The horizon distances extend maximally up to ˜40 kpc seen from the spin axis. With an increasing number of 3D models trending towards explosion recently, our results suggest that in addition to the best-studied GW signals due to rotating core collapse and bounce, the time is ripe to consider how we can do science from GWs of CCSNe much more seriously than before. In particular, the quasiperiodic signals due to the nonaxisymmetric instabilities and the detectability deserves further investigation to elucidate the inner workings of the rapidly rotating CCSNe.

  2. Prostate needle biopsy processing: a survey of laboratory practice across Europe.

    PubMed

    Varma, Murali; Berney, Daniel M; Algaba, Ferran; Camparo, Philippe; Compérat, Eva; Griffiths, David F R; Kristiansen, Glen; Lopez-Beltran, Antonio; Montironi, Rodolfo; Egevad, Lars

    2013-02-01

    To determine the degree of variation in the handling of prostate needle biopsies (PBNx) in laboratories across Europe. A web based survey was emailed to members of the European Network of Uropathology and the British Association of Urological Pathologists. Responses were received from 241 laboratories in 15 countries. PNBx were generally taken by urologists (93.8%) or radiologists (23.7%) but in 8.7% were also taken by non-medical personnel such as radiographers, nurses or biomedical assistants. Of the responding laboratories, 40.8% received cores in separate containers, 42.3% processed one core/block, 54.2% examined three levels/block, 49.4% examined one H&E section/level and 56.1% retained spare sections for potential immunohistochemistry. Of the laboratories, 40.9% retained unstained spares for over a year while 36.2% discarded spares within 1 month of reporting. Only two (0.8%) respondents routinely performed immunohistochemistry on all PNBx. There were differences in laboratory practice between the UK and the rest of Europe (RE). Procurement of PNBx by non-medical personnel was more common in the UK. RE laboratories more commonly received each core in a separate container, processed one core/block, examined fewer levels/block and examined more H&E sections/level. RE laboratories also retained spares for potential immunohistochemistry less often and for shorter periods. Use of p63 as the sole basal cell marker was more common in RE. There are marked differences in procurement, handling and processing of PNBx in laboratories across Europe. This data can help the development of best practice guidelines.

  3. Oak Ridge National Laboratory Institutional Plan, FY 1995--FY 2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-11-01

    This report discusses the institutional plan for Oak Ridge National Laboratory for the next five years (1995-2000). Included in this report are the: laboratory director`s statement; laboratory mission, vision, and core competencies; laboratory plan; major laboratory initiatives; scientific and technical programs; critical success factors; summaries of other plans; and resource projections.

  4. An Analysis of the Laboratory Assisting Occupation.

    ERIC Educational Resources Information Center

    McGee, Patricia; And Others

    The general purpose of the occupational analysis is to provide workable, basic information dealing with the many and varied duties performed in the laboratory assistant occupation. The document opens with a brief introduction followed by a job description. The bulk of the document is presented in table form. Eleven duties are broken down into a…

  5. A full virial analysis of the prestellar cores in the Ophiuchus molecular cloud

    NASA Astrophysics Data System (ADS)

    Pattle, Kate; Ward-Thompson, Derek

    2015-08-01

    We present the first observations of the Ophiuchus molecular cloud performed as part of the James Clerk Maxwell (JCMT) Gould Belt Survey with the SCUBA-2 instrument. We demonstrate methods for combining these data with HARP CO, Herschel and IRAM N2H+ observations in order to accurately quantify the properties of the SCUBA-2 sources in Ophiuchus.We perform a full virial analysis on the starless cores in Ophiuchus, including external pressure. We find that the majority of our cores are either bound or virialised, and that gravity and external pressure are typically of similar importance in confining cores. We find that the critical Bonnor-Ebert stability criterion is not a good indicator of the boundedness of our cores. We determine that N2H+ is a good tracer of the bound material of prestellar cores, and find that non-thermal linewidths decrease substantially between the intermediate-density gas traced by C18O and the high-density gas traced by N2H+, indicating the dissipation of turbulence within cores.We find variation from region to region in the virial balance of cores and the relative contributions of pressure and gravity to core support, as well as variation in the degree to which turbulence is dissipated within cores and in the relative numbers of protostellar and starless sources. We find further support for our previous hypothesis of a global evolutionary gradient from southwest to northeast across Ophiuchus, indicating sequential star formation across the region.

  6. Interrelating the breakage and composition of mined and drill core coal

    NASA Astrophysics Data System (ADS)

    Wilson, Terril Edward

    property) indicated that the size distribution and size fraction composition of the drop-shattered/tumbled core more closely resembled the plant feed than the crushed core. An attempt to determine breakage parameters (to allow use of selection and breakage functions and population balance models in the description of bore core size reduction) was initiated. Rank determination of the three coal types was done, indicating that higher rank associates with higher breakage propensity. The two step procedure of drop-shatter and dry batch tumbling simulates the first-order (volume breakage) and zeroth-order (abrasion of particle surfaces) that occur in excavation and handling operations, and is appropriate for drill core reduction prior to laboratory analysis.

  7. Meta-analysis of the effectiveness of computer-based laboratory versus traditional hands-on laboratory in college and pre-college science instructions

    NASA Astrophysics Data System (ADS)

    Onuoha, Cajetan O.

    The purpose of this research study was to determine the overall effectiveness of computer-based laboratory compared with the traditional hands-on laboratory for improving students' science academic achievement and attitudes towards science subjects at the college and pre-college levels of education in the United States. Meta-analysis was used to synthesis the findings from 38 primary research studies conducted and/or reported in the United States between 1996 and 2006 that compared the effectiveness of computer-based laboratory with the traditional hands-on laboratory on measures related to science academic achievements and attitudes towards science subjects. The 38 primary research studies, with total subjects of 3,824 generated a total of 67 weighted individual effect sizes that were used in this meta-analysis. The study found that computer-based laboratory had small positive effect sizes over the traditional hands-on laboratory (ES = +0.26) on measures related to students' science academic achievements and attitudes towards science subjects (ES = +0.22). It was also found that computer-based laboratory produced more significant effects on physical science subjects compared to biological sciences (ES = +0.34, +0.17).

  8. Dating a tropical ice core by time-frequency analysis of ion concentration depth profiles

    NASA Astrophysics Data System (ADS)

    Gay, M.; De Angelis, M.; Lacoume, J.-L.

    2014-09-01

    Ice core dating is a key parameter for the interpretation of the ice archives. However, the relationship between ice depth and ice age generally cannot be easily established and requires the combination of numerous investigations and/or modelling efforts. This paper presents a new approach to ice core dating based on time-frequency analysis of chemical profiles at a site where seasonal patterns may be significantly distorted by sporadic events of regional importance, specifically at the summit area of Nevado Illimani (6350 m a.s.l.), located in the eastern Bolivian Andes (16°37' S, 67°46' W). We used ion concentration depth profiles collected along a 100 m deep ice core. The results of Fourier time-frequency and wavelet transforms were first compared. Both methods were applied to a nitrate concentration depth profile. The resulting chronologies were checked by comparison with the multi-proxy year-by-year dating published by de Angelis et al. (2003) and with volcanic tie points. With this first experiment, we demonstrated the efficiency of Fourier time-frequency analysis when tracking the nitrate natural variability. In addition, we were able to show spectrum aliasing due to under-sampling below 70 m. In this article, we propose a method of de-aliasing which significantly improves the core dating in comparison with annual layer manual counting. Fourier time-frequency analysis was applied to concentration depth profiles of seven other ions, providing information on the suitability of each of them for the dating of tropical Andean ice cores.

  9. Development of Safety Analysis Code System of Beam Transport and Core for Accelerator Driven System

    NASA Astrophysics Data System (ADS)

    Aizawa, Naoto; Iwasaki, Tomohiko

    2014-06-01

    Safety analysis code system of beam transport and core for accelerator driven system (ADS) is developed for the analyses of beam transients such as the change of the shape and position of incident beam. The code system consists of the beam transport analysis part and the core analysis part. TRACE 3-D is employed in the beam transport analysis part, and the shape and incident position of beam at the target are calculated. In the core analysis part, the neutronics, thermo-hydraulics and cladding failure analyses are performed by the use of ADS dynamic calculation code ADSE on the basis of the external source database calculated by PHITS and the cross section database calculated by SRAC, and the programs of the cladding failure analysis for thermoelastic and creep. By the use of the code system, beam transient analyses are performed for the ADS proposed by Japan Atomic Energy Agency. As a result, the rapid increase of the cladding temperature happens and the plastic deformation is caused in several seconds. In addition, the cladding is evaluated to be failed by creep within a hundred seconds. These results have shown that the beam transients have caused a cladding failure.

  10. Multivariate analysis of heavy metal contamination using river sediment cores of Nankan River, northern Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, An-Sheng; Lu, Wei-Li; Huang, Jyh-Jaan; Chang, Queenie; Wei, Kuo-Yen; Lin, Chin-Jung; Liou, Sofia Ya Hsuan

    2016-04-01

    Through the geology and climate characteristic in Taiwan, generally rivers carry a lot of suspended particles. After these particles settled, they become sediments which are good sorbent for heavy metals in river system. Consequently, sediments can be found recording contamination footprint at low flow energy region, such as estuary. Seven sediment cores were collected along Nankan River, northern Taiwan, which is seriously contaminated by factory, household and agriculture input. Physico-chemical properties of these cores were derived from Itrax-XRF Core Scanner and grain size analysis. In order to interpret these complex data matrices, the multivariate statistical techniques (cluster analysis, factor analysis and discriminant analysis) were introduced to this study. Through the statistical determination, the result indicates four types of sediment. One of them represents contamination event which shows high concentration of Cu, Zn, Pb, Ni and Fe, and low concentration of Si and Zr. Furthermore, three possible contamination sources of this type of sediment were revealed by Factor Analysis. The combination of sediment analysis and multivariate statistical techniques used provides new insights into the contamination depositional history of Nankan River and could be similarly applied to other river systems to determine the scale of anthropogenic contamination.

  11. Quantifying the Impact of Nanoparticle Coatings and Non-uniformities on XPS Analysis: Gold/silver Core-shell Nanoparticles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yung-Chen Andrew; Engelhard, Mark H.; Baer, Donald R.

    2016-03-07

    Abstract or short description: Spectral modeling of photoelectrons can serve as a valuable tool when combined with X-ray photoelectron spectroscopy (XPS) analysis. Herein, a new version of the NIST Simulation of Electron Spectra for Surface Analysis (SESSA 2.0) software, capable of directly simulating spherical multilayer NPs, was applied to model citrate stabilized Au/Ag-core/shell nanoparticles (NPs). The NPs were characterized using XPS and scanning transmission electron microscopy (STEM) to determine the composition and morphology of the NPs. The Au/Ag-core/shell NPs were observed to be polydispersed in size, non-circular, and contain off-centered Au-cores. Using the average NP dimensions determined from STEM analysis,more » SESSA spectral modeling indicated that washed Au/Ag-core shell NPs were stabilized with a 0.8 nm l« less

  12. Laboratory Building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herrera, Joshua M.

    2015-03-01

    This report is an analysis of the means of egress and life safety requirements for the laboratory building. The building is located at Sandia National Laboratories (SNL) in Albuquerque, NM. The report includes a prescriptive-based analysis as well as a performance-based analysis. Following the analysis are appendices which contain maps of the laboratory building used throughout the analysis. The top of all the maps is assumed to be north.

  13. Resolving Supercritical Orion Cores

    NASA Astrophysics Data System (ADS)

    Li, Di; Chapman, N.; Goldsmith, P.; Velusamy, T.

    2009-01-01

    The theoretical framework for high mass star formation (HMSF) is unclear. Observations reveal a seeming dichotomy between high- and low-mass star formation, with HMSF occurring only in Giant Molecular Clouds (GMC), mostly in clusters, and with higher star formation efficiencies than low-mass star formation. One crucial constraint to any theoretical model is the dynamical state of massive cores, in particular, whether a massive core is in supercritical collapse. Based on the mass-size relation of dust emission, we select likely unstable targets from a sample of massive cores (Li et al. 2007 ApJ 655, 351) in the nearest GMC, Orion. We have obtained N2H+ (1-0) maps using CARMA with resolution ( 2.5", 0.006 pc) significantly better than existing observations. We present observational and modeling results for ORI22. By revealing the dynamic structure down to Jeans scale, CARMA data confirms the dominance of gravity over turbulence in this cores. This work was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  14. VALIDATION GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    EPA Science Inventory

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

  15. 2. VIEW IN ROOM 111, ATOMIC ABSORPTION BERYLLIUM ANALYSIS LABORATORY. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VIEW IN ROOM 111, ATOMIC ABSORPTION BERYLLIUM ANALYSIS LABORATORY. AIR FILTERS AND SWIPES ARE DISSOLVED WITH ACIDS AND THE REMAINING RESIDUES ARE SUSPENDED IN NITRIC ACID SOLUTION. THE SOLUTION IS PROCESSED THROUGH THE ATOMIC ABSORPTION SPECTROPHOTOMETER TO DETECT THE PRESENCE AND LEVELS OF BERYLLIUM. - Rocky Flats Plant, Health Physics Laboratory, On Central Avenue between Third & Fourth Streets, Golden, Jefferson County, CO

  16. MASTR-MS: a web-based collaborative laboratory information management system (LIMS) for metabolomics.

    PubMed

    Hunter, Adam; Dayalan, Saravanan; De Souza, David; Power, Brad; Lorrimar, Rodney; Szabo, Tamas; Nguyen, Thu; O'Callaghan, Sean; Hack, Jeremy; Pyke, James; Nahid, Amsha; Barrero, Roberto; Roessner, Ute; Likic, Vladimir; Tull, Dedreia; Bacic, Antony; McConville, Malcolm; Bellgard, Matthew

    2017-01-01

    An increasing number of research laboratories and core analytical facilities around the world are developing high throughput metabolomic analytical and data processing pipelines that are capable of handling hundreds to thousands of individual samples per year, often over multiple projects, collaborations and sample types. At present, there are no Laboratory Information Management Systems (LIMS) that are specifically tailored for metabolomics laboratories that are capable of tracking samples and associated metadata from the beginning to the end of an experiment, including data processing and archiving, and which are also suitable for use in large institutional core facilities or multi-laboratory consortia as well as single laboratory environments. Here we present MASTR-MS, a downloadable and installable LIMS solution that can be deployed either within a single laboratory or used to link workflows across a multisite network. It comprises a Node Management System that can be used to link and manage projects across one or multiple collaborating laboratories; a User Management System which defines different user groups and privileges of users; a Quote Management System where client quotes are managed; a Project Management System in which metadata is stored and all aspects of project management, including experimental setup, sample tracking and instrument analysis, are defined, and a Data Management System that allows the automatic capture and storage of raw and processed data from the analytical instruments to the LIMS. MASTR-MS is a comprehensive LIMS solution specifically designed for metabolomics. It captures the entire lifecycle of a sample starting from project and experiment design to sample analysis, data capture and storage. It acts as an electronic notebook, facilitating project management within a single laboratory or a multi-node collaborative environment. This software is being developed in close consultation with members of the metabolomics research

  17. High Temperature Reactor (HTR) Deep Burn Core and Fuel Analysis: Design Selection for the Prismatic Block Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francesco Venneri; Chang-Keun Jo; Jae-Man Noh

    2010-09-01

    The Deep Burn (DB) Project is a U.S. Department of Energy sponsored feasibility study of Transuranic Management using high burnup fuel in the high temperature helium cooled reactor (HTR). The DB Project consists of seven tasks: project management, core and fuel analysis, spent fuel management, fuel cycle integration, TRU fuel modeling, TRU fuel qualification, and HTR fuel recycle. In the Phase II of the Project, we conducted nuclear analysis of TRU destruction/utilization in the HTR prismatic block design (Task 2.1), deep burn fuel/TRISO microanalysis (Task 2.3), and synergy with fast reactors (Task 4.2). The Task 2.1 covers the core physicsmore » design, thermo-hydraulic CFD analysis, and the thermofluid and safety analysis (low pressure conduction cooling, LPCC) of the HTR prismatic block design. The Task 2.3 covers the analysis of the structural behavior of TRISO fuel containing TRU at very high burnup level, i.e. exceeding 50% of FIMA. The Task 4.2 includes the self-cleaning HTR based on recycle of HTR-generated TRU in the same HTR. Chapter IV contains the design and analysis results of the 600MWth DB-HTR core physics with the cycle length, the average discharged burnup, heavy metal and plutonium consumptions, radial and axial power distributions, temperature reactivity coefficients. Also, it contains the analysis results of the 450MWth DB-HTR core physics and the analysis of the decay heat of a TRU loaded DB-HTR core. The evaluation of the hot spot fuel temperature of the fuel block in the DB-HTR (Deep-Burn High Temperature Reactor) core under full operating power conditions are described in Chapter V. The investigated designs are the 600MWth and 460MWth DB-HTRs. In Chapter VI, the thermo-fluid and safety of the 600MWth DB-HTRs has been analyzed to investigate a thermal-fluid design performance at the steady state and a passive safety performance during an LPCC event. Chapter VII describes the analysis results of the TRISO fuel microanalysis of the 600MWth

  18. Medical Laboratory Services. Student's Manual. Cluster Core for Health Occupations Education.

    ERIC Educational Resources Information Center

    Williams, Catherine

    This student's manual on medical laboratory services is one of a series of self-contained, individualized materials for students enrolled in training within the allied health field. It includes competencies that are associated with the performance of skills common to several occupations in the medical laboratory. The material is intended for use…

  19. NASA Laboratory Analysis for Manned Exploration Missions

    NASA Technical Reports Server (NTRS)

    Krihak, Michael (Editor); Shaw, Tianna

    2014-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability Element under the NASA Human Research Program. ELA instrumentation is identified as an essential capability for future exploration missions to diagnose and treat evidence-based medical conditions. However, mission architecture limits the medical equipment, consumables, and procedures that will be available to treat medical conditions during human exploration missions. Allocated resources such as mass, power, volume, and crew time must be used efficiently to optimize the delivery of in-flight medical care. Although commercial instruments can provide the blood and urine based measurements required for exploration missions, these commercial-off-the-shelf devices are prohibitive for deployment in the space environment. The objective of the ELA project is to close the technology gap of current minimally invasive laboratory capabilities and analytical measurements in a manner that the mission architecture constraints impose on exploration missions. Besides micro gravity and radiation tolerances, other principal issues that generally fail to meet NASA requirements include excessive mass, volume, power and consumables, and nominal reagent shelf-life. Though manned exploration missions will not occur for nearly a decade, NASA has already taken strides towards meeting the development of ELA medical diagnostics by developing mission requirements and concepts of operations that are coupled with strategic investments and partnerships towards meeting these challenges. This paper focuses on the remote environment, its challenges, biomedical diagnostics requirements and candidate technologies that may lead to successful blood/urine chemistry and biomolecular measurements in future space exploration missions. SUMMARY The NASA Exploration Laboratory Analysis project seeks to develop capability to diagnose anticipated space exploration medical conditions on future manned missions. To achieve

  20. Practical, transparent prospective risk analysis for the clinical laboratory.

    PubMed

    Janssens, Pim Mw

    2014-11-01

    Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  1. Laboratories | NREL

    Science.gov Websites

    | Z A Accelerated Exposure Testing Laboratory Advanced Optical Materials Laboratory Advanced Thermal Laboratory Structural Testing Laboratory Surface Analysis Laboratory Systems Performance Laboratory T Thermal Storage Materials Laboratory Thermal Storage Process and Components Laboratory Thin-Film Deposition

  2. Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 5: Analysis of core damage frequency from seismic events for plant operational state 5 during a refueling outage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budnitz, R.J.; Davis, P.R.; Ravindra, M.K.

    In 1989 the US Nuclear Regulatory Commission (NRC) initiated an extensive program to examine carefully the potential risks during low-power and shutdown operations. The program included two parallel projects, one at Sandia National Laboratories studying a boiling water reactor (Grand Gulf), and the other at Brookhaven National Laboratory studying a pressurized water reactor (Surry Unit 1). Both the Sandia and Brookhaven projects have examined only accidents initiated by internal plant faults---so-called ``internal initiators.`` This project, which has explored the likelihood of seismic-initiated core damage accidents during refueling outage conditions, is complementary to the internal-initiator analyses at Brookhaven and Sandia. Thismore » report covers the seismic analysis at Grand Gulf. All of the many systems modeling assumptions, component non-seismic failure rates, and human effort rates that were used in the internal-initiator study at Grand Gulf have been adopted here, so that the results of the study can be as comparable as possible. Both the Sandia study and this study examine only one shutdown plant operating state (POS) at Grand Gulf, namely POS 5 representing cold shutdown during a refueling outage. This analysis has been limited to work analogous to a level-1 seismic PRA, in which estimates have been developed for the core-damage frequency from seismic events during POS 5. The results of the analysis are that the core-damage frequency for earthquake-initiated accidents during refueling outages in POS 5 is found to be quite low in absolute terms, less than 10{sup {minus}7}/year.« less

  3. Determination of power distribution in the VVER-440 core on the basis of data from in-core monitors by means of a metric analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kryanev, A. V.; Udumyan, D. K.; Kurchenkov, A. Yu., E-mail: s327@vver.kiae.ru

    2014-12-15

    Problems associated with determining the power distribution in the VVER-440 core on the basis of a neutron-physics calculation and data from in-core monitors are considered. A new mathematical scheme is proposed for this on the basis of a metric analysis. In relation to the existing mathematical schemes, the scheme in question improves the accuracy and reliability of the resulting power distribution.

  4. Comparative Analysis of Hexagonal Solid Silica and Nitro-benzene Filled Hollow Core Photonic Crystal Fiber

    NASA Astrophysics Data System (ADS)

    Shahiruddin; Singh, Dharmendra K.; Hassan, M. A.

    2018-02-01

    A comparative study of five ring solid core and nitrobenzene filled hollow core liquid filled photonic crystal fiber (PCF) are presented. Considering the same structure, one is used as solid silica and another one is filled with nitrobenzene in the core. Here the paper elaborates the confinement loss, dispersion properties and birefringence of an index-guiding PCF with asymmetric cladding designed and analyzed by the finite-element method. The proposed structure shows the low confinement loss in case of solid silica, negative dispersion in nitrobenzene filled hollow core PCF and high birefringence in both the cases. The calculated values shows flat zero confinement loss in 0.7 µm to 1.54 µm range, flat zero dispersion is achieved in solid core and -2000 ps/km-nm in nitrobenzene filled hollow core PCF and high birefringence in the range of 10-3 in nitrobenzene filled hollow core PCF. Results show the relative analysis at different air fill fraction.

  5. Biotechniques Laboratory: An Enabling Course in the Biological Sciences

    ERIC Educational Resources Information Center

    Di Trapani, Giovanna; Clarke, Frank

    2012-01-01

    Practical skills and competencies are critical to student engagement and effective learning in laboratory courses. This article describes the design of a yearlong, stand-alone laboratory course--the Biotechniques Laboratory--a common core course in the second year of all our degree programs in the biological sciences. It is an enabling,…

  6. Methodology of full-core Monte Carlo calculations with leakage parameter evaluations for benchmark critical experiment analysis

    NASA Astrophysics Data System (ADS)

    Sboev, A. G.; Ilyashenko, A. S.; Vetrova, O. A.

    1997-02-01

    The method of bucking evaluation, realized in the MOnte Carlo code MCS, is described. This method was applied for calculational analysis of well known light water experiments TRX-1 and TRX-2. The analysis of this comparison shows, that there is no coincidence between Monte Carlo calculations, obtained by different ways: the MCS calculations with given experimental bucklings; the MCS calculations with given bucklings evaluated on base of full core MCS direct simulations; the full core MCNP and MCS direct simulations; the MCNP and MCS calculations, where the results of cell calculations are corrected by the coefficients taking into the account the leakage from the core. Also the buckling values evaluated by full core MCS calculations have differed from experimental ones, especially in the case of TRX-1, when this difference has corresponded to 0.5 percent increase of Keff value.

  7. Total laboratory automation: Do stat tests still matter?

    PubMed

    Dolci, Alberto; Giavarina, Davide; Pasqualetti, Sara; Szőke, Dominika; Panteghini, Mauro

    2017-07-01

    During the past decades the healthcare systems have rapidly changed and today hospital care is primarily advocated for critical patients and acute treatments, for which laboratory test results are crucial and need to be always reported in predictably short turnaround time (TAT). Laboratories in the hospital setting can face this challenge by changing their organization from a compartmentalized laboratory department toward a decision making-based laboratory department. This requires the implementation of a core laboratory, that exploits total laboratory automation (TLA) using technological innovation in analytical platforms, track systems and information technology, including middleware, and a number of satellite specialized laboratory sections cooperating with care teams for specific medical conditions. In this laboratory department model, the short TAT for all first-line tests performed by TLA in the core laboratory represents the key paradigm, where no more stat testing is required because all samples are handled in real-time and (auto)validated results dispatched in a time that fulfills clinical needs. To optimally reach this goal, laboratories should be actively involved in managing all the steps covering the total examination process, speeding up also extra-laboratory phases, such sample delivery. Furthermore, to warrant effectiveness and not only efficiency, all the processes, e.g. specimen integrity check, should be managed by middleware through a predefined set of rules defined in light of the clinical governance. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  8. Conceptual Core Analysis of Long Life PWR Utilizing Thorium-Uranium Fuel Cycle

    NASA Astrophysics Data System (ADS)

    Rouf; Su'ud, Zaki

    2016-08-01

    Conceptual core analysis of long life PWR utilizing thorium-uranium based fuel has conducted. The purpose of this study is to evaluate neutronic behavior of reactor core using combined thorium and enriched uranium fuel. Based on this fuel composition, reactor core have higher conversion ratio rather than conventional fuel which could give longer operation length. This simulation performed using SRAC Code System based on library SRACLIB-JDL32. The calculation carried out for (Th-U)O2 and (Th-U)C fuel with uranium composition 30 - 40% and gadolinium (Gd2O3) as burnable poison 0,0125%. The fuel composition adjusted to obtain burn up length 10 - 15 years under thermal power 600 - 1000 MWt. The key properties such as uranium enrichment, fuel volume fraction, percentage of uranium are evaluated. Core calculation on this study adopted R-Z geometry divided by 3 region, each region have different uranium enrichment. The result show multiplication factor every burn up step for 15 years operation length, power distribution behavior, power peaking factor, and conversion ratio. The optimum core design achieved when thermal power 600 MWt, percentage of uranium 35%, U-235 enrichment 11 - 13%, with 14 years operation length, axial and radial power peaking factor about 1.5 and 1.2 respectively.

  9. An analysis of laboratory activities found in "Applications In Biology/Chemistry: A Contextual Approach to Laboratory Science"

    NASA Astrophysics Data System (ADS)

    Haskins, Sandra Sue

    The purpose of this study was to quantitatively determine whether the material found in ABC promotes scientific inquiry through the inclusion of science process skills, and to quantitatively determine the type (experimental, comparative, or descriptive) and character (wet-lab, paper and pencil, model, or computer) of laboratory activities. The research design allowed for an examination of the frequency and type of science process skills required of students in 79 laboratory activities sampled from all 12 units utilizing a modified 33-item laboratory analysis inventory (LAI) (Germane et al, 1996). Interrater reliability for the science process skills was completed on 19 of the laboratory activities with a mean score of 86.1%. Interrater reliability for the type and character of the laboratory, on the same 19 laboratory activities, was completed with mean scores of 79.0% and 96.5%, respectively. It was found that all laboratory activities provide a prelaboratory activity. In addition, the science process skill category of student performance is required most often of students with the skill of learning techniques or manipulating apparatus occurring 99% of the time. The science process skill category observed the least was student planning and design, occurring only 3% of the time. Students were rarely given the opportunity to practice science process skills such as developing and testing hypotheses through experiments they have designed. Chi-square tests, applied at the .05 level of significance, revealed that there was a significant difference in the type of laboratory activities; comparative laboratory activities appeared more often (59%). In addition the character of laboratory activities, "wet-lab" activities appeared more often (90%) than any of the others.

  10. Development of a wear-resistant flux cored wire of Fe-C-Si-Mn-Cr-Ni-Mo-V system for deposit welding of mining equipment parts

    NASA Astrophysics Data System (ADS)

    Osetkovsky, I. V.; Kozyrev, N. A.; Kryukov, R. E.; Usoltsev, A. A.; Gusev, A. I.

    2017-09-01

    The effect of introduction of cobalt in the charge of the flux cored wire of Fe-C-Si-Mn-Cr-Ni-Mo-V system operating under abrasive and abrasive-shock loads is studied. In the laboratory conditions samples of flux cored wires were made, deposition was performed, the effect of cobalt on the hardness and the degree of wear was evaluated, metallographic studies were carried out. The influence of cobalt introduced into the charge of the flux cored wire of Fe-C-Si-Mn-Cr-Ni-Mo-V system on the structure, nature of nonmetallic inclusions, hardness and wear resistance of the weld metal was studied. In the laboratory conditions samples flux cored wire were made using appropriate powdered materials. As a carbon-fluorine-containing material dust from gas cleaning units of aluminum production was used. In the course of the study the chemical composition of the weld metal was determined, metallographic analysis was performed, mechanical properties were determined. As a result of the metallographic analysis the size of the former austenite grain, martensite dispersion in the structure of the weld metal, the level of contamination with its nonmetallic inclusions were established.

  11. The State Public Health Laboratory System.

    PubMed

    Inhorn, Stanley L; Astles, J Rex; Gradus, Stephen; Malmberg, Veronica; Snippes, Paula M; Wilcke, Burton W; White, Vanessa A

    2010-01-01

    This article describes the development since 2000 of the State Public Health Laboratory System in the United States. These state systems collectively are related to several other recent public health laboratory (PHL) initiatives. The first is the Core Functions and Capabilities of State Public Health Laboratories, a white paper that defined the basic responsibilities of the state PHL. Another is the Centers for Disease Control and Prevention National Laboratory System (NLS) initiative, the goal of which is to promote public-private collaboration to assure quality laboratory services and public health surveillance. To enhance the realization of the NLS, the Association of Public Health Laboratories (APHL) launched in 2004 a State Public Health Laboratory System Improvement Program. In the same year, APHL developed a Comprehensive Laboratory Services Survey, a tool to measure improvement through the decade to assure that essential PHL services are provided.

  12. Analysis of core-periphery organization in protein contact networks reveals groups of structurally and functionally critical residues.

    PubMed

    Isaac, Arnold Emerson; Sinha, Sitabhra

    2015-10-01

    The representation of proteins as networks of interacting amino acids, referred to as protein contact networks (PCN), and their subsequent analyses using graph theoretic tools, can provide novel insights into the key functional roles of specific groups of residues. We have characterized the networks corresponding to the native states of 66 proteins (belonging to different families) in terms of their core-periphery organization. The resulting hierarchical classification of the amino acid constituents of a protein arranges the residues into successive layers - having higher core order - with increasing connection density, ranging from a sparsely linked periphery to a densely intra-connected core (distinct from the earlier concept of protein core defined in terms of the three-dimensional geometry of the native state, which has least solvent accessibility). Our results show that residues in the inner cores are more conserved than those at the periphery. Underlining the functional importance of the network core, we see that the receptor sites for known ligand molecules of most proteins occur in the innermost core. Furthermore, the association of residues with structural pockets and cavities in binding or active sites increases with the core order. From mutation sensitivity analysis, we show that the probability of deleterious or intolerant mutations also increases with the core order. We also show that stabilization centre residues are in the innermost cores, suggesting that the network core is critically important in maintaining the structural stability of the protein. A publicly available Web resource for performing core-periphery analysis of any protein whose native state is known has been made available by us at http://www.imsc.res.in/ ~sitabhra/proteinKcore/index.html.

  13. Comparative analysis of core genome MLST and SNP typing within a European Salmonella serovar Enteritidis outbreak.

    PubMed

    Pearce, Madison E; Alikhan, Nabil-Fareed; Dallman, Timothy J; Zhou, Zhemin; Grant, Kathie; Maiden, Martin C J

    2018-06-02

    Multi-country outbreaks of foodborne bacterial disease present challenges in their detection, tracking, and notification. As food is increasingly distributed across borders, such outbreaks are becoming more common. This increases the need for high-resolution, accessible, and replicable isolate typing schemes. Here we evaluate a core genome multilocus typing (cgMLST) scheme for the high-resolution reproducible typing of Salmonella enterica (S. enterica) isolates, by its application to a large European outbreak of S. enterica serovar Enteritidis. This outbreak had been extensively characterised using single nucleotide polymorphism (SNP)-based approaches. The cgMLST analysis was congruent with the original SNP-based analysis, the epidemiological data, and whole genome MLST (wgMLST) analysis. Combination of the cgMLST and epidemiological data confirmed that the genetic diversity among the isolates predated the outbreak, and was likely present at the infection source. There was consequently no link between country of isolation and genetic diversity, but the cgMLST clusters were congruent with date of isolation. Furthermore, comparison with publicly available Enteritidis isolate data demonstrated that the cgMLST scheme presented is highly scalable, enabling outbreaks to be contextualised within the Salmonella genus. The cgMLST scheme is therefore shown to be a standardised and scalable typing method, which allows Salmonella outbreaks to be analysed and compared across laboratories and jurisdictions. Copyright © 2018. Published by Elsevier B.V.

  14. Structural, evolutionary and genetic analysis of the histidine biosynthetic "core" in the genus Burkholderia.

    PubMed

    Papaleo, Maria Cristiana; Russo, Edda; Fondi, Marco; Emiliani, Giovanni; Frandi, Antonio; Brilli, Matteo; Pastorelli, Roberta; Fani, Renato

    2009-12-01

    In this work a detailed analysis of the structure, the expression and the organization of his genes belonging to the core of histidine biosynthesis (hisBHAF) in 40 newly determined and 13 available sequences of Burkholderia strains was carried out. Data obtained revealed a strong conservation of the structure and organization of these genes through the entire genus. The phylogenetic analysis showed the monophyletic origin of this gene cluster and indicated that it did not undergo horizontal gene transfer events. The analysis of the intergenic regions, based on the substitution rate, entropy plot and bendability suggested the existence of a putative transcription promoter upstream of hisB, that was supported by the genetic analysis that showed that this cluster was able to complement Escherichia colihisA, hisB, and hisF mutations. Moreover, a preliminary transcriptional analysis and the analysis of microarray data revealed that the expression of the his core was constitutive. These findings are in agreement with the fact that the entire Burkholderiahis operon is heterogeneous, in that it contains "alien" genes apparently not involved in histidine biosynthesis. Besides, they also support the idea that the proteobacterial his operon was piece-wisely assembled, i.e. through accretion of smaller units containing only some of the genes (eventually together with their own promoters) involved in this biosynthetic route. The correlation existing between the structure, organization and regulation of his "core" genes and the function(s) they perform in cellular metabolism is discussed.

  15. Heat deposition analysis for the High Flux Isotope Reactor’s HEU and LEU core models

    DOE PAGES

    Davidson, Eva E.; Betzler, Benjamin R.; Chandler, David; ...

    2017-08-01

    The High Flux Isotope Reactor at Oak Ridge National Laboratory is an 85 MW th pressurized light-water-cooled and -moderated flux-trap type research reactor. The reactor is used to conduct numerous experiments, advancing various scientific and engineering disciplines. As part of an ongoing program sponsored by the US Department of Energy National Nuclear Security Administration Office of Material Management and Minimization, studies are being performed to assess the feasibility of converting the reactor’s highly enriched uranium fuel to low-enriched uranium fuel. To support this conversion project, reference models with representative experiment target loading and explicit fuel plate representation were developed andmore » benchmarked for both fuels to (1) allow for consistent comparison between designs for both fuel types and (2) assess the potential impact of low-enriched uranium conversion. These high-fidelity models were used to conduct heat deposition analyses at the beginning and end of the reactor cycle and are presented herein. This article (1) discusses the High Flux Isotope Reactor models developed to facilitate detailed heat deposition analyses of the reactor’s highly enriched and low-enriched uranium cores, (2) examines the computational approach for performing heat deposition analysis, which includes a discussion on the methodology for calculating the amount of energy released per fission, heating rates, power and volumetric heating rates, and (3) provides results calculated throughout various regions of the highly enriched and low-enriched uranium core at the beginning and end of the reactor cycle. These are the first detailed high-fidelity heat deposition analyses for the High Flux Isotope Reactor’s highly enriched and low-enriched core models with explicit fuel plate representation. Lastly, these analyses are used to compare heat distributions obtained for both fuel designs at the beginning and end of the reactor cycle, and they are essential

  16. Idaho National Laboratory Quarterly Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Lisbeth

    2014-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 60 reportable events (23 from the 4th Qtr FY14 and 37 from the prior three reporting quarters) as well as 58 other issue reports (including not reportable events and Significant Category A and B conditions) identified at INLmore » from July 2013 through October 2014. Battelle Energy Alliance (BEA) operates the INL under contract DE AC07 051D14517.« less

  17. Idaho National Laboratory Quarterly Occurrence Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Lisbeth Ann

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 85 reportable events (18 from the 4th Qtr FY-15 and 67 from the prior three reporting quarters), as well as 25 other issue reports (including events found to be not reportable and Significant Category A and B conditions)more » identified at INL during the past 12 months (8 from this quarter and 17 from the prior three quarters).« less

  18. Inter-laboratory comparison of the in vivo comet assay including three image analysis systems.

    PubMed

    Plappert-Helbig, Ulla; Guérard, Melanie

    2015-12-01

    To compare the extent of potential inter-laboratory variability and the influence of different comet image analysis systems, in vivo comet experiments were conducted using the genotoxicants ethyl methanesulfonate and methyl methanesulfonate. Tissue samples from the same animals were processed and analyzed-including independent slide evaluation by image analysis-in two laboratories with extensive experience in performing the comet assay. The analysis revealed low inter-laboratory experimental variability. Neither the use of different image analysis systems, nor the staining procedure of DNA (propidium iodide vs. SYBR® Gold), considerably impacted the results or sensitivity of the assay. In addition, relatively high stability of the staining intensity of propidium iodide-stained slides was found in slides that were refrigerated for over 3 months. In conclusion, following a thoroughly defined protocol and standardized routine procedures ensures that the comet assay is robust and generates comparable results between different laboratories. © 2015 Wiley Periodicals, Inc.

  19. Inquiry-based Laboratory Activities on Drugs Analysis for High School Chemistry Learning

    NASA Astrophysics Data System (ADS)

    Rahmawati, I.; Sholichin, H.; Arifin, M.

    2017-09-01

    Laboratory activity is an important part of chemistry learning, but cookbook instructions is still commonly used. However, the activity with that way do not improve students thinking skill, especially students creativity. This study aims to improve high school students creativity through inquiry-based laboratory on drugs analysis activity. Acid-base titration is used to be method for drugs analysis involving a color changing indicator. The following tools were used to assess the activity achievement: creative thinking test on acid base titration, creative attitude and action observation sheets, questionnaire of inquiry-based lab activities, and interviews. The results showed that the inquiry-based laboratory activity improving students creative thinking, creative attitude and creative action. The students reacted positively to this teaching strategy as demonstrated by results from questionnaire responses and interviews. This result is expected to help teachers to overcome the shortcomings in other laboratory learning.

  20. 9 CFR 590.960 - Small importations for consignee's personal use, display, or laboratory analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... personal use, display, or laboratory analysis. 590.960 Section 590.960 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE EGG PRODUCTS INSPECTION INSPECTION OF EGGS AND... personal use, display, or laboratory analysis. Any egg products which are offered for importation...

  1. Barium and calcium analyses in sediment cores using µ-XRF core scanners

    NASA Astrophysics Data System (ADS)

    Acar, Dursun; Çaǧatay, Namık; Genç, S. Can; Eriş, K. Kadir; Sarı, Erol; Uçarkus, Gülsen

    2017-04-01

    Barium and Ca are used as proxies for organic productivity in paleooceanographic studies. With its heavy atomic weight (137.33 u), barium is easily detectable in small concentrations (several ppm levels) in marine sediments using XRF methods, including the analysis by µ-XRF core scanners. Calcium has an intermediate atomic weight (40.078 u) but is a major element in the earth's crust and in sediments and sedimentary rocks, and hence it is easily detectable by µ-XRF techniques. Normally, µ-XRF elemental analysis of cores are carried out using split half cores or 1-2 cm thich u-channels with an original moisture. Sediment cores show variation in different water content (and porosity) along their length. This in turn results in variation in the XRF counts of the elements and causes error in the elemental concentrations. We tried µ-XRF elemental analysis of split half cores, subsampled as 1 cm thick u-channels with original moisture and 0.3 mm-thin film slices of the core with original wet sample and after air drying with humidity protector mylar film. We found considerable increase in counts of most elements, and in particular for Ba and Ca, when we used 0.3 mm thin film, dried slice. In the case of Ba, the counts increased about three times that of the analysis made with wet and 1 cm thick u-channels. The higher Ba and Ca counts are mainly due to the possible precipitation of Ba as barite and Ca as gypsum from oxidation of Fe-sulphides and the evaporation of pore waters. The secondary barite and gypsum precipitation would be especially serious in unoxic sediment units, such as sapropels, with considerable Fe-sulphides and bio-barite.It is therefore suggested that reseachers should be cautious of such secondary precipitation on core surfaces when analyzing cores that have long been exposed to the atmospheric conditions.

  2. Sandia National Laboratories analysis code data base

    NASA Astrophysics Data System (ADS)

    Peterson, C. W.

    1994-11-01

    Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.

  3. CoreFlow: a computational platform for integration, analysis and modeling of complex biological data.

    PubMed

    Pasculescu, Adrian; Schoof, Erwin M; Creixell, Pau; Zheng, Yong; Olhovsky, Marina; Tian, Ruijun; So, Jonathan; Vanderlaan, Rachel D; Pawson, Tony; Linding, Rune; Colwill, Karen

    2014-04-04

    A major challenge in mass spectrometry and other large-scale applications is how to handle, integrate, and model the data that is produced. Given the speed at which technology advances and the need to keep pace with biological experiments, we designed a computational platform, CoreFlow, which provides programmers with a framework to manage data in real-time. It allows users to upload data into a relational database (MySQL), and to create custom scripts in high-level languages such as R, Python, or Perl for processing, correcting and modeling this data. CoreFlow organizes these scripts into project-specific pipelines, tracks interdependencies between related tasks, and enables the generation of summary reports as well as publication-quality images. As a result, the gap between experimental and computational components of a typical large-scale biology project is reduced, decreasing the time between data generation, analysis and manuscript writing. CoreFlow is being released to the scientific community as an open-sourced software package complete with proteomics-specific examples, which include corrections for incomplete isotopic labeling of peptides (SILAC) or arginine-to-proline conversion, and modeling of multiple/selected reaction monitoring (MRM/SRM) results. CoreFlow was purposely designed as an environment for programmers to rapidly perform data analysis. These analyses are assembled into project-specific workflows that are readily shared with biologists to guide the next stages of experimentation. Its simple yet powerful interface provides a structure where scripts can be written and tested virtually simultaneously to shorten the life cycle of code development for a particular task. The scripts are exposed at every step so that a user can quickly see the relationships between the data, the assumptions that have been made, and the manipulations that have been performed. Since the scripts use commonly available programming languages, they can easily be

  4. Improved methodologies for continuous-flow analysis of stable water isotopes in ice cores

    NASA Astrophysics Data System (ADS)

    Jones, Tyler R.; White, James W. C.; Steig, Eric J.; Vaughn, Bruce H.; Morris, Valerie; Gkinis, Vasileios; Markle, Bradley R.; Schoenemann, Spruce W.

    2017-02-01

    Water isotopes in ice cores are used as a climate proxy for local temperature and regional atmospheric circulation as well as evaporative conditions in moisture source regions. Traditional measurements of water isotopes have been achieved using magnetic sector isotope ratio mass spectrometry (IRMS). However, a number of recent studies have shown that laser absorption spectrometry (LAS) performs as well or better than IRMS. The new LAS technology has been combined with continuous-flow analysis (CFA) to improve data density and sample throughput in numerous prior ice coring projects. Here, we present a comparable semi-automated LAS-CFA system for measuring high-resolution water isotopes of ice cores. We outline new methods for partitioning both system precision and mixing length into liquid and vapor components - useful measures for defining and improving the overall performance of the system. Critically, these methods take into account the uncertainty of depth registration that is not present in IRMS nor fully accounted for in other CFA studies. These analyses are achieved using samples from a South Pole firn core, a Greenland ice core, and the West Antarctic Ice Sheet (WAIS) Divide ice core. The measurement system utilizes a 16-position carousel contained in a freezer to consecutively deliver ˜ 1 m × 1.3 cm2 ice sticks to a temperature-controlled melt head, where the ice is converted to a continuous liquid stream and eventually vaporized using a concentric nebulizer for isotopic analysis. An integrated delivery system for water isotope standards is used for calibration to the Vienna Standard Mean Ocean Water (VSMOW) scale, and depth registration is achieved using a precise overhead laser distance device with an uncertainty of ±0.2 mm. As an added check on the system, we perform inter-lab LAS comparisons using WAIS Divide ice samples, a corroboratory step not taken in prior CFA studies. The overall results are important for substantiating data obtained from LAS

  5. Characterizing core-periphery structure of complex network by h-core and fingerprint curve

    NASA Astrophysics Data System (ADS)

    Li, Simon S.; Ye, Adam Y.; Qi, Eric P.; Stanley, H. Eugene; Ye, Fred Y.

    2018-02-01

    It is proposed that the core-periphery structure of complex networks can be simulated by h-cores and fingerprint curves. While the features of core structure are characterized by h-core, the features of periphery structure are visualized by rose or spiral curve as the fingerprint curve linking to entire-network parameters. It is suggested that a complex network can be approached by h-core and rose curves as the first-order Fourier-approach, where the core-periphery structure is characterized by five parameters: network h-index, network radius, degree power, network density and average clustering coefficient. The simulation looks Fourier-like analysis.

  6. Computed Tomography Scanning and Geophysical Measurements of Core from the Coldstream 1MH Well

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crandall, Dustin M.; Brown, Sarah; Moore, Johnathan E.

    The computed tomography (CT) facilities and the Multi-Sensor Core Logger (MSCL) at the National Energy Technology Laboratory (NETL) Morgantown, West Virginia site were used to characterize core of the Marcellus Shale from a vertical well, the Coldstream 1MH Well in Clearfield County, PA. The core is comprised primarily of the Marcellus Shale from a depth of 7,002 to 7,176 ft. The primary impetus of this work is a collaboration between West Virginia University (WVU) and NETL to characterize core from multiple wells to better understand the structure and variation of the Marcellus and Utica shale formations. As part of thismore » effort, bulk scans of core were obtained from the Coldstream 1MH well, provided by the Energy Corporation of America (now Greylock Energy). This report, and the associated scans, provide detailed datasets not typically available from unconventional shales for analysis. The resultant datasets are presented in this report, and can be accessed from NETL's Energy Data eXchange (EDX) online system using the following link: https://edx.netl.doe.gov/dataset/coldstream-1mh-well. All equipment and techniques used were non-destructive, enabling future examinations to be performed on these cores. None of the equipment used was suitable for direct visualization of the shale pore space, although fractures and discontinuities were detectable with the methods tested. Low resolution CT imagery with the NETL medical CT scanner was performed on the entire core. Qualitative analysis of the medical CT images, coupled with x-ray fluorescence (XRF), P-wave, and magnetic susceptibility measurements from the MSCL were useful in identifying zones of interest for more detailed analysis as well as fractured zones. En echelon fractures were observed at 7,100 ft and were CT scanned using NETL’s industrial CT scanner at higher resolution. The ability to quickly identify key areas for more detailed study with higher resolution will save time and resources in future

  7. QUALITY ASSURANCE GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    EPA Science Inventory

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following quality assurance guidelines to provide laboratories engaged in forensic analysis of chemical evidence associated with terrorism a framework to implement a quality assura...

  8. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system.

    PubMed

    Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas

    2014-01-01

    Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were

  9. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system

    PubMed Central

    2014-01-01

    Background Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Methods Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Results Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC

  10. Economic Analysis of Alternatives for PC Upgrade of OR Department Laboratory

    DTIC Science & Technology

    1990-09-01

    DEPARTMENT LABORATORY by Chen Lung-Shan September, 1990 Thesis Advisor: Thomas E. Halwachs Approved for public release; distribution is unlimited. 91...b. RESTRICTIVE MARKINGS UNCLASSIFIED 2a. SECURITY CLASSIFICATION AUTHORITY 3. DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release...Approved for public release; distribution is unlimited. Economic Analysis Of Alternatives For PC Upgrade Of OR Department Laboratory by Lung-shan Chen

  11. An Interpretative Phenomenological Analysis of the Common Core Standards Program in the State of South Dakota

    ERIC Educational Resources Information Center

    Alase, Abayomi

    2017-01-01

    This interpretative phenomenological analysis (IPA) study investigated and interpreted the Common Core State Standards program (the phenomenon) that has been the dominating topic of discussions amongst educators all across the country since the inauguration of the program in 2014/2015 school session. Common Core State Standards (CCSS) was a…

  12. Development of Analytical Protocols For Organics and Isotopes Analysis on the 2009 MARS Science Laboratory.

    NASA Technical Reports Server (NTRS)

    Mahaffy, P. R.

    2006-01-01

    The Mars Science Laboratory, under development for launch in 2009, is designed explore and quantitatively asses a local region on Mars as a potential habitat for present or past life. Its ambitious goals are to (1) assess the past or present biological potential of the target environment, (2) to characterize the geology and geochemistry at the MSL landing site, and (3) to investigate planetary processes that influence habitability. The planned capabilities of the rover payload will enable a comprehensive search for organic molecules, a determination of definitive mineralogy of sampled rocks and fines, chemical and isotopic analysis of both atmospheric and solid samples, and precision isotope measurements of several volatile elements. A range of contact and remote surface and subsurface survey tools will establish context for these measurements and will facilitate sample identification and selection. The Sample Analysis at Mars (SAM) suite of MSL addresses several of the mission's core measurement goals. It includes a gas chromatograph, a mass spectrometer, and a tunable laser spectrometer. These instruments will be designed to analyze either atmospheric samples or gases extracted from solid phase samples such as rocks and fines. We will describe the range of measurement protocols under development and study by the SAM engineering and science teams for use on the surface of Mars.

  13. The total laboratory solution: a new laboratory E-business model based on a vertical laboratory meta-network.

    PubMed

    Friedman, B A

    2001-08-01

    Major forces are now reshaping all businesses on a global basis, including the healthcare and clinical laboratory industries. One of the major forces at work is information technology (IT), which now provides the opportunity to create a new economic and business model for the clinical laboratory industry based on the creation of an integrated vertical meta-network, referred to here as the "total laboratory solution" (TLS). Participants at the most basic level of such a network would include a hospital-based laboratory, a reference laboratory, a laboratory information system/application service provider/laboratory portal vendor, an in vitro diagnostic manufacturer, and a pharmaceutical/biotechnology manufacturer. It is suggested that each of these participants would add value to the network primarily in its area of core competency. Subvariants of such a network have evolved over recent years, but a TLS comprising all or most of these participants does not exist at this time. Although the TLS, enabled by IT and closely akin to the various e-businesses that are now taking shape, offers many advantages from a theoretical perspective over the current laboratory business model, its success will depend largely on (a) market forces, (b) how the collaborative networks are organized and managed, and (c) whether the network can offer healthcare organizations higher quality testing services at lower cost. If the concept is successful, new demands will be placed on hospital-based laboratory professionals to shift the range of professional services that they offer toward clinical consulting, integration of laboratory information from multiple sources, and laboratory information management. These information management and integration tasks can only increase in complexity in the future as new genomic and proteomics testing modalities are developed and come on-line in clinical laboratories.

  14. A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.

    In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.

  15. Environmental Response Laboratory Network Membership and Benefits

    EPA Pesticide Factsheets

    Member laboratories must meet core requirements including quality systems, policies and procedures, sample and data management, and analytical capabilities. Benefits include training and exercise opportunities, information sharing and technical support.

  16. Core-valence stockholder AIM analysis and its connection to nonadiabatic effects in small molecules.

    PubMed

    Amaral, Paulo H R; Mohallem, José R

    2017-05-21

    A previous theory of separation of motions of core and valence fractions of electrons in a molecule [J. R. Mohallem et al., Chem. Phys. Lett. 501, 575 (2011)] is invoked as basis for the useful concept of Atoms-in-Molecules (AIM) in the stockholder scheme. The output is a new tool for the analysis of the chemical bond that identifies core and valence electron density fractions (core-valence stockholder AIM (CVSAIM)). One-electron effective potentials for each atom are developed, which allow the identification of the parts of the AIM which move along with the nuclei (cores). This procedure results in a general method for obtaining effective masses that yields accurate non-adiabatic corrections to vibrational energies, necessary to attain cm -1 accuracy in molecular spectroscopy. The clear-cut determination of the core masses is exemplified for either homonuclear (H 2 + , H 2 ) or heteronuclear (HeH + , LiH) molecules. The connection of CVSAIM with independent physically meaningful quantities can resume the question of whether they are observable or not.

  17. Core-valence stockholder AIM analysis and its connection to nonadiabatic effects in small molecules

    PubMed Central

    Amaral, Paulo H. R.; Mohallem, José R.

    2017-01-01

    A previous theory of separation of motions of core and valence fractions of electrons in a molecule [J. R. Mohallem et al., Chem. Phys. Lett. 501, 575 (2011)] is invoked as basis for the useful concept of Atoms-in-Molecules (AIM) in the stockholder scheme. The output is a new tool for the analysis of the chemical bond that identifies core and valence electron density fractions (core-valence stockholder AIM (CVSAIM)). One-electron effective potentials for each atom are developed, which allow the identification of the parts of the AIM which move along with the nuclei (cores). This procedure results in a general method for obtaining effective masses that yields accurate non-adiabatic corrections to vibrational energies, necessary to attain cm−1 accuracy in molecular spectroscopy. The clear-cut determination of the core masses is exemplified for either homonuclear (H2+, H2) or heteronuclear (HeH+, LiH) molecules. The connection of CVSAIM with independent physically meaningful quantities can resume the question of whether they are observable or not. PMID:28527456

  18. Generation and Characterization of States of Matter at Solar Core Conditions

    NASA Astrophysics Data System (ADS)

    Bachmann, Benjamin

    2016-10-01

    The equation-of-state (EOS) of matter at solar core conditions is important to stellar evolution models and understanding the origin of high Z elements. Temperatures, densities and pressures of stellar cores are, however, orders of magnitude greater than those obtained in state-of-the-art laboratory EOS experiments and therefore such conditions have been limited to observational astronomy and theoretical models. Here we present a method to generate and diagnose these conditions in the laboratory, which is the first step towards characterizing the EOS of such extreme states of matter. By launching a converging shock wave into a deuterated plastic sphere (CD2) we produce solar core conditions (R /RSun < 0.2) which are initiated when the shock reaches the center of the CD2 sphere and extends during transit of the reflected wave until the temperature drops to a level where the neutron production and x-ray self emission drop below threshold levels of the detectors. These conditions are diagnosed by both, the neutron spectral data from D-D nuclear reactions, and temporal, spatial, and spectral x-ray emission data. We will discuss how these observables can be measured and used to help our understanding of dense plasma states that reach well into the thermonuclear regime of stellar cores. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was supported by Laboratory Directed Research and Development Grant No. 13-ERD-073.

  19. A highly efficient multi-core algorithm for clustering extremely large datasets

    PubMed Central

    2010-01-01

    Background In recent years, the demand for computational power in computational biology has increased due to rapidly growing data sets from microarray and other high-throughput technologies. This demand is likely to increase. Standard algorithms for analyzing data, such as cluster algorithms, need to be parallelized for fast processing. Unfortunately, most approaches for parallelizing algorithms largely rely on network communication protocols connecting and requiring multiple computers. One answer to this problem is to utilize the intrinsic capabilities in current multi-core hardware to distribute the tasks among the different cores of one computer. Results We introduce a multi-core parallelization of the k-means and k-modes cluster algorithms based on the design principles of transactional memory for clustering gene expression microarray type data and categorial SNP data. Our new shared memory parallel algorithms show to be highly efficient. We demonstrate their computational power and show their utility in cluster stability and sensitivity analysis employing repeated runs with slightly changed parameters. Computation speed of our Java based algorithm was increased by a factor of 10 for large data sets while preserving computational accuracy compared to single-core implementations and a recently published network based parallelization. Conclusions Most desktop computers and even notebooks provide at least dual-core processors. Our multi-core algorithms show that using modern algorithmic concepts, parallelization makes it possible to perform even such laborious tasks as cluster sensitivity and cluster number estimation on the laboratory computer. PMID:20370922

  20. Incorporating Basic Optical Microscopy in the Instrumental Analysis Laboratory

    ERIC Educational Resources Information Center

    Flowers, Paul A.

    2011-01-01

    A simple and versatile approach to incorporating basic optical microscopy in the undergraduate instrumental analysis laboratory is described. Attaching a miniature CCD spectrometer to the video port of a standard compound microscope yields a visible microspectrophotometer suitable for student investigations of fundamental spectrometry concepts,…

  1. Quality Assessment of Urinary Stone Analysis: Results of a Multicenter Study of Laboratories in Europe

    PubMed Central

    Siener, Roswitha; Buchholz, Noor; Daudon, Michel; Hess, Bernhard; Knoll, Thomas; Osther, Palle J.; Reis-Santos, José; Sarica, Kemal; Traxer, Olivier; Trinchieri, Alberto

    2016-01-01

    After stone removal, accurate analysis of urinary stone composition is the most crucial laboratory diagnostic procedure for the treatment and recurrence prevention in the stone-forming patient. The most common techniques for routine analysis of stones are infrared spectroscopy, X-ray diffraction and chemical analysis. The aim of the present study was to assess the quality of urinary stone analysis of laboratories in Europe. Nine laboratories from eight European countries participated in six quality control surveys for urinary calculi analyses of the Reference Institute for Bioanalytics, Bonn, Germany, between 2010 and 2014. Each participant received the same blinded test samples for stone analysis. A total of 24 samples, comprising pure substances and mixtures of two or three components, were analysed. The evaluation of the quality of the laboratory in the present study was based on the attainment of 75% of the maximum total points, i.e. 99 points. The methods of stone analysis used were infrared spectroscopy (n = 7), chemical analysis (n = 1) and X-ray diffraction (n = 1). In the present study only 56% of the laboratories, four using infrared spectroscopy and one using X-ray diffraction, fulfilled the quality requirements. According to the current standard, chemical analysis is considered to be insufficient for stone analysis, whereas infrared spectroscopy or X-ray diffraction is mandatory. However, the poor results of infrared spectroscopy highlight the importance of equipment, reference spectra and qualification of the staff for an accurate analysis of stone composition. Regular quality control is essential in carrying out routine stone analysis. PMID:27248840

  2. Quality Assessment of Urinary Stone Analysis: Results of a Multicenter Study of Laboratories in Europe.

    PubMed

    Siener, Roswitha; Buchholz, Noor; Daudon, Michel; Hess, Bernhard; Knoll, Thomas; Osther, Palle J; Reis-Santos, José; Sarica, Kemal; Traxer, Olivier; Trinchieri, Alberto

    2016-01-01

    After stone removal, accurate analysis of urinary stone composition is the most crucial laboratory diagnostic procedure for the treatment and recurrence prevention in the stone-forming patient. The most common techniques for routine analysis of stones are infrared spectroscopy, X-ray diffraction and chemical analysis. The aim of the present study was to assess the quality of urinary stone analysis of laboratories in Europe. Nine laboratories from eight European countries participated in six quality control surveys for urinary calculi analyses of the Reference Institute for Bioanalytics, Bonn, Germany, between 2010 and 2014. Each participant received the same blinded test samples for stone analysis. A total of 24 samples, comprising pure substances and mixtures of two or three components, were analysed. The evaluation of the quality of the laboratory in the present study was based on the attainment of 75% of the maximum total points, i.e. 99 points. The methods of stone analysis used were infrared spectroscopy (n = 7), chemical analysis (n = 1) and X-ray diffraction (n = 1). In the present study only 56% of the laboratories, four using infrared spectroscopy and one using X-ray diffraction, fulfilled the quality requirements. According to the current standard, chemical analysis is considered to be insufficient for stone analysis, whereas infrared spectroscopy or X-ray diffraction is mandatory. However, the poor results of infrared spectroscopy highlight the importance of equipment, reference spectra and qualification of the staff for an accurate analysis of stone composition. Regular quality control is essential in carrying out routine stone analysis.

  3. Measurement and Analysis of Structural Integrity of Reactor Core Support Structure in Pressurized Water Reactor (PWR) Plant

    NASA Astrophysics Data System (ADS)

    Ansari, Saleem A.; Haroon, Muhammad; Rashid, Atif; Kazmi, Zafar

    2017-02-01

    Extensive calculation and measurements of flow-induced vibrations (FIV) of reactor internals were made in a PWR plant to assess the structural integrity of reactor core support structure against coolant flow. The work was done to meet the requirements of the Fukushima Response Action Plan (FRAP) for enhancement of reactor safety, and the regulatory guide RG-1.20. For the core surveillance measurements the Reactor Internals Vibration Monitoring System (IVMS) has been developed based on detailed neutron noise analysis of the flux signals from the four ex-core neutron detectors. The natural frequencies, displacement and mode shapes of the reactor core barrel (CB) motion were determined with the help of IVMS. The random pressure fluctuations in reactor coolant flow due to turbulence force have been identified as the predominant cause of beam-mode deflection of CB. The dynamic FIV calculations were also made to supplement the core surveillance measurements. The calculational package employed the computational fluid dynamics, mode shape analysis, calculation of power spectral densities of flow & pressure fields and the structural response to random flow excitation forces. The dynamic loads and stiffness of the Hold-Down Spring that keeps the core structure in position against upward coolant thrust were also determined by noise measurements. Also, the boron concentration in primary coolant at any time of the core cycle has been determined with the IVMS.

  4. Multivariate analysis and geochemical approach for assessment of metal pollution state in sediment cores.

    PubMed

    Jamshidi-Zanjani, Ahmad; Saeedi, Mohsen

    2017-07-01

    Vertical distribution of metals (Cu, Zn, Cr, Fe, Mn, Pb, Ni, Cd, and Li) in four sediment core samples (C 1 , C 2 , C 3 , and C 4 ) from Anzali international wetland located southwest of the Caspian Sea was examined. Background concentration of each metal was calculated according to different statistical approaches. The results of multivariate statistical analysis showed that Fe and Mn might have significant role in the fate of Ni and Zn in sediment core samples. Different sediment quality indexes were utilized to assess metal pollution in sediment cores. Moreover, a new sediment quality index named aggregative toxicity index (ATI) based on sediment quality guidelines (SQGs) was developed to assess the degree of metal toxicity in an aggregative manner. The increasing pattern of metal pollution and their toxicity degree in upper layers of core samples indicated increasing effects of anthropogenic sources in the study area.

  5. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  6. Physiological and psychological responses to outdoor vs. laboratory cycling.

    PubMed

    Mieras, Molly E; Heesch, Matthew W S; Slivka, Dustin R

    2014-08-01

    The purpose of this study was to determine the physiological and psychological responses to laboratory vs. outdoor cycling. Twelve recreationally trained male cyclists participated in an initial descriptive testing session and 2 experimental trials consisting of 1 laboratory and 1 outdoor session, in a randomized order. Participants were given a standardized statement instructing them to give the same perceived effort for both the laboratory and outdoor 40-km trials. Variables measured include power output, heart rate (HR), core temperature, skin temperature, body weight, urine specific gravity (USG), Rating of Perceived Exertion (RPE), attentional focus, and environmental conditions. Wind speed was higher in the outdoor trial than in the laboratory trial (2.5 ± 0.6 vs. 0.0 ± 0.0 m·s-1, p = 0.02) whereas all other environmental conditions were similar. Power output (208.1 ± 10.2 vs. 163.4 ± 11.8 W, respectively, p < 0.001) and HR (152 ± 4 and 143 ± 6 b·min-1, respectively, p = 0.04) were higher in the outdoor trial than in the laboratory trial. Core temperature was similar, whereas skin temperature was cooler during the outdoor trial than during the laboratory trial (31.4 ± 0.3 vs. 33.0 ± 0.2° C, respectively, p < 0.001), thus creating a larger thermal gradient between the core and skin outdoors. No significant differences in body weight, USG, RPE, or attentional focus were observed between trials. These data indicate that outdoor cycling allows cyclists to exercise at a higher intensity than in laboratory cycling, despite similar environmental conditions and perceived exertion. In light of this, cyclists may want to ride at a higher perceived exertion in indoor settings to acquire the same benefit as they would from an outdoor ride.

  7. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Lisbeth A.

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  8. The makeover of the Lakeshore General Hospital laboratories.

    PubMed

    Estioko-Taimuri, Teresa

    2006-01-31

    This article describes the expansion and reorganization of a moderate-sized Canadian laboratory from Day One to "Live Day." The key factors to the success of this project were organized planning by the laboratory staff and the introduction of core lab theories, team building, and organized training sessions. The successful makeover resulted in improved turnaround time for STAT tests, especially those coming from the Emergency Unit. The efforts of the laboratory personnel toward the improvement of laboratory services, in spite of budget, human resources constraints, and resistance to change, are addressed.

  9. Revising Laboratory Work: Sociological Perspectives on the Science Classroom

    ERIC Educational Resources Information Center

    Jobér, Anna

    2017-01-01

    This study uses sociological perspectives to analyse one of the core practices in science education: school children's and students' laboratory work. Applying an ethnographic approach to the laboratory work done by pupils at a Swedish compulsory school, data were generated through observations, field notes, interviews, and a questionnaire. The…

  10. Core-core and core-valence correlation

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1988-01-01

    The effect of (1s) core correlation on properties and energy separations was analyzed using full configuration-interaction (FCI) calculations. The Be 1 S - 1 P, the C 3 P - 5 S and CH+ 1 Sigma + or - 1 Pi separations, and CH+ spectroscopic constants, dipole moment and 1 Sigma + - 1 Pi transition dipole moment were studied. The results of the FCI calculations are compared to those obtained using approximate methods. In addition, the generation of atomic natural orbital (ANO) basis sets, as a method for contracting a primitive basis set for both valence and core correlation, is discussed. When both core-core and core-valence correlation are included in the calculation, no suitable truncated CI approach consistently reproduces the FCI, and contraction of the basis set is very difficult. If the (nearly constant) core-core correlation is eliminated, and only the core-valence correlation is included, CASSCF/MRCI approached reproduce the FCI results and basis set contraction is significantly easier.

  11. Laboratory Directed Research and Development FY2001 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Ayat, R

    2002-06-20

    Established by Congress in 1991, the Laboratory Directed Research and Development (LDRD) Program provides the Department of Energy (DOE)/National Nuclear Security Administration (NNSA) laboratories, like Lawrence Livermore National Laboratory (LLNL or the Laboratory), with the flexibility to invest up to 6% of their budget in long-term, high-risk, and potentially high payoff research and development (R&D) activities to support the DOE/NNSA's national security missions. By funding innovative R&D, the LDRD Program at LLNL develops and extends the Laboratory's intellectual foundations and maintains its vitality as a premier research institution. As proof of the Program's success, many of the research thrusts thatmore » started many years ago under LDRD sponsorship are at the core of today's programs. The LDRD Program, which serves as a proving ground for innovative ideas, is the Laboratory's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. Basic and applied research activities funded by LDRD enhance the Laboratory's core strengths, driving its technical vitality to create new capabilities that enable LLNL to meet DOE/NNSA's national security missions. The Program also plays a key role in building a world-class multidisciplinary workforce by engaging the Laboratory's best researchers, recruiting its future scientists and engineers, and promoting collaborations with all sectors of the larger scientific community.« less

  12. User’s guide to the collection and analysis of tree cores to assess the distribution of subsurface volatile organic compounds

    USGS Publications Warehouse

    Vroblesky, Don A.

    2008-01-01

    Analysis of the volatile organic compound content of tree cores is an inexpensive, rapid, simple approach to examining the distribution of subsurface volatile organic compound contaminants. The method has been shown to detect several volatile petroleum hydrocarbons and chlorinated aliphatic compounds associated with vapor intrusion and ground-water contamination. Tree cores, which are approximately 3 inches long, are obtained by using an increment borer. The cores are placed in vials and sealed. After a period of equilibration, the cores can be analyzed by headspace analysis gas chromatography. Because the roots are exposed to volatile organic compound contamination in the unsaturated zone or shallow ground water, the volatile organic compound concentrations in the tree cores are an indication of the presence of subsurface volatile organic compound contamination. Thus, tree coring can be used to detect and map subsurface volatile organic compound contamination. For comparison of tree-core data at a particular site, it is important to maintain consistent methods for all aspects of tree-core collection, handling, and analysis. Factors affecting the volatile organic compound concentrations in tree cores include the type of volatile organic compound, the tree species, the rooting depth, ground-water chemistry, the depth to the contaminated horizon, concentration differences around the trunk related to variations in the distribution of subsurface volatile organic compounds, concentration differences with depth of coring related to volatilization loss through the bark and possibly other unknown factors, dilution by rain, seasonal influences, sorption, vapor-exchange rates, and within-tree volatile organic compound degradation.

  13. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  14. The Analysis of Seawater: A Laboratory-Centered Learning Project in General Chemistry.

    ERIC Educational Resources Information Center

    Selco, Jodye I.; Roberts, Julian L., Jr.; Wacks, Daniel B.

    2003-01-01

    Describes a sea-water analysis project that introduces qualitative and quantitative analysis methods and laboratory methods such as gravimetric analysis, potentiometric titration, ion-selective electrodes, and the use of calibration curves. Uses a problem-based cooperative teaching approach. (Contains 24 references.) (YDS)

  15. Fine Structure of the Outermost Solid Core from Analysis of PKiKP Coda Waves

    NASA Astrophysics Data System (ADS)

    Krasnoshchekov, D.; Kaazik, P.; Ovtchinnikov, V.

    2006-05-01

    Near surface heterogeneities in the Earth's inner core have recently been confirmed to exist, and pods of partial melt or variations in seismic anisotropy either due to orientation of iron crystals or changes in strength were indicated as possible sources for such peculiarities. In the same time, analysis of the phase reflected from the inner core boundary (PKiKP) predicts complex character of the reflecting discontinuity in the form of local thin transition layers resulting in mosaic structure of the Earth's inner core's surface. Precritical PKiKP waveforms and coda waves provide necessary seismological constraints to investigate fine structure of the upper part of the Earth's inner core and its boundary, and rank high among researches that detected the described specifics of the solid core. PKiKP coda studies have to do with weak amplitudes and subtle effects, which frequently requires using a reference core related seismic phase and array data processing, as well as eliminating max number of factors biasing the resulting estimates (for example, source related inaccuracies typical for earthquake analysis). In this work we report new observations of PKiKP coda waves detected on records of a group of Underground Nuclear Explosions (UNEs) carried out in USSR and recorded at distances from 6 to 95 degrees by stations of the world seismological network. Our dataset benefits from using accurate ground truth information on source parameters (locations, origin times, depths, etc.), requires no accounting for different source radiation patterns and contains records corresponding to the whole range of precritical reflection including so called transparent zone where amplitudes of direct PKiKP phase are negligible. The processed dataset incorporates records of the array of sources consisted of the same magnitude explosions closely carried out at Semipalatinsk Test Site and recorded by stations located in Eurasia, Africa and North America. We detect PKiKP coda waves on

  16. Experimental Simulations of Methane Gas Migration through Water-Saturated Sediment Cores

    NASA Astrophysics Data System (ADS)

    Choi, J.; Seol, Y.; Rosenbaum, E. J.

    2010-12-01

    Previous numerical simulations (Jaines and Juanes, 2009) showed that modes of gas migration would mainly be determined by grain size; capillary invasion preferably occurring in coarse-grained sediments vs. fracturing dominantly in fine-grained sediments. This study was intended to experimentally simulate preferential modes of gas migration in various water-saturated sediment cores. The cores compacted in the laboratory include a silica sand core (mean size of 180 μm), a silica silt core (1.7 μm), and a kaolin clay core (1.0 μm). Methane gas was injected into the core placed within an x-ray-transparent pressure vessel, which was under continuous x-ray computed tomography (CT) scanning with controlled radial (σr), axial (σa), and pore pressures (P). The CT image analysis reveals that, under the radial effective stress (σr') of 0.69 MPa and the axial effective stress (σa') of 1.31 MPa, fracturings by methane gas injection occur in both silt and clay cores. Fracturing initiates at the capillary pressure (Pc) of ~ 0.41 MPa and ~ 2.41 MPa for silt and clay cores, respectively. Fracturing appears as irregular fracture-networks consisting of nearly invisibly-fine multiple fractures, longitudinally-oriented round tube-shape conduits, or fine fractures branching off from the large conduits. However, for the sand core, only capillary invasion was observed at or above 0.034 MPa of capillary pressure under the confining pressure condition of σr' = 1.38 MPa and σa' = 2.62 MPa. Compared to the numerical predictions under similar confining pressure conditions, fracturing occurs with relatively larger grain sizes, which may result from lower grain-contact compression and friction caused by loose compaction and flexible lateral boundary employed in the experiment.

  17. Laboratory Governance: Issues for the Study Group on Regional Laboratories.

    ERIC Educational Resources Information Center

    Schultz, Thomas; Dominic, Joseph

    Background information and an analysis of issues involved in the governance of new regional educational laboratories are presented. The new laboratories are to be established through a 1984 competition administered by the National Institute of Education (NIE). The analysis is designed to assist the Study Group on Regional Laboratories to advise…

  18. Developing laboratory networks: a practical guide and application.

    PubMed

    Kirk, Carol J; Shult, Peter A

    2010-01-01

    The role of the public health laboratory (PHL) in support of public health response has expanded beyond testing to include a number of other core functions, such as emergency response, training and outreach, communications, laboratory-based surveillance, and laboratory data management. These functions can only be accomplished by a network that includes public health and other agency laboratories and clinical laboratories. It is a primary responsibility of the PHL to develop and maintain such a network. In this article, we present practical recommendations-based on 17 years of network development experience-for the development of statewide laboratory networks. These recommendations, and examples of current laboratory networks, are provided to facilitate laboratory network development in other states. The development of laboratory networks will enhance each state's public health system and is critical to the development of a robust national Laboratory Response Network.

  19. Inaccurate reporting of mineral composition by commercial stone analysis laboratories: implications for infection and metabolic stones.

    PubMed

    Krambeck, Amy E; Khan, Naseem F; Jackson, Molly E; Lingeman, James E; McAteer, James A; Williams, James C

    2010-10-01

    We determined the accuracy of stone composition analysis at commercial laboratories. A total of 25 human renal stones with infrared spectroscopy determined composition were fragmented into aliquots and studied with micro computerized tomography to ensure fragment similarity. Representative fragments of each stone were submitted to 5 commercial stone laboratories for blinded analysis. All laboratories agreed on the composition of 6 pure stones. Only 2 of 4 stones (50%) known to contain struvite were identified as struvite at all laboratories. Struvite was reported as a component by some laboratories for 4 stones previously determined not to contain struvite. Overall there was disagreement regarding struvite in 6 stones (24%). For 9 calcium oxalate stones all laboratories reported some mixture of calcium oxalate but the quantity of subtypes differed significantly among laboratories. In 6 apatite containing stones apatite was missed by the laboratories in 20% of samples. None of the laboratories identified atazanavir in a stone containing that antiviral drug. One laboratory reported protein in every sample while all others reported it in only 1. Nomenclature for apatite differed among laboratories with 1 reporting apatite as carbonate apatite and never hydroxyapatite, another never reporting carbonate apatite and always reporting hydroxyapatite, and a third reporting carbonate apatite as apatite with calcium carbonate. Commercial laboratories reliably recognize pure calculi. However, variability in the reporting of mixed calculi suggests a problem with the accuracy of stone analysis results. There is also a lack of standard nomenclature used by laboratories. Copyright © 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  20. INACCURATE REPORTING OF MINERAL COMPOSITION BY COMMERCIAL STONE ANALYSIS LABORATORIES: IMPLICATIONS FOR INFECTION AND METABOLIC STONES

    PubMed Central

    Krambeck, Amy E.; Khan, Naseem F.; Jackson, Molly E.; Lingeman, James E.; McAteer, James A; Williams, James C.

    2011-01-01

    INTRODUCTION The goal of this study was to determine the accuracy of stone composition analysis by commercial laboratories. METHODS 25 human renal stones with infrared spectroscopy (IR) determined compositions were fragmented into aliquots and studied with micro-computed tomography (CT) to ensure fragment similarity. Representative fragments of each stone were submitted to 5 commercial stone laboratories for blinded analysis. RESULTS All laboratories agreed on composition for 6 pure stones. Of 4 stones known to contain struvite, only 2(50%) were identified as struvite by all laboratories. Struvite was reported as a component by some laboratories for 4 stones previously determined not to contain struvite. Overall, there was disagreement regarding struvite in 6(24%) stones. For 9 calcium oxalate (CaOx) stones, all laboratories reported some mixture of CaOx, but the quantities of subtypes differed significantly among laboratories. In 6 apatite containing stones, apatite was missed by the laboratories in 20% of the samples. None of the laboratories identified atazanavir in a stone containing that antiviral drug. One laboratory reported protein in every sample, while all others reported it in only 1 sample. Nomenclature for apatite differed among laboratories, with one reporting apatite as carbonate apatite (CA) and never hydroxyapatite (HA), another never reporting CA and always reporting HA, and a third reporting CA as apatite with calcium carbonate. CONCLUSIONS Commercial laboratories reliably recognize pure calculi; however, variability in reporting of mixed calculi suggests a problem with accuracy of stone analysis results. Furthermore, there is a lack of standard nomenclature used by laboratories. PMID:20728108

  1. Core ADHD Symptom Improvement with Atomoxetine versus Methylphenidate: A Direct Comparison Meta-Analysis

    ERIC Educational Resources Information Center

    Hazell, Philip L.; Kohn, Michael R.; Dickson, Ruth; Walton, Richard J.; Granger, Renee E.; van Wyk, Gregory W.

    2011-01-01

    Objective: Previous studies comparing atomoxetine and methylphenidate to treat ADHD symptoms have been equivocal. This noninferiority meta-analysis compared core ADHD symptom response between atomoxetine and methylphenidate in children and adolescents. Method: Selection criteria included randomized, controlled design; duration 6 weeks; and…

  2. The impact of the EUSCLE Core Set Questionnaire for the assessment of cutaneous lupus erythematosus.

    PubMed

    Kuhn, A; Patsinakidis, N; Bonsmann, G

    2010-08-01

    Epidemiological data and standard European guidelines for the diagnosis and treatment of cutaneous lupus erythematosus (CLE) are lacking in the current literature. In order to provide a standardized tool for an extensive consistent data collection, a study group of the European Society of Cutaneous Lupus Erythematosus (EUSCLE) recently developed a Core Set Questionnaire for the assessment of patients with different subtypes of CLE. The EUSCLE Core Set Questionnaire includes six sections on patient data, diagnosis, skin involvement, activity and damage of disease, laboratory analysis, and treatment. An instrument like the EUSCLE Core Set Questionnaire is essential to gain a broad and comparable data collection of patients with CLE from different European centres and to achieve consensus concerning clinical standards for the disease. The data will also be important for further characterization of the different CLE subtypes and the evaluation of therapeutic strategies; moreover, the EUSCLE Core Set Questionnaire might also be useful for the comparison of data in clinical trials. In this review, the impact of the EUSCLE Core Set Questionnaire is discussed in detail with regard to clinical and serological features as well as therapeutic modalities in CLE.

  3. Cost analysis in a clinical microbiology laboratory.

    PubMed

    Brezmes, M F; Ochoa, C; Eiros, J M

    2002-08-01

    The use of models for business management and cost control in public hospitals has led to a need for microbiology laboratories to know the real cost of the different products they offer. For this reason, a catalogue of microbiological products was prepared, and the costs (direct and indirect) for each product were analysed, along with estimated profitability. All tests performed in the microbiology laboratory of the "Virgen de la Concha" Hospital in Zamora over a 2-year period (73192 tests) were studied. The microbiological product catalogue was designed using homogeneity criteria with respect to procedures used, workloads and costs. For each product, the direct personnel costs (estimated from workloads following the method of the College of American Pathologists, 1992 version), the indirect personnel costs, the direct and indirect material costs and the portion of costs corresponding to the remaining laboratory costs (capital and structural costs) were calculated. The average product cost was 16.05 euros. The average cost of a urine culture (considered, for purposes of this study, as a relative value unit) reached 13.59 euros, with a significant difference observed between positive and negative cultures (negative urine culture, 10.72 euros; positive culture, 29.65 euros). Significant heterogeneity exists, both in the costs of different products and especially in the cost per positive test. The application of a detailed methodology of cost analysis facilitates the calculation of the real cost of microbiological products. This information provides a basic tool for establishing clinical management strategies.

  4. Inner core rotation from event-pair analysis

    NASA Astrophysics Data System (ADS)

    Song, Xiaodong; Poupinet, Georges

    2007-09-01

    The last decade has witnessed an animated debate on whether the inner core rotation is a fact or an artifact. Here we examine the temporal change of inner core waves using a technique that compares differential travel times at the same station but between two events. The method does not require precise knowledge of earthquake locations and earth models. The pairing of the events creates a large data set for the application of statistical tools. Using measurements from 87 events in the South Sandwich Islands recorded at College, Alaska station, we conclude the temporal change is robust. The estimates of the temporal change range from about 0.07 to 0.10 s/decade over the past 50 yr. If we used only pairs with small inter-event distances, which reduce the influence of mantle heterogeneity, the rates range from 0.084 to 0.098 s/decade, nearly identical to the rate inferred by Zhang et al. [Zhang, J., Song, X.D., Li, Y.C., Richards, P.G., Sun, X.L., Waldhauser, F., Inner core differential motion confirmed by earthquake waveform doublets, Science 309 (5739) (2005) 1357-1360.] from waveform doublets. The rate of the DF change seems to change with time, which may be explained by lateral variation of the inner core structure or the change in rotation rate on decadal time scale.

  5. Experimental investigation and CFD analysis on cross flow in the core of PMR200

    DOE PAGES

    Lee, Jeong -Hun; Yoon, Su -Jong; Cho, Hyoung -Kyu; ...

    2015-04-16

    The Prismatic Modular Reactor (PMR) is one of the major Very High Temperature Reactor (VHTR) concepts, which consists of hexagonal prismatic fuel blocks and reflector blocks made of nuclear gradegraphite. However, the shape of the graphite blocks could be easily changed by neutron damage duringthe reactor operation and the shape change can create gaps between the blocks inducing the bypass flow.In the VHTR core, two types of gaps, a vertical gap and a horizontal gap which are called bypass gap and cross gap, respectively, can be formed. The cross gap complicates the flow field in the reactor core by connectingmore » the coolant channel to the bypass gap and it could lead to a loss of effective coolant flow in the fuel blocks. Thus, a cross flow experimental facility was constructed to investigate the cross flow phenomena in the core of the VHTR and a series of experiments were carried out under varying flow rates and gap sizes. The results of the experiments were compared with CFD (Computational Fluid Dynamics) analysis results in order to verify its prediction capability for the cross flow phenomena. Fairly good agreement was seen between experimental results and CFD predictions and the local characteristics of the cross flow was discussed in detail. Based on the calculation results, pressure loss coefficient across the cross gap was evaluated, which is necessary for the thermo-fluid analysis of the VHTR core using a lumped parameter code.« less

  6. 10 CFR 707.12 - Specimen collection, handling and laboratory analysis for drug testing.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... drug testing. 707.12 Section 707.12 Energy DEPARTMENT OF ENERGY WORKPLACE SUBSTANCE ABUSE PROGRAMS AT DOE SITES Procedures § 707.12 Specimen collection, handling and laboratory analysis for drug testing... collection to final disposition of specimens, and testing laboratories shall use appropriate cutoff levels in...

  7. 10 CFR 707.12 - Specimen collection, handling and laboratory analysis for drug testing.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... drug testing. 707.12 Section 707.12 Energy DEPARTMENT OF ENERGY WORKPLACE SUBSTANCE ABUSE PROGRAMS AT DOE SITES Procedures § 707.12 Specimen collection, handling and laboratory analysis for drug testing... collection to final disposition of specimens, and testing laboratories shall use appropriate cutoff levels in...

  8. 10 CFR 707.12 - Specimen collection, handling and laboratory analysis for drug testing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... drug testing. 707.12 Section 707.12 Energy DEPARTMENT OF ENERGY WORKPLACE SUBSTANCE ABUSE PROGRAMS AT DOE SITES Procedures § 707.12 Specimen collection, handling and laboratory analysis for drug testing... collection to final disposition of specimens, and testing laboratories shall use appropriate cutoff levels in...

  9. 10 CFR 707.12 - Specimen collection, handling and laboratory analysis for drug testing.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... drug testing. 707.12 Section 707.12 Energy DEPARTMENT OF ENERGY WORKPLACE SUBSTANCE ABUSE PROGRAMS AT DOE SITES Procedures § 707.12 Specimen collection, handling and laboratory analysis for drug testing... collection to final disposition of specimens, and testing laboratories shall use appropriate cutoff levels in...

  10. 10 CFR 707.12 - Specimen collection, handling and laboratory analysis for drug testing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... drug testing. 707.12 Section 707.12 Energy DEPARTMENT OF ENERGY WORKPLACE SUBSTANCE ABUSE PROGRAMS AT DOE SITES Procedures § 707.12 Specimen collection, handling and laboratory analysis for drug testing... collection to final disposition of specimens, and testing laboratories shall use appropriate cutoff levels in...

  11. The study on the core personality trait words of Chinese medical university students based on social network analysis.

    PubMed

    Wu, Ying; Xue, Yunzhen; Xue, Zhanling

    2017-09-01

    The medical university students in China whose school work is relatively heavy and educational system is long are a special professional group. Many students have psychological problems more or less. So, to understand their personality characteristics will provide a scientific basis for the intervention of psychological health.We selected top 30 personality trait words according to the order of frequency. Additionally, some methods such as social network analysis (SNA) and visualization technology of mapping knowledge domain were used in this study.Among these core personality trait words Family conscious had the 3 highest centralities and possessed the largest core status and influence. From the analysis of core-peripheral structure, we can see polarized core-perpheral structure was quite obvious. From the analysis of K-plex, there were in total 588 "K-2"K-plexs. From the analysis of Principal Components, we selected the 11 principal components.This study of personality not only can prevent disease, but also provide a scientific basis for students' psychological healthy education. In addition, we have adopted SNA to pay more attention to the relationship between personality trait words and the connection among personality dimensions. This study may provide the new ideas and methods for the research of personality structure.

  12. The study on the core personality trait words of Chinese medical university students based on social network analysis

    PubMed Central

    Wu, Ying; Xue, Yunzhen; Xue, Zhanling

    2017-01-01

    Abstract The medical university students in China whose school work is relatively heavy and educational system is long are a special professional group. Many students have psychological problems more or less. So, to understand their personality characteristics will provide a scientific basis for the intervention of psychological health. We selected top 30 personality trait words according to the order of frequency. Additionally, some methods such as social network analysis (SNA) and visualization technology of mapping knowledge domain were used in this study. Among these core personality trait words Family conscious had the 3 highest centralities and possessed the largest core status and influence. From the analysis of core-peripheral structure, we can see polarized core-perpheral structure was quite obvious. From the analysis of K-plex, there were in total 588 “K-2”K-plexs. From the analysis of Principal Components, we selected the 11 principal components. This study of personality not only can prevent disease, but also provide a scientific basis for students’ psychological healthy education. In addition, we have adopted SNA to pay more attention to the relationship between personality trait words and the connection among personality dimensions. This study may provide the new ideas and methods for the research of personality structure. PMID:28906409

  13. Analysis of ultradian heat production and aortic core temperature rhythms in the rat.

    PubMed

    Gómez-Sierra, J M; Canela, E I; Esteve, M; Rafecas, I; Closa, D; Remesar, X; Alemany, M

    1993-01-01

    The rhythms of aortic core temperature and overall heat production in Wistar rats was analyzed by using long series of recordings of temperature obtained from implanted thermocouple probes and heat release values from a chamber calorimeter. There was a very high degree of repetitiveness in the presentation of actual heat rhythms, with high cross-correlation values ascertained wit paired periodograms. No differences were observed between heat production between male and female adult rats. The cross-correlation for temperature gave similar figures. The cross-correlation study between heat production and aortic core temperature in the same animals was significant and showed a displacement of about 30 minutes between heat release and aortic core temperature. The analysis of heat production showed a strong predominance of rhythms with periods of 24 hours (frequencies < 11.6 microHz) or more; other rhythms detected (of roughly the same relative importance) had periods of 8 or 2.2 hours (35 or 126 microHz, respectively). The analysis of aortic core temperature showed a smaller quantitative contribution of the 8 or 2.2 hours (35 or 126 microHz) rhythms, with other harmonic rhythms interspersed (5.1 and 4.0 hours, i.e. 54 and 69 microHz). The proportion of 'noise' or cycles lower than 30 minutes (< 550 microHz) was higher in internal temperature than in the actual release of heat. The results are in agreement with the existence of a basic period of about 130 minutes (126 microHz) of warming/cooling of the blood, with a number of other harmonic rhythms superimposed upon the basic circadian rhythm.

  14. Accuracy of finite-difference modeling of seismic waves : Simulation versus laboratory measurements

    NASA Astrophysics Data System (ADS)

    Arntsen, B.

    2017-12-01

    The finite-difference technique for numerical modeling of seismic waves is still important and for some areas extensively used.For exploration purposes is finite-difference simulation at the core of both traditional imaging techniques such as reverse-time migration and more elaborate Full-Waveform Inversion techniques.The accuracy and fidelity of finite-difference simulation of seismic waves are hard to quantify and meaningfully error analysis is really onlyeasily available for simplistic media. A possible alternative to theoretical error analysis is provided by comparing finite-difference simulated data with laboratory data created using a scale model. The advantage of this approach is the accurate knowledge of the model, within measurement precision, and the location of sources and receivers.We use a model made of PVC immersed in water and containing horizontal and tilted interfaces together with several spherical objects to generateultrasonic pressure reflection measurements. The physical dimensions of the model is of the order of a meter, which after scaling represents a model with dimensions of the order of 10 kilometer and frequencies in the range of one to thirty hertz.We find that for plane horizontal interfaces the laboratory data can be reproduced by the finite-difference scheme with relatively small error, but for steeply tilted interfaces the error increases. For spherical interfaces the discrepancy between laboratory data and simulated data is sometimes much more severe, to the extent that it is not possible to simulate reflections from parts of highly curved bodies. The results are important in view of the fact that finite-difference modeling is often at the core of imaging and inversion algorithms tackling complicatedgeological areas with highly curved interfaces.

  15. Preparations to ship the TMI-2 damaged reactor core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmitt, R.C.; Quinn, G.J.

    1985-11-01

    The March 1979 accident at Three Mile Island Unit 2 (TMI-2) resulted in a severely damaged core. Entries into that core using various tools and inspection devices have shown a significant void, large amounts of rubble, partially intact fuel assemblies, and some resolidified molten materials. The removal and disposition of that core has been of considerable public, regulatory, and governmental interest for some time. In a contractual agreement between General Public Utility Nuclear (GPUN) and the US Department of Energy (DOE), DOE has agreed to accept the TMI-2 core for interim storage at the Idaho National Engineering Laboratory (INEL), conductmore » research on fuel and materials of the core, and eventually dispose of the core either by processing or internment at the national repository. GPUN has removed various samples of material from the core and was scheduled to begin extensive defueling operations in September 1985. EG and G Idaho, Inc. (EG and G), acting on behalf of DOE, is responsible for transporting, receiving, examining, and storing the TMI-2 core. This paper addresses the preparations to ship the core to INEL, which is scheduled to commence in March 1986.« less

  16. The Alcohol Dehydrogenase Kinetics Laboratory: Enhanced Data Analysis and Student-Designed Mini-Projects

    ERIC Educational Resources Information Center

    Silverstein, Todd P.

    2016-01-01

    A highly instructive, wide-ranging laboratory project in which students study the effects of various parameters on the enzymatic activity of alcohol dehydrogenase has been adapted for the upper-division biochemistry and physical biochemistry laboratory. Our two main goals were to provide enhanced data analysis, featuring nonlinear regression, and…

  17. Combustion and Engine-Core Noise

    NASA Astrophysics Data System (ADS)

    Ihme, Matthias

    2017-01-01

    The implementation of advanced low-emission aircraft engine technologies and the reduction of noise from airframe, fan, and jet exhaust have made noise contributions from an engine core increasingly important. Therefore, meeting future ambitious noise-reduction goals requires the consideration of engine-core noise. This article reviews progress on the fundamental understanding, experimental analysis, and modeling of engine-core noise; addresses limitations of current techniques; and identifies opportunities for future research. After identifying core-noise contributions from the combustor, turbomachinery, nozzles, and jet exhaust, they are examined in detail. Contributions from direct combustion noise, originating from unsteady combustion, and indirect combustion noise, resulting from the interaction of flow-field perturbations with mean-flow variations in turbine stages and nozzles, are analyzed. A new indirect noise-source contribution arising from mixture inhomogeneities is identified by extending the theory. Although typically omitted in core-noise analysis, the impact of mean-flow variations and nozzle-upstream perturbations on the jet-noise modulation is examined, providing potential avenues for future core-noise mitigation.

  18. Polybrominated Diphenyl Ethers in Dryer Lint: An Advanced Analysis Laboratory

    ERIC Educational Resources Information Center

    Thompson, Robert Q.

    2008-01-01

    An advanced analytical chemistry laboratory experiment is described that involves environmental analysis and gas chromatography-mass spectrometry. Students analyze lint from clothes dryers for traces of flame retardant chemicals, polybrominated diphenylethers (PBDEs), compounds receiving much attention recently. In a typical experiment, ng/g…

  19. 7 CFR 91.37 - Standard hourly fee rate for laboratory testing, analysis, and other services.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Standard hourly fee rate for laboratory testing... AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Fees and Charges § 91.37 Standard hourly fee rate for laboratory testing, analysis, and other services. (a) The...

  20. 7 CFR 91.37 - Standard hourly fee rate for laboratory testing, analysis, and other services.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Standard hourly fee rate for laboratory testing... AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Fees and Charges § 91.37 Standard hourly fee rate for laboratory testing, analysis, and other services. (a) The...

  1. 7 CFR 91.37 - Standard hourly fee rate for laboratory testing, analysis, and other services.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Standard hourly fee rate for laboratory testing... AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Fees and Charges § 91.37 Standard hourly fee rate for laboratory testing, analysis, and other services. (a) The...

  2. 7 CFR 91.37 - Standard hourly fee rate for laboratory testing, analysis, and other services.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Standard hourly fee rate for laboratory testing... AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Fees and Charges § 91.37 Standard hourly fee rate for laboratory testing, analysis, and other services. (a) The...

  3. Analysis and Design of ITER 1 MV Core Snubber

    NASA Astrophysics Data System (ADS)

    Wang, Haitian; Li, Ge

    2012-11-01

    The core snubber, as a passive protection device, can suppress arc current and absorb stored energy in stray capacitance during the electrical breakdown in accelerating electrodes of ITER NBI. In order to design the core snubber of ITER, the control parameters of the arc peak current have been firstly analyzed by the Fink-Baker-Owren (FBO) method, which are used for designing the DIIID 100 kV snubber. The B-H curve can be derived from the measured voltage and current waveforms, and the hysteresis loss of the core snubber can be derived using the revised parallelogram method. The core snubber can be a simplified representation as an equivalent parallel resistance and inductance, which has been neglected by the FBO method. A simulation code including the parallel equivalent resistance and inductance has been set up. The simulation and experiments result in dramatically large arc shorting currents due to the parallel inductance effect. The case shows that the core snubber utilizing the FBO method gives more compact design.

  4. Integration of Video-Based Demonstrations to Prepare Students for the Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Nadelson, Louis S.; Scaggs, Jonathan; Sheffield, Colin; McDougal, Owen M.

    2015-01-01

    Consistent, high-quality introductions to organic chemistry laboratory techniques effectively and efficiently support student learning in the organic chemistry laboratory. In this work, we developed and deployed a series of instructional videos to communicate core laboratory techniques and concepts. Using a quasi-experimental design, we tested the…

  5. A Content Analysis of General Chemistry Laboratory Manuals for Evidence of Higher-Order Cognitive Tasks

    NASA Astrophysics Data System (ADS)

    Domin, Daniel S.

    1999-01-01

    The science laboratory instructional environment is ideal for fostering the development of problem-solving, manipulative, and higher-order thinking skills: the skills needed by today's learner to compete in an ever increasing technology-based society. This paper reports the results of a content analysis of ten general chemistry laboratory manuals. Three experiments from each manual were examined for evidence of higher-order cognitive activities. Analysis was based upon the six major cognitive categories of Bloom's Taxonomy of Educational Objectives: knowledge, comprehension, application, analysis, synthesis, and evaluation. The results of this study show that the overwhelming majority of general chemistry laboratory manuals provide tasks that require the use of only the lower-order cognitive skills: knowledge, comprehension, and application. Two of the laboratory manuals were disparate in having activities that utilized higher-order cognition. I describe the instructional strategies used within these manuals to foster higher-order cognitive development.

  6. Revising laboratory work: sociological perspectives on the science classroom

    NASA Astrophysics Data System (ADS)

    Jobér, Anna

    2017-09-01

    This study uses sociological perspectives to analyse one of the core practices in science education: schoolchildren's and students' laboratory work. Applying an ethnographic approach to the laboratory work done by pupils at a Swedish compulsory school, data were generated through observations, field notes, interviews, and a questionnaire. The pupils, ages 14 and 15, were observed as they took a 5-week physics unit (specifically, mechanics). The analysis shows that the episodes of laboratory work could be filled with curiosity and exciting challenges; however, another picture emerged when sociological concepts and notions were applied to what is a very common way of working in the classroom. Laboratory work is characterised as a social activity that is expected to be organised as a group activity. This entails groups becoming, to some extent, `safe havens' for the pupils. On the other hand, this way of working in groups required pupils to subject to the groups and the peer effect, sometimes undermining their chances to learn and perform better. In addition, the practice of working in groups when doing laboratory work left some pupils and the teacher blaming themselves, even though the outcome of the learning situation was a result of a complex interplay of social processes. This article suggests a stronger emphasis on the contradictions and consequences of the science subjects, which are strongly influenced by their socio-historical legacy.

  7. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  8. Selenium semiconductor core optical fibers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, G. W.; Qian, Q., E-mail: qianqi@scut.edu.cn; Peng, K. L.

    2015-02-15

    Phosphate glass-clad optical fibers containing selenium (Se) semiconductor core were fabricated using a molten core method. The cores were found to be amorphous as evidenced by X-ray diffraction and corroborated by Micro-Raman spectrum. Elemental analysis across the core/clad interface suggests that there is some diffusion of about 3 wt % oxygen in the core region. Phosphate glass-clad crystalline selenium core optical fibers were obtained by a postdrawing annealing process. A two-cm-long crystalline selenium semiconductor core optical fibers, electrically contacted to external circuitry through the fiber end facets, exhibit a three times change in conductivity between dark and illuminated states. Suchmore » crystalline selenium semiconductor core optical fibers have promising utility in optical switch and photoconductivity of optical fiber array.« less

  9. Improvement of analytical capabilities of neutron activation analysis laboratory at the Colombian Geological Survey

    NASA Astrophysics Data System (ADS)

    Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.

    2016-07-01

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.

  10. Laboratory investigations of earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Xia, Kaiwen

    In this thesis this will be attempted through controlled laboratory experiments that are designed to mimic natural earthquake scenarios. The earthquake dynamic rupturing process itself is a complicated phenomenon, involving dynamic friction, wave propagation, and heat production. Because controlled experiments can produce results without assumptions needed in theoretical and numerical analysis, the experimental method is thus advantageous over theoretical and numerical methods. Our laboratory fault is composed of carefully cut photoelastic polymer plates (Homahte-100, Polycarbonate) held together by uniaxial compression. As a unique unit of the experimental design, a controlled exploding wire technique provides the triggering mechanism of laboratory earthquakes. Three important components of real earthquakes (i.e., pre-existing fault, tectonic loading, and triggering mechanism) correspond to and are simulated by frictional contact, uniaxial compression, and the exploding wire technique. Dynamic rupturing processes are visualized using the photoelastic method and are recorded via a high-speed camera. Our experimental methodology, which is full-field, in situ, and non-intrusive, has better control and diagnostic capacity compared to other existing experimental methods. Using this experimental approach, we have investigated several problems: dynamics of earthquake faulting occurring along homogeneous faults separating identical materials, earthquake faulting along inhomogeneous faults separating materials with different wave speeds, and earthquake faulting along faults with a finite low wave speed fault core. We have observed supershear ruptures, subRayleigh to supershear rupture transition, crack-like to pulse-like rupture transition, self-healing (Heaton) pulse, and rupture directionality.

  11. Seeking excellence: An evaluation of 235 international laboratories conducting water isotope analyses by isotope-ratio and laser-absorption spectrometry.

    PubMed

    Wassenaar, L I; Terzer-Wassmuth, S; Douence, C; Araguas-Araguas, L; Aggarwal, P K; Coplen, T B

    2018-03-15

    Water stable isotope ratios (δ 2 H and δ 18 O values) are widely used tracers in environmental studies; hence, accurate and precise assays are required for providing sound scientific information. We tested the analytical performance of 235 international laboratories conducting water isotope analyses using dual-inlet and continuous-flow isotope ratio mass spectrometers and laser spectrometers through a water isotope inter-comparison test. Eight test water samples were distributed by the IAEA to international stable isotope laboratories. These consisted of a core set of five samples spanning the common δ-range of natural waters, and three optional samples (highly depleted, enriched, and saline). The fifth core sample contained unrevealed trace methanol to assess analyst vigilance to the impact of organic contamination on water isotopic measurements made by all instrument technologies. For the core and optional samples ~73 % of laboratories gave acceptable results within 0.2 ‰ and 1.5 ‰ of the reference values for δ 18 O and δ 2 H, respectively; ~27 % produced unacceptable results. Top performance for δ 18 O values was dominated by dual-inlet IRMS laboratories; top performance for δ 2 H values was led by laser spectrometer laboratories. Continuous-flow instruments yielded comparatively intermediate results. Trace methanol contamination of water resulted in extreme outlier δ-values for laser instruments, but also affected reactor-based continuous-flow IRMS systems; however, dual-inlet IRMS δ-values were unaffected. Analysis of the laboratory results and their metadata suggested inaccurate or imprecise performance stemmed mainly from skill- and knowledge-based errors including: calculation mistakes, inappropriate or compromised laboratory calibration standards, poorly performing instrumentation, lack of vigilance to contamination, or inattention to unreasonable isotopic outcomes. To counteract common errors, we recommend that laboratories include 1-2 'known

  12. Seeking excellence: An evaluation of 235 international laboratories conducting water isotope analyses by isotope-ratio and laser-absorption spectrometry

    USGS Publications Warehouse

    Wassenaar, L. I.; Terzer-Wassmuth, S.; Douence, C.; Araguas-Araguas, L.; Aggarwal, P. K.; Coplen, Tyler B.

    2018-01-01

    RationaleWater stable isotope ratios (δ2H and δ18O values) are widely used tracers in environmental studies; hence, accurate and precise assays are required for providing sound scientific information. We tested the analytical performance of 235 international laboratories conducting water isotope analyses using dual-inlet and continuous-flow isotope ratio mass spectrometers and laser spectrometers through a water isotope inter-comparison test.MethodsEight test water samples were distributed by the IAEA to international stable isotope laboratories. These consisted of a core set of five samples spanning the common δ-range of natural waters, and three optional samples (highly depleted, enriched, and saline). The fifth core sample contained unrevealed trace methanol to assess analyst vigilance to the impact of organic contamination on water isotopic measurements made by all instrument technologies.ResultsFor the core and optional samples ~73 % of laboratories gave acceptable results within 0.2 ‰ and 1.5 ‰ of the reference values for δ18O and δ2H, respectively; ~27 % produced unacceptable results. Top performance for δ18O values was dominated by dual-inlet IRMS laboratories; top performance for δ2H values was led by laser spectrometer laboratories. Continuous-flow instruments yielded comparatively intermediate results. Trace methanol contamination of water resulted in extreme outlier δ-values for laser instruments, but also affected reactor-based continuous-flow IRMS systems; however, dual-inlet IRMS δ-values were unaffected.ConclusionsAnalysis of the laboratory results and their metadata suggested inaccurate or imprecise performance stemmed mainly from skill- and knowledge-based errors including: calculation mistakes, inappropriate or compromised laboratory calibration standards, poorly performing instrumentation, lack of vigilance to contamination, or inattention to unreasonable isotopic outcomes. To counteract common errors, we recommend that

  13. Tidal excitation of elliptical instability in the Martian core: Possible mechanism for generating the core dynamo

    NASA Astrophysics Data System (ADS)

    Arkani-Hamed, J.; Seyed-Mahmoud, B.; Aldridge, K. D.; Baker, R. E.

    2008-06-01

    We propose a causal relationship between the creation of the giant impact basins on Mars by a large asteroid, ruptured when it entered the Roche limit, and the excitation of the Martian core dynamo. Our laboratory experiments indicate that the elliptical instability of the Martian core can be excited if the asteroid continually exerts tidal forces on Mars for ~20,000 years. Our numerical experiments suggest that the growth-time of the instability was 5,000-15,000 years when the asteroid was at a distance of 50,000-75,000 km. We demonstrate the stability of the orbital motion of an asteroid captured by Mars at a distance of 100,000 km in the presence of the Sun and Jupiter. We also present our results for the tidal interaction of the asteroid with Mars. An asteroid captured by Mars in prograde fashion can survive and excite the elliptical instability of the core for only a few million years, whereas a captured retrograde asteroid can excite the elliptical instability for hundreds of millions of years before colliding with Mars. The rate at which tidal energy dissipates in Mars during this period is over two orders of magnitude greater than the rate at which magnetic energy dissipates. If only 1% of the tidal energy dissipation is partitioned to the core, sufficient energy would be available to maintain the core dynamo. Accordingly, a retrograde asteroid is quite capable of exciting an elliptical instability in the Martian core, thus providing a candidate process to drive a core dynamo.

  14. Analysis of Stainless Steel Sandwich Panels with a Metal Foam Core for Lightweight Fan Blade Design

    NASA Technical Reports Server (NTRS)

    Min, James B.; Ghosn, Louis J.; Lerch, Bradley A.; Raj, Sai V.; Holland, Frederic A., Jr.; Hebsur, Mohan G.

    2004-01-01

    The quest for cheap, low density and high performance materials in the design of aircraft and rotorcraft engine fan and propeller blades poses immense challenges to the materials and structural design engineers. The present study investigates the use of a sandwich foam fan blade mae up of solid face sheets and a metal foam core. The face sheets and the metal foam core material were an aerospace grade precipitation hardened 17-4 PH stainless steel with high strength and high toughness. The resulting structures possesses a high stiffness while being lighter than a similar solid construction. The material properties of 17-4 PH metal foam are reviewed briefly to describe the characteristics of sandwich structure for a fan blade application. A vibration analysis for natural frequencies and a detailed stress analysis on the 17-4 PH sandwich foam blade design for different combinations of kin thickness and core volume are presented with a comparison to a solid titanium blade.

  15. Results of Laboratory Tests on Surficial Sediments from the Upper Continental Slope Northern Gulf of Mexico

    USGS Publications Warehouse

    Booth, James S.

    1979-01-01

    The purpose of this report is to present the results of geo­technical, textural, and chemical tests performed on samples from the upper Continental Slope, northern Gulf of Mexico.The samples were collected by a piston corer up to 12 m (40 ft.} in length with a head weight of 908 kg (one ton}. The inside diameter of the C. A. B. liner was 89 mm (3.5 inches}. Upon retrieval, the cores were cut in 1.5 m sections, examined for evidence of disturbance, then, if in acceptable condition, were sealed and placed in their in situ vertical position in a refrigerated van. Once ashore, the sections were opened, sealed with wax, recapped and stored as before.The cores were split lengthwise for analysis. One half of the core was X-rayed and the radiograph was carefully examined as a further check for disturbance. This half was then archived. The other half of the core was used for the laboratory work.

  16. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-11-13

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  17. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-01-30

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  18. Design and analysis of three-layer-core optical fiber

    NASA Astrophysics Data System (ADS)

    Zheng, Siwen; Liu, Yazhuo; Chang, Guangjian

    2018-03-01

    A three-layer-core single-mode large-mode-area fiber is investigated. The three-layer structure in the core, which is composed of a core-index layer, a cladding-index layer, and a depression-index layer, could achieve a large effective area Aeff while maintaining an ultralow bending loss without deteriorating cutoff behaviors. The single-mode large mode area of 100 to 330 μm2 could be achieved in the fiber. The effective area Aeff can be further enlarged by adjusting the layer parameters. Furthermore, the bending property could be improved in this three-layer-core structure. The bending loss could decrease by 2 to 4 orders of magnitude compared with the conventional step-index fiber with the same Aeff. These characteristics of three-layer-core fiber suggest that it can be used in large-mode-area wide-bandwidth high-capacity transmission or high-power optical fiber laser and amplifier in optical communications, which could be used for the basic physical layer structure of big data storage, reading, calculation, and transmission applications.

  19. Using the Laboratory to Engage All Students in Science Practices

    ERIC Educational Resources Information Center

    Walker, J. P.; Sampson, V.; Southerland, S.; Enderle, P. J.

    2016-01-01

    This study examines the extent to which the type of instruction used during a general chemistry laboratory course affects students' ability to use core ideas to engage in science practices. We use Ford's (2008) description of the nature of scientific practices to categorize what students do in the laboratory as either empirical or…

  20. Developing Laboratory Skills by Incorporating Peer-Review and Digital Badges

    ERIC Educational Resources Information Center

    Seery, Michael K.; Agustian, Hendra Y.; Doidge, Euan D.; Kucharski, Maciej M.; O'Connor, Helen M.; Price, Amy

    2017-01-01

    Laboratory work is at the core of any chemistry curriculum but literature on the assessment of laboratory skills is scant. In this study we report the use of a peer-observation protocol underpinned by exemplar videos. Students are required to watch exemplar videos for three techniques (titrations, distillations, preparation of standard solutions)…

  1. Coagulation Testing in the Core Laboratory.

    PubMed

    Winter, William E; Flax, Sherri D; Harris, Neil S

    2017-11-08

    and the PT or aPTT are repeated on the 1:1 mix. Factor activity assays are most commonly performed as a one-stage assay. The patient's citrated plasma is diluted and mixed 1-to-1 with a single factor-deficient substrate plasma. A PT or aPTT is performed on the above mix, depending on the factor being tested.Factor inhibitors are antibodies that are most commonly diagnosed in male patients with severe hemophilia A (FVIII deficiency) where they are induced by factor replacement therapy.Factor inhibitors can also appear in the form of spontaneous autoantibodies in both male and female individuals who were previously well. This is an autoimmune condition called "acquired hemophilia."Most coagulation laboratories can measure the plasma concentration of VWF protein (VWF antigen) by an immunoturbidimetric technique. Testing the functional activity of VWF, utilizes the drug ristocetin.The state of multimerization of VWF is important and is assessed by electrophoresis on agarose gels. Type 2a and 2b VWD are associated with the lack of intermediate- and high molecular weight multimers.The antiphospholipid syndrome (APLS) is an acquired autoimmune phenomenon associated with an increased incidence of both venous and arterial thromboses, as well as fetal loss. Typically, there is a paradoxical prolongation of the aPTT in the absence of any clinical features of bleeding. This is the so-called "lupus anticoagulant (LA) effect." The laboratory definition of the APLS requires the presence of either a "lupus anticoagulant" or a persistent titer of antiphospholipid antibodies.There are now 2 broad classes of direct-acting oral anticoagulants (DOACs): [1] The oral direct thrombin inhibitors (DTIs) such as dabigatran; and [2] The oral direct factor Xa inhibitors such as rivaroxaban and apixaban. The PT and aPTT are variably affected by the DOACs and are generally unhelpful in monitoring their concentrations. Most importantly, a normal PT or aPTT does NOT exclude the presence of any of the

  2. Quantifying and overcoming bioturbation in marine sediment cores: dual 14C and δ18O analysis on single foraminifera

    NASA Astrophysics Data System (ADS)

    Lougheed, Bryan; Metcalfe, Brett; Wacker, Lukas

    2017-04-01

    Marine sediment cores used in palaeoceanography form the basis of our current understanding of past global climate and ocean chemistry. Precision and accuracy of geochronological control in these sediment cores are crucial in unravelling the timing of rapid shifts in palaeoclimate and, ultimately, the interdependency of global climate mechanisms and their causality. Aware of the problems associated with bioturbation (the mixing of ocean sediments by benthic organisms) palaeoceanographers generally aim to retrieve sediment cores from locations with high sediment accumulation rates, thus minimising the influence of bioturbation as much as possible. However, the practice of concentrating only on areas of the ocean floor with high sedimentation accumulation rates has the potential to introduce a geographical bias into our understanding of global palaeoclimate. For example, global time averaged sediment accumulation rates for the ocean floor (excluding continental margins) indicate that vast areas of the ocean floor have sediment accumulation rates less than the recommended minimum advised sediment accumulation rates of 10 cm/ka or greater. Whilst many studies have focussed on quantifying the impact of bioturbation on our understanding of the past, few have attempted to overcome the problems associated with bioturbation. Recent pioneering developments in 14C AMS at the Laboratory of Ion Beam Physics at ETH Zürich have led to the development of the Mini Carbon Dating System (MICADAS). This compact 14C AMS system can be coupled to a carbonate handling system, thus enabling the direct AMS measurement of gaseous samples, i.e. without graphitisation, allowing for the analysis of carbonate samples of <100 μg. Likewise, while earlier isotope ratio mass spectrometry (IRMS) technology required a minimum of 100 μg of carbonate to produce a successful δ18O measurement, more recent advances in IRMS technology have made routine measurements of as little as 5 μg possible

  3. Laboratory testing and economic analysis of high RAP warm mixed asphalt.

    DOT National Transportation Integrated Search

    2009-03-24

    This report contains laboratory testing, economic analysis, literature review, and information obtained from multiple producers throughout the state of Mississippi regarding the use of high RAP (50 % to 100%) mixtures containing warm mix additives. T...

  4. Knowledge Economy Core Journals: Identification through LISTA Database Analysis.

    PubMed

    Nouri, Rasool; Karimi, Saeed; Ashrafi-rizi, Hassan; Nouri, Azadeh

    2013-03-01

    Knowledge economy has become increasingly broad over the years and identification of core journals in this field can be useful for librarians in journal selection process and also for researchers to select their studies and finding Appropriate Journal for publishing their articles. Present research attempts to determine core journals of Knowledge Economy indexed in LISTA (Library and Information Science and Technology). The research method was bibliometric and research population include the journals indexed in LISTA (From the start until the beginning of 2011) with at least one article a bout "knowledge economy". For data collection, keywords about "knowledge economy"-were extracted from the literature in this area-have searched in LISTA by using title, keyword and abstract fields and also taking advantage of LISTA thesaurus. By using this search strategy, 1608 articles from 390 journals were retrieved. The retrieved records import in to the excel sheet and after that the journals were grouped and the Bradford's coefficient was measured for each group. Finally the average of the Bradford's coefficients were calculated and core journals with subject area of "Knowledge economy" were determined by using Bradford's formula. By using Bradford's scattering law, 15 journals with the highest publication rates were identified as "Knowledge economy" core journals indexed in LISTA. In this list "Library and Information update" with 64 articles was at the top. "ASLIB Proceedings" and "Serials" with 51 and 40 articles are next in rank. Also 41 journals were identified as beyond core that "Library Hi Tech" with 20 articles was at the top. Increased importance of knowledge economy has led to growth of production of articles in this subject area. So the evaluation of journals for ranking these journals becomes a very challenging task for librarians and generating core journal list can provide a useful tool for journal selection and also quick and easy access to information. Core

  5. Incorporating the International Polar Year Into Introductory Geology Laboratories at Ohio State University

    NASA Astrophysics Data System (ADS)

    Judge, S. A.; Wilson, T. J.

    2005-12-01

    The International Polar Year (IPY) provides an excellent opportunity for highlighting polar research in education. The ultimate goal of our outreach and education program is to develop a series of modules that are focused on societally-relevant topics being investigated in Antarctic earth science, while teaching basic geologic concepts that are standard elements of school curricula. For example, we envision a university-level, undergraduate, introductory earth science class with the entire semester/quarter laboratory program focused on polar earth science research during the period of the International Polar Year. To attain this goal, a series of modules will be developed, including inquiry-based exercises founded on imagery (video, digital photos, digital core scans), GIS data layers, maps, and data sets available from OSU research groups. Modules that highlight polar research are also suitable for the K-12 audience. Scaleable/grade appropriate modules that use some of the same data sets as the undergraduate modules can be outlined for elementary through high school earth science classes. An initial module is being developed that focuses on paleoclimate data. The module provides a hands-on investigation of the climate history archived in both ice cores and sedimentary rock cores in order to understand time scales, drivers, and processes of global climate change. The paleoclimate module also demonstrates the types of polar research that are ongoing at OSU, allowing students to observe what research the faculty are undertaking in their respective fields. This will link faculty research with student education in the classroom, enhancing learning outcomes. Finally, this module will provide a direct link to U.S. Antarctic Program research related to the International Polar Year, when new ice and sedimentary rock cores will be obtained and analyzed. As a result of this laboratory exercise, the students will be able to: (1) Define an ice core and a sedimentary rock core

  6. Photoelastic stress analysis of different prefabricated post-and-core materials.

    PubMed

    Asvanund, Pattapon; Morgano, Steven M

    2011-01-01

    The purpose of this study was to investigate stress developed by a combination of a stainless steel post or a fiber-reinforced resin post with a silver amalgam core or a composite resin core. Two-dimensional photoelastic models were used to simulate root dentin. Posts (ParaPost XT and ParaPost-FiberWhite) were cemented with a luting agent (RelyX Unicem). Silver amalgam cores and composite resin cores were fabricated on the posts. Complete crowns were fabricated and cemented on the cores. Each model was analyzed with 2 force magnitudes and in 2 directions. Fringe orders were recorded and compared using ANOVA (p=0.05) and the Scheffe's test. With vertical force, no stress differences occurred among the 4 groups (p=0.159). With a 30-degree force, there was stress differences among the 4 groups (p<0.001). The combination of a fiber-reinforced post and composite resin core could potentially reduce stresses within the radicular dentin when angled loads are applied.

  7. Geometric Analysis of Vein Fracture Networks From the Awibengkok Core, Indonesia

    NASA Astrophysics Data System (ADS)

    Khatwa, A.; Bruhn, R. L.; Brown, S. R.

    2003-12-01

    Fracture network systems within rocks are important features for the transportation and remediation of hazardous waste, oil and gas production, geothermal energy extraction and the formation of vein fillings and ore deposits. A variety of methods, including computational and laboratory modeling have been employed to further understand the dynamic nature of fractures and fracture systems (e.g. Ebel and Brown, this session). To substantiate these studies, it is also necessary to analyze the characteristics and morphology of naturally occurring vein systems. The Awibengkok core from a geothermal system in West Java, Indonesia provided an excellent opportunity to study geometric and petrologic characteristics of vein systems in volcanic rock. Vein minerals included chlorite, calcite, quartz, zeolites and sulphides. To obtain geometric data on the veins, we employed a neural net image processing technique to analyze high-resolution digital photography of the veins. We trained a neural net processor to map the extent of the vein using RGB pixel training classes. The resulting classification image was then converted to a binary image file and processed through a MatLab program that we designed to calculate vein geometric statistics, including aperture and roughness. We also performed detailed petrographic and microscopic geometric analysis on the veins to determine the history of mineralization and fracturing. We found that multi-phase mineralization due to chemical dissolution and re-precipitation as well as mechanical fracturing was a common feature in many of the veins and that it had a significant role for interpreting vein tortuosity and history of permeability. We used our micro- and macro-scale observations to construct four hypothetical permeability models that compliment the numerical and laboratory modeled data reported by Ebel and Brown. In each model, permeability changes, and in most cases fluctuates, differently over time as the tortuosity and aperture of

  8. Laboratory-acquired infections of Salmonella enterica serotype Typhi in South Africa: phenotypic and genotypic analysis of isolates.

    PubMed

    Smith, Anthony Marius; Smouse, Shannon Lucrecia; Tau, Nomsa Pauline; Bamford, Colleen; Moodley, Vineshree Mischka; Jacobs, Charlene; McCarthy, Kerrigan Mary; Lourens, Adré; Keddy, Karen Helena

    2017-09-29

    Workers in clinical microbiology laboratories are exposed to a variety of pathogenic microorganisms. Salmonella species is among the most commonly reported bacterial causes of laboratory-acquired infections. We report on three cases of laboratory-acquired Salmonella enterica serotype Typhi (Salmonella Typhi) infection which occurred over the period 2012 to 2016 in South Africa. Laboratory investigation included phenotypic and genotypic characterization of isolates. Phenotypic analysis included standard microbiological identification techniques, serotyping and antimicrobial susceptibility testing. Genotypic analysis included the molecular subtyping methodologies of pulsed-field gel electrophoresis analysis, multilocus sequence typing and whole-genome sequencing (WGS); with WGS data analysis including phylogenetic analysis based upon comparison of single nucleotide polymorphism profiles of isolates. All cases of laboratory-acquired infection were most likely the result of lapses in good laboratory practice and laboratory safety. The following critical issues were highlighted. There was misdiagnosis and misreporting of Salmonella Typhi as nontyphoidal Salmonella by a diagnostic laboratory, with associated public health implications. We highlight issues concerning the importance of accurate fluoroquinolone susceptibility testing and interpretation of results according to updated guidelines. We describe potential shortcomings of a single disk susceptibility screening test for fluoroquinolone susceptibility and suggest that confirmatory minimum inhibitory concentration testing should always be performed in cases of invasive Salmonella infections. These antimicrobial susceptibility testing issues resulted in inappropriate ciprofloxacin therapy which may have been responsible for failure in clearance of pathogen from patients. Salmonella Typhi capsular polysaccharide vaccine was not protective in one case, possibly secondarily to a faulty vaccine. Molecular subtyping of

  9. Analysis of graphical representation among freshmen in undergraduate physics laboratory

    NASA Astrophysics Data System (ADS)

    Adam, A. S.; Anggrayni, S.; Kholiq, A.; Putri, N. P.; Suprapto, N.

    2018-03-01

    Physics concept understanding is the importance of the physics laboratory among freshmen in the undergraduate program. These include the ability to interpret the meaning of the graph to make an appropriate conclusion. This particular study analyses the graphical representation among freshmen in an undergraduate physics laboratory. This study uses empirical study with quantitative approach. The graphical representation covers 3 physics topics: velocity of sound, simple pendulum and spring system. The result of this study shows most of the freshmen (90% of the sample) make a graph based on the data from physics laboratory. It means the transferring process of raw data which illustrated in the table to physics graph can be categorised. Most of the Freshmen use the proportional principle of the variable in graph analysis. However, Freshmen can't make the graph in an appropriate variable to gain more information and can't analyse the graph to obtain the useful information from the slope.

  10. Guidelines on Good Clinical Laboratory Practice

    PubMed Central

    Ezzelle, J.; Rodriguez-Chavez, I. R.; Darden, J. M.; Stirewalt, M.; Kunwar, N.; Hitchcock, R.; Walter, T.; D’Souza, M. P.

    2008-01-01

    A set of Good Clinical Laboratory Practice (GCLP) standards that embraces both the research and clinical aspects of GLP were developed utilizing a variety of collected regulatory and guidance material. We describe eleven core elements that constitute the GCLP standards with the objective of filling a gap for laboratory guidance, based on IND sponsor requirements, for conducting laboratory testing using specimens from human clinical trials. These GCLP standards provide guidance on implementing GLP requirements that are critical for laboratory operations, such as performance of protocol-mandated safety assays, peripheral blood mononuclear cell processing and immunological or endpoint assays from biological interventions on IND-registered clinical trials. The expectation is that compliance with the GCLP standards, monitored annually by external audits, will allow research and development laboratories to maintain data integrity and to provide immunogenicity, safety, and product efficacy data that is repeatable, reliable, auditable and that can be easily reconstructed in a research setting. PMID:18037599

  11. Entrapment of carbon dioxide with chitosan-based core-shell particles containing changeable cores.

    PubMed

    Dong, Yanrui; Fu, Yinghao; Lin, Xia; Xiao, Congming

    2016-08-01

    Water-soluble chitosan-based core-shell particles that contained changeable cores were successfully applied to anchor carbon dioxide. The entrapment capacity of the particles for carbon dioxide (EC) depended on the cores. It was found that EC of the particles contained aqueous cores was higher than that of the beads with water-soluble chitosan gel cores, which was confirmed with thermogravimetric analysis. In addition, calcium ions and sodium hydroxide were introduced within the particles to examine their effect on the entrapment. EC of the particles was enhanced with sodium hydroxide when the cores were WSC gel. The incorporation of calcium ions was helpful for stabilizing carbon dioxide through the formation of calcium carbonate, which was verified with Fourier transform infrared spectra and scanning electron microscopy/energy-dispersive spectrometry. This phenomenon meant the role of calcium ions for fixating carbon dioxide was significant. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Microstructural investigations on carbonate fault core rocks in active extensional fault zones from the central Apennines (Italy)

    NASA Astrophysics Data System (ADS)

    Cortinovis, Silvia; Balsamo, Fabrizio; Storti, Fabrizio

    2017-04-01

    The study of the microstructural and petrophysical evolution of cataclasites and gouges has a fundamental impact on both hydraulic and frictional properties of fault zones. In the last decades, growing attention has been payed to the characterization of carbonate fault core rocks due to the nucleation and propagation of coseismic ruptures in carbonate successions (e.g., Umbria-Marche 1997, L'Aquila 2009, Amatrice 2016 earthquakes in Central Apennines, Italy). Among several physical parameters, grain size and shape in fault core rocks are expected to control the way of sliding along the slip surfaces in active fault zones, thus influencing the propagation of coseismic ruptures during earthquakes. Nevertheless, the role of grain size and shape distribution evolution in controlling the weakening or strengthening behavior in seismogenic fault zones is still not fully understood also because a comprehensive database from natural fault cores is still missing. In this contribution, we present a preliminary study of seismogenic extensional fault zones in Central Apennines by combining detailed filed mapping with grain size and microstructural analysis of fault core rocks. Field mapping was aimed to describe the structural architecture of fault systems and the along-strike fault rock distribution and fracturing variations. In the laboratory we used a Malvern Mastersizer 3000 granulometer to obtain a precise grain size characterization of loose fault rocks combined with sieving for coarser size classes. In addition, we employed image analysis on thin sections to quantify the grain shape and size in cemented fault core rocks. The studied fault zones consist of an up to 5-10 m-thick fault core where most of slip is accommodated, surrounded by a tens-of-meters wide fractured damage zone. Fault core rocks consist of (1) loose to partially cemented breccias characterized by different grain size (from several cm up to mm) and variable grain shape (from very angular to sub

  13. Modeling Core Collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Mezzacappa, Anthony

    2017-01-01

    Core collapse supernovae, or the death throes of massive stars, are general relativistic, neutrino-magneto-hydrodynamic events. The core collapse supernova mechanism is still not in hand, though key components have been illuminated, and the potential for multiple mechanisms for different progenitors exists. Core collapse supernovae are the single most important source of elements in the Universe, and serve other critical roles in galactic chemical and thermal evolution, the birth of neutron stars, pulsars, and stellar mass black holes, the production of a subclass of gamma-ray bursts, and as potential cosmic laboratories for fundamental nuclear and particle physics. Given this, the so called ``supernova problem'' is one of the most important unsolved problems in astrophysics. It has been fifty years since the first numerical simulations of core collapse supernovae were performed. Progress in the past decade, and especially within the past five years, has been exponential, yet much work remains. Spherically symmetric simulations over nearly four decades laid the foundation for this progress. Two-dimensional modeling that assumes axial symmetry is maturing. And three-dimensional modeling, while in its infancy, has begun in earnest. I will present some of the recent work from the ``Oak Ridge'' group, and will discuss this work in the context of the broader work by other researchers in the field. I will then point to future requirements and challenges. Connections with other experimental, observational, and theoretical efforts will be discussed, as well.

  14. Promoting Utilization of Saccharum spp. Genetic Resources through Genetic Diversity Analysis and Core Collection Construction

    PubMed Central

    Pathak, Bhuvan; Ayala-Silva, Tomas; Yang, Xiping; Todd, James; Glynn, Neil C.; Kuhn, David N.; Glaz, Barry; Gilbert, Robert A.; Comstock, Jack C.; Wang, Jianping

    2014-01-01

    Sugarcane (Saccharum spp.) and other members of Saccharum spp. are attractive biofuel feedstocks. One of the two World Collections of Sugarcane and Related Grasses (WCSRG) is in Miami, FL. This WCSRG has 1002 accessions, presumably with valuable alleles for biomass, other important agronomic traits, and stress resistance. However, the WCSRG has not been fully exploited by breeders due to its lack of characterization and unmanageable population. In order to optimize the use of this genetic resource, we aim to 1) genotypically evaluate all the 1002 accessions to understand its genetic diversity and population structure and 2) form a core collection, which captures most of the genetic diversity in the WCSRG. We screened 36 microsatellite markers on 1002 genotypes and recorded 209 alleles. Genetic diversity of the WCSRG ranged from 0 to 0.5 with an average of 0.304. The population structure analysis and principal coordinate analysis revealed three clusters with all S. spontaneum in one cluster, S. officinarum and S. hybrids in the second cluster and mostly non-Saccharum spp. in the third cluster. A core collection of 300 accessions was identified which captured the maximum genetic diversity of the entire WCSRG which can be further exploited for sugarcane and energy cane breeding. Sugarcane and energy cane breeders can effectively utilize this core collection for cultivar improvement. Further, the core collection can provide resources for forming an association panel to evaluate the traits of agronomic and commercial importance. PMID:25333358

  15. Laboratory analysis of phacoemulsifier compliance and capacity.

    PubMed

    Nejad, Mitra; Injev, Valentine P; Miller, Kevin M

    2012-11-01

    To compare the compliance and capacity of 7 fluidics modules used by 6 phacoemulsifiers from 3 manufacturers. Jules Stein Eye Institute, Los Angeles, California, USA. Experimental study. Previous-model and current-model phacoemulsifiers from 3 manufacturers were subjected to laboratory analysis of compliance and capacity. Previous-generation models tested included the Legacy Advantec, Whitestar Sovereign Phacoemulsification System, and Millennium Microsurgical System. Current models tested were the Infiniti Vision System with standard and Intrepid cassettes, Whitestar Signature Phacoemulsification System, and Stellaris Vision Enhancement System. To measure compliance, the aspiration line was connected to an electronic pressure transducer and small volumes of fluid were injected or aspirated. To measure capacity, the space between the distal end of the aspiration line and the pump was filled with methylene blue-dyed fluid. The Legacy was the most compliant phacoemulsifier. The old and new Whitestar systems, Millennium system, and Stellaris system showed similar midrange compliances. The Infiniti Vision System with the Intrepid fluidic management system was the least compliant. The Infiniti cassettes had the greatest capacity, which is a detriment from a surge-control perspective, and Signature cassettes had the least capacity. The Infiniti Intrepid system had the lowest compliance of the 6 units tested, which is optimum from a surge-control perspective. All other things being equal, the Infiniti should have the safest occlusion-break surge response. Mr. Injev is an employee of Alcon Laboratories. Dr. Miller is a consultant to and investigator for Alcon Laboratories. Ms. Nejad has no financial or proprietary interest in any material or method mentioned. Copyright © 2012 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  16. Impactor core disruption by high-energy planetary collisions

    NASA Astrophysics Data System (ADS)

    Landeau, M.; Phillips, D.; Deguen, R.; Neufeld, J.; Dalziel, S.; Olson, P.

    2017-12-01

    Understanding the fate of impactor cores during large planetary collisions is key for predicting metal-silicate equilibration during Earth's accretion. Accretion models and geochemical observations indicate that much of Earth's mass accreted through high-energy impacts between planetary embryos already differentiated into a metallic core and a silicate mantle. Previous studies on core formation assume that the metallic core of the impactor is left intact by the impact, but it mixes with silicates during the post-impact fall in the magma ocean. Recent impact simulations, however, suggest that the impact cratering process induces significant core disruption and metal-silicate mixing. Unlike existing impact simulations, experiments can produce turbulence, a key ingredient to investigate disruption of the impactor core. Here we use laboratory experiments where a volume of salt solution (representing the impactor core) vertically impacts a pool of water (representing the magma ocean) to quantify impact-induced mixing between the impactor and the target as a function of impact velocity, impactor size and density difference. We find that the ratio between the impactor inertia and its weight controls mixing. Extrapolated to planetary accretion, our results suggest that the impact process induces no significant mixing for impactors of comparable size as the protoplanet whereas the impactor core is highly disrupted by impacts involving impactors much smaller than the protoplanet.

  17. Evaluation of Counter-Based Dynamic Load Balancing Schemes for Massive Contingency Analysis on Over 10,000 Cores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Rice, Mark J.

    Contingency analysis studies are necessary to assess the impact of possible power system component failures. The results of the contingency analysis are used to ensure the grid reliability, and in power market operation for the feasibility test of market solutions. Currently, these studies are performed in real time based on the current operating conditions of the grid with a set of pre-selected contingency list, which might result in overlooking some critical contingencies caused by variable system status. To have a complete picture of a power grid, more contingencies need to be studied to improve grid reliability. High-performance computing techniques holdmore » the promise of being able to perform the analysis for more contingency cases within a much shorter time frame. This paper evaluates the performance of counter-based dynamic load balancing schemes for a massive contingency analysis program on 10,000+ cores. One million N-2 contingency analysis cases with a Western Electricity Coordinating Council power grid model have been used to demonstrate the performance. The speedup of 3964 with 4096 cores and 7877 with 10240 cores are obtained. This paper reports the performance of the load balancing scheme with a single counter and two counters, describes disk I/O issues, and discusses other potential techniques for further improving the performance.« less

  18. A tracking system for laboratory mice to support medical researchers in behavioral analysis.

    PubMed

    Macrì, S; Mainetti, L; Patrono, L; Pieretti, S; Secco, A; Sergi, I

    2015-08-01

    The behavioral analysis of laboratory mice plays a key role in several medical and scientific research areas, such as biology, toxicology, pharmacology, and so on. Important information on mice behavior and their reaction to a particular stimulus is deduced from a careful analysis of their movements. Moreover, behavioral analysis of genetically modified mice allows obtaining important information about particular genes, phenotypes or drug effects. The techniques commonly adopted to support such analysis have many limitations, which make the related systems particularly ineffective. Currently, the engineering community is working to explore innovative identification and sensing technologies to develop new tracking systems able to guarantee benefits to animals' behavior analysis. This work presents a tracking solution based on passive Radio Frequency Identification Technology (RFID) in Ultra High Frequency (UHF) band. Much emphasis is given to the software component of the system, based on a Web-oriented solution, able to process the raw tracking data coming from a hardware system, and offer 2D and 3D tracking information as well as reports and dashboards about mice behavior. The system has been widely tested using laboratory mice and compared with an automated video-tracking software (i.e., EthoVision). The obtained results have demonstrated the effectiveness and reliability of the proposed solution, which is able to correctly detect the events occurring in the animals' cage, and to offer a complete and user-friendly tool to support researchers in behavioral analysis of laboratory mice.

  19. Design of a Clinical Information Management System to Support DNA Analysis Laboratory Operation

    PubMed Central

    Dubay, Christopher J.; Zimmerman, David; Popovich, Bradley

    1995-01-01

    The LabDirector system has been developed at the Oregon Health Sciences University to support the operation of our clinical DNA analysis laboratory. Through an iterative design process which has spanned two years, we have produced a system that is both highly tailored to a clinical genetics production laboratory and flexible in its implementation, to support the rapid growth and change of protocols and methodologies in use in the field. The administrative aspects of the system are integrated with an enterprise schedule management system. The laboratory side of the system is driven by a protocol modeling and execution system. The close integration between these two aspects of the clinical laboratory facilitates smooth operations, and allows management to accurately measure costs and performance. The entire application has been designed and documented to provide utility to a wide range of clinical laboratory environments.

  20. Quantitative analysis of core fucosylation of serum proteins in liver diseases by LC-MS-MRM.

    PubMed

    Ma, Junfeng; Sanda, Miloslav; Wei, Renhuizi; Zhang, Lihua; Goldman, Radoslav

    2018-02-07

    Aberrant core fucosylation of proteins has been linked to liver diseases. In this study, we carried out multiple reaction monitoring (MRM) quantification of core fucosylated N-glycopeptides of serum proteins partially deglycosylated by a combination of endoglycosidases (endoF1, endoF2, and endoF3). To minimize variability associated with the preparatory steps, the analysis was performed without enrichment of glycopeptides or fractionation of serum besides the nanoRP chromatography. Specifically, we quantified core fucosylation of 22 N-glycopeptides derived from 17 proteins together with protein abundance of these glycoproteins in a cohort of 45 participants (15 disease-free control, 15 fibrosis and 15 cirrhosis patients) using a multiplex nanoUPLC-MS-MRM workflow. We find increased core fucosylation of 5 glycopeptides at the stage of liver fibrosis (i.e., N630 of serotransferrin, N107 of alpha-1-antitrypsin, N253 of plasma protease C1 inhibitor, N397 of ceruloplasmin, and N86 of vitronectin), increase of additional 6 glycopeptides at the stage of cirrhosis (i.e., N138 and N762 of ceruloplasmin, N354 of clusterin, N187 of hemopexin, N71 of immunoglobulin J chain, and N127 of lumican), while the degree of core fucosylation of 10 glycopeptides did not change. Interestingly, although we observe an increase in the core fucosylation at N86 of vitronectin in liver fibrosis, core fucosylation decreases on the N169 glycopeptide of the same protein. Our results demonstrate that the changes in core fucosylation are protein and site specific during the progression of fibrotic liver disease and independent of the changes in the quantity of N-glycoproteins. It is expected that the fully optimized multiplex LC-MS-MRM assay of core fucosylated glycopeptides will be useful for the serologic assessment of the fibrosis of liver. We have quantified the difference in core fucosylation among three comparison groups (healthy control, fibrosis and cirrhosis patients) using a sensitive and

  1. Providing critical laboratory results on time, every time to help reduce emergency department length of stay: how our laboratory achieved a Six Sigma level of performance.

    PubMed

    Blick, Kenneth E

    2013-08-01

    To develop a fully automated core laboratory, handling samples on a "first in, first out" real-time basis with Lean/Six Sigma management tools. Our primary goal was to provide services to critical care areas, eliminating turnaround time outlier percentage (TAT-OP) as a factor in patient length of stay (LOS). A secondary goal was to achieve a better laboratory return on investment. In 2011, we reached our primary goal when we calculated the TAT-OP distribution and found we had achieved a Six Sigma level of performance, ensuring that our laboratory service can be essentially eliminated as a factor in emergency department patient LOS. We also measured return on investment, showing a productivity improvement of 35%, keeping pace with our increased testing volume. As a result of our Lean process improvements and Six Sigma initiatives, in part through (1) strategic deployment of point-of-care testing and (2) core laboratory total automation with robotics, middleware, and expert system technology, physicians and nurses at the Oklahoma University Medical Center can more effectively deliver lifesaving health care using evidence-based protocols that depend heavily on "on time, every time" laboratory services.

  2. Sodium Based Heat Pipe Modules for Space Reactor Concepts: Stainless Steel SAFE-100 Core

    NASA Technical Reports Server (NTRS)

    Martin, James J.; Reid, Robert S.

    2004-01-01

    A heat pipe cooled reactor is one of several candidate reactor cores being considered for advanced space power and propulsion systems to support future space exploration applications. Long life heat pipe modules, with designs verified through a combination of theoretical analysis and experimental lifetime evaluations, would be necessary to establish the viability of any of these candidates, including the heat pipe reactor option. A hardware-based program was initiated to establish the infrastructure necessary to build heat pipe modules. This effort, initiated by Los Alamos National Laboratory and referred to as the Safe Affordable Fission Engine (SAFE) project, set out to fabricate and perform non-nuclear testing on a modular heat pipe reactor prototype that can provide 100 kilowatt from the core to an energy conversion system at 700 C. Prototypic heat pipe hardware was designed, fabricated, filled, closed-out and acceptance tested.

  3. Design and analysis of large-core single-mode windmill single crystal sapphire optical fiber

    DOE PAGES

    Cheng, Yujie; Hill, Cary; Liu, Bo; ...

    2016-06-01

    We present a large-core single-mode “windmill” single crystal sapphire optical fiber (SCSF) design, which exhibits single-mode operation by stripping off the higher-order modes (HOMs) while maintaining the fundamental mode. The “windmill” SCSF design was analyzed using the finite element analysis method, in which all the HOMs are leaky. The numerical simulation results show single-mode operation in the spectral range from 0.4 to 2 μm in the windmill SCSF, with an effective core diameter as large as 14 μm. Such fiber is expected to improve the performance of many of the current sapphire fiber optic sensor structures.

  4. Evaluation of red blood cell and platelet antigen genotyping platforms (ID CORE XT/ID HPA XT) in routine clinical practice.

    PubMed

    Finning, Kirstin; Bhandari, Radhika; Sellers, Fiona; Revelli, Nicoletta; Villa, Maria Antonietta; Muñiz-Díaz, Eduardo; Nogués, Núria

    2016-03-01

    High-throughput genotyping platforms enable simultaneous analysis of multiple polymorphisms for blood group typing. BLOODchip® ID is a genotyping platform based on Luminex® xMAP technology for simultaneous determination of 37 red blood cell (RBC) antigens (ID CORE XT) and 18 human platelet antigens (HPA) (ID HPA XT) using the BIDS XT software. In this international multicentre study, the performance of ID CORE XT and ID HPA XT, using the centres' current genotyping methods as the reference for comparison, and the usability and practicality of these systems, were evaluated under working laboratory conditions. DNA was extracted from whole blood in EDTA with Qiagen methodologies. Ninety-six previously phenotyped/genotyped samples were processed per assay: 87 testing samples plus five positive controls and four negative controls. Results were available for 519 samples: 258 with ID CORE XT and 261 with ID HPA XT. There were three "no calls" that were either caused by human error or resolved after repeating the test. Agreement between the tests and reference methods was 99.94% for ID CORE XT (9,540/9,546 antigens determined) and 100% for ID HPA XT (all 4,698 alleles determined). There were six discrepancies in antigen results in five RBC samples, four of which (in VS, N, S and Do(a)) could not be investigated due to lack of sufficient sample to perform additional tests and two of which (in S and C) were resolved in favour of ID CORE XT (100% accuracy). The total hands-on time was 28-41 minutes for a batch of 16 samples. Compared with the reference platforms, ID CORE XT and ID HPA XT were considered simpler to use and had shorter processing times. ID CORE XT and ID HPA XT genotyping platforms for RBC and platelet systems were accurate and user-friendly in working laboratory settings.

  5. Status of Undergraduate Pharmacology Laboratories in Colleges of Pharmacy in the United States

    ERIC Educational Resources Information Center

    Katz, Norman L.; And Others

    1978-01-01

    U.S. colleges of pharmacy were surveyed in 1976 to determine whether a trend exists in continuing, discontinuing, or restructuring laboratory time in pharmaceutical education. Data regarding core undergraduate pharmacology courses, undergraduate pharmacology laboratory status, and pharmacology faculty are presented. (LBH)

  6. Fault Tree Analysis: Investigation of Epidemic Hemorrhagic Fever Infection Acquired in Animal Laboratories in China.

    PubMed

    Liu, Xiao Yu; Xue, Kang Ning; Rong, Rong; Zhao, Chi Hong

    2016-01-01

    Epidemic hemorrhagic fever has been an ongoing threat to laboratory personnel involved in animal care and use. Laboratory transmissions and severe infections occurred over the past twenty years, even though the standards and regulations for laboratory biosafety have been issued, upgraded, and implemented in China. Therefore, there is an urgent need to identify risk factors and to seek effective preventive measures that can curb the incidences of epidemic hemorrhagic fever among laboratory personnel. In the present study, we reviewed literature that relevant to animals laboratory-acquired hemorrhagic fever infections reported from 1995 to 2015, and analyzed these incidences using fault tree analysis (FTA). The results of data analysis showed that purchasing of qualified animals and guarding against wild rats which could make sure the laboratory animals without hantaviruses, are the basic measures to prevent infections. During the process of daily management, the consciousness of personal protecting and the ability of personal protecting need to be further improved. Undoubtedly vaccination is the most direct and effective method, while it plays role after infection. So avoiding infections can't rely entirely on vaccination. Copyright © 2016 The Editorial Board of Biomedical and Environmental Sciences. Published by China CDC. All rights reserved.

  7. Analysis of pan-genome to identify the core genes and essential genes of Brucella spp.

    PubMed

    Yang, Xiaowen; Li, Yajie; Zang, Juan; Li, Yexia; Bie, Pengfei; Lu, Yanli; Wu, Qingmin

    2016-04-01

    Brucella spp. are facultative intracellular pathogens, that cause a contagious zoonotic disease, that can result in such outcomes as abortion or sterility in susceptible animal hosts and grave, debilitating illness in humans. For deciphering the survival mechanism of Brucella spp. in vivo, 42 Brucella complete genomes from NCBI were analyzed for the pan-genome and core genome by identification of their composition and function of Brucella genomes. The results showed that the total 132,143 protein-coding genes in these genomes were divided into 5369 clusters. Among these, 1710 clusters were associated with the core genome, 1182 clusters with strain-specific genes and 2477 clusters with dispensable genomes. COG analysis indicated that 44 % of the core genes were devoted to metabolism, which were mainly responsible for energy production and conversion (COG category C), and amino acid transport and metabolism (COG category E). Meanwhile, approximately 35 % of the core genes were in positive selection. In addition, 1252 potential essential genes were predicted in the core genome by comparison with a prokaryote database of essential genes. The results suggested that the core genes in Brucella genomes are relatively conservation, and the energy and amino acid metabolism play a more important role in the process of growth and reproduction in Brucella spp. This study might help us to better understand the mechanisms of Brucella persistent infection and provide some clues for further exploring the gene modules of the intracellular survival in Brucella spp.

  8. Is HCV core antigen a reliable marker of viral load? An evaluation of HCV core antigen automated immunoassay

    PubMed Central

    Hadziyannis, Emilia; Minopetrou, Martha; Georgiou, Anastasia; Spanou, Fotini; Koskinas, John

    2013-01-01

    Background Hepatitis C viral (HCV) load detection and quantification is routinely accomplished by HCV RNA measurement, an expensive but essential test, both for the diagnosis and treatment of chronic hepatitis C (CHC). HCV core antigen (Ag) testing has been suggested as an attractive alternative to molecular diagnostics. The aim of the study was to evaluate an automated chemiluminescent immunoassay (CLIA) for HCV core Ag measurement in comparison to quantitative HCV RNA determination. Methods HCV Ag was measured in 105 anti-HCV positive patients, from which 89 were HCV RNA positive with CHC and 16 HCV RNA negative after spontaneous HCV clearance. Viral load was quantified with branched DNA (bDNA, Versant, Siemens). Sera were stored at -70°C and then tested with the Architect HCV Ag test (Abbott Laboratories), a two-step CLIA assay, with high throughput and minimal handling of the specimens. Statistical analysis was performed on logarithmically transformed values. Results HCV-Ag was detectable and quantifiable in 83/89 and in grey zone in 4/89 HCV RNA positive sera. HCV-Ag was undetectable in all 16 HCV RNA negative samples. The sample with the lowest viral load that tested positive for HCV-Ag contained 1200 IU/mL HCV RNA. There was a positive correlation between HCV RNA and HCV-Ag (r=0.89). The HCV RNA/ HCV Ag ratio varied from 1.5 to 3.25. Conclusion The HCV core Ag is an easy test with comparable sensitivity (>90%) and satisfactory correlation with the HCV RNA bDNA assay. Its role in diagnostics and other clinical applications has to be determined based on cost effectiveness. PMID:24714621

  9. Crystallization and preliminary X-ray crystallographic analysis of the GluR0 ligand-binding core from Nostoc punctiforme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jun Hyuck; Park, Soo Jeong; Rho, Seong-Hwan

    2005-11-01

    The GluR0 ligand-binding core from N. punctiforme was expressed, purified and crystallized in the presence of l-glutamate. A diffraction data set was collected to a resolution of 2.1 Å. GluR0 from Nostoc punctiforme (NpGluR0) is a bacterial homologue of the ionotropic glutamate receptor. The ligand-binding core of NpGluR0 was crystallized at 294 K using the hanging-drop vapour-diffusion method. The l-glutamate-complexed crystal belongs to space group C222{sub 1}, with unit-cell parameters a = 78.0, b = 145.1, c = 132.1 Å. The crystals contain three subunits in the asymmetric unit, with a V{sub M} value of 2.49 Å{sup 3} Da{sup −1}.more » The diffraction limit of the l-glutamate complex data set was 2.1 Å using synchrotron X-ray radiation at beamline BL-4A of the Pohang Accelerator Laboratory (Pohang, Korea)« less

  10. Paleoclimatological analysis of Late Eocene core, Manning Formation, Brazos County, Texas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yancey, T.; Elsik, W.

    1994-09-01

    A core of the basal part of the Manning Formation was drilled to provide a baseline for paleoclimate analysis of the expanded section of siliciclastic sediments of late Eocene age in the outcrop belt. The interdeltaic Jackson Stage deposits of this area include 20+ cyclic units containing both lignite and shallow marine sediments. Depositional environments can be determined with precision and the repetitive nature of cycles allows comparisons of the same environment throughout, effectively removing depositional environment as a variable in interpretation of climate signal. Underlying Yegua strata contain similar cycles, providing 35+ equivalent environmental transacts within a 6 m.y.more » time interval of Jackson and Yegua section, when additional cores are taken. The core is from a cycle deposited during maximum flooding of the Jackson Stage, with deposits ranging from shoreface (carbonaceous) to midshelf, beyond the range of storm sand deposition. Sediments are leached of carbonate, but contain foram test linings, agglutinated forams, fish debris, and rich assemblages of terrestrial and marine palynomorphs. All samples examined contain marine dinoflagellates, which are most abundant in transgressive and maximum flood zones, along with agglutinated forams and fish debris. This same interval contains two separate pulses of reworked palynomorphs. The transgressive interval contains Glaphyrocysta intricata, normally present in Yegua sediments. Pollen indicates fluctuating subtropical to tropical paleoclimates, with three short cycles of cooler temperatures, indicated by abundance peaks of alder pollen (Alnus) in transgressive, maximum flood, and highstand deposits.« less

  11. Update on the magnetic resonance imaging core of the Alzheimer's disease neuroimaging initiative.

    PubMed

    Jack, Clifford R; Bernstein, Matt A; Borowski, Bret J; Gunter, Jeffrey L; Fox, Nick C; Thompson, Paul M; Schuff, Norbert; Krueger, Gunnar; Killiany, Ronald J; Decarli, Charles S; Dale, Anders M; Carmichael, Owen W; Tosun, Duygu; Weiner, Michael W

    2010-05-01

    Functions of the Alzheimer's Disease Neuroimaging Initiative (ADNI) magnetic resonance imaging (MRI) core fall into three categories: (1) those of the central MRI core laboratory at Mayo Clinic, Rochester, Minnesota, needed to generate high quality MRI data in all subjects at each time point; (2) those of the funded ADNI MRI core imaging analysis groups responsible for analyzing the MRI data; and (3) the joint function of the entire MRI core in designing and problem solving MR image acquisition, pre-processing, and analyses methods. The primary objective of ADNI was and continues to be improving methods for clinical trials in Alzheimer's disease. Our approach to the present ("ADNI-GO") and future ("ADNI-2," if funded) MRI protocol will be to maintain MRI methodological consistency in the previously enrolled "ADNI-1" subjects who are followed up longitudinally in ADNI-GO and ADNI-2. We will modernize and expand the MRI protocol for all newly enrolled ADNI-GO and ADNI-2 subjects. All newly enrolled subjects will be scanned at 3T with a core set of three sequence types: 3D T1-weighted volume, FLAIR, and a long TE gradient echo volumetric acquisition for micro hemorrhage detection. In addition to this core ADNI-GO and ADNI-2 protocol, we will perform vendor-specific pilot sub-studies of arterial spin-labeling perfusion, resting state functional connectivity, and diffusion tensor imaging. One of these sequences will be added to the core protocol on systems from each MRI vendor. These experimental sub-studies are designed to demonstrate the feasibility of acquiring useful data in a multicenter (but single vendor) setting for these three emerging MRI applications. Copyright 2010 The Alzheimer

  12. Instrumental Analysis Chemistry Laboratory

    ERIC Educational Resources Information Center

    Munoz de la Pena, Arsenio; Gonzalez-Gomez, David; Munoz de la Pena, David; Gomez-Estern, Fabio; Sequedo, Manuel Sanchez

    2013-01-01

    designed for automating the collection and assessment of laboratory exercises is presented. This Web-based system has been extensively used in engineering courses such as control systems, mechanics, and computer programming. Goodle GMS allows the students to submit their results to a…

  13. Chamber-core structures for fairing acoustic mitigation

    NASA Astrophysics Data System (ADS)

    Ardelean, Emil; Williams, Andrew; Korshin, Nicholas; Henderson, Kyle; Lane, Steven; Richard, Robert

    2005-05-01

    Extreme noise and vibration levels at lift-off and during ascent can damage sensitive payload components. Recently, the Air Force Research Laboratory, Space Vehicles Directorate has investigated a composite structure fabrication approach, called chamber-core, for building payload fairings. Chamber-core offers a strong, lightweight structure with inherent noise attenuation characteristics. It uses one-inch square axial tubes that are sandwiched between inner and outer face-sheets to form a cylindrical fairing structure. These hollow tubes can be used as acoustic dampers to attenuate the amplitude response of low frequency acoustic resonances within the fairing"s volume. A cylindrical, graphite-epoxy chamber-core structure was built to study noise transmission characteristics and to quantify the achievable performance improvement. The cylinder was tested in a semi-reverberant acoustics laboratory using bandlimited random noise at sound pressure levels up to 110 dB. The performance was measured using external and internal microphones. The noise reduction was computed as the ratio of the spatially averaged external response to the spatially averaged interior response. The noise reduction provided by the chamber-core cylinder was measured over three bandwidths, 20 Hz to 500 Hz, 20 Hz to 2000 Hz, and 20 Hz to 5000 Hz. For the bare cylinder with no acoustic resonators, the structure provided approximately 13 dB of attenuation over the 20 Hz to 500 Hz bandwidth. With the axial tubes acting as acoustic resonators at various frequencies over the bandwidth, the noise reduction provided by the cylinder increased to 18.2 dB, an overall increase of 4.8 dB over the bandwidth. Narrow-band reductions greater than 10 dB were observed at specific low frequency acoustic resonances. This was accomplished with virtually no added mass to the composite cylinder.

  14. Analysis of genetic diversity of rapeseed genetic resources in Japan and core collection construction

    PubMed Central

    Chen, Ruikun; Hara, Takashi; Ohsawa, Ryo; Yoshioka, Yosuke

    2017-01-01

    Diversity analysis of rapeseed accessions preserved in the Japanese Genebank can provide valuable information for breeding programs. In this study, 582 accessions were genotyped with 30 SSR markers covering all 19 rapeseed chromosomes. These markers amplified 311 alleles (10.37 alleles per marker; range, 3–39). The genetic diversity of Japanese accessions was lower than that of overseas accessions. Analysis of molecular variance indicated significant genetic differentiation between Japanese and overseas accessions. Small but significant differences were found among geographical groups in Japan, and genetic differentiation tended to increase with geographical distance. STRUCTURE analysis indicated the presence of two main genetic clusters in the NARO rapeseed collection. With the membership probabilities threshold, 227 accessions mostly originating from overseas were assigned to one subgroup, and 276 accessions mostly originating from Japan were assigned to the other subgroup. The remaining 79 accessions are assigned to admixed group. The core collection constructed comprises 96 accessions of diverse origin. It represents the whole collection well and thus it may be useful for rapeseed genetic research and breeding programs. The core collection improves the efficiency of management, evaluation, and utilization of genetic resources. PMID:28744177

  15. Utility of repeat testing of critical values: a Q-probes analysis of 86 clinical laboratories.

    PubMed

    Lehman, Christopher M; Howanitz, Peter J; Souers, Rhona; Karcher, Donald S

    2014-06-01

    A common laboratory practice is to repeat critical values before reporting the test results to the clinical care provider. This may be an unnecessary step that delays the reporting of critical test results without adding value to the accuracy of the test result. To determine the proportions of repeated chemistry and hematology critical values that differ significantly from the original value as defined by the participating laboratory, to determine the threshold differences defined by the laboratory as clinically significant, and to determine the additional time required to analyze the repeat test. Participants prospectively reviewed critical test results for 4 laboratory tests: glucose, potassium, white blood cell count, and platelet count. Participants reported the following information: initial and repeated test result; time initial and repeat results were first known to laboratory staff; critical result notification time; if the repeat result was still a critical result; if the repeat result was significantly different from the initial result, as judged by the laboratory professional or policy; significant difference threshold, as defined by the laboratory; the make and model of the instrument used for primary and repeat testing. Routine, repeat analysis of critical values is a common practice. Most laboratories did not formally define a significant difference between repeat results. Repeated results were rarely considered significantly different. Median repeated times were at least 17 to 21 minutes for 10% of laboratories. Twenty percent of laboratories reported at least 1 incident in the last calendar year of delayed result reporting that clinicians indicated had adversely affected patient care. Routine repeat analysis of automated chemistry and hematology critical values is unlikely to be clinically useful and may adversely affect patient care.

  16. 40 CFR 141.705 - Approved laboratories.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...

  17. 40 CFR 141.705 - Approved laboratories.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...

  18. 40 CFR 141.705 - Approved laboratories.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...

  19. 40 CFR 141.705 - Approved laboratories.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...

  20. 40 CFR 141.705 - Approved laboratories.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...

  1. Investigation of mechanical properties of hydrate-bearing pressure core sediments recovered from the Eastern Nankai Trough using transparent acrylic cell triaxial testing system (TACTT-system)

    NASA Astrophysics Data System (ADS)

    Yoneda, J.; Masui, A.; Konno, Y.; Jin, Y.; Kida, M.; Suzuki, K.; Nakatsuka, Y.; Tenma, N.; Nagao, J.

    2014-12-01

    Natural gas hydrate-bearing pressure core sediments have been sheared in compression using a newly developed Transparent Acrylic Cell Triaxial Testing (TACTT) system to investigate the geophysical and geomechanical behavior of sediments recovered from the deep seabed in the Eastern Nankai Trough, the first Japanese offshore production test region. The sediments were recovered by hybrid pressure core system (hybrid PCS) and pressure cores were cut by pressure core analysis tools (PCATs) on board. These pressure cores were transferred to the AIST Hokkaido centre and trimmed by pressure core non-destructive analysis tools (PNATs) for TACTT system which maintained the pressure and temperature conditions within the hydrate stability boundary, through the entire process of core handling from drilling to the end of laboratory testing. An image processing technique was used to capture the motion of sediment in a transparent acrylic cell, and digital photographs were obtained at every 0.1% of vertical strain during the test. Analysis of the optical images showed that sediments with 63% hydrate saturation exhibited brittle failure, although nonhydrate-bearing sediments exhibited ductile failure. In addition, the increase in shear strength with hydrate saturation increase of natural gas hydrate is in agreement with previous data from synthetic gas hydrate. This research was financially supported by the Research Consortium for Methane Hydrate Resources in Japan (MH21 Research Consortium) that carries out Japan's Methane Hydrate R&D Program by the Ministry of Economy, Trade and Industry (METI).

  2. Ernest Orlando Lawrence Berkeley National Laboratory institutional plan, FY 1996--2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    The FY 1996--2001 Institutional Plan provides an overview of the Ernest Orlando Lawrence Berkeley National Laboratory mission, strategic plan, core business areas, critical success factors, and the resource requirements to fulfill its mission in support of national needs in fundamental science and technology, energy resources, and environmental quality. The Laboratory Strategic Plan section identifies long-range conditions that will influence the Laboratory, as well as potential research trends and management implications. The Core Business Areas section identifies those initiatives that are potential new research programs representing major long-term opportunities for the Laboratory, and the resources required for their implementation. It alsomore » summarizes current programs and potential changes in research program activity, science and technology partnerships, and university and science education. The Critical Success Factors section reviews human resources; work force diversity; environment, safety, and health programs; management practices; site and facility needs; and communications and trust. The Resource Projections are estimates of required budgetary authority for the Laboratory`s ongoing research programs. The Institutional Plan is a management report for integration with the Department of Energy`s strategic planning activities, developed through an annual planning process. The plan identifies technical and administrative directions in the context of the national energy policy and research needs and the Department of Energy`s program planning initiatives. Preparation of the plan is coordinated by the Office of Planning and Communications from information contributed by the Laboratory`s scientific and support divisions.« less

  3. Ice Chemistry in Starless Molecular Cores

    NASA Astrophysics Data System (ADS)

    Kalvāns, J.

    2015-06-01

    Starless molecular cores are natural laboratories for interstellar molecular chemistry research. The chemistry of ices in such objects was investigated with a three-phase (gas, surface, and mantle) model. We considered the center part of five starless cores, with their physical conditions derived from observations. The ice chemistry of oxygen, nitrogen, sulfur, and complex organic molecules (COMs) was analyzed. We found that an ice-depth dimension, measured, e.g., in monolayers, is essential for modeling of chemistry in interstellar ices. Particularly, the H2O:CO:CO2:N2:NH3 ice abundance ratio regulates the production and destruction of minor species. It is suggested that photodesorption during the core-collapse period is responsible for the high abundance of interstellar H2O2 and O2H and other species synthesized on the surface. The calculated abundances of COMs in ice were compared to observed gas-phase values. Smaller activation barriers for CO and H2CO hydrogenation may help explain the production of a number of COMs. The observed abundance of methyl formate HCOOCH3 could be reproduced with a 1 kyr, 20 K temperature spike. Possible desorption mechanisms, relevant for COMs, are gas turbulence (ice exposure to interstellar photons) or a weak shock within the cloud core (grain collisions). To reproduce the observed COM abundances with the present 0D model, 1%-10% of ice mass needs to be sublimated. We estimate that the lifetime for starless cores likely does not exceed 1 Myr. Taurus cores are likely to be younger than their counterparts in most other clouds.

  4. A NEW METHOD TO QUANTIFY CORE TEMPERATURE INSTABILITY IN RODENTS.

    EPA Science Inventory

    Methods to quantify instability of autonomic systems such as temperature regulation should be important in toxicant and drug safety studies. Stability of core temperature (Tc) in laboratory rodents is susceptible to a variety of stimuli. Calculating the temperature differential o...

  5. Conducting a surveillance problem analysis on poor feedback from Reference Laboratory, Liberia, February 2016.

    PubMed

    Frimpong, Joseph Asamoah; Amo-Addae, Maame Pokuah; Adewuyi, Peter Adebayo; Hall, Casey Daniel; Park, Meeyoung Mattie; Nagbe, Thomas Knue

    2017-01-01

    The laboratory plays a major role in surveillance, including confirming the start and end of an outbreak. Knowing the causative agent for an outbreak informs the development of response strategies and management plans for a public health event. However, issues and challenges may arise that limit the effectiveness or efficiency of laboratories in surveillance. This case study applies a systematic approach to analyse gaps in laboratory surveillance, thereby improving the ability to mitigate these gaps. Although this case study concentrates on factors resulting in poor feedback from the laboratory, practise of this general approach to problem analysis will confer skills required in analysing most public health issues. This case study was developed based on a report submitted by the district surveillance officer in Grand Bassa County, Liberia, as a resident of the Liberian Frontline Field Epidemiology Training Program in 2016. This case study will serve as a training tool to reinforce lectures on surveillance problem analysis using the fishbone approach. It is designed for public health training in a classroom setting and can be completed within 2 hours 30 minutes.

  6. Optimal design for crosstalk analysis in 12-core 5-LP mode homogeneous multicore fiber for different lattice structure

    NASA Astrophysics Data System (ADS)

    Kumar, Dablu; Ranjan, Rakesh

    2018-03-01

    12-Core 5-LP mode homogeneous multicore fibers have been proposed for analysis of inter-core crosstalk and dispersion, with four different lattice structures (circular, 2-ring, square lattice, and triangular lattice) having cladding diameter of 200 μm and a fixed cladding thickness of 35 μm. The core-to-core crosstalk impact has been studied numerically with respect to bending radius, core pitch, transmission distance, wavelength, and core diameter for all 5-LP modes. In anticipation of further reduction in crosstalk levels, the trench-assisted cores have been incorporated for all respective designs. Ultra-low crosstalk (-138 dB/100 km) has been achieved through the triangular lattice arrangement, with trench depth Δ2 = -1.40% for fundamental (LP01) mode. It has been noted that the impact of mode polarization on crosstalk behavior is minor, with difference in crosstalk levels between two polarized spatial modes as ≤0.2 dB. Moreover, the optimized cladding diameter has been obtained for all 5-LP modes for a target value of crosstalk of -50 dB/100 km, with all the core arrangements. The dispersion characteristic has also been analyzed with respect to wavelength, which is nearly 2.5 ps/nm km at operating wavelength 1550 nm. The relative core multiplicity factor (RCMF) for the proposed design is obtained as 64.

  7. Ultra-trace analysis of 41Ca in urine by accelerator mass spectrometry: an inter-laboratory comparison

    PubMed Central

    Jackson, George S.; Hillegonds, Darren J.; Muzikar, Paul; Goehring, Brent

    2013-01-01

    A 41Ca interlaboratory comparison between Lawrence Livermore National Laboratory (LLNL) and the Purdue Rare Isotope Laboratory (PRIME Lab) has been completed. Analysis of the ratios assayed by accelerator mass spectrometry (AMS) shows that there is no statistically significant difference in the ratios. Further, Bayesian analysis shows that the uncertainties reported by both facilities are correct with the possibility of a slight under-estimation by one laboratory. Finally, the chemistry procedures used by the two facilities to produce CaF2 for the cesium sputter ion source are robust and don't yield any significant differences in the final result. PMID:24179312

  8. Measurement of unsaturated hydraulic properties and evaluation of property-transfer models for deep sedimentary interbeds, Idaho National Laboratory, Idaho

    USGS Publications Warehouse

    Perkins, Kimberlie; Johnson, Brittany D.; Mirus, Benjamin B.

    2014-01-01

    During 2013–14, the USGS, in cooperation with the U.S. Department of Energy, focused on further characterization of the sedimentary interbeds below the future site of the proposed Remote Handled Low-Level Waste (RHLLW) facility, which is intended for the long-term storage of low-level radioactive waste. Twelve core samples from the sedimentary interbeds from a borehole near the proposed facility were collected for laboratory analysis of hydraulic properties, which also allowed further testing of the property-transfer modeling approach. For each core sample, the steady-state centrifuge method was used to measure relations between matric potential, saturation, and conductivity. These laboratory measurements were compared to water-retention and unsaturated hydraulic conductivity parameters estimated using the established property-transfer models. For each core sample obtained, the agreement between measured and estimated hydraulic parameters was evaluated quantitatively using the Pearson correlation coefficient (r). The highest correlation is for saturated hydraulic conductivity (Ksat) with an r value of 0.922. The saturated water content (qsat) also exhibits a strong linear correlation with an r value of 0.892. The curve shape parameter (λ) has a value of 0.731, whereas the curve scaling parameter (yo) has the lowest r value of 0.528. The r values demonstrate that model predictions correspond well to the laboratory measured properties for most parameters, which supports the value of extending this approach for quantifying unsaturated hydraulic properties at various sites throughout INL.

  9. NCI Core Open House Shines Spotlight on Supportive Science and Basic Research | Poster

    Cancer.gov

    The lobby of Building 549 at NCI at Frederick bustled with activity for two hours on Tuesday, May 1, as several dozen scientists and staff gathered for the NCI Core Open House. The event aimed to encourage discussion and educate visitors about the capabilities of the cores, laboratories, and facilities that offer support to NCI’s Center for Cancer Research.

  10. 7 CFR 91.37 - Standard hourly fee rate for laboratory testing, analysis, and other services.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Standard hourly fee rate for laboratory testing, analysis, and other services. 91.37 Section 91.37 Agriculture Regulations of the Department of Agriculture... AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Fees and...

  11. The Synthetic Experiment: E. B. Titchener's Cornell Psychological Laboratory and the Test of Introspective Analysis.

    PubMed

    Evans, Rand B

    2017-01-01

    Beginning in 1 9a0, a major thread of research was added to E. B. Titchener's Cornell laboratory: the synthetic experiment. Titchener and his graduate students used introspective analysis to reduce a perception, a complex experience, into its simple sensory constituents. To test the validity of that analysis, stimulus patterns were selected to reprodiuce the patterns of sensations found in the introspective analyses. If the original perception can be reconstructed in this way, then the analysis was considered validated. This article reviews development of the synthetic method in E. B. Titchener's laboratory at Cornell University and examines its impact on psychological research.

  12. Development and analysis of a meteorological database, Argonne National Laboratory, Illinois

    USGS Publications Warehouse

    Over, Thomas M.; Price, Thomas H.; Ishii, Audrey L.

    2010-01-01

    A database of hourly values of air temperature, dewpoint temperature, wind speed, and solar radiation from January 1, 1948, to September 30, 2003, primarily using data collected at the Argonne National Laboratory station, was developed for use in continuous-time hydrologic modeling in northeastern Illinois. Missing and apparently erroneous data values were replaced with adjusted values from nearby stations used as 'backup'. Temporal variations in the statistical properties of the data resulting from changes in measurement and data-storage methodologies were adjusted to match the statistical properties resulting from the data-collection procedures that have been in place since January 1, 1989. The adjustments were computed based on the regressions between the primary data series from Argonne National Laboratory and the backup series using data obtained during common periods; the statistical properties of the regressions were used to assign estimated standard errors to values that were adjusted or filled from other series. Each hourly value was assigned a corresponding data-source flag that indicates the source of the value and its transformations. An analysis of the data-source flags indicates that all the series in the database except dewpoint have a similar fraction of Argonne National Laboratory data, with about 89 percent for the entire period, about 86 percent from 1949 through 1988, and about 98 percent from 1989 through 2003. The dewpoint series, for which observations at Argonne National Laboratory did not begin until 1958, has only about 71 percent Argonne National Laboratory data for the entire period, about 63 percent from 1948 through 1988, and about 93 percent from 1989 through 2003, indicating a lower reliability of the dewpoint sensor. A basic statistical analysis of the filled and adjusted data series in the database, and a series of potential evapotranspiration computed from them using the computer program LXPET (Lamoreux Potential

  13. Earth's core and inner-core resonances from analysis of VLBI nutation and superconducting gravimeter data

    NASA Astrophysics Data System (ADS)

    Rosat, S.; Lambert, S. B.; Gattano, C.; Calvo, M.

    2017-01-01

    Geophysical parameters of the deep Earth's interior can be evaluated through the resonance effects associated with the core and inner-core wobbles on the forced nutations of the Earth's figure axis, as observed by very long baseline interferometry (VLBI), or on the diurnal tidal waves, retrieved from the time-varying surface gravity recorded by superconducting gravimeters (SGs). In this paper, we inverse for the rotational mode parameters from both techniques to retrieve geophysical parameters of the deep Earth. We analyse surface gravity data from 15 SG stations and VLBI delays accumulated over the last 35 yr. We show existing correlations between several basic Earth parameters and then decide to inverse for the rotational modes parameters. We employ a Bayesian inversion based on the Metropolis-Hastings algorithm with a Markov-chain Monte Carlo method. We obtain estimates of the free core nutation resonant period and quality factor that are consistent for both techniques. We also attempt an inversion for the free inner-core nutation (FICN) resonant period from gravity data. The most probable solution gives a period close to the annual prograde term (or S1 tide). However the 95 per cent confidence interval extends the possible values between roughly 28 and 725 d for gravity, and from 362 to 414 d from nutation data, depending on the prior bounds. The precisions of the estimated long-period nutation and respective small diurnal tidal constituents are hence not accurate enough for a correct determination of the FICN complex frequency.

  14. New virtual laboratories presenting advanced motion control concepts

    NASA Astrophysics Data System (ADS)

    Goubej, Martin; Krejčí, Alois; Reitinger, Jan

    2015-11-01

    The paper deals with development of software framework for rapid generation of remote virtual laboratories. Client-server architecture is chosen in order to employ real-time simulation core which is running on a dedicated server. Ordinary web browser is used as a final renderer to achieve hardware independent solution which can be run on different target platforms including laptops, tablets or mobile phones. The provided toolchain allows automatic generation of the virtual laboratory source code from the configuration file created in the open- source Inkscape graphic editor. Three virtual laboratories presenting advanced motion control algorithms have been developed showing the applicability of the proposed approach.

  15. Low back pain in 17 countries, a Rasch analysis of the ICF core set for low back pain.

    PubMed

    Røe, Cecilie; Bautz-Holter, Erik; Cieza, Alarcos

    2013-03-01

    Previous studies indicate that a worldwide measurement tool may be developed based on the International Classification of Functioning Disability and Health (ICF) Core Sets for chronic conditions. The aim of the present study was to explore the possibility of constructing a cross-cultural measurement of functioning for patients with low back pain (LBP) on the basis of the Comprehensive ICF Core Set for LBP and to evaluate the properties of the ICF Core Set. The Comprehensive ICF Core Set for LBP was scored by health professionals for 972 patients with LBP from 17 countries. Qualifier levels of the categories, invariance across age, sex and countries, construct validity and the ordering of the categories in the components of body function, body structure, activities and participation were explored by Rasch analysis. The item-trait χ2-statistics showed that the 53 categories in the ICF Core Set for LBP did not fit the Rasch model (P<0.001). The main challenge was the invariance in the responses according to country. Analysis of the four countries with the largest sample sizes indicated that the data from Germany fit the Rasch model, and the data from Norway, Serbia and Kuwait in terms of the components of body functions and activities and participation also fit the model. The component of body functions and activity and participation had a negative mean location, -2.19 (SD 1.19) and -2.98 (SD 1.07), respectively. The negative location indicates that the ICF Core Set reflects patients with a lower level of function than the present patient sample. The present results indicate that it may be possible to construct a clinical measure of function on the basis of the Comprehensive ICF Core Set for LBP by calculating country-specific scores before pooling the data.

  16. BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.

    1981-06-01

    This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given.

  17. [caCORE: core architecture of bioinformation on cancer research in America].

    PubMed

    Gao, Qin; Zhang, Yan-lei; Xie, Zhi-yun; Zhang, Qi-peng; Hu, Zhang-zhi

    2006-04-18

    A critical factor in the advancement of biomedical research is the ease with which data can be integrated, redistributed and analyzed both within and across domains. This paper summarizes the Biomedical Information Core Infrastructure built by National Cancer Institute Center for Bioinformatics in America (NCICB). The main product from the Core Infrastructure is caCORE--cancer Common Ontologic Reference Environment, which is the infrastructure backbone supporting data management and application development at NCICB. The paper explains the structure and function of caCORE: (1) Enterprise Vocabulary Services (EVS). They provide controlled vocabulary, dictionary and thesaurus services, and EVS produces the NCI Thesaurus and the NCI Metathesaurus; (2) The Cancer Data Standards Repository (caDSR). It provides a metadata registry for common data elements. (3) Cancer Bioinformatics Infrastructure Objects (caBIO). They provide Java, Simple Object Access Protocol and HTTP-XML application programming interfaces. The vision for caCORE is to provide a common data management framework that will support the consistency, clarity, and comparability of biomedical research data and information. In addition to providing facilities for data management and redistribution, caCORE helps solve problems of data integration. All NCICB-developed caCORE components are distributed under open-source licenses that support unrestricted usage by both non-profit and commercial entities, and caCORE has laid the foundation for a number of scientific and clinical applications. Based on it, the paper expounds caCORE-base applications simply in several NCI projects, of which one is CMAP (Cancer Molecular Analysis Project), and the other is caBIG (Cancer Biomedical Informatics Grid). In the end, the paper also gives good prospects of caCORE, and while caCORE was born out of the needs of the cancer research community, it is intended to serve as a general resource. Cancer research has historically

  18. Nano-catalysts with Magnetic Core: Sustainable Options for Greener Synthesis

    EPA Science Inventory

    Author’s perspective on nano-catalysts with magnetic core is summarized with recent work from his laboratory. Magnetically recyclable nano-catalysts and their use in benign media is an ideal blend for the development of sustainable methodologies in organic synthesis. Water or pol...

  19. Micro-positron emission tomography for measuring sub-core scale single and multiphase transport parameters in porous media

    NASA Astrophysics Data System (ADS)

    Zahasky, Christopher; Benson, Sally M.

    2018-05-01

    Accurate descriptions of heterogeneity in porous media are important for understanding and modeling single phase (e.g. contaminant transport, saltwater intrusion) and multiphase (e.g. geologic carbon storage, enhanced oil recovery) transport problems. Application of medical imaging to experimentally quantify these processes has led to significant progress in material characterization and understanding fluid transport behavior at laboratory scales. While widely utilized in cancer diagnosis and management, cardiology, and neurology, positron emission tomography (PET) has had relatively limited applications in earth science. This study utilizes a small-bore micro-PET scanner to image and quantify the transport behavior of pulses of a conservative aqueous radiotracer injected during single and multiphase flow experiments in two heterogeneous Berea sandstone cores. The cores are discretized into axial-parallel streamtubes, and using the reconstructed micro-PET data, expressions are derived from spatial moment analysis for calculating sub-core tracer flux and pore water velocity. Using the flux and velocity measurements, it is possible to calculate porosity and saturation from volumetric flux balance, and calculate permeability and water relative permeability from Darcy's law. Second spatial moment analysis enables measurement of sub-core solute dispersion during both single phase and multiphase experiments. A numerical simulation model is developed to verify the assumptions of the streamtube dimension reduction technique. A variation of the reactor ratio is presented as a diagnostic metric to efficiently determine the validity of the streamtube approximation in core and column-scale experiments. This study introduces a new method to quantify sub-core permeability, relative permeability, and dispersion. These experimental and analytical methods provide a foundation for future work on experimental measurements of differences in transport behavior across scales.

  20. The AVRDC - The World Vegetable Center mungbean (Vigna radiata) core and mini core collections.

    PubMed

    Schafleitner, Roland; Nair, Ramakrishnan Madhavan; Rathore, Abhishek; Wang, Yen-wei; Lin, Chen-yu; Chu, Shu-hui; Lin, Pin-yun; Chang, Jian-Cheng; Ebert, Andreas W

    2015-04-29

    Large ex situ germplasm collections generally harbor a wide range of crop diversity. AVRDC--The World Vegetable Center is holding in trust the world's second largest mungbean (Vigna radiata) germplasm collection with more than 6,700 accessions. Screening large collections for traits of interest is laborious and expensive. To enhance the access of breeders to the diversity of the crop, mungbean core and mini core collections have been established. The core collection of 1,481 entries has been built by random selection of 20% of the accessions after geographical stratification and subsequent cluster analysis of eight phenotypic descriptors in the whole collection. Summary statistics, especially the low differences of means, equal variance of the traits in both the whole and core collection and the visual inspection of quantile-quantile plots comparing the variation of phenotypic traits present in both collections indicated that the core collection well represented the pattern of diversity of the whole collection. The core collection was genotyped with 20 simple sequence repeat markers and a mini core set of 289 accessions was selected, which depicted the allele and genotype diversity of the core collection. The mungbean core and mini core collections plus their phenotypic and genotypic data are available for distribution to breeders. It is expected that these collections will enhance the access to biodiverse mungbean germplasm for breeding.

  1. Rapid Construction of a Benzo-Fused Indoxamycin Core Enabled by Site-Selective C-H Functionalizations.

    PubMed

    Bedell, T Aaron; Hone, Graham A B; Valette, Damien; Yu, Jin-Quan; Davies, Huw M L; Sorensen, Erik J

    2016-07-11

    Methods for functionalizing carbon-hydrogen bonds are featured in a new synthesis of the tricyclic core architecture that characterizes the indoxamycin family of secondary metabolites. A unique collaboration between three laboratories has engendered a design for synthesis featuring two sequential C-H functionalization reactions, namely a diastereoselective dirhodium carbene insertion followed by an ester-directed oxidative Heck cyclization, to rapidly assemble the congested tricyclic core of the indoxamycins. This project exemplifies how multi-laboratory collaborations can foster conceptually novel approaches to challenging problems in chemical synthesis. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores.

    PubMed

    Chikkagoudar, Satish; Wang, Kai; Li, Mingyao

    2011-05-26

    Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  3. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    PubMed Central

    2011-01-01

    Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/. PMID:21615923

  4. Kinetic analysis of bypass of abasic site by the catalytic core of yeast DNA polymerase eta.

    PubMed

    Yang, Juntang; Wang, Rong; Liu, Binyan; Xue, Qizhen; Zhong, Mengyu; Zeng, Hao; Zhang, Huidong

    2015-09-01

    Abasic sites (Apurinic/apyrimidinic (AP) sites), produced ∼ 50,000 times/cell/day, are very blocking and miscoding. To better understand miscoding mechanisms of abasic site for yeast DNA polymerase η, pre-steady-state nucleotide incorporation and LC-MS/MS sequence analysis of extension product were studied using pol η(core) (catalytic core, residues 1-513), which can completely eliminate the potential effects of the C-terminal C2H2 motif of pol η on dNTP incorporation. The extension beyond the abasic site was very inefficient. Compared with incorporation of dCTP opposite G, the incorporation efficiencies opposite abasic site were greatly reduced according to the order of dGTP > dATP > dCTP and dTTP. Pol η(core) showed no fast burst phase for any incorporation opposite G or abasic site, suggesting that the catalytic step is not faster than the dissociation of polymerase from DNA. LC-MS/MS sequence analysis of extension products showed that 53% products were dGTP misincorporation, 33% were dATP and 14% were -1 frameshift, indicating that Pol η(core) bypasses abasic site by a combined G-rule, A-rule and -1 frameshift deletions. Compared with full-length pol η, pol η(core) relatively reduced the efficiency of incorporation of dCTP opposite G, increased the efficiencies of dNTP incorporation opposite abasic site and the exclusive incorporation of dGTP opposite abasic site, but inhibited the extension beyond abasic site, and increased the priority in extension of A: abasic site relative to G: abasic site. This study provides further understanding in the mutation mechanism of abasic sites for yeast DNA polymerase η. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Stack-and-Draw Manufacture Process of a Seven-Core Optical Fiber for Fluorescence Measurements

    NASA Astrophysics Data System (ADS)

    Samir, Ahmed; Batagelj, Bostjan

    2018-01-01

    Multi-core, optical-fiber technology is expected to be used in telecommunications and sensory systems in a relatively short amount of time. However, a successful transition from research laboratories to industry applications will only be possible with an optimized design and manufacturing process. The fabrication process is an important aspect in designing and developing new multi-applicable, multi-core fibers, where the best candidate is a seven-core fiber. Here, the basics for designing and manufacturing a single-mode, seven-core fiber using the stack-and-draw process is described for the example of a fluorescence sensory system.

  6. A motional Stark effect diagnostic analysis routine for improved resolution of iota in the core of the large helical device.

    PubMed

    Dobbins, T J; Ida, K; Suzuki, C; Yoshinuma, M; Kobayashi, T; Suzuki, Y; Yoshida, M

    2017-09-01

    A new Motional Stark Effect (MSE) analysis routine has been developed for improved spatial resolution in the core of the Large Helical Device (LHD). The routine was developed to reduce the dependency of the analysis on the Pfirsch-Schlüter (PS) current in the core. The technique used the change in the polarization angle as a function of flux in order to find the value of diota/dflux at each measurement location. By integrating inwards from the edge, the iota profile can be recovered from this method. This reduces the results' dependency on the PS current because the effect of the PS current on the MSE measurement is almost constant as a function of flux in the core; therefore, the uncertainty in the PS current has a minimal effect on the calculation of the iota profile. In addition, the VMEC database was remapped from flux into r/a space by interpolating in mode space in order to improve the database core resolution. These changes resulted in a much smoother iota profile, conforming more to the physics expectations of standard discharge scenarios in the core of the LHD.

  7. This Is not Participatory Design - A Critical Analysis of Eight Living Laboratories.

    PubMed

    Bygholm, Ann; Kanstrup, Anne Marie

    2017-01-01

    Design of Health Technology for elderly and care personnel has a high priority because of a severe increase of elderly citizens in need of health care combined with a decrease of resources in the health care sector. Desires for maintaining and improving the quality of care while reducing costs has resulted in a search for approaches that support co-operation between technology designers, elderly persons and health care professionals on innovating future care technology. Living laboratories, where areas of a care environment are transformed into a so-called platform for technology innovation, are popular. Expectations for living laboratories are high but examinations of how such laboratories support the intended participatory innovation are few. This paper presents and examines eight living laboratories set up in Danish nursing homes for technology innovation. We present the notion of a living laboratory and explicate the aspirations and expectations of this approach, and discuss why these expectations are hard to meet both on a general level and in the investigated labs. We question the basic assumptions of the possibility of reconciling the different interests of the stakeholders involved. In our analysis we focus on users in the living laboratories. We use guiding principles developed within Participatory Design to reveal the role and participation of the users - the health care professionals and the elderly - in the eight living laboratories. In general, these users played a minor role, in the labs where technical problems turned out to be main activity. We conclude that living laboratories do not nullify different/conflicting interests and that a real-life setting by itself is no guarantee for user participation.

  8. Biomechanical Evaluation of a Tooth Restored with High Performance Polymer PEKK Post-Core System: A 3D Finite Element Analysis.

    PubMed

    Lee, Ki-Sun; Shin, Joo-Hee; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Won-Chang; Shin, Sang-Wan; Lee, Jeong-Yol

    2017-01-01

    The aim of this study was to evaluate the biomechanical behavior and long-term safety of high performance polymer PEKK as an intraradicular dental post-core material through comparative finite element analysis (FEA) with other conventional post-core materials. A 3D FEA model of a maxillary central incisor was constructed. A cyclic loading force of 50 N was applied at an angle of 45° to the longitudinal axis of the tooth at the palatal surface of the crown. For comparison with traditionally used post-core materials, three materials (gold, fiberglass, and PEKK) were simulated to determine their post-core properties. PEKK, with a lower elastic modulus than root dentin, showed comparably high failure resistance and a more favorable stress distribution than conventional post-core material. However, the PEKK post-core system showed a higher probability of debonding and crown failure under long-term cyclic loading than the metal or fiberglass post-core systems.

  9. Biomechanical Evaluation of a Tooth Restored with High Performance Polymer PEKK Post-Core System: A 3D Finite Element Analysis

    PubMed Central

    Shin, Joo-Hee; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Won-Chang; Shin, Sang-Wan

    2017-01-01

    The aim of this study was to evaluate the biomechanical behavior and long-term safety of high performance polymer PEKK as an intraradicular dental post-core material through comparative finite element analysis (FEA) with other conventional post-core materials. A 3D FEA model of a maxillary central incisor was constructed. A cyclic loading force of 50 N was applied at an angle of 45° to the longitudinal axis of the tooth at the palatal surface of the crown. For comparison with traditionally used post-core materials, three materials (gold, fiberglass, and PEKK) were simulated to determine their post-core properties. PEKK, with a lower elastic modulus than root dentin, showed comparably high failure resistance and a more favorable stress distribution than conventional post-core material. However, the PEKK post-core system showed a higher probability of debonding and crown failure under long-term cyclic loading than the metal or fiberglass post-core systems. PMID:28386547

  10. The Effects of Acute Stress on Core Executive Functions: A Meta-Analysis and Comparison with Cortisol

    PubMed Central

    Shields, Grant S.; Sazma, Matthew A.; Yonelinas, Andrew P.

    2016-01-01

    Core executive functions such as working memory, inhibition, and cognitive flexibility are integral to daily life. A growing body of research has suggested that acute stress may impair core executive functions. However, there are a number of inconsistencies in the literature, leading to uncertainty about how or even if acute stress influences core executive functions. We addressed this by conducting a meta-analysis of acute stress effects on working memory, inhibition, and cognitive flexibility. We found that stress impaired working memory and cognitive flexibility, whereas it had nuanced effects on inhibition. Many of these effects were moderated by other variables, such as sex. In addition, we compared effects of acute stress on core executive functions to effects of cortisol administration and found some striking differences. Our findings indicate that stress works through mechanisms aside from or in addition to cortisol to produce a state characterized by more reactive processing of salient stimuli but greater control over actions. We conclude by highlighting some important future directions for stress and executive function research. PMID:27371161

  11. Analysis of the Gas Core Actinide Transmutation Reactor (GCATR)

    NASA Technical Reports Server (NTRS)

    Clement, J. D.; Rust, J. H.

    1977-01-01

    Design power plant studies were carried out for two applications of the plasma core reactor: (1) As a breeder reactor, (2) As a reactor able to transmute actinides effectively. In addition to the above applications the reactor produced electrical power with a high efficiency. A reactor subsystem was designed for each of the two applications. For the breeder reactor, neutronics calculations were carried out for a U-233 plasma core with a molten salt breeding blanket. A reactor was designed with a low critical mass (less than a few hundred kilograms U-233) and a breeding ratio of 1.01. The plasma core actinide transmutation reactor was designed to transmute the nuclear waste from conventional LWR's. The spent fuel is reprocessed during which 100% of Np, Am, Cm, and higher actinides are separated from the other components. These actinides are then manufactured as oxides into zirconium clad fuel rods and charged as fuel assemblies in the reflector region of the plasma core actinide transmutation reactor. In the equilibrium cycle, about 7% of the actinides are directly fissioned away, while about 31% are removed by reprocessing.

  12. Forensic Analysis of Canine DNA Samples in the Undergraduate Biochemistry Laboratory

    ERIC Educational Resources Information Center

    Carson, Tobin M.; Bradley, Sharonda Q.; Fekete, Brenda L.; Millard, Julie T.; LaRiviere, Frederick J.

    2009-01-01

    Recent advances in canine genomics have allowed the development of highly distinguishing methods of analysis for both nuclear and mitochondrial DNA. We describe a laboratory exercise suitable for an undergraduate biochemistry course in which the polymerase chain reaction is used to amplify hypervariable regions of DNA from dog hair and saliva…

  13. Key findings and remaining questions in the areas of core-concrete interaction and debris coolability

    DOE PAGES

    Farmer, M. T.; Gerardi, C.; Bremer, N.; ...

    2016-10-31

    The reactor accidents at Fukushima-Dai-ichi have rekindled interest in late phase severe accident behavior involving reactor pressure vessel breach and discharge of molten core melt into the containment. Two technical issues of interest in this area include core-concrete interaction and the extent to which the core debris may be quenched and rendered coolable by top flooding. The OECD-sponsored Melt Coolability and Concrete Interaction (MCCI) programs at Argonne National Laboratory included the conduct of large scale reactor material experiments and associated analysis with the objectives of resolving the ex-vessel debris coolability issue, and to address remaining uncertainties related to long-term two-dimensionalmore » molten core-concrete interactions under both wet and dry cavity conditions. These tests provided a broad database to support accident management planning, as well as the development and validation of models and codes that can be used to extrapolate the experiment results to plant conditions. This paper provides a high level overview of the key experiment results obtained during the program. Finally, a discussion is also provided that describes technical gaps that remain in this area, several of which have arisen based on the sequence of events and operator actions during Fukushima.« less

  14. Key findings and remaining questions in the areas of core-concrete interaction and debris coolability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, M. T.; Gerardi, C.; Bremer, N.

    The reactor accidents at Fukushima-Dai-ichi have rekindled interest in late phase severe accident behavior involving reactor pressure vessel breach and discharge of molten core melt into the containment. Two technical issues of interest in this area include core-concrete interaction and the extent to which the core debris may be quenched and rendered coolable by top flooding. The OECD-sponsored Melt Coolability and Concrete Interaction (MCCI) programs at Argonne National Laboratory included the conduct of large scale reactor material experiments and associated analysis with the objectives of resolving the ex-vessel debris coolability issue, and to address remaining uncertainties related to long-term two-dimensionalmore » molten core-concrete interactions under both wet and dry cavity conditions. These tests provided a broad database to support accident management planning, as well as the development and validation of models and codes that can be used to extrapolate the experiment results to plant conditions. This paper provides a high level overview of the key experiment results obtained during the program. Finally, a discussion is also provided that describes technical gaps that remain in this area, several of which have arisen based on the sequence of events and operator actions during Fukushima.« less

  15. Destructive analysis capabilities for plutonium and uranium characterization at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R

    Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less

  16. High Temperature Reactor (HTR) Deep Burn Core and Fuel Analysis: Design Selection for the Prismatic Block Reactor With Results from FY-2011 Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael A. Pope

    2011-10-01

    The Deep Burn (DB) Project is a U.S. Department of Energy sponsored feasibility study of Transuranic Management using high burnup fuel in the high temperature helium cooled reactor (HTR). The DB Project consists of seven tasks: project management, core and fuel analysis, spent fuel management, fuel cycle integration, TRU fuel modeling, TRU fuel qualification, and HTR fuel recycle. In the Phase II of the Project, we conducted nuclear analysis of TRU destruction/utilization in the HTR prismatic block design (Task 2.1), deep burn fuel/TRISO microanalysis (Task 2.3), and synergy with fast reactors (Task 4.2). The Task 2.1 covers the core physicsmore » design, thermo-hydraulic CFD analysis, and the thermofluid and safety analysis (low pressure conduction cooling, LPCC) of the HTR prismatic block design. The Task 2.3 covers the analysis of the structural behavior of TRISO fuel containing TRU at very high burnup level, i.e. exceeding 50% of FIMA. The Task 4.2 includes the self-cleaning HTR based on recycle of HTR-generated TRU in the same HTR. Chapter IV contains the design and analysis results of the 600MWth DB-HTR core physics with the cycle length, the average discharged burnup, heavy metal and plutonium consumptions, radial and axial power distributions, temperature reactivity coefficients. Also, it contains the analysis results of the 450MWth DB-HTR core physics and the analysis of the decay heat of a TRU loaded DB-HTR core. The evaluation of the hot spot fuel temperature of the fuel block in the DB-HTR (Deep-Burn High Temperature Reactor) core under full operating power conditions are described in Chapter V. The investigated designs are the 600MWth and 460MWth DB-HTRs. In Chapter VI, the thermo-fluid and safety of the 600MWth DB-HTRs has been analyzed to investigate a thermal-fluid design performance at the steady state and a passive safety performance during an LPCC event. Chapter VII describes the analysis results of the TRISO fuel microanalysis of the 600MWth

  17. Inter-laboratory exercise on antibiotic drugs analysis in aqueous samples.

    PubMed

    Roig, B; Brogat, M; Mompelat, S; Leveque, J; Cadiere, A; Thomas, O

    2012-08-30

    An inter-laboratory exercise was organized under the PHARMAS EU project, by the Advanced School of Public Health (EHESP), in order to evaluate the performances of analytical methods for the measurement of antibiotics in waters (surface and tap). This is the first time such an exercise on antibiotics has been organized in Europe, using different kinds of analytical methods and devices. In this exercise thirteen laboratories from five countries (Canada, France, Italy, the Netherlands and Portugal) participated, and a total number of 78 samples were distributed. During the exercise, 2 testing samples (3 bottles of each) prepared from tap water and river water, respectively, spiked with antibiotics, were sent to participants and analyzed over a period of one month. A final number of 77 (98.7%) testing samples were considered. Depending on substances studied by each participant, 305 values in duplicate were collected, with the results for each sample being expressed as the target concentration. A statistical study was initiated using 611 results. The mean value, standard deviation, coefficient of variation, standard uncertainty of the mean, median, the minimum and maximum values of each series as well as the 95% confidence interval were obtained from each participant laboratory. In this exercise, 36 results (6% of accounted values) were outliers according to the distribution over the median (box plot). The outlier results were excluded. In order to establish the stability of testing samples in the course of the exercise, differences between variances obtained for every type of sample at different intervals were evaluated. The results showed no representative variations and it can be considered that all samples were stable during the exercise. The goals of this inter-laboratory study were to assess results variability when analysis is conducted by different laboratories, to evaluate the influence of different matrix samples, and to determine the rate at which

  18. Measuring laboratory-based influenza surveillance capacity: development of the 'International Influenza Laboratory Capacity Review' Tool.

    PubMed

    Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C

    2016-01-01

    The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Idaho National Laboratory Quarterly Performance Analysis - 2nd Quarter FY2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lisbeth A. Mitchell

    2014-06-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of occurrence reports and other deficiency reports (including not reportable events) identified at INL from January 2014 through March 2014.

  20. Trace-element analyses of core samples from the 1967-1988 drillings of Kilauea Iki lava lake, Hawaii

    USGS Publications Warehouse

    Helz, Rosalind Tuthill

    2012-01-01

    This report presents previously unpublished analyses of trace elements in drill core samples from Kilauea Iki lava lake and from the 1959 eruption that fed the lava lake. The two types of data presented were obtained by instrumental neutron-activation analysis (INAA) and energy-dispersive X-ray fluorescence analysis (EDXRF). The analyses were performed in U.S. Geological Survey (USGS) laboratories from 1989 to 1994. This report contains 93 INAA analyses on 84 samples and 68 EDXRF analyses on 68 samples. The purpose of the study was to document trace-element variation during chemical differentiation, especially during the closed-system differentiation of Kilauea Iki lava lake.

  1. Microfluidic Gel Electrophoresis in the Undergraduate Laboratory Applied to Food Analysis

    ERIC Educational Resources Information Center

    Chao, Tzu-Chiao; Bhattacharya, Sanchari; Ros, Alexandra

    2012-01-01

    A microfluidics-based laboratory experiment for the analysis of DNA fragments in an analytical undergraduate course is presented. The experiment is set within the context of food species identification via amplified DNA fragments. The students are provided with berry samples from which they extract DNA and perform polymerase chain reaction (PCR)…

  2. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dionne, B.J.; Morris, S.C. III; Baum, J.W.

    1998-01-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example ofmore » a risk-based decision technique. This document contains the Appendices for the report.« less

  3. Subannual layer variability in Greenland firn cores

    NASA Astrophysics Data System (ADS)

    Kjær, Helle Astrid; Vallelonga, Paul; Vinther, Bo; Winstrup, Mai; Simonsen, Marius; Maffezzoli, Niccoló; Jensen, Camilla Marie

    2017-04-01

    Ice cores are used to infer information about the past and modern techniques allow for high resolution (< cm) continuous flow analysis (CFA) of the ice. Such analysis is often used to inform on annual layers to constrain dating of ice cores, but can also be extended to provide information on sub-annual deposition patterns. In this study we use available high resolution data from multiple shallow cores around Greenland to investigate the seasonality and trends in the most often continuously measured components sodium, insoluble dust, calcium, ammonium and conductivity (or acidity) from 1800 AD to today. We evaluate the similarities and differences between the records and discuss the causes from different sources and transport to deposition and post-deposition effects over differences in measurement set up. Further we add to the array of cores already published with measurements from the newly drilled ReCAP ice core from a coastal ice cap in eastern Greenland and from a shallow core drilled at the high accumulation site at the Greenland South Dome.

  4. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  5. Development, Implementation, and Analysis of a National Survey of Faculty Goals for Undergraduate Chemistry Laboratory

    ERIC Educational Resources Information Center

    Bruck, Aaron D.; Towns, Marcy

    2013-01-01

    This work reports the development of a survey for laboratory goals in undergraduate chemistry, the analysis of reliable and valid data collected from a national survey of college chemistry faculty, and a synthesis of the findings. The study used a sequential exploratory mixed-methods design. Faculty goals for laboratory emerged across seven…

  6. Cores Of Recurrent Events (CORE) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    CORE is a statistically supported computational method for finding recurrently targeted regions in massive collections of genomic intervals, such as those arising from DNA copy number analysis of single tumor cells or bulk tumor tissues.

  7. An analysis of high school students' perceptions and academic performance in laboratory experiences

    NASA Astrophysics Data System (ADS)

    Mirchin, Robert Douglas

    This research study is an investigation of student-laboratory (i.e., lab) learning based on students' perceptions of experiences using questionnaire data and evidence of their science-laboratory performance based on paper-and-pencil assessments using Maryland-mandated criteria, Montgomery County Public Schools (MCPS) criteria, and published laboratory questions. A 20-item questionnaire consisting of 18 Likert-scale items and 2 open-ended items that addressed what students liked most and least about lab was administered to students before labs were observed. A pre-test and post-test assessing laboratory achievement were administered before and after the laboratory experiences. The three labs observed were: soda distillation, stoichiometry, and separation of a mixture. Five significant results or correlations were found. For soda distillation, there were two positive correlations. Student preference for analyzing data was positively correlated with achievement on the data analysis dimension of the lab rubric. A student preference for using numbers and graphs to analyze data was positively correlated with achievement on the analysis dimension of the lab rubric. For the separating a mixture lab data the following pairs of correlations were significant. Student preference for doing chemistry labs where numbers and graphs were used to analyze data had a positive correlation with writing a correctly worded hypothesis. Student responses that lab experiences help them learn science positively correlated with achievement on the data dimension of the lab rubric. The only negative correlation found related to the first result where students' preference for computers was inversely correlated to their performance on analyzing data on their lab report. Other findings included the following: students like actual experimental work most and the write-up and analysis of a lab the least. It is recommended that lab science instruction be inquiry-based, hands-on, and that students be

  8. Vibroacoustic Characterization of Corrugated-Core and Honeycomb-Core Sandwich Panels

    NASA Technical Reports Server (NTRS)

    Allen, Albert; Schiller, Noah

    2016-01-01

    The vibroacoustic characteristics of two candidate launch vehicle fairing structures, corrugated- core and honeycomb-core sandwich designs, were studied. The study of these structures has been motivated by recent risk reduction efforts focused on mitigating high noise levels within the payload bays of large launch vehicles during launch. The corrugated-core sandwich concept is of particular interest as a dual purpose structure due to its ability to harbor resonant noise control systems without appreciably adding mass or taking up additional volume. Specifically, modal information, wavelength dispersion, and damping were determined from a series of vibrometer measurements and subsequent analysis procedures carried out on two test panels. Numerical and analytical modeling techniques were also used to assess assumed material properties and to further illuminate underlying structural dynamic aspects. Results from the tests and analyses described herein may serve as a reference for additional vibroacoustic studies involving these or similar structures.

  9. Efficiency of static core turn-off in a system-on-a-chip with variation

    DOEpatents

    Cher, Chen-Yong; Coteus, Paul W; Gara, Alan; Kursun, Eren; Paulsen, David P; Schuelke, Brian A; Sheets, II, John E; Tian, Shurong

    2013-10-29

    A processor-implemented method for improving efficiency of a static core turn-off in a multi-core processor with variation, the method comprising: conducting via a simulation a turn-off analysis of the multi-core processor at the multi-core processor's design stage, wherein the turn-off analysis of the multi-core processor at the multi-core processor's design stage includes a first output corresponding to a first multi-core processor core to turn off; conducting a turn-off analysis of the multi-core processor at the multi-core processor's testing stage, wherein the turn-off analysis of the multi-core processor at the multi-core processor's testing stage includes a second output corresponding to a second multi-core processor core to turn off; comparing the first output and the second output to determine if the first output is referring to the same core to turn off as the second output; outputting a third output corresponding to the first multi-core processor core if the first output and the second output are both referring to the same core to turn off.

  10. A Sketch of the Taiwan Zebrafish Core Facility.

    PubMed

    You, May-Su; Jiang, Yun-Jin; Yuh, Chiou-Hwa; Wang, Chien-Ming; Tang, Chih-Hao; Chuang, Yung-Jen; Lin, Bo-Hung; Wu, Jen-Leih; Hwang, Sheng-Ping L

    2016-07-01

    In the past three decades, the number of zebrafish laboratories has significantly increased in Taiwan. The Taiwan Zebrafish Core Facility (TZCF), a government-funded core facility, was launched to serve this growing community. The Core Facility was built on two sites, one located at the National Health Research Institutes (NHRI, called Taiwan Zebrafish Core Facility at NHRI or TZeNH) and the other is located at the Academia Sinica (Taiwan Zebrafish Core Facility at AS a.k.a. TZCAS). The total surface area of the TZCF is about 180 m(2) encompassing 2880 fish tanks. Each site has a separate quarantine room and centralized water recirculating systems, monitoring key water parameters. To prevent diseases, three main strategies have been implemented: (1) imported fish must be quarantined; (2) only bleached embryos are introduced into the main facilities; and (3) working practices were implemented to minimize pathogen transfer between stocks and facilities. Currently, there is no health program in place; however, a fourth measure for the health program, specific regular pathogen tests, is being planned. In March 2015, the TZCF at NHRI has been AAALAC accredited. It is our goal to ensure that we provide "disease-free" fish and embryos to the Taiwanese research community.

  11. [SWOT analysis of laboratory certification and accreditation on detection of parasitic diseases].

    PubMed

    Xiong, Yan-hong; Zheng, Bin

    2014-04-01

    This study analyzes the strength, weakness, opportunity and threat (SWOT) of laboratory certification and accreditation on detection of parasitic diseases by SWOT analysis comprehensively, and it puts forward some development strategies specifically, in order to provide some indicative references for the further development.

  12. Thermo-Physics Technical Note No. 60: thermal analysis of SNAP 10A reactor core during atmospheric reentry and resulting core disintegration and fuel element separation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mouradian, E.M.

    1966-02-16

    A thermal analysis is carried out to determine the temperature distribution throughout a SNAP 10A reactor core, particularly in the vicinity of the grid plates, during atmospheric reentry. The transient temperatue distribution of the grid plate indicates when sufficient melting occurs so that fuel elements are free to be released and continue their descent individually.

  13. High-sensitivity O-glycomic analysis of mice deficient in core 2 β1,6-N-acetylglucosaminyltransferases

    PubMed Central

    Ismail, Mohd Nazri; Stone, Erica L; Panico, Maria; Lee, Seung Ho; Luu, Ying; Ramirez, Kevin; Ho, Samuel B; Fukuda, Minoru; Marth, Jamey D; Haslam, Stuart M; Dell, Anne

    2011-01-01

    Core 2 β1,6-N-acetylglucosaminyltransferase (C2GnT), which exists in three isoforms, C2GnT1, C2GnT2 and C2GnT3, is one of the key enzymes in the O-glycan biosynthetic pathway. These isoenzymes produce core 2 O-glycans and have been correlated with the biosynthesis of core 4 O-glycans and I-branches. Previously, we have reported mice with single and multiple deficiencies of C2GnT isoenzyme(s) and have evaluated the biological and structural consequences of the loss of core 2 function. We now present more comprehensive O-glycomic analyses of neutral and sialylated glycans expressed in the colon, small intestine, stomach, kidney, thyroid/trachea and thymus of wild-type, C2GnT2 and C2GnT3 single knockouts and the C2GnT1–3 triple knockout mice. Very high-quality data have emerged from our mass spectrometry techniques with the capability of detecting O-glycans up to at least 3500 Da. We were able to unambiguously elucidate the types of O-glycan core, branching location and residue linkages, which allowed us to exhaustively characterize structural changes in the knockout tissues. The C2GnT2 knockout mice suffered a major loss of core 2 O-glycans as well as glycans with I-branches on core 1 antennae especially in the stomach and the colon. In contrast, core 2 O-glycans still dominated the O-glycomic profile of most tissues in the C2GnT3 knockout mice. Analysis of the C2GnT triple knockout mice revealed a complete loss of both core 2 O-glycans and branched core 1 antennae, confirming that the three known isoenzymes are entirely responsible for producing these structures. Unexpectedly, O-linked mannosyl glycans are upregulated in the triple deficient stomach. In addition, our studies have revealed an interesting terminal structure detected on O-glycans of the colon tissues that is similar to the RM2 antigen from glycolipids. PMID:20855471

  14. Beginning Plant Biotechnology Laboratories Using Fast Plants.

    ERIC Educational Resources Information Center

    Williams, Mike

    This set of 16 laboratory activities is designed to illustrate the life cycle of Brassicae plants from seeds in pots to pods in 40 days. At certain points along the production cycle of the central core of labs, there are related lateral labs to provide additional learning opportunities employing this family of plants, referred to as "fast…

  15. 78 FR 300 - IDEXX Laboratories, Inc.; Analysis of Proposed Consent Order To Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-03

    ... FEDERAL TRADE COMMISSION [File No. 101 0023] IDEXX Laboratories, Inc.; Analysis of Proposed... or deceptive acts or practices or unfair methods of competition. The attached Analysis to Aid Public.... The following Analysis to Aid Public Comment describes the terms of the consent agreement, and the...

  16. Permeability analysis of Asbuton material used as core layers of water resistance in the body of dam

    NASA Astrophysics Data System (ADS)

    Rahim, H.; Tjaronge, M. W.; Thaha, A.; Djamaluddin, R.

    2017-11-01

    In order to increase consumption of the local materials and national products, large reserves of Asbuton material about 662.960 million tons in the Buton Islands became an alternative as a waterproof core layer in the body of dam. The Asbuton material was used in this research is Lawele Granular Asphalt (LGA). This study was an experimental study conducted in the laboratory by conducting density testing (content weight) and permeability on Asbuton material. Testing of the Asbuton material used Falling Head method to find out the permeability value of Asbuton material. The data of test result to be analyzed are the relation between compaction energy and density value also relation between density value and permeability value of Asbuton material. The result shows that increases the number of blow apply to the Asbuton material at each layer will increase the density of the Asbuton material. The density value of Asbuton material that satisfies the requirements for use as an impermeable core layer in the dam body is 1.53 grams/cm3. The increase the density value (the weight of the contents) of the Asbuton material will reduce its permeability value of the Asbuton material.

  17. V-Sipal - a Virtual Laboratory for Satellite Image Processing and Analysis

    NASA Astrophysics Data System (ADS)

    Buddhiraju, K. M.; Eeti, L.; Tiwari, K. K.

    2011-09-01

    In this paper a virtual laboratory for the Satellite Image Processing and Analysis (v-SIPAL) being developed at the Indian Institute of Technology Bombay is described. v-SIPAL comprises a set of experiments that are normally carried out by students learning digital processing and analysis of satellite images using commercial software. Currently, the experiments that are available on the server include Image Viewer, Image Contrast Enhancement, Image Smoothing, Edge Enhancement, Principal Component Transform, Texture Analysis by Co-occurrence Matrix method, Image Indices, Color Coordinate Transforms, Fourier Analysis, Mathematical Morphology, Unsupervised Image Classification, Supervised Image Classification and Accuracy Assessment. The virtual laboratory includes a theory module for each option of every experiment, a description of the procedure to perform each experiment, the menu to choose and perform the experiment, a module on interpretation of results when performed with a given image and pre-specified options, bibliography, links to useful internet resources and user-feedback. The user can upload his/her own images for performing the experiments and can also reuse outputs of one experiment in another experiment where applicable. Some of the other experiments currently under development include georeferencing of images, data fusion, feature evaluation by divergence andJ-M distance, image compression, wavelet image analysis and change detection. Additions to the theory module include self-assessment quizzes, audio-video clips on selected concepts, and a discussion of elements of visual image interpretation. V-SIPAL is at the satge of internal evaluation within IIT Bombay and will soon be open to selected educational institutions in India for evaluation.

  18. Synthesis and Plasmonic Understanding of Core/Satellite and Core Shell Nanostructures

    NASA Astrophysics Data System (ADS)

    Ruan, Qifeng

    Localized surface plasmon resonance, which stems from the collective oscillations of conduction-band electrons, endows Au nanocrystals with unique optical properties. Au nanocrystals possess extremely large scattering/absorption cross-sections and enhanced local electromagnetic field, both of which are synthetically tunable. Moreover, when Au nanocrystals are closely placed or hybridized with semiconductors, the coupling and interaction between the individual components bring about more fascinating phenomena and promising applications, including plasmon-enhanced spectroscopies, solar energy harvesting, and cancer therapy. The continuous development in the field of plasmonics calls for further advancements in the preparation of high-quality plasmonic nanocrystals, the facile construction of hybrid plasmonic nanostructures with desired functionalities, as well as deeper understanding and efficient utilization of the interaction between plasmonic nanocrystals and semiconductor components. In this thesis, I developed a seed-mediated growth method for producing size-controlled Au nanospheres with high monodispersity and assembled Au nanospheres of different sizes into core/satellite nanostructures for enhancing Raman signals. For investigating the interactions between Au nanocrystals and semiconductors, I first prepared (Au core) (TiO2 shell) nanostructures, and then studied their synthetically controlled plasmonic properties and light-harvesting applications. Au nanocrystals with spherical shapes are desirable in plasmon-coupled systems owing to their high geometrical symmetry, which facilitates the analysis of electrodynamic responses in a classical electromagnetic framework and the investigation of quantum tunneling and nonlocal effects. I prepared remarkably uniform Au nanospheres with diameters ranging from 20 nm to 220 nm using a simple seed-mediated growth method associated with mild oxidation. Core/satellite nanostructures were assembled out of differently sized

  19. Mars Science Laboratory Heatshield Flight Data Analysis

    NASA Technical Reports Server (NTRS)

    Mahzari, Milad; White, Todd

    2017-01-01

    NASA Mars Science Laboratory (MSL), which landed the Curiosity rover on the surface of Mars on August 5th, 2012, was the largest and heaviest Mars entry vehicle representing a significant advancement in planetary entry, descent and landing capability. Hypersonic flight performance data was collected using MSLs on-board sensors called Mars Entry, Descent and Landing Instrumentation (MEDLI). This talk will give an overview of MSL entry and a description of MEDLI sensors. Observations from flight data will be examined followed by a discussion of analysis efforts to reconstruct surface heating from heatshields in-depth temperature measurements. Finally, a brief overview of MEDLI2 instrumentation, which will fly on NASAs Mars2020 mission, will be presented with a discussion on how lessons learned from MEDLI data affected the design of MEDLI2 instrumentation.

  20. Hydraulic manipulator design, analysis, and control at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kress, R.L.; Jansen, J.F.; Love, L.J.

    1996-09-01

    To meet the increased payload capacities demanded by present-day tasks, manipulator designers have turned to hydraulics as a means of actuation. Hydraulics have always been the actuator of choice when designing heavy-life construction and mining equipment such as bulldozers, backhoes, and tunneling devices. In order to successfully design, build, and deploy a new hydraulic manipulator (or subsystem) sophisticated modeling, analysis, and control experiments are usually needed. To support the development and deployment of new hydraulic manipulators Oak Ridge National Laboratory (ORNL) has outfitted a significant experimental laboratory and has developed the software capability for research into hydraulic manipulators, hydraulic actuators,more » hydraulic systems, modeling of hydraulic systems, and hydraulic controls. The hydraulics laboratory at ORNL has three different manipulators. First is a 6-Degree-of-Freedom (6-DoF), multi-planer, teleoperated, flexible controls test bed used for the development of waste tank clean-up manipulator controls, thermal studies, system characterization, and manipulator tracking. Finally, is a human amplifier test bed used for the development of an entire new class of teleoperated systems. To compliment the hardware in the hydraulics laboratory, ORNL has developed a hydraulics simulation capability including a custom package to model the hydraulic systems and manipulators for performance studies and control development. This paper outlines the history of hydraulic manipulator developments at ORNL, describes the hydraulics laboratory, discusses the use of the equipment within the laboratory, and presents some of the initial results from experiments and modeling associated with these hydraulic manipulators. Included are some of the results from the development of the human amplifier/de-amplifier concepts, the characterization of the thermal sensitivity of hydraulic systems, and end-point tracking accuracy studies. Experimental and

  1. Breast epithelium procurement from stereotactic core biopsy washings: flow cytometry-sorted cell count analysis.

    PubMed

    Stoler, Daniel L; Stewart, Carleton C; Stomper, Paul C

    2002-02-01

    Molecular studies of breast lesions have been constrained by difficulties in procuring adequate tissues for analyses. Standard procedures are restricted to larger, palpable masses or the use of paraffin-embedded materials, precluding facile procurement of fresh specimens of early lesions. We describe a study to determine the yield and characteristics of sorted cell populations retrieved in core needle biopsy specimen rinses from a spectrum of breast lesions. Cells from 114 consecutive stereotactic core biopsies of mammographic lesions released into saline washes were submitted for flow cytometric analysis. For each specimen, epithelial cells were separated from stromal and blood tissue based on the presence of cytokeratin 8 and 18 markers. Epithelial cell yields based on pathological diagnoses of the biopsy specimen, patient age, and mammographic appearance of the lesion were determined. Biopsies containing malignant lesions yielded significantly higher numbers of cells than were obtained from benign lesion biopsies. Significantly greater cell counts were observed from lesions from women age 50 or above compared with those of younger women. Mammographic density surrounding the biopsy site, the mammographic appearance of the lesion, and the number of cores taken at the time of biopsy appeared to have little effect on the yield of epithelial cells. We demonstrate the use of flow cytometric sorting of stereotactic core needle biopsy washes from lesions spanning the spectrum of breast pathology to obtain epithelial cells in sufficient numbers to meet the requirements of a variety of molecular and genetic analyses.

  2. Data analysis considerations for pesticides determined by National Water Quality Laboratory schedule 2437

    USGS Publications Warehouse

    Shoda, Megan E.; Nowell, Lisa H.; Stone, Wesley W.; Sandstrom, Mark W.; Bexfield, Laura M.

    2018-04-02

    In 2013, the U.S. Geological Survey National Water Quality Laboratory (NWQL) made a new method available for the analysis of pesticides in filtered water samples: laboratory schedule 2437. Schedule 2437 is an improvement on previous analytical methods because it determines the concentrations of 225 fungicides, herbicides, insecticides, and associated degradates in one method at similar or lower concentrations than previously available methods. Additionally, the pesticides included in schedule 2437 were strategically identified in a prioritization analysis that assessed likelihood of occurrence, prevalence of use, and potential toxicity. When the NWQL reports pesticide concentrations for analytes in schedule 2437, the laboratory also provides supplemental information useful to data users for assessing method performance and understanding data quality. That supplemental information is discussed in this report, along with an initial analysis of analytical recovery of pesticides in water-quality samples analyzed by schedule 2437 during 2013–2015. A total of 523 field matrix spike samples and their paired environmental samples and 277 laboratory reagent spike samples were analyzed for this report (1,323 samples total). These samples were collected in the field as part of the U.S. Geological Survey National Water-Quality Assessment groundwater and surface-water studies and as part of the NWQL quality-control program. This report reviews how pesticide samples are processed by the NWQL, addresses how to obtain all the data necessary to interpret pesticide concentrations, explains the circumstances that result in a reporting level change or the occurrence of a raised reporting level, and describes the calculation and assessment of recovery. This report also discusses reasons why a data user might choose to exclude data in an interpretive analysis and outlines the approach used to identify the potential for decreased data quality in the assessment of method recovery. The

  3. Efficient Design and Analysis of Lightweight Reinforced Core Sandwich and PRSEUS Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Yarrington, Phillip W.; Lucking, Ryan C.; Collier, Craig S.; Ainsworth, James J.; Toubia, Elias A.

    2012-01-01

    Design, analysis, and sizing methods for two novel structural panel concepts have been developed and incorporated into the HyperSizer Structural Sizing Software. Reinforced Core Sandwich (RCS) panels consist of a foam core with reinforcing composite webs connecting composite facesheets. Boeing s Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) panels use a pultruded unidirectional composite rod to provide axial stiffness along with integrated transverse frames and stitching. Both of these structural concepts are ovencured and have shown great promise applications in lightweight structures, but have suffered from the lack of efficient sizing capabilities similar to those that exist for honeycomb sandwich, foam sandwich, hat stiffened, and other, more traditional concepts. Now, with accurate design methods for RCS and PRSEUS panels available in HyperSizer, these concepts can be traded and used in designs as is done with the more traditional structural concepts. The methods developed to enable sizing of RCS and PRSEUS are outlined, as are results showing the validity and utility of the methods. Applications include several large NASA heavy lift launch vehicle structures.

  4. Standardization of Laboratory Methods for the PERCH Study

    PubMed Central

    Karron, Ruth A.; Morpeth, Susan C.; Bhat, Niranjan; Levine, Orin S.; Baggett, Henry C.; Brooks, W. Abdullah; Feikin, Daniel R.; Hammitt, Laura L.; Howie, Stephen R. C.; Knoll, Maria Deloria; Kotloff, Karen L.; Madhi, Shabir A.; Scott, J. Anthony G.; Thea, Donald M.; Adrian, Peter V.; Ahmed, Dilruba; Alam, Muntasir; Anderson, Trevor P.; Antonio, Martin; Baillie, Vicky L.; Dione, Michel; Endtz, Hubert P.; Gitahi, Caroline; Karani, Angela; Kwenda, Geoffrey; Maiga, Abdoul Aziz; McClellan, Jessica; Mitchell, Joanne L.; Morailane, Palesa; Mugo, Daisy; Mwaba, John; Mwansa, James; Mwarumba, Salim; Nyongesa, Sammy; Panchalingam, Sandra; Rahman, Mustafizur; Sawatwong, Pongpun; Tamboura, Boubou; Toure, Aliou; Whistler, Toni; O’Brien, Katherine L.; Murdoch, David R.

    2017-01-01

    Abstract The Pneumonia Etiology Research for Child Health study was conducted across 7 diverse research sites and relied on standardized clinical and laboratory methods for the accurate and meaningful interpretation of pneumonia etiology data. Blood, respiratory specimens, and urine were collected from children aged 1–59 months hospitalized with severe or very severe pneumonia and community controls of the same age without severe pneumonia and were tested with an extensive array of laboratory diagnostic tests. A standardized testing algorithm and standard operating procedures were applied across all study sites. Site laboratories received uniform training, equipment, and reagents for core testing methods. Standardization was further assured by routine teleconferences, in-person meetings, site monitoring visits, and internal and external quality assurance testing. Targeted confirmatory testing and testing by specialized assays were done at a central reference laboratory. PMID:28575358

  5. Clinical features and ryanodine receptor type 1 gene mutation analysis in a Chinese family with central core disease.

    PubMed

    Chang, Xingzhi; Jin, Yiwen; Zhao, Haijuan; Huang, Qionghui; Wang, Jingmin; Yuan, Yun; Han, Ying; Qin, Jiong

    2013-03-01

    Central core disease is a rare inherited neuromuscular disorder caused by mutations in ryanodine receptor type 1 gene. The clinical phenotype of the disease is highly variable. We report a Chinese pedigree with central core disease confirmed by the gene sequencing. All 3 patients in the family presented with mild proximal limb weakness. The serum level of creatine kinase was normal, and electromyography suggested myogenic changes. The histologic analysis of muscle biopsy showed identical central core lesions in almost all of the muscle fibers in the index case. Exon 90-106 in the C-terminal domain of the ryanodine receptor type 1 gene was amplified using polymerase chain reaction. One heterozygous missense mutation G14678A (Arg4893Gln) in exon 102 was identified in all 3 patients. This is the first report of a familial case of central core disease confirmed by molecular study in mainland China.

  6. Laboratory Identity: A Linguistic Landscape Analysis of Personalized Space within a Microbiology Laboratory

    ERIC Educational Resources Information Center

    Hanauer, David I.

    2010-01-01

    This study provides insights into what constitutes a laboratory identity and the ways in which it is spatially constructed. This article explores students' professional identities as microbiologists as manifest in their usage of representational space in a laboratory and as such extends understandings of science identity and spatial identity. The…

  7. Correlative and multivariate analysis of increased radon concentration in underground laboratory.

    PubMed

    Maletić, Dimitrije M; Udovičić, Vladimir I; Banjanac, Radomir M; Joković, Dejan R; Dragić, Aleksandar L; Veselinović, Nikola B; Filipović, Jelena

    2014-11-01

    The results of analysis using correlative and multivariate methods, as developed for data analysis in high-energy physics and implemented in the Toolkit for Multivariate Analysis software package, of the relations of the variation of increased radon concentration with climate variables in shallow underground laboratory is presented. Multivariate regression analysis identified a number of multivariate methods which can give a good evaluation of increased radon concentrations based on climate variables. The use of the multivariate regression methods will enable the investigation of the relations of specific climate variable with increased radon concentrations by analysis of regression methods resulting in 'mapped' underlying functional behaviour of radon concentrations depending on a wide spectrum of climate variables. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Intestinal Microbiota in Healthy Adults: Temporal Analysis Reveals Individual and Common Core and Relation to Intestinal Symptoms

    PubMed Central

    Nikkilä, Janne; Immonen, Outi; Kekkonen, Riina; Lahti, Leo; Palva, Airi; de Vos, Willem M.

    2011-01-01

    Background While our knowledge of the intestinal microbiota during disease is accumulating, basic information of the microbiota in healthy subjects is still scarce. The aim of this study was to characterize the intestinal microbiota of healthy adults and specifically address its temporal stability, core microbiota and relation with intestinal symptoms. We carried out a longitudinal study by following a set of 15 healthy Finnish subjects for seven weeks and regularly assessed their intestinal bacteria and archaea with the Human Intestinal Tract (HIT)Chip, a phylogenetic microarray, in conjunction with qPCR analyses. The health perception and occurrence of intestinal symptoms was recorded by questionnaire at each sampling point. Principal Findings A high overall temporal stability of the microbiota was observed. Five subjects showed transient microbiota destabilization, which correlated not only with the intake of antibiotics but also with overseas travelling and temporary illness, expanding the hitherto known factors affecting the intestinal microbiota. We identified significant correlations between the microbiota and common intestinal symptoms, including abdominal pain and bloating. The most striking finding was the inverse correlation between Bifidobacteria and abdominal pain: subjects who experienced pain had over five-fold less Bifidobacteria compared to those without pain. Finally, a novel computational approach was used to define the common core microbiota, highlighting the role of the analysis depth in finding the phylogenetic core and estimating its size. The in-depth analysis suggested that we share a substantial number of our intestinal phylotypes but as they represent highly variable proportions of the total community, many of them often remain undetected. Conclusions/Significance A global and high-resolution microbiota analysis was carried out to determine the temporal stability, the associations with intestinal symptoms, and the individual and common

  9. Laboratory Measurements for H3+ Deuteration Reactions

    NASA Astrophysics Data System (ADS)

    Bowen, Kyle; Hillenbrand, Pierre-Michel; Urbain, Xavier; Savin, Daniel Wolf

    2018-06-01

    Deuterated molecules are important chemical tracers of protostellar cores. At the ~106 cm-3 particle densities and ~20 K temperatures typical for protostellar cores, most molecules freeze onto dust grains. A notable exception is H3+ and its isotopologues. These become important carriers of positive charge in the gas, can couple to any ambient magnetic field, and can thereby alter the cloud dynamics. Knowing the total abundance of H3+ and its isotopologues is important for studying the evolution of protostellar cores. However, H3+ and D3+ have no dipole moment. They lack a pure rotational spectrum and are not observable at protostellar core temperatures. Fortunately H2D+ and D2H+ have dipole moments and a pure rotational spectrum that can be excited in protostellar cores. Observations of these two molecules, combined with astrochemical models, provide information about the total abundance of H3+ and all its isotopologues. The inferred abundances, though, rely on accurate astrochemical data for the deuteration of H3+ and its isotopologues.Here we present laboratory measurements of the rate coefficients for three important deuterating reactions, namely D + H3+/H2D+/D2H+ → H + H2D+/ D2H+/D3+. Astrochemical models currently rely on rate coefficients from classical (Langevin) or semi-classical methods for these reactions, as fully quantum-mechanical calculations are beyond current computational capabilities. Laboratory studies are the most tractable means of providing the needed data. For our studies we used our novel dual-source, merged fast-beams apparatus, which enables us to study reactions of neutral atoms and molecular ions. Co-propagating beams allow us to measure experimental rate coefficients as a function of collision energy. We extract cross section data from these results, which we then convolve with a Maxwell-Boltzmann distribution to generate thermal rate coefficients. Here we present our results for these three reactions and discuss some implications.

  10. Examination of core samples from the Mount Elbert Gas Hydrate Stratigraphic Test Well, Alaska North Slope: Effects of retrieval and preservation

    USGS Publications Warehouse

    Kneafsey, T.J.; Lu, H.; Winters, W.; Boswell, R.; Hunter, R.; Collett, T.S.

    2011-01-01

    Collecting and preserving undamaged core samples containing gas hydrates from depth is difficult because of the pressure and temperature changes encountered upon retrieval. Hydrate-bearing core samples were collected at the BPXA-DOE-USGS Mount Elbert Gas Hydrate Stratigraphic Test Well in February 2007. Coring was performed while using a custom oil-based drilling mud, and the cores were retrieved by a wireline. The samples were characterized and subsampled at the surface under ambient winter arctic conditions. Samples thought to be hydrate bearing were preserved either by immersion in liquid nitrogen (LN), or by storage under methane pressure at ambient arctic conditions, and later depressurized and immersed in LN. Eleven core samples from hydrate-bearing zones were scanned using x-ray computed tomography to examine core structure and homogeneity. Features observed include radial fractures, spalling-type fractures, and reduced density near the periphery. These features were induced during sample collection, handling, and preservation. Isotopic analysis of the methane from hydrate in an initially LN-preserved core and a pressure-preserved core indicate that secondary hydrate formation occurred throughout the pressurized core, whereas none occurred in the LN-preserved core, however no hydrate was found near the periphery of the LN-preserved core. To replicate some aspects of the preservation methods, natural and laboratory-made saturated porous media samples were frozen in a variety of ways, with radial fractures observed in some LN-frozen sands, and needle-like ice crystals forming in slowly frozen clay-rich sediments. Suggestions for hydrate-bearing core preservation are presented.

  11. PNNL Researchers Collect Permafrost Cores in Alaska

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-11-23

    Permafrost is ground that is frozen for two or more years. In the Arctic, discontinuous regions of this saturated admixture of soil and rock store a large fraction of the Earth’s carbon – about 1672 petagrams (1672 trillion kilograms). As temperatures increase in the Northern Hemisphere, a lot of that carbon may be released to the atmosphere, making permafrost an important factor to represent accurately in global climate models. At Pacific Northwest National Laboratory, a group led by James C. Stegen periodically extracts permafrost core samples from a site near Fairbanks, Alaska. Back at the lab in southeastern Washington State,more » they study the cores for levels of microbial activity, carbon fluxes, hydrologic patterns, and other factors that reveal the dynamics of this consequential layer of soil and rock.« less

  12. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, Keith

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less

  13. Integration of Pharmacy Practice and Pharmaceutical Analysis: Quality Assessment of Laboratory Performance.

    ERIC Educational Resources Information Center

    McGill, Julian E.; Holly, Deborah R.

    1996-01-01

    Laboratory portions of courses in pharmacy practice and pharmaceutical analysis at the Medical University of South Carolina are integrated and coordinated to provide feedback on student performance in compounding medications. Students analyze the products they prepare, with early exposure to compendia requirements and other references. Student…

  14. Core body temperature in obesity.

    PubMed

    Heikens, Marc J; Gorbach, Alexander M; Eden, Henry S; Savastano, David M; Chen, Kong Y; Skarulis, Monica C; Yanovski, Jack A

    2011-05-01

    A lower core body temperature set point has been suggested to be a factor that could potentially predispose humans to develop obesity. We tested the hypothesis that obese individuals have lower core temperatures than those in normal-weight individuals. In study 1, nonobese [body mass index (BMI; in kg/m(2)) <30] and obese (BMI ≥30) adults swallowed wireless core temperature-sensing capsules, and we measured core temperatures continuously for 24 h. In study 2, normal-weight (BMI of 18-25) and obese subjects swallowed temperature-sensing capsules to measure core temperatures continuously for ≥48 h and kept activity logs. We constructed daily, 24-h core temperature profiles for analysis. Mean (±SE) daily core body temperature did not differ significantly between the 35 nonobese and 46 obese subjects (36.92 ± 0.03°C compared with 36.89 ± 0.03°C; P = 0.44). Core temperature 24-h profiles did not differ significantly between 11 normal-weight and 19 obese subjects (P = 0.274). Women had a mean core body temperature ≈0.23°C greater than that of men (36.99 ± 0.03°C compared with 36.76 ± 0.03°C; P < 0.0001). Obesity is not generally associated with a reduced core body temperature. It may be necessary to study individuals with function-altering mutations in core temperature-regulating genes to determine whether differences in the core body temperature set point affect the regulation of human body weight. These trials were registered at clinicaltrials.gov as NCT00428987 and NCT00266500.

  15. Dowel-nut connection in Douglas-fir peeler cores

    Treesearch

    Ronald W. Wolfe; John R. King; Agron Gjinolli

    As part of an effort to encourage more efficient use of small-diameter timber, the Forest Products Laboratory cooperated with Geiger Engineers in a study of the structural properties of Douglas-fir peeler cores and the efficacy of a bdowel-nutc connection detail for application in the design of a space frame roof system. A 44.5-mm- (1.75-in.-) diameter dowel-nut...

  16. The student perspective of high school laboratory experiences

    NASA Astrophysics Data System (ADS)

    Lambert, R. Mitch

    High school science laboratory experiences are an accepted teaching practice across the nation despite a lack of research evidence to support them. The purpose of this study was to examine the perspective of students---stakeholders often ignored---on these experiences. Insight into the students' perspective was explored progressively using a grounded theory methodology. Field observations of science classrooms led to an open-ended survey of high school science students, garnering 665 responses. Twelve student interviews then focused on the data and questions evolving from the survey. The student perspective on laboratory experiences revealed varied information based on individual experience. Concurrent analysis of the data revealed that although most students like (348/665) or sometimes like (270/665) these experiences, some consistent factors yielded negative experiences and prompted suggestions for improvement. The category of responses that emerged as the core idea focused on student understanding of the experience. Students desire to understand the why do, the how to, and the what it means of laboratory experiences. Lacking any one of these, the experience loses educational value for them. This single recurring theme crossed the boundaries of age, level in school, gender, and even the student view of lab experiences as positive or negative. This study suggests reflection on the current laboratory activities in which science teachers engage their students. Is the activity appropriate (as opposed to being merely a favorite), does it encourage learning, does it fit, does it operate at the appropriate level of inquiry, and finally what can science teachers do to integrate these activities into the classroom curriculum more effectively? Simply stated, what can teachers do so that students understand what to do, what's the point, and how that point fits into what they are learning outside the laboratory?

  17. ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEM

    EPA Science Inventory

    ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEMThomas J. Hughes, QA and Records Manager, Experimental Toxicology Division (ETD), National Health and Environmental Effects Research Laboratory (NHEERL), ORD, U.S. EPA, RTP, NC 27709

    ETD is the largest health divis...

  18. Modeling analysis of pulsed magnetization process of magnetic core based on inverse Jiles-Atherton model

    NASA Astrophysics Data System (ADS)

    Liu, Yi; Zhang, He; Liu, Siwei; Lin, Fuchang

    2018-05-01

    The J-A (Jiles-Atherton) model is widely used to describe the magnetization characteristics of magnetic cores in a low-frequency alternating field. However, this model is deficient in the quantitative analysis of the eddy current loss and residual loss in a high-frequency magnetic field. Based on the decomposition of magnetization intensity, an inverse J-A model is established which uses magnetic flux density B as an input variable. Static and dynamic core losses under high frequency excitation are separated based on the inverse J-A model. Optimized parameters of the inverse J-A model are obtained based on particle swarm optimization. The platform for the pulsed magnetization characteristic test is designed and constructed. The hysteresis curves of ferrite and Fe-based nanocrystalline cores at high magnetization rates are measured. The simulated and measured hysteresis curves are presented and compared. It is found that the inverse J-A model can be used to describe the magnetization characteristics at high magnetization rates and to separate the static loss and dynamic loss accurately.

  19. The French initiative for scientific cores virtual curating : a user-oriented integrated approach

    NASA Astrophysics Data System (ADS)

    Pignol, Cécile; Godinho, Elodie; Galabertier, Bruno; Caillo, Arnaud; Bernardet, Karim; Augustin, Laurent; Crouzet, Christian; Billy, Isabelle; Teste, Gregory; Moreno, Eva; Tosello, Vanessa; Crosta, Xavier; Chappellaz, Jérome; Calzas, Michel; Rousseau, Denis-Didier; Arnaud, Fabien

    2016-04-01

    Managing scientific data is probably one the most challenging issue in modern science. The question is made even more sensitive with the need of preserving and managing high value fragile geological sam-ples: cores. Large international scientific programs, such as IODP or ICDP are leading an intense effort to solve this problem and propose detailed high standard work- and dataflows thorough core handling and curating. However most results derived from rather small-scale research programs in which data and sample management is generally managed only locally - when it is … The national excellence equipment program (Equipex) CLIMCOR aims at developing French facilities for coring and drilling investigations. It concerns indiscriminately ice, marine and continental samples. As part of this initiative, we initiated a reflexion about core curating and associated coring-data management. The aim of the project is to conserve all metadata from fieldwork in an integrated cyber-environment which will evolve toward laboratory-acquired data storage in a near future. In that aim, our demarche was conducted through an close relationship with field operators as well laboratory core curators in order to propose user-oriented solutions. The national core curating initiative currently proposes a single web portal in which all scientifics teams can store their field data. For legacy samples, this will requires the establishment of a dedicated core lists with associated metadata. For forthcoming samples, we propose a mobile application, under Android environment to capture technical and scientific metadata on the field. This application is linked with a unique coring tools library and is adapted to most coring devices (gravity, drilling, percussion, etc...) including multiple sections and holes coring operations. Those field data can be uploaded automatically to the national portal, but also referenced through international standards or persistent identifiers (IGSN, ORCID and INSPIRE

  20. Us army corps of engineers - Engineering research and development center - Petrographic analysis of section 3 personnel tunnel concrete

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, J. M.

    The Concrete and Materials Branch (CMB) of the Geotechnical and Structures Laboratory was requested to perform an analysis on concrete cores collected from the north and south walls of the H-Canyon Section 3 Personnel Tunel, Savannah River Site, Aiken, South Carolina to determine the cause of the lower than expected compressive strength. This study examined five cores provided to the ERDC by the Department of Energy. The cores were logged in as CMB No. 170051-1 to 170051-5 and subjected to petrographic examination, air void analysis, chemical sprays, scanning electron microscopy, and x-ray diffraction.

  1. [Errors in laboratory daily practice].

    PubMed

    Larrose, C; Le Carrer, D

    2007-01-01

    Legislation set by GBEA (Guide de bonne exécution des analyses) requires that, before performing analysis, the laboratory directors have to check both the nature of the samples and the patients identity. The data processing of requisition forms, which identifies key errors, was established in 2000 and in 2002 by the specialized biochemistry laboratory, also with the contribution of the reception centre for biological samples. The laboratories follow a strict criteria of defining acceptability as a starting point for the reception to then check requisition forms and biological samples. All errors are logged into the laboratory database and analysis report are sent to the care unit specifying the problems and the consequences they have on the analysis. The data is then assessed by the laboratory directors to produce monthly or annual statistical reports. This indicates the number of errors, which are then indexed to patient files to reveal the specific problem areas, therefore allowing the laboratory directors to teach the nurses and enable corrective action.

  2. Chemical Convention in the Lunar Core from Melting Experiments on the Ironsulfur System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, J.; Liu, J.; Chen, B.

    2012-03-26

    By reanalyzing Apollo lunar seismograms using array-processing methods, a recent study suggests that the Moon has a solid inner core and a fluid outer core, much like the Earth. The volume fraction of the lunar inner core is 38%, compared with 4% for the Earth. The pressure at the Moon's core-mantle boundary is 4.8 GPa, and that at the ICB is 5.2 GPa. The partially molten state of the lunar core provides constraints on the thermal and chemical states of the Moon: The temperature at the inner core boundary (ICB) corresponds to the liquidus of the outer core composition, andmore » the mass fraction of the solid core allows us to infer the bulk composition of the core from an estimated thermal profile. Moreover, knowledge on the extent of core solidification can be used to evaluate the role of chemical convection in the origin of early lunar core dynamo. Sulfur is considered an antifreeze component in the lunar core. Here we investigate the melting behavior of the Fe-S system at the pressure conditions of the lunar core, using the multi-anvil apparatus and synchrotron and laboratory-based analytical methods. Our goal is to understand compositionally driven convection in the lunar core and assess its role in generating an internal magnetic field in the early history of the Moon.« less

  3. A natural laboratory for 40Ar/39Ar geochronology: ICDP cores from Lake Van, Turkey

    NASA Astrophysics Data System (ADS)

    Engelhardt, Jonathan; Sudo, Masafumi; Oberhänsli, Roland

    2015-04-01

    Pore water samples from ICDP Paleovan cores indicate a limited pore water exchange within Quaternary lake sediments. The core's volcaniclastic sections bear unaltered K-rich ternary feldspar and fresh to altered glass shards of predominantly rhyolitic composition. Whereas applying the 40Ar/39Ar method on feldspars resulted in ages timing a late-stage crystallization, glass shards had the potential to date the eruption. Volcanic glass is prone to modifications such as hydrous alteration (palagonitization) and devitrification (Cerling et al., 1985). These modifications affect the glass' chemistry and challenge the application of the 40Ar/39Ar method. Gaining precise radiometric ages from two phases has the potential to strengthen a climate-stratigraphic age-model (Stockhecke et al., 2014), and to significantly increase the temporal resolution on the deposition of the lake sediments. Vice versa the core's previous age model has the ability to question the reliability of 40Ar/39Ar eruption ages derived from ternary feldspars and glass shards. Multi- and single-grain total fusion on alkali feldspars from six volcaniclastic deposits resulted in Pleistocene ages that are in good agreement with the predicted age model. Feldspar phenocrysts from three ashes in the core's youngest section yielded consistent isochron ages that are significantly older than the model's prediction. Several distinct stratigraphic and paleomagnetic time markers of similar stratigraphic positions contradict to the older radiometric dates (Stockhecke et al., 2014). Partial resorption features of inherited feldspar domains and the involvement of excess 40Ar indicate incomplete degassing of older domains. To evaluate the magmatic history of the different domains EMPA mappings of trace elements that could be interpreted as Ar diffusion couples are currently conducted. Geochronology on Paleovan cores offers unique opportunities to monitor the effect of alteration on the Ar-systematics of volcanic glass

  4. Lunar Polar Coring Lander

    NASA Technical Reports Server (NTRS)

    Angell, David; Bealmear, David; Benarroche, Patrice; Henry, Alan; Hudson, Raymond; Rivellini, Tommaso; Tolmachoff, Alex

    1990-01-01

    Plans to build a lunar base are presently being studied with a number of considerations. One of the most important considerations is qualifying the presence of water on the Moon. The existence of water on the Moon implies that future lunar settlements may be able to use this resource to produce things such as drinking water and rocket fuel. Due to the very high cost of transporting these materials to the Moon, in situ production could save billions of dollars in operating costs of the lunar base. Scientists have suggested that the polar regions of the Moon may contain some amounts of water ice in the regolith. Six possible mission scenarios are suggested which would allow lunar polar soil samples to be collected for analysis. The options presented are: remote sensing satellite, two unmanned robotic lunar coring missions (one is a sample return and one is a data return only), two combined manned and robotic polar coring missions, and one fully manned core retrieval mission. One of the combined manned and robotic missions has been singled out for detailed analysis. This mission proposes sending at least three unmanned robotic landers to the lunar pole to take core samples as deep as 15 meters. Upon successful completion of the coring operations, a manned mission would be sent to retrieve the samples and perform extensive experiments of the polar region. Man's first step in returning to the Moon is recommended to investigate the issue of lunar polar water. The potential benefits of lunar water more than warrant sending either astronauts, robots or both to the Moon before any permanent facility is constructed.

  5. Analysis of Dextromethorphan in Cough Drops and Syrups: A Medicinal Chemistry Laboratory

    ERIC Educational Resources Information Center

    Hamilton, Todd M.; Wiseman, Frank L., Jr.

    2009-01-01

    Fluorescence spectroscopy is used to determine the quantity of dextromethorphan hydrobromide (DM) in over-the-counter (OTC) cough drops and syrups. This experiment is appropriate for an undergraduate medicinal chemistry laboratory course when studying OTC medicines and active ingredients. Students prepare the cough drops and syrups for analysis,…

  6. Environmental Technology (Laboratory Analysis and Environmental Sampling) Curriculum Development Project. Final Report.

    ERIC Educational Resources Information Center

    Hinojosa, Oscar V.; Guillen, Alfonso

    A project assessed the need and developed a curriculum for environmental technology (laboratory analysis and environmental sampling) in the emerging high technology centered around environmental safety and health in Texas. Initial data were collected through interviews by telephone and in person and through onsite visits. Additional data was…

  7. A Laboratory Exercise Illustrating the Sensitivity and Specificity of Western Blot Analysis

    ERIC Educational Resources Information Center

    Chang, Ming-Mei; Lovett, Janice

    2011-01-01

    Western blot analysis, commonly known as "Western blotting," is a standard tool in every laboratory where proteins are analyzed. It involves the separation of polypeptides in polyacrylamide gels followed by the electrophoretic transfer of the separated polypeptides onto a nitrocellulose or polyvinylidene fluoride membrane. A replica of the…

  8. Application Performance Analysis and Efficient Execution on Systems with multi-core CPUs, GPUs and MICs: A Case Study with Microscopy Image Analysis

    PubMed Central

    Teodoro, George; Kurc, Tahsin; Andrade, Guilherme; Kong, Jun; Ferreira, Renato; Saltz, Joel

    2015-01-01

    We carry out a comparative performance study of multi-core CPUs, GPUs and Intel Xeon Phi (Many Integrated Core-MIC) with a microscopy image analysis application. We experimentally evaluate the performance of computing devices on core operations of the application. We correlate the observed performance with the characteristics of computing devices and data access patterns, computation complexities, and parallelization forms of the operations. The results show a significant variability in the performance of operations with respect to the device used. The performances of operations with regular data access are comparable or sometimes better on a MIC than that on a GPU. GPUs are more efficient than MICs for operations that access data irregularly, because of the lower bandwidth of the MIC for random data accesses. We propose new performance-aware scheduling strategies that consider variabilities in operation speedups. Our scheduling strategies significantly improve application performance compared to classic strategies in hybrid configurations. PMID:28239253

  9. Thermal Hydraulics Design and Analysis Methodology for a Solid-Core Nuclear Thermal Rocket Engine Thrust Chamber

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Canabal, Francisco; Chen, Yen-Sen; Cheng, Gary; Ito, Yasushi

    2013-01-01

    Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions. This chapter describes a thermal hydraulics design and analysis methodology developed at the NASA Marshall Space Flight Center, in support of the nuclear thermal propulsion development effort. The objective of this campaign is to bridge the design methods in the Rover/NERVA era, with a modern computational fluid dynamics and heat transfer methodology, to predict thermal, fluid, and hydrogen environments of a hypothetical solid-core, nuclear thermal engine the Small Engine, designed in the 1960s. The computational methodology is based on an unstructured-grid, pressure-based, all speeds, chemically reacting, computational fluid dynamics and heat transfer platform, while formulations of flow and heat transfer through porous and solid media were implemented to describe those of hydrogen flow channels inside the solid24 core. Design analyses of a single flow element and the entire solid-core thrust chamber of the Small Engine were performed and the results are presented herein

  10. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    PubMed

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  11. NASA Laboratory Analysis for Manned Exploration Missions

    NASA Technical Reports Server (NTRS)

    Krihak, Michael K.; Shaw, Tianna E.

    2014-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability Element under the NASA Human Research Program. ELA instrumentation is identified as an essential capability for future exploration missions to diagnose and treat evidence-based medical conditions. However, mission architecture limits the medical equipment, consumables, and procedures that will be available to treat medical conditions during human exploration missions. Allocated resources such as mass, power, volume, and crew time must be used efficiently to optimize the delivery of in-flight medical care. Although commercial instruments can provide the blood and urine based measurements required for exploration missions, these commercial-off-the-shelf devices are prohibitive for deployment in the space environment. The objective of the ELA project is to close the technology gap of current minimally invasive laboratory capabilities and analytical measurements in a manner that the mission architecture constraints impose on exploration missions. Besides micro gravity and radiation tolerances, other principal issues that generally fail to meet NASA requirements include excessive mass, volume, power and consumables, and nominal reagent shelf-life. Though manned exploration missions will not occur for nearly a decade, NASA has already taken strides towards meeting the development of ELA medical diagnostics by developing mission requirements and concepts of operations that are coupled with strategic investments and partnerships towards meeting these challenges. This paper focuses on the remote environment, its challenges, biomedical diagnostics requirements and candidate technologies that may lead to successful blood-urine chemistry and biomolecular measurements in future space exploration missions.

  12. An Analysis of Laboratory Safety in Texas.

    ERIC Educational Resources Information Center

    Fuller, Edward J.; Picucci, Ali Callicoatte; Collins, James W.; Swann, Philip

    This paper reports on a survey to discover the types of laboratory accidents that occur in Texas public schools, the factors associated with such accidents, and the practices of schools with regard to current laboratory safety requirements. The purpose of the survey is to better understand safety conditions in Texas public schools and to help…

  13. Basic data from five core holes in the Raft River geothermal area, Cassia County, Idaho

    USGS Publications Warehouse

    Crosthwaite, E. G.

    1976-01-01

    meters) were completed in the area (Crosthwaite, 1974), and the Aerojet Nuclear Company, under the auspices of the U.S. Energy Research and Development Administration, was planning some deep drilling 4,000 to 6,000 feet (1,200 to 1,800 meters) (fig. 1). The purpose of the core drilling was to provide information to test geophysical interpretations of the subsurface structure and lithology and to provide hydrologic and geologic data on the shallow part of the geothermal system. Samples of the core were made available to several divisions and branches of the Geological Survey and to people and agencies outside the Survey. This report presents the basic data from the core holes that had been collected to September 1, 1975, and includes lithologic and geophysical well logs, chemical analyses of water (table 1), and laboratory analyses of cores (table 2) that were completed as of the above date. The data were collected by the Idaho District office, Hydrologic Laboratory, Borehole Geophysics Research Project, and Drilling, Sampling, and Testing Section, all of the Water Resources Division, and the Branch of Central Environmental Geology of the Geologic Divison.

  14. Explicit Instruction in Core Reading Programs

    ERIC Educational Resources Information Center

    Reutzel, D. Ray; Child, Angela; Jones, Cindy D.; Clark, Sarah K.

    2014-01-01

    The purpose of this study was to conduct a content analysis of the types and occurrences of explicit instructional moves recommended for teaching five essentials of effective reading instruction in grades 1, 3, and 5 core reading program teachers' editions in five widely marketed core reading programs. Guided practice was the most frequently…

  15. An Investigative, Cooperative Learning Approach for General Chemistry Laboratories

    ERIC Educational Resources Information Center

    Díaz-Vázquez, Liz M.; Montes, Barbara Casañas; Echevarría Vargas, Ileabett M.; Hernandez-Cancel, Griselle; Gonzalez, Fernando; Molina, Anna M.; Morales-Cruz, Moraima; Torres-Díaz, Carlos M.; Griebenow, Kai

    2012-01-01

    The integration of research and education is an essential component of our university's teaching philosophy. Recently, we made a curricular revision to facilitate such an approach in the General Chemistry Laboratory, to teach students that investigative approaches are at the core of sciences. The curriculum revision included new interdisciplinary…

  16. Diagnostic accuracy of 22/25-gauge core needle in endoscopic ultrasound-guided sampling: systematic review and meta-analysis.

    PubMed

    Oh, Hyoung-Chul; Kang, Hyun; Lee, Jae Young; Choi, Geun Joo; Choi, Jung Sik

    2016-11-01

    To compare the diagnostic accuracy of endoscopic ultrasound-guided core needle aspiration with that of standard fine-needle aspiration by systematic review and meta-analysis. Studies using 22/25-gauge core needles, irrespective of comparison with standard fine needles, were comprehensively reviewed. Pooled sensitivity, specificity, diagnostic odds ratio (DOR), and summary receiver operating characteristic curves for the diagnosis of malignancy were used to estimate the overall diagnostic efficiency. The pooled sensitivity, specificity, and DOR of the core needle for the diagnosis of malignancy were 0.88 (95% confidence interval [CI], 0.84 to 0.90), 0.99 (95% CI, 0.96 to 1), and 167.37 (95% CI, 65.77 to 425.91), respectively. The pooled sensitivity, specificity, and DOR of the standard needle were 0.84 (95% CI, 0.79 to 0.88), 1 (95% CI, 0.97 to 1), and 130.14 (95% CI, 34.00 to 495.35), respectively. The area under the curve of core and standard needle in the diagnosis of malignancy was 0.974 and 0.955, respectively. The core and standard needle were comparable in terms of pancreatic malignancy diagnosis. There was no significant difference in procurement of optimal histologic cores between core and standard needles (risk ratio [RR], 0.545; 95% CI, 0.187 to 1.589). The number of needle passes for diagnosis was significantly lower with the core needle (standardized mean difference, -0.72; 95% CI, -1.02 to -0.41). There were no significant differences in overall complications (RR, 1.26; 95% CI, 0.34 to 4.62) and technical failure (RR, 5.07; 95% CI, 0.68 to 37.64). Core and standard needles were comparable in terms of diagnostic accuracy, technical performance, and safety profile.

  17. Elevating Learner Achievement Using Formative Electronic Lab Assessments in the Engineering Laboratory: A Viable Alternative to Weekly Lab Reports

    ERIC Educational Resources Information Center

    Chen, Baiyun; DeMara, Ronald F.; Salehi, Soheil; Hartshorne, Richard

    2018-01-01

    A laboratory pedagogy interweaving weekly student portfolios with onsite formative electronic laboratory assessments (ELAs) is developed and assessed within the laboratory component of a required core course of the electrical and computer engineering (ECE) undergraduate curriculum. The approach acts to promote student outcomes, and neutralize…

  18. Laboratory Workflow Analysis of Culture of Periprosthetic Tissues in Blood Culture Bottles.

    PubMed

    Peel, Trisha N; Sedarski, John A; Dylla, Brenda L; Shannon, Samantha K; Amirahmadi, Fazlollaah; Hughes, John G; Cheng, Allen C; Patel, Robin

    2017-09-01

    Culture of periprosthetic tissue specimens in blood culture bottles is more sensitive than conventional techniques, but the impact on laboratory workflow has yet to be addressed. Herein, we examined the impact of culture of periprosthetic tissues in blood culture bottles on laboratory workflow and cost. The workflow was process mapped, decision tree models were constructed using probabilities of positive and negative cultures drawn from our published study (T. N. Peel, B. L. Dylla, J. G. Hughes, D. T. Lynch, K. E. Greenwood-Quaintance, A. C. Cheng, J. N. Mandrekar, and R. Patel, mBio 7:e01776-15, 2016, https://doi.org/10.1128/mBio.01776-15), and the processing times and resource costs from the laboratory staff time viewpoint were used to compare periprosthetic tissues culture processes using conventional techniques with culture in blood culture bottles. Sensitivity analysis was performed using various rates of positive cultures. Annualized labor savings were estimated based on salary costs from the U.S. Labor Bureau for Laboratory staff. The model demonstrated a 60.1% reduction in mean total staff time with the adoption of tissue inoculation into blood culture bottles compared to conventional techniques (mean ± standard deviation, 30.7 ± 27.6 versus 77.0 ± 35.3 h per month, respectively; P < 0.001). The estimated annualized labor cost savings of culture using blood culture bottles was $10,876.83 (±$337.16). Sensitivity analysis was performed using various rates of culture positivity (5 to 50%). Culture in blood culture bottles was cost-effective, based on the estimated labor cost savings of $2,132.71 for each percent increase in test accuracy. In conclusion, culture of periprosthetic tissue in blood culture bottles is not only more accurate than but is also cost-saving compared to conventional culture methods. Copyright © 2017 American Society for Microbiology.

  19. Size-exclusion chromatography using core-shell particles.

    PubMed

    Pirok, Bob W J; Breuer, Pascal; Hoppe, Serafine J M; Chitty, Mike; Welch, Emmet; Farkas, Tivadar; van der Wal, Sjoerd; Peters, Ron; Schoenmakers, Peter J

    2017-02-24

    Size-exclusion chromatography (SEC) is an indispensable technique for the separation of high-molecular-weight analytes and for determining molar-mass distributions. The potential application of SEC as second-dimension separation in comprehensive two-dimensional liquid chromatography demands very short analysis times. Liquid chromatography benefits from the advent of highly efficient core-shell packing materials, but because of the reduced total pore volume these materials have so far not been explored in SEC. The feasibility of using core-shell particles in SEC has been investigated and contemporary core-shell materials were compared with conventional packing materials for SEC. Columns packed with very small core-shell particles showed excellent resolution in specific molar-mass ranges, depending on the pore size. The analysis times were about an order of magnitude shorter than what could be achieved using conventional SEC columns. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Laboratory and Airborne BRDF Analysis of Vegetation Leaves and Soil Samples

    NASA Technical Reports Server (NTRS)

    Georgiev, Georgi T.; Gatebe, Charles K.; Butler, James J.; King, Michael D.

    2008-01-01

    Laboratory-based Bidirectional Reflectance Distribution Function (BRDF) analysis of vegetation leaves, soil, and leaf litter samples is presented. The leaf litter and soil samples, numbered 1 and 2, were obtained from a site located in the savanna biome of South Africa (Skukuza: 25.0degS, 31.5degE). A third soil sample, number 3, was obtained from Etosha Pan, Namibia (19.20degS, 15.93degE, alt. 1100 m). In addition, BRDF of local fresh and dry leaves from tulip tree (Liriodendron tulipifera) and acacia tree (Acacia greggii) were studied. It is shown how the BRDF depends on the incident and scatter angles, sample size (i.e. crushed versus whole leaf,) soil samples fraction size, sample status (i.e. fresh versus dry leaves), vegetation species (poplar versus acacia), and vegetation s biochemical composition. As a demonstration of the application of the results of this study, airborne BRDF measurements acquired with NASA's Cloud Absorption Radiometer (CAR) over the same general site where the soil and leaf litter samples were obtained are compared to the laboratory results. Good agreement between laboratory and airborne measured BRDF is reported.

  1. Real-world daptomycin use across wide geographical regions: results from a pooled analysis of CORE and EU-CORE.

    PubMed

    Seaton, R Andrew; Gonzalez-Ruiz, Armando; Cleveland, Kerry O; Couch, Kimberly A; Pathan, Rashidkhan; Hamed, Kamal

    2016-03-15

    Pooled data from two large registries, Cubicin(®) Outcomes Registry and Experience (CORE; USA) and European Cubicin(®) Outcomes Registry and Experience (EU-CORE; Europe, Latin America, and Asia), were analyzed to determine the characteristics and clinical outcomes of daptomycin therapy in patients with Gram-positive infections across wide geographical regions. Patients receiving at least one dose of daptomycin between 2004 and 2012 for the treatment of Gram-positive infections were included. Clinical success was defined as an outcome of 'cured' or 'improved'. Post-treatment follow-up data were collected for a subset of patients (CORE: osteomyelitis and orthopedic foreign body device infection; EU-CORE: endocarditis, intracardiac/intravascular device infection, osteomyelitis, and orthopedic device infection). Safety was assessed for up to 30 days after daptomycin treatment. In 11,557 patients (CORE, 5482; EU-CORE, 6075) treated with daptomycin (median age, 62 [range, 1-103] years), the most frequent underlying conditions were cardiovascular disease (54.7 %) and diabetes mellitus (28.0 %). The most commonly treated primary infections were complicated skin and soft tissue infection (cSSTI; 31.2 %) and bacteremia (21.8 %). The overall clinical success rate was 77.2 % (uncomplicated SSTI, 88.3 %; cSSTI, 81.0 %; osteomyelitis, 77.7 %; foreign body/prosthetic infection (FBPI), 75.9 %; endocarditis, 75.4 %; and bacteremia, 69.5 %). The clinical success rate was 79.1 % in patients with Staphylococcus aureus infections (MRSA, 78.1 %). An increasing trend of high-dose daptomycin (>6 mg/kg/day) prescribing pattern was observed over time. Clinical success rates were higher with high-dose daptomycin treatment for endocarditis and FBPI. Adverse events (AEs) and serious AEs possibly related to daptomycin therapy were reported in 628 (5.4 %) and 133 (1.2 %) patients, respectively. The real-world data showed that daptomycin was effective and safe in the treatment

  2. Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development

    DTIC Science & Technology

    2017-09-29

    Filter Standards Development September 29, 2017 Approved for public release; distribution is unlimited. Thomas E. suTTo Materials and Systems Branch...LIMITATION OF ABSTRACT Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development Thomas E. Sutto Naval Research...approach, developed by NRL, is tested by examining the filter behavior against a number of chemicals to determine if the NRL approach resulted in the

  3. [Analysis of core virion polypeptides from the pathogen causing chicken egg-drop syndrome].

    PubMed

    Iurov, G K; Dadykov, V A; Neugodova, G L; Naroditskiĭ, B S

    1998-01-01

    The cores of egg-drop syndrome virus (EDS-76) were isolated by the pyridine technique. EDS-76 proved to be much more resistant to pyridine disruption than other adenoviruses and treatment with 10% pyridine did not lead to complete dissociation of capsid and cores; only increase of pyridine concentration to 20% produced satisfactory results. At least three polypeptides (24, 10.5, and 6.5 kDa) were found in the core by SDS-PAGE, whereas the 40 kDa reacting with the core is most probably not a core component. Much more intensive reactions of the core with EDS-76 virion capsid suggest that its virion structure differs from that of other adenoviruses.

  4. Synthesis of parallel and antiparallel core-shell triangular nanoparticles

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, Gourab; Satpati, Biswarup

    2018-04-01

    Core-shell triangular nanoparticles were synthesized by seed mediated growth. Using triangular gold (Au) nanoparticle as template, we have grown silver (Ag) shellto get core-shell nanoparticle. Here by changing the chemistry we have grown two types of core-shell structures where core and shell is having same symmetry and also having opposite symmetry. Both core and core-shell nanoparticles were characterized using transmission electron microscopy (TEM) and energy dispersive X-ray spectroscopy (EDX) to know the crystal structure and composition of these synthesized core-shell nanoparticles. From diffraction pattern analysis and energy filtered TEM (EFTEM) we have confirmed the crystal facet in core is responsible for such two dimensional growth of core-shell nanostructures.

  5. Twenty Years of Active Bacterial Core Surveillance

    PubMed Central

    Schaffner, William; Farley, Monica M.; Lynfield, Ruth; Bennett, Nancy M.; Reingold, Arthur; Thomas, Ann; Harrison, Lee H.; Nichols, Megin; Petit, Susan; Miller, Lisa; Moore, Matthew R.; Schrag, Stephanie J.; Lessa, Fernanda C.; Skoff, Tami H.; MacNeil, Jessica R.; Briere, Elizabeth C.; Weston, Emily J.; Van Beneden, Chris

    2015-01-01

    Active Bacterial Core surveillance (ABCs) was established in 1995 as part of the Centers for Disease Control and Prevention Emerging Infections Program (EIP) network to assess the extent of invasive bacterial infections of public health importance. ABCs is distinctive among surveillance systems because of its large, population-based, geographically diverse catchment area; active laboratory-based identification of cases to ensure complete case capture; detailed collection of epidemiologic information paired with laboratory isolates; infrastructure that allows for more in-depth investigations; and sustained commitment of public health, academic, and clinical partners to maintain the system. ABCs has directly affected public health policies and practices through the development and evaluation of vaccines and other prevention strategies, the monitoring of antimicrobial drug resistance, and the response to public health emergencies and other emerging infections. PMID:26292067

  6. Application of failure mode and effect analysis in an assisted reproduction technology laboratory.

    PubMed

    Intra, Giulia; Alteri, Alessandra; Corti, Laura; Rabellotti, Elisa; Papaleo, Enrico; Restelli, Liliana; Biondo, Stefania; Garancini, Maria Paola; Candiani, Massimo; Viganò, Paola

    2016-08-01

    Assisted reproduction technology laboratories have a very high degree of complexity. Mismatches of gametes or embryos can occur, with catastrophic consequences for patients. To minimize the risk of error, a multi-institutional working group applied failure mode and effects analysis (FMEA) to each critical activity/step as a method of risk assessment. This analysis led to the identification of the potential failure modes, together with their causes and effects, using the risk priority number (RPN) scoring system. In total, 11 individual steps and 68 different potential failure modes were identified. The highest ranked failure modes, with an RPN score of 25, encompassed 17 failures and pertained to "patient mismatch" and "biological sample mismatch". The maximum reduction in risk, with RPN reduced from 25 to 5, was mostly related to the introduction of witnessing. The critical failure modes in sample processing were improved by 50% in the RPN by focusing on staff training. Three indicators of FMEA success, based on technical skill, competence and traceability, have been evaluated after FMEA implementation. Witnessing by a second human operator should be introduced in the laboratory to avoid sample mix-ups. These findings confirm that FMEA can effectively reduce errors in assisted reproduction technology laboratories. Copyright © 2016 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  7. United States Department of Agriculture/Agricultural Research Service (USDA-ARS) Eastern Regional Research Center Core Technologies

    PubMed Central

    Nunez, A.; Strahan, G.; Soroka, D.S.; Damert, W.; Needleman, D.

    2011-01-01

    The Core Technologies (CT) unit, located at the Eastern Regional Research Center (ERRC), is a centralized resource of specialized instrumentation and technologies. Its objective is to provide supplementary research data processing, interpretation, analysis and consultation for a broad range of research programs approved by the Agricultural Research Service (ARS), the in-house research arm of the United States Department of Agriculture. The CT unit is comprised of four research related components: genetic analysis, proteomicsbiopolymers mass spectrometry, electron microscopy, and magnetic resonance spectroscopy (NMR). In addition, the Research Data Systems, the information pipeline of the CT, provides the means to facilitate data distribution to researchers, stakeholders, and the general public. The availability of integrated resource laboratories assures professional and dependable support to the goals of the ARS community.

  8. Analysis of actinides in an ombrotrophic peat core - evidence of post-depositional migration of fallout radionuclides

    NASA Astrophysics Data System (ADS)

    Quinto, Francesca; Hrnecek, Erich; Krachler, Michael; Shotyk, William; Steier, Peter; Winkler, Stephan R.

    2013-04-01

    Plutonium (239Pu, 240Pu, 241Pu, 242Pu) and uranium (236U, 238U) isotopes were analyzed in an ombrotrophic peat core from the Black Forest, Germany, representing the last 80 years of atmospheric deposition. The reliable determination of these isotopes at ultra-trace levels was possible using ultra-clean laboratory procedures and accelerator mass spectrometry. The 240Pu/239Pu isotopic ratios are constant along the core with a mean value of 0.19 ±0.02 (N = 32). This result is consistent with the acknowledged average 240Pu/239Pu isotopic ratio from global fallout in the Northern Hemisphere. The global fallout origin of Pu is confirmed by the corresponding 241Pu/239Pu (0.0012 ±0.0005) and 242Pu/239Pu (0.004 ± 0.001) isotopic ratios. The identification of the Pu isotopic composition characteristic for global fallout in peat layers pre-dating the period of atmospheric atom bomb testing (AD 1956 - AD 1980) is a clear evidence of the migration of Pu downwards the peat profile. The maximum of global fallout derived 236U is detected in correspondence to the age/depth layer of maximum stratospheric fallout (AD 1963). This finding demonstrates that the 236U bomb peak can be successfully used as an independent chronological marker complementing the 210Pb dating of peat cores. The profiles of the global fallout derived 236U and 239Pu are compared with those of 137Cs and 241Am. As typical of ombrothrophic peat, the temporal fallout pattern of 137Cs is poorly retained. Similarly like for Pu, post-depositional migration of 241Am in peat layers preceding the era of atmospheric nuclear tests is observed.

  9. Clinical pharmacology quality assurance program: models for longitudinal analysis of antiretroviral proficiency testing for international laboratories.

    PubMed

    DiFrancesco, Robin; Rosenkranz, Susan L; Taylor, Charlene R; Pande, Poonam G; Siminski, Suzanne M; Jenny, Richard W; Morse, Gene D

    2013-10-01

    Among National Institutes of Health HIV Research Networks conducting multicenter trials, samples from protocols that span several years are analyzed at multiple clinical pharmacology laboratories (CPLs) for multiple antiretrovirals. Drug assay data are, in turn, entered into study-specific data sets that are used for pharmacokinetic analyses, merged to conduct cross-protocol pharmacokinetic analysis, and integrated with pharmacogenomics research to investigate pharmacokinetic-pharmacogenetic associations. The CPLs participate in a semiannual proficiency testing (PT) program implemented by the Clinical Pharmacology Quality Assurance program. Using results from multiple PT rounds, longitudinal analyses of recovery are reflective of accuracy and precision within/across laboratories. The objectives of this longitudinal analysis of PT across multiple CPLs were to develop and test statistical models that longitudinally: (1) assess the precision and accuracy of concentrations reported by individual CPLs and (2) determine factors associated with round-specific and long-term assay accuracy, precision, and bias using a new regression model. A measure of absolute recovery is explored as a simultaneous measure of accuracy and precision. Overall, the analysis outcomes assured 97% accuracy (±20% of the final target concentration of all (21) drug concentration results reported for clinical trial samples by multiple CPLs). Using the Clinical Laboratory Improvement Act acceptance of meeting criteria for ≥2/3 consecutive rounds, all 10 laboratories that participated in 3 or more rounds per analyte maintained Clinical Laboratory Improvement Act proficiency. Significant associations were present between magnitude of error and CPL (Kruskal-Wallis P < 0.001) and antiretroviral (Kruskal-Wallis P < 0.001).

  10. Genome-Wide Analysis of the Core DNA Replication Machinery in the Higher Plants Arabidopsis and Rice1[W][OA

    PubMed Central

    Shultz, Randall W.; Tatineni, Vinaya M.; Hanley-Bowdoin, Linda; Thompson, William F.

    2007-01-01

    Core DNA replication proteins mediate the initiation, elongation, and Okazaki fragment maturation functions of DNA replication. Although this process is generally conserved in eukaryotes, important differences in the molecular architecture of the DNA replication machine and the function of individual subunits have been reported in various model systems. We have combined genome-wide bioinformatic analyses of Arabidopsis (Arabidopsis thaliana) and rice (Oryza sativa) with published experimental data to provide a comprehensive view of the core DNA replication machinery in plants. Many components identified in this analysis have not been studied previously in plant systems, including the GINS (go ichi ni san) complex (PSF1, PSF2, PSF3, and SLD5), MCM8, MCM9, MCM10, NOC3, POLA2, POLA3, POLA4, POLD3, POLD4, and RNASEH2. Our results indicate that the core DNA replication machinery from plants is more similar to vertebrates than single-celled yeasts (Saccharomyces cerevisiae), suggesting that animal models may be more relevant to plant systems. However, we also uncovered some important differences between plants and vertebrate machinery. For example, we did not identify geminin or RNASEH1 genes in plants. Our analyses also indicate that plants may be unique among eukaryotes in that they have multiple copies of numerous core DNA replication genes. This finding raises the question of whether specialized functions have evolved in some cases. This analysis establishes that the core DNA replication machinery is highly conserved across plant species and displays many features in common with other eukaryotes and some characteristics that are unique to plants. PMID:17556508

  11. Starless Cores as Fundamental Physics Labs

    NASA Astrophysics Data System (ADS)

    Mignano, Arturo; Molaro, Paolo; Levshakov, Sergei; Centurión, Miriam; Maccaferri, Giuseppe; Lapinov, Alexander

    We present high resolution observations in the starless dense molecular core L1512 performed with the Medicina 32m radio telescope. The resolved hfs components of HC3N and NH3 show no kinematic sub-structure and consist of an apparently symmetric peak profile without broadened line wings or self-absorption features suggesting that they sample the same material. The velocity dispersion is 101( ± 1) m s - 1for NH3 and 85( ± 2) m s - 1 for HC3N. The kinetic temperature of the cloud is estimated at 9.2 ( ± 1.2) K and the turbulence is of 76 m s - 1in a subsonic regime. This places L1512 among the most quiescent dark cores and makes it an ideal laboratory to study variations of the electron-to-proton mass ratio, μ = {m}e/{m}p by means of observations of inversion lines of NH3 combined with rotational lines of other molecular species.

  12. Core Hunter 3: flexible core subset selection.

    PubMed

    De Beukelaer, Herman; Davenport, Guy F; Fack, Veerle

    2018-05-31

    Core collections provide genebank curators and plant breeders a way to reduce size of their collections and populations, while minimizing impact on genetic diversity and allele frequency. Many methods have been proposed to generate core collections, often using distance metrics to quantify the similarity of two accessions, based on genetic marker data or phenotypic traits. Core Hunter is a multi-purpose core subset selection tool that uses local search algorithms to generate subsets relying on one or more metrics, including several distance metrics and allelic richness. In version 3 of Core Hunter (CH3) we have incorporated two new, improved methods for summarizing distances to quantify diversity or representativeness of the core collection. A comparison of CH3 and Core Hunter 2 (CH2) showed that these new metrics can be effectively optimized with less complex algorithms, as compared to those used in CH2. CH3 is more effective at maximizing the improved diversity metric than CH2, still ensures a high average and minimum distance, and is faster for large datasets. Using CH3, a simple stochastic hill-climber is able to find highly diverse core collections, and the more advanced parallel tempering algorithm further increases the quality of the core and further reduces variability across independent samples. We also evaluate the ability of CH3 to simultaneously maximize diversity, and either representativeness or allelic richness, and compare the results with those of the GDOpt and SimEli methods. CH3 can sample equally representative cores as GDOpt, which was specifically designed for this purpose, and is able to construct cores that are simultaneously more diverse, and either are more representative or have higher allelic richness, than those obtained by SimEli. In version 3, Core Hunter has been updated to include two new core subset selection metrics that construct cores for representativeness or diversity, with improved performance. It combines and outperforms the

  13. Diagnostic Pathology and Laboratory Medicine in the Age of “Omics”

    PubMed Central

    Finn, William G.

    2007-01-01

    Functional genomics and proteomics involve the simultaneous analysis of hundreds or thousands of expressed genes or proteins and have spawned the modern discipline of computational biology. Novel informatic applications, including sophisticated dimensionality reduction strategies and cancer outlier profile analysis, can distill clinically exploitable biomarkers from enormous experimental datasets. Diagnostic pathologists are now charged with translating the knowledge generated by the “omics” revolution into clinical practice. Food and Drug Administration-approved proprietary testing platforms based on microarray technologies already exist and will expand greatly in the coming years. However, for diagnostic pathology, the greatest promise of the “omics” age resides in the explosion in information technology (IT). IT applications allow for the digitization of histological slides, transforming them into minable data and enabling content-based searching and archiving of histological materials. IT will also allow for the optimization of existing (and often underused) clinical laboratory technologies such as flow cytometry and high-throughput core laboratory functions. The state of pathology practice does not always keep up with the pace of technological advancement. However, to use fully the potential of these emerging technologies for the benefit of patients, pathologists and clinical scientists must embrace the changes and transformational advances that will characterize this new era. PMID:17652635

  14. Definition of experiments and instruments for a communication/navigation research laboratory. Volume 3: Laboratory descriptions

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The following study objectives are covered: (1) identification of major laboratory equipment; (2) systems and operations analysis in support of the laboratory design; and (3) conceptual design of the comm/nav research laboratory.

  15. Materials and Methods for Streamlined Laboratory Analysis of Environmental Samples, FY 2016 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Addleman, Raymond S.; Naes, Benjamin E.; McNamara, Bruce K.

    The International Atomic Energy Agency (IAEA) relies upon laboratory analysis of environmental samples (typically referred to as “swipes”) collected during on-site inspections of safeguarded facilities to support the detection and deterrence of undeclared activities. Unfortunately, chemical processing and assay of the samples is slow and expensive. A rapid, effective, and simple extraction process and analysis method is needed to provide certified results with improved timeliness at reduced costs (principally in the form of reduced labor), while maintaining or improving sensitivity and efficacy. To address these safeguard needs the Pacific Northwest National Laboratory (PNNL) explored and demonstrated improved methods for environmentalmore » sample (ES) analysis. Improvements for both bulk and particle analysis were explored. To facilitate continuity and adoption, the new sampling materials and processing methods will be compatible with existing IAEA protocols for ES analysis. PNNL collaborated with Oak Ridge National Laboratory (ORNL), which performed independent validation of the new bulk analysis methods and compared performance to traditional IAEA’s Network of Analytical Laboratories (NWAL) protocol. ORNL efforts are reported separately. This report describes PNNL’s FY 2016 progress, which was focused on analytical application supporting environmental monitoring of uranium enrichment plants and nuclear fuel processing. In the future the technology could be applied to other safeguard applications and analytes related to fuel manufacturing, reprocessing, etc. PNNL’s FY 2016 efforts were broken into two tasks and a summary of progress, accomplishments and highlights are provided below. Principal progress and accomplishments on Task 1, Optimize Materials and Methods for ICP-MS Environmental Sample Analysis, are listed below. • Completed initial procedure for rapid uranium extraction from ES swipes based upon carbonate-peroxide chemistry (delivered to

  16. Observations of Pre-Stellar Cores

    NASA Astrophysics Data System (ADS)

    Tafalla, M.

    2005-08-01

    Our understanding of the physical and chemical structure of pre-stellar cores, the simplest star-forming sites, has significantly improved since the last IAU Symposium on Astrochemistry (South Korea, 1999). Research done over these years has revealed that major molecular species like CO and CS systematically deplete onto dust grains in the interior of pre-stellar cores, while species like N2H+ and NH3 survive in the gas phase and can usually be detected toward the core centers. Such a selective behavior of molecular species gives rise to a differentiated (onion-like) chemical composition, and manifests itself in molecular maps as a dichotomy between centrally peaked and ring-shaped distributions. From the point of view of star-formation studies, the identification of molecular inhomogeneities in cores helps to resolve past discrepancies between observations made using different tracers, and brings the possibility of self-consistent modelling of the core internal structure. Here I present recent work on determining the physical and chemical structure of two pre-stellar cores, L1498 and L1517B, using observations in a large number of molecules and Monte Carlo radiative transfer analysis. These two cores are typical examples of the pre-stellar core population, and their chemical composition is characterized by the presence of large `freeze out holes' in most molecular species. In contrast with these chemically processed objects, a new population of chemically young cores has begun to emerge. The characteristics of its most extreme representative, L1521E, are briefly reviewed.

  17. The Information Systems Core: A Study from the Perspective of IS Core Curricula in the U.S.

    ERIC Educational Resources Information Center

    Hwang, Drew; Ma, Zhongming; Wang, Ming

    2015-01-01

    To keep up with technology changes and industry trends, it is essential for Information Systems (IS) programs to maintain up to date curricula. In doing so, IS educators need to determine what the IS core is and implement it in their curriculum. This study performed a descriptive analysis of 2,229 core courses offered by 394 undergraduate IS…

  18. Design and analysis of a toroidal tester for the measurement of core losses under axial compressive stress

    NASA Astrophysics Data System (ADS)

    Alatawneh, Natheer; Rahman, Tanvir; Lowther, David A.; Chromik, Richard

    2017-06-01

    Electric machine cores are subjected to mechanical stresses due to manufacturing processes. These stresses include radial, circumferential and axial components that may have significant influences on the magnetic properties of the electrical steel and hence, on the output and efficiencies of electrical machines. Previously, most studies of iron losses due to mechanical stress have considered only radial and circumferential components. In this work, an improved toroidal tester has been designed and developed to measure the core losses and the magnetic properties of electrical steel under a compressive axial stress. The shape of the toroidal ring has been verified using 3D stress analysis. Also, 3D electromagnetic simulations show a uniform flux density distribution in the specimen with a variation of 0.03 T and a maximum average induction level of 1.5 T. The developed design has been prototyped, and measurements were carried out using a steel sample of grade 35WW300. Measurements show that applying small mechanical stresses normal to the sample thickness rises the delivered core losses, then the losses decrease continuously as the stress increases. However, the drop in core losses at high stresses does not go lower than the free-stress condition. Physical explanations for the observed trend of core losses as a function of stress are provided based on core loss separation to the hysteresis and eddy current loss components. The experimental results show that the effect of axial compressive stress on magnetic properties of electrical steel at high level of inductions becomes less pronounced.

  19. Simulation of cracking cores when molding piston components

    NASA Astrophysics Data System (ADS)

    Petrenko, Alena; Soukup, Josef

    2014-08-01

    The article deals with pistons casting made from aluminum alloy. Pistons are casting at steel mold with steel core. The casting is provided by gravity casting machine. The each machine is equipped by two metal molds, which are preheated above temperature 160 °C before use. The steel core is also preheated by flame. The metal molds and cores are heated up within the casting process. The temperature of the metal mold raise up to 200 °C and temperature of core is higher. The surface of the core is treated by nitration. The mold and core are cooled down by water during casting process. The core is overheated and its top part is finally cracked despite its intensive water-cooling. The life time cycle of the core is decreased to approximately 5 to 15 thousands casting, which is only 15 % of life time cycle of core for production of other pistons. The article presents the temperature analysis of the core.

  20. Multiphysics Analysis of a Solid-Core Nuclear Thermal Engine Thrust Chamber

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Canabal, Francisco; Cheng, Gary; Chen, Yen-Sen

    2006-01-01

    The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics methodology. Formulations for heat transfer in solids and porous media were implemented and anchored. A two-pronged approach was employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of hydrogen dissociation and recombination on heat transfer and thrust performance. The formulations and preliminary results on both aspects are presented.

  1. Analysis and Exchange of Multimedia Laboratory Data Using the Brain Database

    PubMed Central

    Wertheim, Steven L.

    1990-01-01

    Two principal goals of the Brain Database are: 1) to support laboratory data collection and analysis of multimedia information about the nervous system and 2) to support exchange of these data among researchers and clinicians who may be physically distant. This has been achieved by an implementation of experimental and clinical records within a relational database. An Image Series Editor has been created that provides a graphical interface to these data for the purposes of annotation, quantification and other analyses. Cooperating laboratories each maintain their own copies of the Brain Database to which they may add private data. Although the data in a given experimental or patient record will be distributed among many tables and external image files, the user can treat each record as a unit that can be extracted from the local database and sent to a distant colleague.

  2. Provenance of whitefish in the Gulf of Bothnia determined by elemental analysis of otolith cores

    NASA Astrophysics Data System (ADS)

    Lill, J.-O.; Finnäs, V.; Slotte, J. M. K.; Jokikokko, E.; Heimbrand, Y.; Hägerstrand, H.

    2018-02-01

    The strontium concentration in the core of otoliths was used to determine the provenance of whitefish found in the Gulf of Bothnia, Baltic Sea. To that end, a database of strontium concentration in fish otoliths representing different habitats (sea, river and fresh water) had to be built. Otoliths from juvenile whitefish were therefore collected from freshwater ponds at 5 hatcheries, from adult whitefish from 6 spawning sites at sea along the Finnish west coast, and from adult whitefish ascending to spawn in the Torne River, in total 67 otoliths. PIXE was applied to determine the elemental concentrations in these otoliths. While otoliths from the juveniles raised in the freshwater ponds showed low but varying strontium concentrations (194-1664 μg/g,), otoliths from sea-spawning fish showed high uniform strontium levels (3720-4333 μg/g). The otolith core analysis of whitefish from Torne River showed large variations in the strontium concentrations (1525-3650 μg/g). These otolith data form a database to be used for provenance studies of wild adult whitefish caught at sea. The applicability of the database was evaluated by analyzing the core of polished otoliths from 11 whitefish from a test site at sea in the Larsmo archipelago. Our results show that by analyzing strontium in the otolith core, we can differentiate between hatchery-origin and wild-origin whitefish, but not always between river and sea spawning whitefish.

  3. A Comprehensive Microfluidics Device Construction and Characterization Module for the Advanced Undergraduate Analytical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Piunno, Paul A. E.; Zetina, Adrian; Chu, Norman; Tavares, Anthony J.; Noor, M. Omair; Petryayeva, Eleonora; Uddayasankar, Uvaraj; Veglio, Andrew

    2014-01-01

    An advanced analytical chemistry undergraduate laboratory module on microfluidics that spans 4 weeks (4 h per week) is presented. The laboratory module focuses on comprehensive experiential learning of microfluidic device fabrication and the core characteristics of microfluidic devices as they pertain to fluid flow and the manipulation of samples.…

  4. Field analysis of the Cerenkov doubling of infrared coherent radiation utilizing an organic crystal core bounded by a glass capillary

    NASA Astrophysics Data System (ADS)

    Hayata, K.; Yanagawa, K.; Koshiba, M.

    1990-12-01

    A mode field analysis is presented of the second-harmonic electromagnetic wave that radiates from a nonlinear core bounded by a dielectric cladding. With this analysis the ultimate performance of the organic crystal-cored single-mode optical fiber waveguide as a guided-wave frequency doubler is evaluated through the solution of nonlinear parametric equations derived from Maxwell's equations under some assumptions. As a phase-matching scheme, a Cerenkov approach is considered because of advantages in actual device applications, in which the phase matching is achievable between the fundamental guided LP01 mode and the second-harmonic radiation (leaky) mode. Calculated results for organic cores made of benzil, 4-(N,N-dimethyl-amino)-3-acetamidonitrobenzen, 2-methyl-4-nitroaniline, and 4'-nitrobenzilidene-3-acetoamino-4-metxianiline provide useful data for designing an efficient fiber-optic wavelength converter utilizing nonlinear parametric processes. A detailed comparison is made between results for infinite and finite cladding thicknesses.

  5. Competency Guidelines for Public Health Laboratory Professionals: CDC and the Association of Public Health Laboratories.

    PubMed

    Ned-Sykes, Renée; Johnson, Catherine; Ridderhof, John C; Perlman, Eva; Pollock, Anne; DeBoy, John M

    2015-05-15

    These competency guidelines outline the knowledge, skills, and abilities necessary for public health laboratory (PHL) professionals to deliver the core services of PHLs efficiently and effectively. As part of a 2-year workforce project sponsored in 2012 by CDC and the Association of Public Health Laboratories (APHL), competencies for 15 domain areas were developed by experts representing state and local PHLs, clinical laboratories, academic institutions, laboratory professional organizations, CDC, and APHL. The competencies were developed and reviewed by approximately 170 subject matter experts with diverse backgrounds and experiences in laboratory science and public health. The guidelines comprise general, cross-cutting, and specialized domain areas and are divided into four levels of proficiency: beginner, competent, proficient, and expert. The 15 domain areas are 1) Quality Management System, 2) Ethics, 3) Management and Leadership, 4) Communication, 5) Security, 6) Emergency Management and Response, 7) Workforce Training, 8) General Laboratory Practice, 9) Safety, 10) Surveillance, 11) Informatics, 12) Microbiology, 13) Chemistry, 14) Bioinformatics, and 15) Research. These competency guidelines are targeted to scientists working in PHLs, defined as governmental public health, environmental, and agricultural laboratories that provide analytic biological and/or chemical testing and testing-related services that protect human populations against infectious diseases, foodborne and waterborne diseases, environmental hazards, treatable hereditary disorders, and natural and human-made public health emergencies. The competencies support certain PHL workforce needs such as identifying job responsibilities, assessing individual performance, and providing a guiding framework for producing education and training programs. Although these competencies were developed specifically for the PHL community, this does not preclude their broader application to other professionals

  6. Structure-Function Analysis of the Drosophila melanogaster Caudal Transcription Factor Provides Insights into Core Promoter-preferential Activation.

    PubMed

    Shir-Shapira, Hila; Sharabany, Julia; Filderman, Matan; Ideses, Diana; Ovadia-Shochat, Avital; Mannervik, Mattias; Juven-Gershon, Tamar

    2015-07-10

    Regulation of RNA polymerase II transcription is critical for the proper development, differentiation, and growth of an organism. The RNA polymerase II core promoter is the ultimate target of a multitude of transcription factors that control transcription initiation. Core promoters encompass the RNA start site and consist of functional elements such as the TATA box, initiator, and downstream core promoter element (DPE), which confer specific properties to the core promoter. We have previously discovered that Drosophila Caudal, which is a master regulator of genes involved in development and differentiation, is a DPE-specific transcriptional activator. Here, we show that the mouse Caudal-related homeobox (Cdx) proteins (mCdx1, mCdx2, and mCdx4) are also preferential core promoter transcriptional activators. To elucidate the mechanism that enables Caudal to preferentially activate DPE transcription, we performed structure-function analysis. Using a systematic series of deletion mutants (all containing the intact DNA-binding homeodomain) we discovered that the C-terminal region of Caudal contributes to the preferential activation of the fushi tarazu (ftz) Caudal target gene. Furthermore, the region containing both the homeodomain and the C terminus of Caudal was sufficient to confer core promoter-preferential activation to the heterologous GAL4 DNA-binding domain. Importantly, we discovered that Drosophila CREB-binding protein (dCBP) is a co-activator for Caudal-regulated activation of ftz. Strikingly, dCBP conferred the ability to preferentially activate the DPE-dependent ftz reporter to mini-Caudal proteins that were unable to preferentially activate ftz transcription themselves. Taken together, it is the unique combination of dCBP and Caudal that enables the co-activation of ftz in a core promoter-preferential manner. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  7. Factors affecting the use of increment cores to assess fixation

    Treesearch

    Stan T. Lebow

    2001-01-01

    As part of an effort to ensure that treated wood products have minimal environmental and handling concerns, an American Wood Preservers Association task force is considering the development of a test to assess the degree of fixation of Waterborne wood preservatives. The proposed test involves removal and leaching of increment cores. This paper describes a laboratory...

  8. Quality-assurance plan for the analysis of fluvial sediment by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory

    USGS Publications Warehouse

    Shreve, Elizabeth A.; Downs, Aimee C.

    2005-01-01

    This report describes laboratory procedures used by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory for the processing and analysis of fluvial-sediment samples for concentration of sand and finer material. The report details the processing of a sediment sample through the laboratory from receiving the sediment sample, through the analytical process, to compiling results of the requested analysis. Procedures for preserving sample integrity, calibrating and maintaining of laboratory and field instruments and equipment, analyzing samples, internal quality assurance and quality control, and validity of the sediment-analysis results also are described. The report includes a list of references cited and a glossary of sediment and quality-assurance terms.

  9. Core Muscle Activation in Suspension Training Exercises.

    PubMed

    Cugliari, Giovanni; Boccia, Gennaro

    2017-02-01

    A quantitative observational laboratory study was conducted to characterize and classify core training exercises executed in a suspension modality on the base of muscle activation. In a prospective single-group repeated measures design, seventeen active male participants performed four suspension exercises typically associated with core training (roll-out, bodysaw, pike and knee-tuck). Surface electromyographic signals were recorded from lower and upper parts of rectus abdominis, external oblique, internal oblique, lower and upper parts of erector spinae muscles using concentric bipolar electrodes. The average rectified values of electromyographic signals were normalized with respect to individual maximum voluntary isometric contraction of each muscle. Roll-out exercise showed the highest activation of rectus abdominis and oblique muscles compared to the other exercises. The rectus abdominis and external oblique reached an activation higher than 60% of the maximal voluntary contraction (or very close to that threshold, 55%) in roll-out and bodysaw exercises. Findings from this study allow the selection of suspension core training exercises on the basis of quantitative information about the activation of muscles of interest. Roll-out and bodysaw exercises can be considered as suitable for strength training of rectus abdominis and external oblique muscles.

  10. Stability analysis of Hawaiian Island flanks using insight gained from strength testing of the HSDP core

    NASA Astrophysics Data System (ADS)

    Thompson, Nick; Watters, Robert J.; Schiffman, Peter

    2008-04-01

    Hawaiian Island flank failures are recognized as the largest landslide events on Earth, reaching volumes of several thousand cubic kilometers and lengths of over 200 km and occurring on an average of once every 100 000 years. The 3.1 km deep Hawaii Scientific Drilling Project (HSDP) enabled an investigation of the rock mass strength variations on the island of Hawaii [Schiffman, P., Watters, R.J., Thompson, N., Walton, A.W., 2006. Hyaloclastites and the slope stability of Hawaiian volcanoes: insights from the Hawaiian Scientific Drilling Project's 3-km drill core. Journal of Volcanology and Geothermal Research, 151 (1-3): 217-228]. This study builds on that of Schiffman et al. [Schiffman, P., Watters, R.J., Thompson, N., Walton, A.W., 2006. Hyaloclastites and the slope stability of Hawaiian volcanoes: Insights from the Hawaiian Scientific Drilling Project's 3-km drill core. Journal of Volcanology and Geothermal Research, 151 (1-3): 217-228] by considering more in-depth rock mass classification and strength testing methods of the HSDP core. Geotechnical core logging techniques combined with laboratory strength testing methods show that rock strength differences exist within the edifice. Comparing the rock strength parameters obtained from the various volcano lithologies identified weak zones, suggesting the possible location of future slip surfaces for large flank failures. Relatively weak rock layers were recognized within poorly consolidated hyaloclastite zones, with increases in strength based on degree of alteration. Subaerial and submarine basalt flows are found to be significantly stronger. With the aid of digital elevation models, cross-sections have been developed of key flank areas on the island of Hawaii. Limit equilibrium slope stability analyses are performed on each cross-section using various failure criteria for the rock mass strength calculations. Based on the stability analyses the majority of the slopes analyzed are considered stable. In cases

  11. Beryllium Laboratory Analysis--The Regulations May Drive the Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taruru, Seuri K

    Beryllium has many industry-specific applications, such as medical X-ray windows for diagnostic equipment, nuclear reactors, aerospace applications, precision instrumentation, and other consumer products for which lightness and rigidity is essential. According to the National Toxicology Program, beryllium oxide (BeO) is one of the most significant beryllium compounds in production. Although beryllium and its compounds have a wide array of beneficial uses, due to its unique properties it is not an ideal metal to be used in all situations. Exposure to beryllium is linked to beryllium sensitization and Chronic Beryllium Disease (CBD), which is incurable, debilitating, and potentially fatal. The Internationalmore » Agency for Research on Cancer classifies beryllium and beryllium compounds as “carcinogenic to humans” (Group I), and EPA classifies beryllium as a likely human carcinogen, the lung being the primary target organ. Laboratory analysis for beryllium samples has always presented a challenge to the analytical community. While most metals of interest to industrial hygienists have occupational exposure limits (OELs) in milligrams per cubic meter (mg/m 3), the beryllium OELs are in micrograms per cubic meter (μg/m3). Some regulatory agencies have recently published beryllium OELs so low that in some cases a laboratory limit of detection (LOD) in nanograms (ng) is required. For most substances, science drives the regulations, but for beryllium, regulations appear to be driving science to develop laboratory analytical methods that can adequately support the proposed OELs. (EPA has issued guidelines regarding ambient and community airborne beryllium exposure, but this article focuses on beryllium from an occupational exposure perspective.)« less

  12. Beryllium Laboratory Analysis--The Regulations May Drive the Science

    DOE PAGES

    Taruru, Seuri K

    2017-08-01

    Beryllium has many industry-specific applications, such as medical X-ray windows for diagnostic equipment, nuclear reactors, aerospace applications, precision instrumentation, and other consumer products for which lightness and rigidity is essential. According to the National Toxicology Program, beryllium oxide (BeO) is one of the most significant beryllium compounds in production. Although beryllium and its compounds have a wide array of beneficial uses, due to its unique properties it is not an ideal metal to be used in all situations. Exposure to beryllium is linked to beryllium sensitization and Chronic Beryllium Disease (CBD), which is incurable, debilitating, and potentially fatal. The Internationalmore » Agency for Research on Cancer classifies beryllium and beryllium compounds as “carcinogenic to humans” (Group I), and EPA classifies beryllium as a likely human carcinogen, the lung being the primary target organ. Laboratory analysis for beryllium samples has always presented a challenge to the analytical community. While most metals of interest to industrial hygienists have occupational exposure limits (OELs) in milligrams per cubic meter (mg/m 3), the beryllium OELs are in micrograms per cubic meter (μg/m3). Some regulatory agencies have recently published beryllium OELs so low that in some cases a laboratory limit of detection (LOD) in nanograms (ng) is required. For most substances, science drives the regulations, but for beryllium, regulations appear to be driving science to develop laboratory analytical methods that can adequately support the proposed OELs. (EPA has issued guidelines regarding ambient and community airborne beryllium exposure, but this article focuses on beryllium from an occupational exposure perspective.)« less

  13. Idaho National Laboratory Quarterly Performance Analysis for the 2nd Quarter FY 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Lisbeth A.

    2015-04-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of events for the 2nd Qtr FY-15.

  14. Warmed, humidified CO2 insufflation benefits intraoperative core temperature during laparoscopic surgery: A meta‐analysis

    PubMed Central

    Dean, Meara; Ramsay, Robert; Heriot, Alexander; Mackay, John; Hiscock, Richard

    2016-01-01

    Abstract Background Intraoperative hypothermia is linked to postoperative adverse events. The use of warmed, humidified CO2 to establish pneumoperitoneum during laparoscopy has been associated with reduced incidence of intraoperative hypothermia. However, the small number and variable quality of published studies have caused uncertainty about the potential benefit of this therapy. This meta‐analysis was conducted to specifically evaluate the effects of warmed, humidified CO2 during laparoscopy. Methods An electronic database search identified randomized controlled trials performed on adults who underwent laparoscopic abdominal surgery under general anesthesia with either warmed, humidified CO2 or cold, dry CO2. The main outcome measure of interest was change in intraoperative core body temperature. Results The database search identified 320 studies as potentially relevant, and of these, 13 met the inclusion criteria and were included in the analysis. During laparoscopic surgery, use of warmed, humidified CO2 is associated with a significant increase in intraoperative core temperature (mean temperature change, 0.3°C), when compared with cold, dry CO2 insufflation. Conclusion Warmed, humidified CO2 insufflation during laparoscopic abdominal surgery has been demonstrated to improve intraoperative maintenance of normothermia when compared with cold, dry CO2. PMID:27976517

  15. Tracking Student Progression through the Core Curriculum. CCRC Analytics

    ERIC Educational Resources Information Center

    Hodara, Michelle; Rodriguez, Olga

    2013-01-01

    This report demonstrates useful methods for examining student progression through the core curriculum. The authors carry out analyses at two colleges in two different states, illustrating students' overall progression through the core curriculum and the relationship of this "core" progression to their college outcomes. By means of this analysis,…

  16. BNL program in support of LWR degraded-core accident analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ginsberg, T.; Greene, G.A.

    1982-01-01

    Two major sources of loading on dry watr reactor containments are steam generatin from core debris water thermal interactions and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described in this paper. 8 figures.

  17. Development of Optimized Core Design and Analysis Methods for High Power Density BWRs

    NASA Astrophysics Data System (ADS)

    Shirvan, Koroush

    temperature was kept the same for the BWR-HD and ABWR which resulted in 4 °K cooler core inlet temperature for the BWR-HD given that its feedwater makes up a larger fraction of total core flow. The stability analysis using the STAB and S3K codes showed satisfactory results for the hot channel, coupled regional out-of-phase and coupled core-wide in-phase modes. A RELAPS model of the ABWR system was constructed and applied to six transients for the BWR-HD and ABWR. The 6MCPRs during all the transients were found to be equal or less for the new design and the core remained covered for both. The lower void coefficient along with smaller core volume proved to be advantages for the simulated transients. Helical Cruciform Fuel (HCF) rods were proposed in prior MIT studies to enhance the fuel surface to volume ratio. In this work, higher fidelity models (e.g. CFD instead of subchannel methods for the hydraulic behaviour) are used to investigate the resolution needed for accurate assessment of the HCF design. For neutronics, conserving the fuel area of cylindrical rods results in a different reactivity level with a lower void coefficient for the HCF design. In single-phase flow, for which experimental results existed, the friction factor is found to be sensitive to HCF geometry and cannot be calculated using current empirical models. A new approach for analysis of flow crisis conditions for HCF rods in the context of Departure from Nucleate Boiling (DNB) and dryout using the two phase interface tracking method was proposed and initial results are presented. It is shown that the twist of the HCF rods promotes detachment of a vapour bubble along the elbows which indicates no possibility for an early DNB for the HCF rods and in fact a potential for a higher DNB heat flux. Under annular flow conditions, it was found that the twist suppressed the liquid film thickness on the HCF rods, at the locations of the highest heat flux, which increases the possibility of reaching early dryout. It

  18. The Core Services of the European Plate Observing System (EPOS)

    NASA Astrophysics Data System (ADS)

    Hoffmann, T. L.; Euteneuer, F. H.; Lauterjung, J.

    2013-12-01

    The ESFRI project European Plate Observing System (EPOS) was launched in November 2010 and has now completed its year 3 of the four-year preparatory phase. EPOS will create a single sustainable, permanent observation infrastructure, integrating existing geophysical monitoring networks, local observatories and experimental laboratories in Europe and adjacent regions. EPOS' technical Work Package 6 has developed a three layer architectural model for the construction of the EPOS Core Services (CS) during the subsequent implementation phase. The Poster will present and detail on these three layers, consisting of the EPOS Integrated Core Services (ICS), the Thematic Core Services (TCS) and the existing National Research Infrastructures & Data Centers. The basic layer of the architecture is established by the National Research Infrastructures (RIs) & Data Centers, which generate data and information and are responsible for the operation of the instrumentation. National RIs will provide their data to the Thematic Cores Services. The Thematic Core Services constitute the community layer of EPOS architecture and they will: 1) consist of existing (e.g. ORFEUS, EMSC), developing (e.g. EUREF/GNSS) or still to be developed Service Providers for specific thematic communities, as represented within EPOS through the technical EPOS Working Groups (e.g., seismology, volcanology, geodesy, geology, analytic labs for rock physics, geomagnetism, geo-resources ... and many others), 2) provide data services to specific communities, 3) link the National Research Infrastructures to the EPOS Integrated Services, 4) include Service Providers (e.g. OneGeology+, Intermagnet) that may be merely linked or partially integrated and 5) consist of Integrated Laboratories and RIs spanning multiple EPOS disciplines and taking advantage of other existing Thematic Services. The EPOS Integrated Services constitute the ICT layer of the EPOS portal and they will: 1) provide access to multidisciplinary data

  19. From field to database : a user-oriented approche to promote cyber-curating of scientific drilling cores

    NASA Astrophysics Data System (ADS)

    Pignol, C.; Arnaud, F.; Godinho, E.; Galabertier, B.; Caillo, A.; Billy, I.; Augustin, L.; Calzas, M.; Rousseau, D. D.; Crosta, X.

    2016-12-01

    Managing scientific data is probably one the most challenging issues in modern science. In plaeosciences the question is made even more sensitive with the need of preserving and managing high value fragile geological samples: cores. Large international scientific programs, such as IODP or ICDP led intense effort to solve this problem and proposed detailed high standard work- and dataflows thorough core handling and curating. However many paleoscience results derived from small-scale research programs in which data and sample management is too often managed only locally - when it is… In this paper we present a national effort leads in France to develop an integrated system to curate ice and sediment cores. Under the umbrella of the national excellence equipment program CLIMCOR, we launched a reflexion about core curating and the management of associated fieldwork data. Our aim was then to conserve all data from fieldwork in an integrated cyber-environment which will evolve toward laboratory-acquired data storage in a near future. To do so, our demarche was conducted through an intimate relationship with field operators as well laboratory core curators in order to propose user-oriented solutions. The national core curating initiative proposes a single web portal in which all teams can store their fieldwork data. This portal is used as a national hub to attribute IGSNs. For legacy samples, this requires the establishment of a dedicated core list with associated metadata. However, for forthcoming core data, we developed a mobile application to capture technical and scientific data directly on the field. This application is linked with a unique coring-tools library and is adapted to most coring devices (gravity, drilling, percussion etc.) including multiple sections and holes coring operations. Those field data can be uploaded automatically to the national portal, but also referenced through international standards (IGSN and INSPIRE) and displayed in international

  20. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  1. Core-to-core uniformity improvement in multi-core fiber Bragg gratings

    NASA Astrophysics Data System (ADS)

    Lindley, Emma; Min, Seong-Sik; Leon-Saval, Sergio; Cvetojevic, Nick; Jovanovic, Nemanja; Bland-Hawthorn, Joss; Lawrence, Jon; Gris-Sanchez, Itandehui; Birks, Tim; Haynes, Roger; Haynes, Dionne

    2014-07-01

    Multi-core fiber Bragg gratings (MCFBGs) will be a valuable tool not only in communications but also various astronomical, sensing and industry applications. In this paper we address some of the technical challenges of fabricating effective multi-core gratings by simulating improvements to the writing method. These methods allow a system designed for inscribing single-core fibers to cope with MCFBG fabrication with only minor, passive changes to the writing process. Using a capillary tube that was polished on one side, the field entering the fiber was flattened which improved the coverage and uniformity of all cores.

  2. Nondestructive laboratory measurement of geotechnical and geoacoustic properties through intact core-liner

    USGS Publications Warehouse

    Kayen, R.E.; Edwards, B.D.; Lee, H.J.

    1999-01-01

    High-resolution automated measurement of the geotechnical and geoacoustic properties of soil at the U.S. Geological Survey (USGS) is performed with a state-of-the-art multi-sensor whole-core logging device. The device takes measurements, directly through intact sample-tube wall, of p-wave acoustic velocity, of soil wet bulk density, and magnetic susceptibility. This paper summarizes our methodology for determining soil-sound speed and wet-bulk density for material encased in an unsplit liner. Our methodology for nondestructive measurement allows for rapid, accurate, and high-resolution (1 cm-spaced) mapping of the mass physical properties of soil prior to sample extrusion.

  3. RIPOSTE: a framework for improving the design and analysis of laboratory-based research

    PubMed Central

    Masca, Nicholas GD; Hensor, Elizabeth MA; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam KA; Teare, M Dawn

    2015-01-01

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results. DOI: http://dx.doi.org/10.7554/eLife.05519.001 PMID:25951517

  4. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    PubMed

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  5. Cool Core Disruption in Abell 1763

    NASA Astrophysics Data System (ADS)

    Douglass, Edmund; Blanton, Elizabeth L.; Clarke, Tracy E.; Randall, Scott W.; Edwards, Louise O. V.; Sabry, Ziad

    2017-01-01

    We present the analysis of a 20 ksec Chandra archival observation of the massive galaxy cluster Abell 1763. A model-subtracted image highlighting excess cluster emission reveals a large spiral structure winding outward from the core to a radius of ~950 kpc. We measure the gas of the inner spiral to have significantly lower entropy than non-spiral regions at the same radius. This is consistent with the structure resulting from merger-induced motion of the cluster’s cool core, a phenomenon seen in many systems. Atypical of spiral-hosting clusters, an intact cool core is not detected. Its absence suggests the system has experienced significant disruption since the initial dynamical encounter that set the sloshing core in motion. Along the major axis of the elongated ICM distribution we detect thermal features consistent with the merger event most likely responsible for cool core disruption. The merger-induced transition towards non-cool core status will be discussed. The interaction between the powerful (P1.4 ~ 1026 W Hz-1) cluster-center WAT radio source and its ICM environment will also be discussed.

  6. Engineering Water Analysis Laboratory Activity.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    The purposes of water treatment in a marine steam power plant are to prevent damage to boilers, steam-operated equipment, and steam and condensate lives, and to keep all equipment operating at the highest level of efficiency. This laboratory exercise is designed to provide students with experiences in making accurate boiler water tests and to…

  7. Interval Analysis Approach to Prototype the Robust Control of the Laboratory Overhead Crane

    NASA Astrophysics Data System (ADS)

    Smoczek, J.; Szpytko, J.; Hyla, P.

    2014-07-01

    The paper describes the software-hardware equipment and control-measurement solutions elaborated to prototype the laboratory scaled overhead crane control system. The novelty approach to crane dynamic system modelling and fuzzy robust control scheme design is presented. The iterative procedure for designing a fuzzy scheduling control scheme is developed based on the interval analysis of discrete-time closed-loop system characteristic polynomial coefficients in the presence of rope length and mass of a payload variation to select the minimum set of operating points corresponding to the midpoints of membership functions at which the linear controllers are determined through desired poles assignment. The experimental results obtained on the laboratory stand are presented.

  8. Core body temperature in obesity123

    PubMed Central

    Heikens, Marc J; Gorbach, Alexander M; Eden, Henry S; Savastano, David M; Chen, Kong Y; Skarulis, Monica C

    2011-01-01

    Background: A lower core body temperature set point has been suggested to be a factor that could potentially predispose humans to develop obesity. Objective: We tested the hypothesis that obese individuals have lower core temperatures than those in normal-weight individuals. Design: In study 1, nonobese [body mass index (BMI; in kg/m2) <30] and obese (BMI ≥30) adults swallowed wireless core temperature–sensing capsules, and we measured core temperatures continuously for 24 h. In study 2, normal-weight (BMI of 18–25) and obese subjects swallowed temperature-sensing capsules to measure core temperatures continuously for ≥48 h and kept activity logs. We constructed daily, 24-h core temperature profiles for analysis. Results: Mean (±SE) daily core body temperature did not differ significantly between the 35 nonobese and 46 obese subjects (36.92 ± 0.03°C compared with 36.89 ± 0.03°C; P = 0.44). Core temperature 24-h profiles did not differ significantly between 11 normal-weight and 19 obese subjects (P = 0.274). Women had a mean core body temperature ≈0.23°C greater than that of men (36.99 ± 0.03°C compared with 36.76 ± 0.03°C; P < 0.0001). Conclusions: Obesity is not generally associated with a reduced core body temperature. It may be necessary to study individuals with function-altering mutations in core temperature–regulating genes to determine whether differences in the core body temperature set point affect the regulation of human body weight. These trials were registered at clinicaltrials.gov as NCT00428987 and NCT00266500. PMID:21367952

  9. Characterization of Gas and Particle Emissions from Laboratory Burns of Peat

    EPA Science Inventory

    Peat cores collected from two locations in eastern North Carolina (NC, USA) were burned in a laboratory facility to characterize emissions during simulated field combustion. Particle and gas samples were analyzed to quantify emission factors for particulate matter (PM2.5), organi...

  10. Integrated Laboratory and Field Investigations: Assessing Contaminant Risk to American Badgers

    EPA Science Inventory

    This manuscript provides an example of integrated laboratory and field approach to complete a toxicological ecological risk assessment at the landscape level. The core findings from the study demonstrate how radio telemetry data can allow for ranking the relative risks of contam...

  11. Fractal analysis of the hydraulic conductivity on a sandy porous media reproduced in a laboratory facility.

    NASA Astrophysics Data System (ADS)

    de Bartolo, S.; Fallico, C.; Straface, S.; Troisi, S.; Veltri, M.

    2009-04-01

    The complexity characterization of the porous media structure, in terms of the "pore" phase and the "solid" phase, can be carried out by means of the fractal geometry which is able to put in relationship the soil structural properties and the water content. It is particularly complicated to describe analytically the hydraulic conductivity for the irregularity of the porous media structure. However these can be described by many fractal models considering the soil structure as the distribution of particles dimensions, the distribution of the solid aggregates, the surface of the pore-solid interface and the fractal mass of the "pore" and "solid" phases. In this paper the fractal model of Yu and Cheng (2002) and Yu and Liu (2004), for a saturated bidispersed porous media, was considered. This model, using the Sierpinsky-type gasket scheme, doesn't contain empiric constants and furnishes a well accord with the experimental data. For this study an unconfined aquifer was reproduced by means of a tank with a volume of 10 Ã- 7 Ã- 3 m3, filled with a homogeneous sand (95% of SiO2), with a high percentage (86.4%) of grains between 0.063mm and 0.125mm and a medium-high permeability. From the hydraulic point of view, 17 boreholes, a pumping well and a drainage ring around its edge were placed. The permeability was measured utilizing three different methods, consisting respectively in pumping test, slug test and laboratory analysis of an undisturbed soil cores, each of that involving in the measurement a different support volume. The temporal series of the drawdown obtained by the pumping test were analyzed by the Neuman-type Curve method (1972), because the saturated part above the bottom of the facility represents an unconfined aquifer. The data analysis of the slug test were performed by the Bouwer & Rice (1976) method and the laboratory analysis were performed on undisturbed saturated soil samples utilizing a falling head permeameter. The obtained values either of the

  12. Measurements of ethane in Antarctic ice cores

    NASA Astrophysics Data System (ADS)

    Verhulst, K. R.; Fosse, E. K.; Aydin, K. M.; Saltzman, E. S.

    2011-12-01

    Ethane is one of the most abundant hydrocarbons in the atmosphere. The major ethane sources are fossil fuel production and use, biofuel combustion, and biomass-burning emissions and the primary loss pathway is via reaction with OH. A paleoatmospheric ethane record would be useful as a tracer of biomass-burning emissions, providing a constraint on past changes in atmospheric methane and methane isotopes. An independent biomass-burning tracer would improve our understanding of the relationship between biomass burning and climate. The mean annual atmospheric ethane level at high southern latitudes is about 230 parts per trillion (ppt), and Antarctic firn air measurements suggest that atmospheric ethane levels in the early 20th century were considerably lower (Aydin et al., 2011). In this study, we present preliminary measurements of ethane (C2H6) in Antarctic ice core samples with gas ages ranging from 0-1900 C.E. Samples were obtained from dry-drilled ice cores from South Pole and Vostok in East Antarctica, and from the West Antarctic Ice Sheet Divide (WAIS-D). Gases were extracted from the ice by melting under vacuum in a glass vessel sealed by indium wire and were analyzed using high resolution GC/MS with isotope dilution. Ethane levels measured in ice core samples were in the range 100-220 ppt, with a mean of 157 ± 45 ppt (n=12). System blanks contribute roughly half the amount of ethane extracted from a 300 g ice core sample. These preliminary data exhibit a temporal trend, with higher ethane levels from 0-900 C.E., followed by a decline, reaching a minimum between 1600-1700 C.E. These trends are consistent with variations in ice core methane isotopes and carbon monoxide isotopes (Ferretti et al., 2005, Wang et al., 2010), which indicate changes in biomass burning emissions over this time period. These preliminary data suggest that Antarctic ice core bubbles contain paleoatmospheric ethane levels. With further improvement of laboratory techniques it appears

  13. Best estimate plus uncertainty analysis of departure from nucleate boiling limiting case with CASL core simulator VERA-CS in response to PWR main steam line break event

    DOE PAGES

    Brown, Cameron S.; Zhang, Hongbin; Kucukboyaci, Vefa; ...

    2016-09-07

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was used to simulate a typical pressurized water reactor (PWR) full core response with 17x17 fuel assemblies for a main steam line break (MSLB) accident scenario with the most reactive rod cluster control assembly stuck out of the core. The accident scenario was initiated at the hot zero power (HZP) at the end of the first fuel cycle with return to power state points that were determined by amore » system analysis code and the most limiting state point was chosen for core analysis. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this way, 59 full core simulations were performed to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. The results show that this typical PWR core remains within MDNBR safety limits for the MSLB accident.« less

  14. Best estimate plus uncertainty analysis of departure from nucleate boiling limiting case with CASL core simulator VERA-CS in response to PWR main steam line break event

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Cameron S.; Zhang, Hongbin; Kucukboyaci, Vefa

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was used to simulate a typical pressurized water reactor (PWR) full core response with 17x17 fuel assemblies for a main steam line break (MSLB) accident scenario with the most reactive rod cluster control assembly stuck out of the core. The accident scenario was initiated at the hot zero power (HZP) at the end of the first fuel cycle with return to power state points that were determined by amore » system analysis code and the most limiting state point was chosen for core analysis. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this way, 59 full core simulations were performed to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. The results show that this typical PWR core remains within MDNBR safety limits for the MSLB accident.« less

  15. A New Resource for College Distance Education Astronomy Laboratory Exercises

    ERIC Educational Resources Information Center

    Vogt, Nicole P.; Cook, Stephen P.; Muise, Amy Smith

    2013-01-01

    This article introduces a set of distance education astronomy laboratory exercises for use by college students and instructors and discusses first usage results. This General Astronomy Education Source exercise set contains eight two-week projects designed to guide students through both core content and mathematical applications of general…

  16. Intraoral laser welding: ultrastructural and mechanical analysis to compare laboratory laser and dental laser.

    PubMed

    Fornaini, Carlo; Passaretti, Francesca; Villa, Elena; Rocca, Jean-Paul; Merigo, Elisabetta; Vescovi, Paolo; Meleti, Marco; Manfredi, Maddalena; Nammour, Samir

    2011-07-01

    The Nd:YAG laser has been used since 1970 in dental laboratories to weld metals on dental prostheses. Recently in several clinical cases, we have suggested that the Nd:YAG laser device commonly utilized in the dental office could be used to repair broken fixed, removable and orthodontic prostheses and to weld metals directly in the mouth. The aim of this work was to evaluate, using scanning electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDS) and dynamic mechanical analysis (DMA), the quality of the weld and its mechanical strength, comparing a device normally used in dental laboratory and a device normally used in the dental office for oral surgery, the same as that described for intraoral welding. Metal plates of a Co-Cr-Mo dental alloy and steel orthodontic wires were subjected to four welding procedures: welding without filler metal using the laboratory laser, welding with filler metal using the laboratory laser, welding without filler metal using the office laser, and welding with filler metal using the office laser. The welded materials were then analysed by SEM, EDS and DMA. SEM analysis did not show significant differences between the samples although the plates welded using the office laser without filler metal showed a greater number of fissures than the other samples. EDS microanalysis of the welding zone showed a homogeneous composition of the metals. Mechanical tests showed similar elastic behaviours of the samples, with minimal differences between the samples welded with the two devices. No wire broke even under the maximum force applied by the analyser. This study seems to demonstrate that the welds produced using the office Nd:YAG laser device and the laboratory Nd:YAG laser device, as analysed by SEM, EDS and DMA, showed minimal and nonsignificant differences, although these findings need to be confirmed using a greater number of samples.

  17. So These Numbers Really Mean Something? A Role Playing Scenario-Based Approach to the Undergraduate Instrumental Analysis Laboratory

    ERIC Educational Resources Information Center

    Grannas, Amanda M.; Lagalante, Anthony F.

    2010-01-01

    A new curricular approach in our undergraduate second-year instrumental analysis laboratory was implemented. Students work collaboratively on scenarios in diverse fields including pharmaceuticals, forensics, gemology, art conservation, and environmental chemistry. Each laboratory section (approximately 12 students) is divided into three groups…

  18. The contaminant analysis automation robot implementation for the automated laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younkin, J.R.; Igou, R.E.; Urenda, T.D.

    1995-12-31

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLMmore » when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation.« less

  19. Digital core based transmitted ultrasonic wave simulation and velocity accuracy analysis

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Shan, Rui

    2016-06-01

    Transmitted ultrasonic wave simulation (TUWS) in a digital core is one of the important elements of digital rock physics and is used to study wave propagation in porous cores and calculate equivalent velocity. When simulating wave propagates in a 3D digital core, two additional layers are attached to its two surfaces vertical to the wave-direction and one planar wave source and two receiver-arrays are properly installed. After source excitation, the two receivers then record incident and transmitted waves of the digital rock. Wave propagating velocity, which is the velocity of the digital core, is computed by the picked peak-time difference between the two recorded waves. To evaluate the accuracy of TUWS, a digital core is fully saturated with gas, oil, and water to calculate the corresponding velocities. The velocities increase with decreasing wave frequencies in the simulation frequency band, and this is considered to be the result of scattering. When the pore fluids are varied from gas to oil and finally to water, the velocity-variation characteristics between the different frequencies are similar, thereby approximately following the variation law of velocities obtained from linear elastic statics simulation (LESS), although their absolute values are different. However, LESS has been widely used. The results of this paper show that the transmission ultrasonic simulation has high relative precision.

  20. Precision and manufacturing at the Lawrence Livermore National Laboratory

    NASA Technical Reports Server (NTRS)

    Saito, Theodore T.; Wasley, Richard J.; Stowers, Irving F.; Donaldson, Robert R.; Thompson, Daniel C.

    1994-01-01

    Precision Engineering is one of the Lawrence Livermore National Laboratory's core strengths. This paper discusses the past and present current technology transfer efforts of LLNL's Precision Engineering program and the Livermore Center for Advanced Manufacturing and Productivity (LCAMP). More than a year ago the Precision Machine Commercialization project embodied several successful methods of transferring high technology from the National Laboratories to industry. Currently, LCAMP has already demonstrated successful technology transfer and is involved in a broad spectrum of current programs. In addition, this paper discusses other technologies ripe for future transition including the Large Optics Diamond Turning Machine.

  1. Precision and manufacturing at the Lawrence Livermore National Laboratory

    NASA Astrophysics Data System (ADS)

    Saito, Theodore T.; Wasley, Richard J.; Stowers, Irving F.; Donaldson, Robert R.; Thompson, Daniel C.

    1994-02-01

    Precision Engineering is one of the Lawrence Livermore National Laboratory's core strengths. This paper discusses the past and present current technology transfer efforts of LLNL's Precision Engineering program and the Livermore Center for Advanced Manufacturing and Productivity (LCAMP). More than a year ago the Precision Machine Commercialization project embodied several successful methods of transferring high technology from the National Laboratories to industry. Currently, LCAMP has already demonstrated successful technology transfer and is involved in a broad spectrum of current programs. In addition, this paper discusses other technologies ripe for future transition including the Large Optics Diamond Turning Machine.

  2. Cone Calorimeter Analysis of FRT Intumescent and Untreated Foam Core Particleboards

    Treesearch

    Mark A. Dietenberger; Ali Shalbafan; Johannes Welling; Charles Boardman

    2012-01-01

    The effectiveness of treatments of the surface layer of novel foam core particleboards were evaluated by means of Cone calorimeter tests. Foam core particleboards with variations of surface layer treatment, adhesives and surface layer thicknesses under similar processing conditions were used to produce the test specimen for the Cone calorimeter tests. Ignitability,...

  3. Core-core and core-valence correlation energy atomic and molecular benchmarks for Li through Ar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ranasinghe, Duminda S.; Frisch, Michael J.; Petersson, George A., E-mail: gpetersson@wesleyan.edu

    2015-12-07

    We have established benchmark core-core, core-valence, and valence-valence absolute coupled-cluster single double (triple) correlation energies (±0.1%) for 210 species covering the first- and second-rows of the periodic table. These species provide 194 energy differences (±0.03 mE{sub h}) including ionization potentials, electron affinities, and total atomization energies. These results can be used for calibration of less expensive methodologies for practical routine determination of core-core and core-valence correlation energies.

  4. Preliminary engineering design of sodium-cooled CANDLE core

    NASA Astrophysics Data System (ADS)

    Takaki, Naoyuki; Namekawa, Azuma; Yoda, Tomoyuki; Mizutani, Akihiko; Sekimoto, Hiroshi

    2012-06-01

    The CANDLE burning process is characterized by the autonomous shifting of burning region with constant reactivity and constant spacial power distribution. Evaluations of such critical burning process by using widely used neutron diffusion and burning codes under some realistic engineering constraints are valuable to confirm the technical feasibility of the CANDLE concept and to put the idea into concrete core design. In the first part of this paper, it is discussed that whether the sustainable and stable CANDLE burning process can be reproduced even by using conventional core analysis tools such as SLAROM and CITATION-FBR. As a result, it is certainly possible to demonstrate it if the proper core configuration and initial fuel composition required as CANDLE core are applied to the analysis. In the latter part, an example of a concrete image of sodium cooled, metal fuel, 2000MWt rating CANDLE core has been presented by assuming an emerging inevitable technology of recladding. The core satisfies engineering design criteria including cladding temperature, pressure drop, linear heat rate, and cumulative damage fraction (CDF) of cladding, fast neutron fluence and sodium void reactivity which are defined in the Japanese FBR design project. It can be concluded that it is feasible to design CADLE core by using conventional codes while satisfying some realistic engineering design constraints assuming that recladding at certain time interval is technically feasible.

  5. EPA Environmental Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Environmental Protection Agency's (EPA) Chemistry Laboratory (ECL) is a national program laboratory specializing in residue chemistry analysis under the jurisdiction of the EPA's Office of Pesticide Programs in Washington, D.C. At Stennis Space Center, the laboratory's work supports many federal anti-pollution laws. The laboratory analyzes environmental and human samples to determine the presence and amount of agricultural chemicals and related substances. Pictured, ECL chemists analyze environmental and human samples for the presence of pesticides and other pollutants.

  6. Supplement analysis for continued operation of Lawrence Livermore National Laboratory and Sandia National Laboratories, Livermore. Volume 2: Comment response document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1999-03-01

    The US Department of Energy (DOE), prepared a draft Supplement Analysis (SA) for Continued Operation of Lawrence Livermore National Laboratory (LLNL) and Sandia National Laboratories, Livermore (SNL-L), in accordance with DOE`s requirements for implementation of the National Environmental Policy Act of 1969 (NEPA) (10 Code of Federal Regulations [CFR] Part 1021.314). It considers whether the Final Environmental Impact Statement and Environmental Impact Report for Continued Operation of Lawrence Livermore National Laboratory and Sandia National Laboratories, Livermore (1992 EIS/EIR) should be supplement3ed, whether a new environmental impact statement (EIS) should be prepared, or no further NEPA documentation is required. The SAmore » examines the current project and program plans and proposals for LLNL and SNL-L, operations to identify new or modified projects or operations or new information for the period from 1998 to 2002 that was not considered in the 1992 EIS/EIR. When such changes, modifications, and information are identified, they are examined to determine whether they could be considered substantial or significant in reference to the 1992 proposed action and the 1993 Record of Decision (ROD). DOE released the draft SA to the public to obtain stakeholder comments and to consider those comments in the preparation of the final SA. DOE distributed copies of the draft SA to those who were known to have an interest in LLNL or SNL-L activities in addition to those who requested a copy. In response to comments received, DOE prepared this Comment Response Document.« less

  7. GALFIT-CORSAIR: Implementing the Core-Sérsic Model Into GALFIT

    NASA Astrophysics Data System (ADS)

    Bonfini, Paolo

    2014-10-01

    We introduce GALFIT-CORSAIR: a publicly available, fully retro-compatible modification of the 2D fitting software GALFIT (v.3) which adds an implementation of the core-Sersic model. We demonstrate the software by fitting the images of NGC 5557 and NGC 5813, which have been previously identified as core-Sersic galaxies by their 1D radial light profiles. These two examples are representative of different dust obscuration conditions, and of bulge/disk decomposition. To perform the analysis, we obtained deep Hubble Legacy Archive (HLA) mosaics in the F555W filter (~V-band). We successfully reproduce the results of the previous 1D analysis, modulo the intrinsic differences between the 1D and the 2D fitting procedures. The code and the analysis procedure described here have been developed for the first coherent 2D analysis of a sample of core-Sersic galaxies, which will be presented in a forth-coming paper. As the 2D analysis provides better constraining on multi-component fitting, and is fully seeing-corrected, it will yield complementary constraints on the missing mass in depleted galaxy cores.

  8. Evidence for a core gut microbiota in the zebrafish

    PubMed Central

    Roeselers, Guus; Mittge, Erika K; Stephens, W Zac; Parichy, David M; Cavanaugh, Colleen M; Guillemin, Karen; Rawls, John F

    2011-01-01

    Experimental analysis of gut microbial communities and their interactions with vertebrate hosts is conducted predominantly in domesticated animals that have been maintained in laboratory facilities for many generations. These animal models are useful for studying coevolved relationships between host and microbiota only if the microbial communities that occur in animals in lab facilities are representative of those that occur in nature. We performed 16S rRNA gene sequence-based comparisons of gut bacterial communities in zebrafish collected recently from their natural habitat and those reared for generations in lab facilities in different geographic locations. Patterns of gut microbiota structure in domesticated zebrafish varied across different lab facilities in correlation with historical connections between those facilities. However, gut microbiota membership in domesticated and recently caught zebrafish was strikingly similar, with a shared core gut microbiota. The zebrafish intestinal habitat therefore selects for specific bacterial taxa despite radical differences in host provenance and domestication status. PMID:21472014

  9. Analysis and Presentation of Cumulative Antimicrobial Susceptibility Test Data--The Influence of Different Parameters in a Routine Clinical Microbiology Laboratory.

    PubMed

    Kohlmann, Rebekka; Gatermann, Sören G

    2016-01-01

    Many clinical microbiology laboratories report on cumulative antimicrobial susceptibility testing (cAST) data on a regular basis. Criteria for generation of cAST reports, however, are often obscure and inconsistent. Whereas the CLSI has published a guideline for analysis and presentation of cAST data, national guidelines directed at clinical microbiology laboratories are not available in Europe. Thus, we sought to describe the influence of different parameters in the process of cAST data analysis in the setting of a German routine clinical microbiology laboratory during 2 consecutive years. We developed various program scripts to assess the consequences ensuing from different algorithms for calculation of cumulative antibiograms from the data collected in our clinical microbiology laboratory in 2013 and 2014. One of the most pronounced effects was caused by exclusion of screening cultures for multi-drug resistant organisms which decreased the MRSA rate in some cases to one third. Dependent on the handling of duplicate isolates, i.e. isolates of the same species recovered from successive cultures on the same patient during the time period analyzed, we recorded differences in resistance rates of up to 5 percentage points for S. aureus, E. coli and K. pneumoniae and up to 10 percentage points for P. aeruginosa. Stratification by site of care and specimen type, testing of antimicrobials selectively on resistant isolates, change of interpretation rules and analysis at genus level instead of species level resulted in further changes of calculated antimicrobial resistance rates. The choice of parameters for cAST data analysis may have a substantial influence on calculated antimicrobial resistance rates. Consequently, comparability of cAST reports from different clinical microbiology laboratories may be limited. We suggest that laboratories communicate the strategy used for cAST data analysis as long as national guidelines for standardized cAST data analysis and reporting

  10. In-situ stress measurements using core-based methods in the vicinity of Nojima fault.

    NASA Astrophysics Data System (ADS)

    Yano, S.; Sugimoto, T.; Lin, W.; Lin, A.

    2017-12-01

    In the cycle of repeatable occurrence of earthquakes, stress accumulates at the source fault and its surroundings in an interseismic period until the next earthquake, and releases abruptly when the earthquake occurs. However, it is almost unknown that the quantitative relationship between stress change and earthquake occurrence. Hence, in order to improve our understanding on the mechanisms of the outbreak of earthquakes, it is important to grasp the stress states in the vicinity of the source fault and to evaluate its change over time. In this study, we carried out in-situ stress measurements by using core samples obtained from a scientific drilling penetrated through the Nojima fault which ruptured and caused the Hyogo-ken Nanbu earthquake, Japan in 1995. Our stress measurements were conducted from 2016 to 2017 when is 22 years after the earthquake. For this purpose, we applied the Anelastic Strain Recovery (ASR) method and Diametrical Core Deformation Analysis (DCDA). First, we measure the ASR change with time of the cores from stress releasing soon and calculate three-dimensional principal in-situ stress orientations and magnitudes from the ASR data. In this study, to ensure the enough amount of ASR, we conducted the measurements using the cores collected within a short time (e.g. 2.5 - 3.5 hours) after stress releasing by drilling at an on-site laboratory in the drilling site in Awaji island, Japan. The site locates at the south-west part of the Nojima fault. In DCDA, we measure the core diameters in all (360°) azimuths, and determine difference of the two horizontal principal stresses and their orientation by using the other cores as those used for ASR. DCDA experiments were conducted indoor and after a long time passed from core collecting. Lithology of all the core samples we used for ASR and DCDA are granite, and 19 and 7 cores were used for ASR and DCDA, respectively. As a result, it was found that the stress state in the depth range of 500 - 560 m and

  11. Probing the Reproducibility of Leaf Growth and Molecular Phenotypes: A Comparison of Three Arabidopsis Accessions Cultivated in Ten Laboratories1[W

    PubMed Central

    Massonnet, Catherine; Vile, Denis; Fabre, Juliette; Hannah, Matthew A.; Caldana, Camila; Lisec, Jan; Beemster, Gerrit T.S.; Meyer, Rhonda C.; Messerli, Gaëlle; Gronlund, Jesper T.; Perkovic, Josip; Wigmore, Emma; May, Sean; Bevan, Michael W.; Meyer, Christian; Rubio-Díaz, Silvia; Weigel, Detlef; Micol, José Luis; Buchanan-Wollaston, Vicky; Fiorani, Fabio; Walsh, Sean; Rinn, Bernd; Gruissem, Wilhelm; Hilson, Pierre; Hennig, Lars; Willmitzer, Lothar; Granier, Christine

    2010-01-01

    A major goal of the life sciences is to understand how molecular processes control phenotypes. Because understanding biological systems relies on the work of multiple laboratories, biologists implicitly assume that organisms with the same genotype will display similar phenotypes when grown in comparable conditions. We investigated to what extent this holds true for leaf growth variables and metabolite and transcriptome profiles of three Arabidopsis (Arabidopsis thaliana) genotypes grown in 10 laboratories using a standardized and detailed protocol. A core group of four laboratories generated similar leaf growth phenotypes, demonstrating that standardization is possible. But some laboratories presented significant differences in some leaf growth variables, sometimes changing the genotype ranking. Metabolite profiles derived from the same leaf displayed a strong genotype × environment (laboratory) component. Genotypes could be separated on the basis of their metabolic signature, but only when the analysis was limited to samples derived from one laboratory. Transcriptome data revealed considerable plant-to-plant variation, but the standardization ensured that interlaboratory variation was not considerably larger than intralaboratory variation. The different impacts of the standardization on phenotypes and molecular profiles could result from differences of temporal scale between processes involved at these organizational levels. Our findings underscore the challenge of describing, monitoring, and precisely controlling environmental conditions but also demonstrate that dedicated efforts can result in reproducible data across multiple laboratories. Finally, our comparative analysis revealed that small variations in growing conditions (light quality principally) and handling of plants can account for significant differences in phenotypes and molecular profiles obtained in independent laboratories. PMID:20200072

  12. Industry Application ECCS / LOCA Integrated Cladding/Emergency Core Cooling System Performance: Demonstration of LOTUS-Baseline Coupled Analysis of the South Texas Plant Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hongbin; Szilard, Ronaldo; Epiney, Aaron

    Under the auspices of the DOE LWRS Program RISMC Industry Application ECCS/LOCA, INL has engaged staff from both South Texas Project (STP) and the Texas A&M University (TAMU) to produce a generic pressurized water reactor (PWR) model including reactor core, clad/fuel design and systems thermal hydraulics based on the South Texas Project (STP) nuclear power plant, a 4-Loop Westinghouse PWR. A RISMC toolkit, named LOCA Toolkit for the U.S. (LOTUS), has been developed for use in this generic PWR plant model to assess safety margins for the proposed NRC 10 CFR 50.46c rule, Emergency Core Cooling System (ECCS) performance duringmore » LOCA. This demonstration includes coupled analysis of core design, fuel design, thermalhydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results. Within this context, a multi-physics best estimate plus uncertainty (MPBEPU) methodology framework is proposed.« less

  13. Exploring cosmic origins with CORE: Mitigation of systematic effects

    NASA Astrophysics Data System (ADS)

    Natoli, P.; Ashdown, M.; Banerji, R.; Borrill, J.; Buzzelli, A.; de Gasperis, G.; Delabrouille, J.; Hivon, E.; Molinari, D.; Patanchon, G.; Polastri, L.; Tomasi, M.; Bouchet, F. R.; Henrot-Versillé, S.; Hoang, D. T.; Keskitalo, R.; Kiiveri, K.; Kisner, T.; Lindholm, V.; McCarthy, D.; Piacentini, F.; Perdereau, O.; Polenta, G.; Tristram, M.; Achucarro, A.; Ade, P.; Allison, R.; Baccigalupi, C.; Ballardini, M.; Banday, A. J.; Bartlett, J.; Bartolo, N.; Basak, S.; Baumann, D.; Bersanelli, M.; Bonaldi, A.; Bonato, M.; Boulanger, F.; Brinckmann, T.; Bucher, M.; Burigana, C.; Cai, Z.-Y.; Calvo, M.; Carvalho, C.-S.; Castellano, M. G.; Challinor, A.; Chluba, J.; Clesse, S.; Colantoni, I.; Coppolecchia, A.; Crook, M.; D'Alessandro, G.; de Bernardis, P.; De Zotti, G.; Di Valentino, E.; Diego, J.-M.; Errard, J.; Feeney, S.; Fernandez-Cobos, R.; Finelli, F.; Forastieri, F.; Galli, S.; Genova-Santos, R.; Gerbino, M.; González-Nuevo, J.; Grandis, S.; Greenslade, J.; Gruppuso, A.; Hagstotz, S.; Hanany, S.; Handley, W.; Hernandez-Monteagudo, C.; Hervías-Caimapo, C.; Hills, M.; Keihänen, E.; Kitching, T.; Kunz, M.; Kurki-Suonio, H.; Lamagna, L.; Lasenby, A.; Lattanzi, M.; Lesgourgues, J.; Lewis, A.; Liguori, M.; López-Caniego, M.; Luzzi, G.; Maffei, B.; Mandolesi, N.; Martinez-González, E.; Martins, C. J. A. P.; Masi, S.; Matarrese, S.; Melchiorri, A.; Melin, J.-B.; Migliaccio, M.; Monfardini, A.; Negrello, M.; Notari, A.; Pagano, L.; Paiella, A.; Paoletti, D.; Piat, M.; Pisano, G.; Pollo, A.; Poulin, V.; Quartin, M.; Remazeilles, M.; Roman, M.; Rossi, G.; Rubino-Martin, J.-A.; Salvati, L.; Signorelli, G.; Tartari, A.; Tramonte, D.; Trappe, N.; Trombetti, T.; Tucker, C.; Valiviita, J.; Van de Weijgaert, R.; van Tent, B.; Vennin, V.; Vielva, P.; Vittorio, N.; Wallis, C.; Young, K.; Zannoni, M.

    2018-04-01

    We present an analysis of the main systematic effects that could impact the measurement of CMB polarization with the proposed CORE space mission. We employ timeline-to-map simulations to verify that the CORE instrumental set-up and scanning strategy allow us to measure sky polarization to a level of accuracy adequate to the mission science goals. We also show how the CORE observations can be processed to mitigate the level of contamination by potentially worrying systematics, including intensity-to-polarization leakage due to bandpass mismatch, asymmetric main beams, pointing errors and correlated noise. We use analysis techniques that are well validated on data from current missions such as Planck to demonstrate how the residual contamination of the measurements by these effects can be brought to a level low enough not to hamper the scientific capability of the mission, nor significantly increase the overall error budget. We also present a prototype of the CORE photometric calibration pipeline, based on that used for Planck, and discuss its robustness to systematics, showing how CORE can achieve its calibration requirements. While a fine-grained assessment of the impact of systematics requires a level of knowledge of the system that can only be achieved in a future study phase, the analysis presented here strongly suggests that the main areas of concern for the CORE mission can be addressed using existing knowledge, techniques and algorithms.

  14. The Invention Factory: Thomas Edison's Laboratories. Teaching with Historic Places.

    ERIC Educational Resources Information Center

    Bolger, Benjamin

    This lesson explores the group of buildings in West Orange, New Jersey, built in 1887, that formed the core of Thomas Edison's research and development complex. They consisted of chemistry, physics, and metallurgy laboratories; machine shop; pattern shop; research library; and rooms for experiments. The lesson explains that the prototypes (ideas…

  15. An Ill-Structured PBL-Based Microprocessor Course without Formal Laboratory

    ERIC Educational Resources Information Center

    Kim, Jungkuk

    2012-01-01

    This paper introduces a problem-based learning (PBL) microprocessor application course designed according to the following strategies: 1) hands-on training without having a formal laboratory, and 2) intense student-centered cooperative learning through an ill-structured problem. PBL was adopted as the core educational technique of the course to…

  16. Laboratory experiments on liquid fragmentation during Earth's core formation

    NASA Astrophysics Data System (ADS)

    Landeau, M.; Deguen, R.; Olson, P.

    2013-12-01

    Buoyancy-driven fragmentation of one liquid in another immiscible liquid likely occurred on a massive scale during the formation of the Earth, when dense liquid metal blobs were released within deep molten silicate magma oceans. Another example of this phenomenon is the sudden release of petroleum into the ocean during the Deepwater Horizon disaster (Gulf of Mexico, 2010). We present experiments on the instability and fragmentation of blobs of a heavy liquid released into a lighter immiscible liquid. During the fragmentation process, we observe deformation of the released fluid, formation of filamentary structures, capillary instability, and eventually drop formation. We find that, at low and intermediate Weber numbers (which measures the importance of inertia versus surface tension), the fragmentation regime mainly results from the competition between a Rayleigh-Taylor instability and the roll-up of a vortex ring. At sufficiently high Weber numbers (the relevant regime for core formation), the fragmentation process becomes turbulent. The large-scale flow then behaves as a turbulent vortex ring or a turbulent thermal: it forms a coherent structure whose shape remains self-similar during the fall and which grows by turbulent entrainment of ambient fluid. An integral model based on the entrainment assumption, and adapted to buoyant vortex rings with initial momentum, is consistent with our experimental data. This indicates that the concept of turbulent entrainment is valid for non-dispersed immiscible fluids at large Weber and Reynolds numbers. Series of photographs, turbulent fragmentation regime, time intervals of about 0.2 s. Portions (red boxes) have been magnified (on the right).

  17. Description and Analysis of Core Samples: The Lunar Experience

    NASA Technical Reports Server (NTRS)

    McKay, David S.; Allton, Judith H.

    1997-01-01

    Although no samples yet have been returned from a comet, extensive experience from sampling another solar system body, the Moon, does exist. While, in overall structure, composition, and physical properties the Moon bears little resemblance to what is expected for a comet, sampling the Moon has provided some basic lessons in how to do things which may be equally applicable to cometary samples. In particular, an extensive series of core samples has been taken on the Moon, and coring is the best way to sample a comet in three dimensions. Data from cores taken at 24 Apollo collection stations and 3 Luna sites have been used to provide insight into the evolution of the lunar regolith. It is now well understood that this regolith is very complex and reflects gardening (stirring of grains by micrometeorites), erosion (from impacts and solar wind sputtering), maturation (exposure on the bare lunar surface to solar winds ions and micrometeorite impacts) and comminution of coarse grains into finer grains, blanket deposition of coarse-grained layers, and other processes. All of these processes have been documented in cores. While a cometary regolith should not be expected to parallel in detail the lunar regolith, it is possible that the upper part of a cometary regolith may include textural, mineralogical, and chemical features which reflect the original accretion of the comet, including a form of gardening. Differences in relative velocities and gravitational attraction no doubt made this accretionary gardening qualitatively much different than the lunar version. Furthermore, at least some comets, depending on their orbits, have been subjected to impacts of the uppermost surface by small projectiles at some time in their history. Consequently, a more recent post-accretional gardening may have occurred. Finally, for comets which approach the sun, large scale erosion may have occurred driven by gas loss. The uppermost material of these comets may reflect some of the features

  18. Fuel Breeding and Core Behavior Analyses on In Core Fuel Management of Water Cooled Thorium Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Permana, Sidik; Department of Physics, Bandung Institute of Technology, Gedung Fisika, Jl. Ganesha 10, Bandung 40132; Sekimoto, Hiroshi

    2010-12-23

    Thorium fuel cycle with recycled U-233 has been widely recognized having some contributions to improve the water-cooled breeder reactor program which has been shown by a feasible area of breeding and negative void reactivity which confirms that fissile of 233U contributes to better fuel breeding and effective for obtaining negative void reactivity coefficient as the main fissile material. The present study has the objective to estimate the effect of whole core configuration as well as burnup effects to the reactor core profile by adopting two dimensional model of fuel core management. About more than 40 months of cycle period hasmore » been employed for one cycle fuel irradiation of three batches fuel system for large water cooled thorium reactors. All position of fuel arrangement contributes to the total core conversion ratio which gives conversion ratio less than unity of at the BOC and it contributes to higher than unity (1.01) at the EOC after some irradiation process. Inner part and central part give the important part of breeding contribution with increasing burnup process, while criticality is reduced with increasing the irradiation time. Feasibility of breeding capability of water-cooled thorium reactors for whole core fuel arrangement has confirmed from the obtained conversion ratio which shows higher than unity. Whole core analysis on evaluating reactivity change which is caused by the change of voided condition has been employed for conservative assumption that 100% coolant and moderator are voided. It obtained always a negative void reactivity coefficient during reactor operation which shows relatively more negative void coefficient at BOC (fresh fuel composition), and it becomes less negative void coefficient with increasing the operation time. Negative value of void reactivity coefficient shows the reactor has good safety properties in relation to the reactivity profile which is the main parameter in term of criticality safety analysis. Therefore

  19. Composite Cores

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Spang & Company's new configuration of converter transformer cores is a composite of gapped and ungapped cores assembled together in concentric relationship. The net effect of the composite design is to combine the protection from saturation offered by the gapped core with the lower magnetizing requirement of the ungapped core. The uncut core functions under normal operating conditions and the cut core takes over during abnormal operation to prevent power surges and their potentially destructive effect on transistors. Principal customers are aerospace and defense manufacturers. Cores also have applicability in commercial products where precise power regulation is required, as in the power supplies for large mainframe computers.

  20. Biological and Physical Space Research Laboratory 2002 Science Review

    NASA Technical Reports Server (NTRS)

    Curreri, P. A. (Editor); Robinson, M. B. (Editor); Murphy, K. L. (Editor)

    2003-01-01

    With the International Space Station Program approaching core complete, our NASA Headquarters sponsor, the new Code U Enterprise, Biological and Physical Research, is shifting its research emphasis from purely fundamental microgravity and biological sciences to strategic research aimed at enabling human missions beyond Earth orbit. Although we anticipate supporting microgravity research on the ISS for some time to come, our laboratory has been vigorously engaged in developing these new strategic research areas.This Technical Memorandum documents the internal science research at our laboratory as presented in a review to Dr. Ann Whitaker, MSFC Science Director, in July 2002. These presentations have been revised and updated as appropriate for this report. It provides a snapshot of the internal science capability of our laboratory as an aid to other NASA organizations and the external scientific community.