Omega Hawaii Antenna System: Modification and Validation Tests. Volume 2. Data Sheets.
1979-10-19
a benchmark because of potential hotel construction . DS 5-1 DATA SHEET 5 (DS-5) RADIO FIELD INTENSITY MEASUREMENTS OMEGA STATION: HAWAII SITE NO. C 1A...27.5 1008 11.05 26.5 1007 Ft 11.80 28.1 COMMENT Not considered for a benchmark because of potential hotel construction . DS 5-5 DATA SHEET 5 (DS-5) RADIO
Fingerprinting sea-level variations in response to continental ice loss: a benchmark exercise
NASA Astrophysics Data System (ADS)
Barletta, Valentina R.; Spada, Giorgio; Riva, Riccardo E. M.; James, Thomas S.; Simon, Karen M.; van der Wal, Wouter; Martinec, Zdenek; Klemann, Volker; Olsson, Per-Anders; Hagedoorn, Jan; Stocchi, Paolo; Vermeersen, Bert
2013-04-01
Understanding the response of the Earth to the waxing and waning ice sheets is crucial in various contexts, ranging from the interpretation of modern satellite geodetic measurements to the projections of future sea level trends in response to climate change. All the processes accompanying Glacial Isostatic Adjustment (GIA) can be described solving the so-called Sea Level Equation (SLE), an integral equation that accounts for the interactions between the ice sheets, the solid Earth, and the oceans. Modern approaches to the SLE are based on various techniques that range from purely analytical formulations to fully numerical methods. Here we present the results of a benchmark exercise of independently developed codes designed to solve the SLE. The study involves predictions of current sea level changes due to present-day ice mass loss. In spite of the differences in the methods employed, the comparison shows that a significant number of GIA modellers can reproduce their sea-level computations within 2% for well defined, large-scale present-day ice mass changes. Smaller and more detailed loads need further and dedicated benchmarking and high resolution computation. This study shows how the details of the implementation and the inputs specifications are an important, and often underappreciated, aspect. Hence this represents a step toward the assessment of reliability of sea level projections obtained with benchmarked SLE codes.
NASA Astrophysics Data System (ADS)
Brinkerhoff, D. J.; Johnson, J. V.
2013-07-01
We introduce a novel, higher order, finite element ice sheet model called VarGlaS (Variational Glacier Simulator), which is built on the finite element framework FEniCS. Contrary to standard procedure in ice sheet modelling, VarGlaS formulates ice sheet motion as the minimization of an energy functional, conferring advantages such as a consistent platform for making numerical approximations, a coherent relationship between motion and heat generation, and implicit boundary treatment. VarGlaS also solves the equations of enthalpy rather than temperature, avoiding the solution of a contact problem. Rather than include a lengthy model spin-up procedure, VarGlaS possesses an automated framework for model inversion. These capabilities are brought to bear on several benchmark problems in ice sheet modelling, as well as a 500 yr simulation of the Greenland ice sheet at high resolution. VarGlaS performs well in benchmarking experiments and, given a constant climate and a 100 yr relaxation period, predicts a mass evolution of the Greenland ice sheet that matches present-day observations of mass loss. VarGlaS predicts a thinning in the interior and thickening of the margins of the ice sheet.
Benchmark Testing of the Largest Titanium Aluminide Sheet Subelement Conducted
NASA Technical Reports Server (NTRS)
Bartolotta, Paul A.; Krause, David L.
2000-01-01
To evaluate wrought titanium aluminide (gamma TiAl) as a viable candidate material for the High-Speed Civil Transport (HSCT) exhaust nozzle, an international team led by the NASA Glenn Research Center at Lewis Field successfully fabricated and tested the largest gamma TiAl sheet structure ever manufactured. The gamma TiAl sheet structure, a 56-percent subscale divergent flap subelement, was fabricated for benchmark testing in three-point bending. Overall, the subelement was 84-cm (33-in.) long by 13-cm (5-in.) wide by 8-cm (3-in.) deep. Incorporated into the subelement were features that might be used in the fabrication of a full-scale divergent flap. These features include the use of: (1) gamma TiAl shear clips to join together sections of corrugations, (2) multiple gamma TiAl face sheets, (3) double hot-formed gamma TiAl corrugations, and (4) brazed joints. The structural integrity of the gamma TiAl sheet subelement was evaluated by conducting a room-temperature three-point static bend test.
The GLAS Standard Data Products Specification-Data Dictionary, Version 1.0. Volume 15
NASA Technical Reports Server (NTRS)
Lee, Jeffrey E.
2013-01-01
The Geoscience Laser Altimeter System (GLAS) is the primary instrument for the ICESat (Ice, Cloud and Land Elevation Satellite) laser altimetry mission. ICESat was the benchmark Earth Observing System (EOS) mission for measuring ice sheet mass balance, cloud and aerosol heights, as well as land topography and vegetation characteristics. From 2003 to 2009, the ICESat mission provided multi-year elevation data needed to determine ice sheet mass balance as well as cloud property information, especially for stratospheric clouds common over polar areas. It also provided topography and vegetation data around the globe, in addition to the polar-specific coverage over the Greenland and Antarctic ice sheets.This document contains the data dictionary for the GLAS standard data products. It details the parameters present on GLAS standard data products. Each parameter is defined with a short name, a long name, units on product, type of variable, a long description and products that contain it. The term standard data products refers to those EOS instrument data that are routinely generated for public distribution. These products are distributed by the National Snow and Ice Data Center (NSDIC).
The GLAS Standard Data Products Specification--Level 2, Version 9. Volume 14
NASA Technical Reports Server (NTRS)
Lee, Jeffrey E.
2013-01-01
The Geoscience Laser Altimeter System (GLAS) is the primary instrument for the ICESat (Ice, Cloud and Land Elevation Satellite) laser altimetry mission. ICESat was the benchmark Earth Observing System (EOS) mission for measuring ice sheet mass balance, cloud and aerosol heights, as well as land topography and vegetation characteristics. From 2003 to 2009, the ICESat mission provided multi-year elevation data needed to determine ice sheet mass balance as well as cloud property information, especially for stratospheric clouds common over polar areas. It also provided topography and vegetation data around the globe, in addition to the polar-specific coverage over the Greenland and Antarctic ice sheets.This document defines the Level-2 GLAS standard data products. This document addresses the data flow, interfaces, record and data formats associated with the GLAS Level 2 standard data products. The term standard data products refers to those EOS instrument data that are routinely generated for public distribution. The National Snow and Ice Data Center (NSDIC) distribute these products. Each data product has a unique Product Identification code assigned by the Senior Project Scientist. The Level 2 Standard Data Products specifically include those derived geophysical data values (i.e., ice sheet elevation, cloud height, vegetation height, etc.). Additionally, the appropriate correction elements used to transform the Level 1A and Level 1B Data Products into Level 2 Data Products are included. The data are packaged with time tags, precision orbit location coordinates, and data quality and usage flags.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Will, M.E.
1994-01-01
This report presents a standard method for deriving benchmarks for the purpose of ''contaminant screening,'' performed by comparing measured ambient concentrations of chemicals. The work was performed under Work Breakdown Structure 1.4.12.2.3.04.07.02 (Activity Data Sheet 8304). In addition, this report presents sets of data concerning the effects of chemicals in soil on invertebrates and soil microbial processes, benchmarks for chemicals potentially associated with United States Department of Energy sites, and literature describing the experiments from which data were drawn for benchmark derivation.
Polar ice-sheet contributions to sea level during past warm periods
NASA Astrophysics Data System (ADS)
Dutton, A.
2015-12-01
Recent sea-level rise has been dominated by thermal expansion and glacier loss, but the contribution from mass loss from the Greenland and Antarctic ice sheets is expected to exceed other contributions under future sustained warming. Due to limitations of existing ice sheet models and the lack of relevant analogues in the historical record, projecting the timing and magnitude of polar ice sheet mass loss in the future remains challenging. One approach to improving our understanding of how polar ice-sheet retreat will unfold is to integrate observations and models of sea level, ice sheets, and climate during past intervals of warmth when the polar ice sheets contributed to higher sea levels. A recent review evaluated the evidence of polar ice sheet mass loss during several warm periods, including interglacials during the mid-Pliocene warm period, Marine Isotope Stage (MIS) 11, 5e (Last Interglacial), and 1 (Holocene). Sea-level benchmarks of ice-sheet retreat during the first of these three periods, when global mean climate was ~1 to 3 deg. C warmer than preindustrial, are useful for understanding the long-term potential for future sea-level rise. Despite existing uncertainties in these reconstructions, it is clear that our present climate is warming to a level associated with significant polar ice-sheet loss in the past, resulting in a conservative estimate for a global mean sea-level rise of 6 meters above present (or more). This presentation will focus on identifying the approaches that have yielded significant advances in terms of past sea level and ice sheet reconstruction as well as outstanding challenges. A key element of recent advances in sea-level reconstructions is the ability to recognize and quantify the imprint of geophysical processes, such as glacial isostatic adjustment (GIA) and dynamic topography, that lead to significant spatial variability in sea level reconstructions. Identifying specific ice-sheet sources that contributed to higher sea levels is a challenge that is currently hindered by limited field evidence at high latitudes. Finally, I will explore the concept of how increasing the quantity and quality of paleo sea level and ice sheet reconstructions can lead to improved quantification of contemporary changes in ice sheets and sea level.
The GLAS Standard Data Products Specification-Level 1, Version 9
NASA Technical Reports Server (NTRS)
Lee, Jeffrey E.
2013-01-01
The Geoscience Laser Altimeter System (GLAS) is the primary instrument for the ICESat (Ice, Cloud and Land Elevation Satellite) laser altimetry mission. ICESat was the benchmark Earth Observing System (EOS) mission for measuring ice sheet mass balance, cloud and aerosol heights, as well as land topography and vegetation characteristics. From 2003 to 2009, the ICESat mission provided multi-year elevation data needed to determine ice sheet mass balance as well as cloud property information, especially for stratospheric clouds common over polar areas. It also provided topography and vegetation data around the globe, in addition to the polar-specific coverage over the Greenland and Antarctic ice sheets.This document defines the Level-1 GLAS standard data products. This document addresses the data flow, interfaces, record and data formats associated with the GLAS Level 1 standard data products. GLAS Level 1 standard data products are composed of Level 1A and Level 1B data products. The term standard data products refers to those EOS instrument data that are routinely generated for public distribution. The National Snow and Ice Data Center (NSDIC) distribute these products. Each data product has a unique Product Identification code assigned by the Senior Project Scientist. GLAS Level 1A and Level 1B Data Products are composed from those Level 0 data that have been reformatted or transformed to corrected and calibrated data in physical units at the full instrument rate and resolution.
Albuquerque, Kevin; Rodgers, Kellie; Spangler, Ann; Rahimi, Asal; Willett, DuWayne
2018-03-01
The on-treatment visit (OTV) for radiation oncology is essential for patient management. Radiation toxicities recorded during the OTV may be inconsistent because of the use of free text and the lack of treatment site-specific templates. We developed a radiation oncology toxicity recording instrument (ROTOX) in a health system electronic medical record (EMR). Our aims were to assess improvement in documentation of toxicities and to develop clinic toxicity benchmarks. A ROTOX that was based on National Cancer Institute Common Terminology Criteria for Adverse Events (version 4.0) with flow-sheet functionality was developed in the EMR. Improvement in documentation was assessed at various time intervals. High-grade toxicities (ie, grade ≥ 3 by CTCAE) by site were audited to develop benchmarks and to track nursing and physician actions taken in response to these. A random sample of OTV notes from each clinic physician before ROTOX implementation was reviewed and assigned a numerical document quality score (DQS) that was based on completeness and comprehensiveness of toxicity grading. The mean DQS improved from an initial level of 41% to 99% (of the maximum possible DQS) when resampled at 6 months post-ROTOX. This high-level DQS was maintained 3 years after ROTOX implementation at 96% of the maximum. For months 7 to 9 after implementation (during a 3-month period), toxicity grading was recorded in 4,443 OTVs for 698 unique patients; 107 episodes of high-grade toxicity were identified during this period, and toxicity-specific intervention was documented in 95%. An EMR-based ROTOX enables consistent recording of treatment toxicity. In a uniform sample of patients, local population toxicity benchmarks can be developed, and clinic response can be tracked.
The GLAS Science Algorithm Software (GSAS) Detailed Design Document Version 6. Volume 16
NASA Technical Reports Server (NTRS)
Lee, Jeffrey E.
2013-01-01
The Geoscience Laser Altimeter System (GLAS) is the primary instrument for the ICESat (Ice, Cloud and Land Elevation Satellite) laser altimetry mission. ICESat was the benchmark Earth Observing System (EOS) mission for measuring ice sheet mass balance, cloud and aerosol heights, as well as land topography and vegetation characteristics. From 2003 to 2009, the ICESat mission provided multi-year elevation data needed to determine ice sheet mass balance as well as cloud property information, especially for stratospheric clouds common over polar areas. It also provided topography and vegetation data around the globe, in addition to the polar-specific coverage over the Greenland and Antarctic ice sheets.This document describes the detailed design of GLAS Science Algorithm Software (GSAS). The GSAS is used to create the ICESat GLAS standard data products. The National Snow and Ice Data Center (NSDIC) distribute these products. The document contains descriptions, flow charts, data flow diagrams, and structure charts for each major component of the GSAS. The purpose of this document is to present the detailed design of the GSAS. It is intended as a reference source to assist the maintenance programmer in making changes that fix or enhance the documented software.
The GLAS Science Algorithm Software (GSAS) User's Guide Version 7
NASA Technical Reports Server (NTRS)
Lee, Jeffrey E.
2013-01-01
The Geoscience Laser Altimeter System (GLAS) is the primary instrument for the ICESat (Ice, Cloud and Land Elevation Satellite) laser altimetry mission. ICESat was the benchmark Earth Observing System (EOS) mission for measuring ice sheet mass balance, cloud and aerosol heights, as well as land topography and vegetation characteristics. From 2003 to 2009, the ICESat mission provided multi-year elevation data needed to determine ice sheet mass balance as well as cloud property information, especially for stratospheric clouds common over polar areas. It also provided topography and vegetation data around the globe, in addition to the polar-specific coverage over the Greenland and Antarctic ice sheets.This document is the final version of the GLAS Science Algorithm Software Users Guide document. It contains the instructions to install the GLAS Science Algorithm Software (GSAS) in the production environment that was used to create the standard data products. It also describes the usage of each GSAS program in that environment with their required inputs and outputs. Included are a number of utility programs that are used to create ancillary data files that are used in the processing but generally are not distributed to the public as data products. Of importance is the values for the large number of constants used in the GSAS algorithm during processing are provided in an appendix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fact sheet describing the National Renewable Energy Laboratory's (NREL's) Fuel Cell Technology Status Analysis Project. NREL is seeking fuel cell industry partners from the United States and abroad to participate in an objective and credible analysis of commercially available fuel cell products to benchmark the current state of the technology and support industry growth.
The Role of Focus Groups with Other Performance Measurement Methods.
ERIC Educational Resources Information Center
Hart, Elizabeth
Huddersfield University Library (England) has undertaken a wide range of evaluative studies of its services and systems, using various data collection techniques such as: user surveys; exit interviews; online and CD-ROM analysis; benchmarking; user groups; staffing and staff development evaluation; suggestion sheets; student project work; group…
The GLAS Algorithm Theoretical Basis Document for Precision Attitude Determination (PAD)
NASA Technical Reports Server (NTRS)
Bae, Sungkoo; Smith, Noah; Schutz, Bob E.
2013-01-01
The Geoscience Laser Altimeter System (GLAS) was the sole instrument for NASAs Ice, Cloud and land Elevation Satellite (ICESat) laser altimetry mission. The primary purpose of the ICESat mission was to make ice sheet elevation measurements of the polar regions. Additional goals were to measure the global distribution of clouds and aerosols and to map sea ice, land topography and vegetation. ICESat was the benchmark Earth Observing System (EOS) mission to be used to determine the mass balance of the ice sheets, as well as for providing cloud property information, especially for stratospheric clouds common over polar areas.
Fuel Cell Technology Status Analysis Project: Partnership Opportunities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fact sheet describing the National Renewable Energy Laboratory's (NREL's) Fuel Cell Technology Status Analysis Project. NREL is seeking fuel cell industry partners from the United States and abroad to participate in an objective and credible analysis of commercially available fuel cell products to benchmark the current state of the technology and support industry growth.
NASA Astrophysics Data System (ADS)
Dang, Van Tuan; Lafon, Pascal; Labergere, Carl
2017-10-01
In this work, a combination of Proper Orthogonal Decomposition (POD) and Radial Basis Function (RBF) is proposed to build a surrogate model based on the Benchmark Springback 3D bending from the Numisheet2011 congress. The influence of the two design parameters, the geometrical parameter of the die radius and the process parameter of the blank holder force, on the springback of the sheet after a stamping operation is analyzed. The classical Design of Experience (DoE) uses Full Factorial to design the parameter space with sample points as input data for finite element method (FEM) numerical simulation of the sheet metal stamping process. The basic idea is to consider the design parameters as additional dimensions for the solution of the displacement fields. The order of the resultant high-fidelity model is reduced through the use of POD method which performs model space reduction and results in the basis functions of the low order model. Specifically, the snapshot method is used in our work, in which the basis functions is derived from snapshot deviation of the matrix of the final displacements fields of the FEM numerical simulation. The obtained basis functions are then used to determine the POD coefficients and RBF is used for the interpolation of these POD coefficients over the parameter space. Finally, the presented POD-RBF approach which is used for shape optimization can be performed with high accuracy.
International Space Station Alpha (ISSA) Integrated Traffic Model
NASA Technical Reports Server (NTRS)
Gates, R. E.
1995-01-01
The paper discusses the development process of the International Space Station Alpha (ISSA) Integrated Traffic Model which is a subsystem analyses tool utilized in the ISSA design analysis cycles. Fast-track prototyping of the detailed relationships between daily crew and station consumables, propellant needs, maintenance requirements and crew rotation via spread sheets provide adequate benchmarks to assess cargo vehicle design and performance characteristics.
A benchmark study of the sea-level equation in GIA modelling
NASA Astrophysics Data System (ADS)
Martinec, Zdenek; Klemann, Volker; van der Wal, Wouter; Riva, Riccardo; Spada, Giorgio; Simon, Karen; Blank, Bas; Sun, Yu; Melini, Daniele; James, Tom; Bradley, Sarah
2017-04-01
The sea-level load in glacial isostatic adjustment (GIA) is described by the so called sea-level equation (SLE), which represents the mass redistribution between ice sheets and oceans on a deforming earth. Various levels of complexity of SLE have been proposed in the past, ranging from a simple mean global sea level (the so-called eustatic sea level) to the load with a deforming ocean bottom, migrating coastlines and a changing shape of the geoid. Several approaches to solve the SLE have been derived, from purely analytical formulations to fully numerical methods. Despite various teams independently investigating GIA, there has been no systematic intercomparison amongst the solvers through which the methods may be validated. The goal of this paper is to present a series of benchmark experiments designed for testing and comparing numerical implementations of the SLE. Our approach starts with simple load cases even though the benchmark will not result in GIA predictions for a realistic loading scenario. In the longer term we aim for a benchmark with a realistic loading scenario, and also for benchmark solutions with rotational feedback. The current benchmark uses an earth model for which Love numbers have been computed and benchmarked in Spada et al (2011). In spite of the significant differences in the numerical methods employed, the test computations performed so far show a satisfactory agreement between the results provided by the participants. The differences found can often be attributed to the different approximations inherent to the various algorithms. Literature G. Spada, V. R. Barletta, V. Klemann, R. E. M. Riva, Z. Martinec, P. Gasperini, B. Lund, D. Wolf, L. L. A. Vermeersen, and M. A. King, 2011. A benchmark study for glacial isostatic adjustment codes. Geophys. J. Int. 185: 106-132 doi:10.1111/j.1365-
Non-Abelian semilocal strings in N=2 supersymmetric QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shifman, M.; Yung, A.; Petersburg Nuclear Physics Institute, Gatchina, St. Petersburg 188300
2006-06-15
We consider a benchmark bulk theory in four dimensions: N=2 supersymmetric QCD with the gauge group U(N) and N{sub f} flavors of fundamental matter hypermultiplets (quarks). The nature of the Bogomol'nyi-Prasad-Sommerfield (BPS) strings in this benchmark theory crucially depends on N{sub f}. If N{sub f}{>=}N and all quark masses are equal, it supports non-Abelian BPS strings which have internal (orientational) moduli. If N{sub f}>N these strings become semilocal, developing additional moduli {rho} related to (unlimited) variations of their transverse size. Using the U(2) gauge group with N{sub f}=3, 4 as an example, we derive an effective low-energy theory on themore » (two-dimensional) string world sheet. Our derivation is field theoretic, direct and explicit: we first analyze the Bogomol'nyi equations for string-geometry solitons, suggest an ansatz, and solve it at large {rho}. Then we use this solution to obtain the world-sheet theory. In the semiclassical limit our result confirms the Hanany-Tong conjecture, which rests on brane-based arguments, that the world-sheet theory is an N=2 supersymmetric U(1) gauge theory with N positively and N{sub e}=N{sub f}-N negatively charged matter multiplets and the Fayet-Iliopoulos term determined by the four-dimensional coupling constant. We conclude that the Higgs branch of this model is not lifted by quantum effects. As a result, such strings cannot confine. Our analysis of infrared effects, not seen in the Hanany-Tong consideration, shows that, in fact, the derivative expansion can make sense only provided that the theory under consideration is regularized in the infrared, e.g. by the quark mass differences. The world-sheet action discussed in this paper becomes a bona fide low-energy effective action only if {delta}m{sub AB}{ne}0.« less
Simple Benchmark Specifications for Space Radiation Protection
NASA Technical Reports Server (NTRS)
Singleterry, Robert C. Jr.; Aghara, Sukesh K.
2013-01-01
This report defines space radiation benchmark specifications. This specification starts with simple, monoenergetic, mono-directional particles on slabs and progresses to human models in spacecraft. This report specifies the models and sources needed to what the team performing the benchmark needs to produce in a report. Also included are brief descriptions of how OLTARIS, the NASA Langley website for space radiation analysis, performs its analysis.
Resistive switching near electrode interfaces: Estimations by a current model
NASA Astrophysics Data System (ADS)
Schroeder, Herbert; Zurhelle, Alexander; Stemmer, Stefanie; Marchewka, Astrid; Waser, Rainer
2013-02-01
The growing resistive switching database is accompanied by many detailed mechanisms which often are pure hypotheses. Some of these suggested models can be verified by checking their predictions with the benchmarks of future memory cells. The valence change memory model assumes that the different resistances in ON and OFF states are made by changing the defect density profiles in a sheet near one working electrode during switching. The resulting different READ current densities in ON and OFF states were calculated by using an appropriate simulation model with variation of several important defect and material parameters of the metal/insulator (oxide)/metal thin film stack such as defect density and its profile change in density and thickness, height of the interface barrier, dielectric permittivity, applied voltage. The results were compared to the benchmarks and some memory windows of the varied parameters can be defined: The required ON state READ current density of 105 A/cm2 can only be achieved for barriers smaller than 0.7 eV and defect densities larger than 3 × 1020 cm-3. The required current ratio between ON and OFF states of at least 10 requests defect density reduction of approximately an order of magnitude in a sheet of several nanometers near the working electrode.
Moorthy, A; Alkadhimi, A F; Stassen, Leo F; Duncan, H F
2016-01-01
Concerns were expressed that postoperative written instructions following endodontic treatment are not available in the Dublin Dental University Hospital. Data was collected in three phases: retrospective analysis of clinical notes for evidence of the delivery of postoperative instructions; a randomly distributed questionnaire to patients undergoing root canal treatment prior to the introduction of a written postoperative advice sheet; and, another survey following introduction of the advice sheet. Some 56% of patients' charts documented that postoperative advice was given. Analysis of phase two revealed that patients were not consistently informed of any key postoperative messages. In phase 3 analysis, the proposed benchmarks were met in four out of six categories. Postoperative advice after root canal treatment in the DDUH is both poorly recorded and inconsistently delivered. A combination of oral postoperative instructions and written postoperative advice provided the most effective delivery of patient information.
The General Concept of Benchmarking and Its Application in Higher Education in Europe
ERIC Educational Resources Information Center
Nazarko, Joanicjusz; Kuzmicz, Katarzyna Anna; Szubzda-Prutis, Elzbieta; Urban, Joanna
2009-01-01
The purposes of this paper are twofold: a presentation of the theoretical basis of benchmarking and a discussion on practical benchmarking applications. Benchmarking is also analyzed as a productivity accelerator. The authors study benchmarking usage in the private and public sectors with due consideration of the specificities of the two areas.…
NASA Technical Reports Server (NTRS)
Bailey, David (Editor); Barton, John (Editor); Lasinski, Thomas (Editor); Simon, Horst (Editor)
1993-01-01
A new set of benchmarks was developed for the performance evaluation of highly parallel supercomputers. These benchmarks consist of a set of kernels, the 'Parallel Kernels,' and a simulated application benchmark. Together they mimic the computation and data movement characteristics of large scale computational fluid dynamics (CFD) applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification - all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.
NASA Astrophysics Data System (ADS)
Asay-Davis, Xylar; Cornford, Stephen; Martin, Daniel; Gudmundsson, Hilmar; Holland, David; Holland, Denise
2015-04-01
The MISMIP and MISMIP3D marine ice sheet model intercomparison exercises have become popular benchmarks, and several modeling groups have used them to show how their models compare to both analytical results and other models. Similarly, the ISOMIP (Ice Shelf-Ocean Model Intercomparison Project) experiments have acted as a proving ground for ocean models with sub-ice-shelf cavities.As coupled ice sheet-ocean models become available, an updated set of benchmark experiments is needed. To this end, we propose sequel experiments, MISMIP+ and ISOMIP+, with an end goal of coupling the two in a third intercomparison exercise, MISOMIP (the Marine Ice Sheet-Ocean Model Intercomparison Project). Like MISMIP3D, the MISMIP+ experiments take place in an idealized, three-dimensional setting and compare full 3D (Stokes) and reduced, hydrostatic models. Unlike the earlier exercises, the primary focus will be the response of models to sub-shelf melting. The chosen configuration features an ice shelf that experiences substantial lateral shear and buttresses the upstream ice, and so is well suited to melting experiments. Differences between the steady states of each model are minor compared to the response to melt-rate perturbations, reflecting typical real-world applications where parameters are chosen so that the initial states of all models tend to match observations. The three ISOMIP+ experiments have been designed to to make use of the same bedrock topography as MISMIP+ and using ice-shelf geometries from MISMIP+ results produced by the BISICLES ice-sheet model. The first two experiments use static ice-shelf geometries to simulate the evolution of ocean dynamics and resulting melt rates to a quasi-steady state when far-field forcing changes in either from cold to warm or from warm to cold states. The third experiment prescribes 200 years of dynamic ice-shelf geometry (with both retreating and advancing ice) based on a BISICLES simulation along with similar flips between warm and cold states in the far-field ocean forcing. The MISOMIP experiment combines the MISMIP+ experiments with the third ISOMIP+ experiment. Changes in far-field ocean forcing lead to a rapid (over ~1-2 years) increase in sub-ice-shelf melting, which is allowed to drive ice-shelf retreat for ~100 years. Then, the far-field forcing is switched to a cold state, leading to a rapid decrease in melting and a subsequent advance over ~100 years. To illustrate, we present results from BISICLES and POP2x experiments for each of the three intercomparison exercises.
Formation of current singularity in a topologically constrained plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Yao; Huang, Yi-Min; Qin, Hong
2016-02-01
Recently a variational integrator for ideal magnetohydrodynamics in Lagrangian labeling has been developed. Its built-in frozen-in equation makes it optimal for studying current sheet formation. We use this scheme to study the Hahm-Kulsrud-Taylor problem, which considers the response of a 2D plasma magnetized by a sheared field under sinusoidal boundary forcing. We obtain an equilibrium solution that preserves the magnetic topology of the initial field exactly, with a fluid mapping that is non-differentiable. Unlike previous studies that examine the current density output, we identify a singular current sheet from the fluid mapping. These results are benchmarked with a constrained Grad-Shafranovmore » solver. The same signature of current singularity can be found in other cases with more complex magnetic topologies.« less
Land Ice Verification and Validation Kit
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-07-15
To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandor, Debra; Chung, Donald; Keyser, David
This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.
Rooney, Alan D.; Selby, David; Llyod, Jeremy M.; Roberts, David H.; Luckge, Andreas; Sageman, Bradley B.; Prouty, Nancy G.
2016-01-01
High-resolution Os isotope stratigraphy can aid in reconstructing Pleistocene ice sheet fluctuation and elucidating the role of local and regional weathering fluxes on the marine Os residence time. This paper presents new Os isotope data from ocean cores adjacent to the West Greenland ice sheet that have excellent chronological controls. Cores MSM-520 and DA00-06 represent distal to proximal sites adjacent to two West Greenland ice streams. Core MSM-520 has a steadily decreasing Os signal over the last 10 kyr (187Os/188Os = 1.35–0.81). In contrast, Os isotopes from core DA00-06 (proximal to the calving front of Jakobshavn Isbræ) highlight four stages of ice stream retreat and advance over the past 10 kyr (187Os/188Os = 2.31; 1.68; 2.09; 1.47). Our high-resolution chemostratigraphic records provide vital benchmarks for ice-sheet modelers as we attempt to better constrain the future response of major ice sheets to climate change. Variations in Os isotope composition from sediment and macro-algae (seaweed) sourced from regional and global settings serve to emphasize the overwhelming effect weathering sources have on seawater Os isotope composition. Further, these findings demonstrate that the residence time of Os is shorter than previous estimates of ∼104 yr.
Numerical analysis of tailored sheets to improve the quality of components made by SPIF
NASA Astrophysics Data System (ADS)
Gagliardi, Francesco; Ambrogio, Giuseppina; Cozza, Anna; Pulice, Diego; Filice, Luigino
2018-05-01
In this paper, the authors pointed out a study on the profitable combination of forming techniques. More in detail, the attention has been put on the combination of the single point incremental forming (SPIF) and, generally, speaking, of an additional process that can lead to a material thickening on the initial blank considering the local thinning which the sheets undergo at. Focalizing the attention of the research on the excessive thinning of parts made by SPIF, a hybrid approach can be thought as a viable solution to reduce the not homogeneous thickness distribution of the sheet. In fact, the basic idea is to work on a blank previously modified by a deformation step performed, for instance, by forming, additive or subtractive processes. To evaluate the effectiveness of this hybrid solution, a FE numerical model has been defined to analyze the thickness variation on tailored sheets incrementally formed optimizing the material distribution according to the shape to be manufactured. Simulations based on the explicit formulation have been set up for the model implementation. The mechanical properties of the sheet material have been taken in literature and a frustum of cone as benchmark profile has been considered for the performed analysis. The outcomes of numerical model have been evaluated in terms of both maximum thinning and final thickness distribution. The feasibility of the proposed approach will be deeply detailed in the paper.
NASA Technical Reports Server (NTRS)
Bailey, D. H.; Barszcz, E.; Barton, J. T.; Carter, R. L.; Lasinski, T. A.; Browning, D. S.; Dagum, L.; Fatoohi, R. A.; Frederickson, P. O.; Schreiber, R. S.
1991-01-01
A new set of benchmarks has been developed for the performance evaluation of highly parallel supercomputers in the framework of the NASA Ames Numerical Aerodynamic Simulation (NAS) Program. These consist of five 'parallel kernel' benchmarks and three 'simulated application' benchmarks. Together they mimic the computation and data movement characteristics of large-scale computational fluid dynamics applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification-all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.
Multi-GPU three dimensional Stokes solver for simulating glacier flow
NASA Astrophysics Data System (ADS)
Licul, Aleksandar; Herman, Frédéric; Podladchikov, Yuri; Räss, Ludovic; Omlin, Samuel
2016-04-01
Here we present how we have recently developed a three-dimensional Stokes solver on the GPUs and apply it to a glacier flow. We numerically solve the Stokes momentum balance equations together with the incompressibility equation, while also taking into account strong nonlinearities for ice rheology. We have developed a fully three-dimensional numerical MATLAB application based on an iterative finite difference scheme with preconditioning of residuals. Differential equations are discretized on a regular staggered grid. We have ported it to C-CUDA to run it on GPU's in parallel, using MPI. We demonstrate the accuracy and efficiency of our developed model by manufactured analytical solution test for three-dimensional Stokes ice sheet models (Leng et al.,2013) and by comparison with other well-established ice sheet models on diagnostic ISMIP-HOM benchmark experiments (Pattyn et al., 2008). The results show that our developed model is capable to accurately and efficiently solve Stokes system of equations in a variety of different test scenarios, while preserving good parallel efficiency on up to 80 GPU's. For example, in 3D test scenarios with 250000 grid points our solver converges in around 3 minutes for single precision computations and around 10 minutes for double precision computations. We have also optimized the developed code to efficiently run on our newly acquired state-of-the-art GPU cluster octopus. This allows us to solve our problem on more than 20 million grid points, by just increasing the number of GPU used, while keeping the computation time the same. In future work we will apply our solver to real world applications and implement the free surface evolution capabilities. REFERENCES Leng,W.,Ju,L.,Gunzburger,M. & Price,S., 2013. Manufactured solutions and the verification of three-dimensional stokes ice-sheet models. Cryosphere 7,19-29. Pattyn, F., Perichon, L., Aschwanden, A., Breuer, B., de Smedt, B., Gagliardini, O., Gudmundsson,G.H., Hindmarsh, R.C.A., Hubbard, A., Johnson, J.V., Kleiner, T., Konovalov,Y., Martin, C., Payne, A.J., Pollard, D., Price, S., Rckamp, M., Saito, F., Souk, O.,Sugiyama, S. & Zwinger, T., 2008. Benchmark experiments for higher-order and full-stokes ice sheet models (ismiphom). The Cryosphere 2, 95-108.
ERIC Educational Resources Information Center
McGregor, Ellen N.; Attinasi, Louis C., Jr.
This paper describes the processes involved in selecting peer institutions for appropriate benchmarking using national databases (NCES-IPEDS). Benchmarking involves the identification of peer institutions and/or best practices in specific operational areas for the purpose of developing standards. The benchmarking process was borne in the early…
How to Advance TPC Benchmarks with Dependability Aspects
NASA Astrophysics Data System (ADS)
Almeida, Raquel; Poess, Meikel; Nambiar, Raghunath; Patil, Indira; Vieira, Marco
Transactional systems are the core of the information systems of most organizations. Although there is general acknowledgement that failures in these systems often entail significant impact both on the proceeds and reputation of companies, the benchmarks developed and managed by the Transaction Processing Performance Council (TPC) still maintain their focus on reporting bare performance. Each TPC benchmark has to pass a list of dependability-related tests (to verify ACID properties), but not all benchmarks require measuring their performances. While TPC-E measures the recovery time of some system failures, TPC-H and TPC-C only require functional correctness of such recovery. Consequently, systems used in TPC benchmarks are tuned mostly for performance. In this paper we argue that nowadays systems should be tuned for a more comprehensive suite of dependability tests, and that a dependability metric should be part of TPC benchmark publications. The paper discusses WHY and HOW this can be achieved. Two approaches are introduced and discussed: augmenting each TPC benchmark in a customized way, by extending each specification individually; and pursuing a more unified approach, defining a generic specification that could be adjoined to any TPC benchmark.
Evaluation of control strategies using an oxidation ditch benchmark.
Abusam, A; Keesman, K J; Spanjers, H; van, Straten G; Meinema, K
2002-01-01
This paper presents validation and implementation results of a benchmark developed for a specific full-scale oxidation ditch wastewater treatment plant. A benchmark is a standard simulation procedure that can be used as a tool in evaluating various control strategies proposed for wastewater treatment plants. It is based on model and performance criteria development. Testing of this benchmark, by comparing benchmark predictions to real measurements of the electrical energy consumptions and amounts of disposed sludge for a specific oxidation ditch WWTP, has shown that it can (reasonably) be used for evaluating the performance of this WWTP. Subsequently, the validated benchmark was then used in evaluating some basic and advanced control strategies. Some of the interesting results obtained are the following: (i) influent flow splitting ratio, between the first and the fourth aerated compartments of the ditch, has no significant effect on the TN concentrations in the effluent, and (ii) for evaluation of long-term control strategies, future benchmarks need to be able to assess settlers' performance.
Unstructured Adaptive (UA) NAS Parallel Benchmark. Version 1.0
NASA Technical Reports Server (NTRS)
Feng, Huiyu; VanderWijngaart, Rob; Biswas, Rupak; Mavriplis, Catherine
2004-01-01
We present a complete specification of a new benchmark for measuring the performance of modern computer systems when solving scientific problems featuring irregular, dynamic memory accesses. It complements the existing NAS Parallel Benchmark suite. The benchmark involves the solution of a stylized heat transfer problem in a cubic domain, discretized on an adaptively refined, unstructured mesh.
Time resolved PIV and flow visualization of 3D sheet cavitation
NASA Astrophysics Data System (ADS)
Foeth, E. J.; van Doorne, C. W. H.; van Terwisga, T.; Wieneke, B.
2006-04-01
Time-resolved PIV was applied to study fully developed sheet cavitation on a hydrofoil with a spanwise varying angle of attack. The hydrofoil was designed to have a three-dimensional cavitation pattern closely related to propeller cavitation, studied for its adverse effects as vibration, noise, and erosion production. For the PIV measurements, fluorescent tracer particles were applied in combination with an optical filter, in order to remove the reflections of the laser lightsheet by the cavitation. An adaptive mask was developed to find the interface between the vapor and liquid phase. The velocity at the interface of the cavity was found to be very close to the velocity predicted by a simple streamline model. For a visualization of the global flow dynamics, the laser beam was expanded and used to illuminate the entire hydrofoil and cavitation structure. The time-resolved recordings reveal the growth of the attached cavity and the cloud shedding. Our investigation proves the viability of accurate PIV measurements around developed sheet cavitation. The presented results will further be made available as a benchmark for the validation of numerical simulations of this complicated flow.
HPGMG 1.0: A Benchmark for Ranking High Performance Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Mark; Brown, Jed; Shalf, John
2014-05-05
This document provides an overview of the benchmark ? HPGMG ? for ranking large scale general purpose computers for use on the Top500 list [8]. We provide a rationale for the need for a replacement for the current metric HPL, some background of the Top500 list and the challenges of developing such a metric; we discuss our design philosophy and methodology, and an overview of the specification of the benchmark. The primary documentation with maintained details on the specification can be found at hpgmg.org and the Wiki and benchmark code itself can be found in the repository https://bitbucket.org/hpgmg/hpgmg.
High-Strength Composite Fabric Tested at Structural Benchmark Test Facility
NASA Technical Reports Server (NTRS)
Krause, David L.
2002-01-01
Large sheets of ultrahigh strength fabric were put to the test at NASA Glenn Research Center's Structural Benchmark Test Facility. The material was stretched like a snare drum head until the last ounce of strength was reached, when it burst with a cacophonous release of tension. Along the way, the 3-ft square samples were also pulled, warped, tweaked, pinched, and yanked to predict the material's physical reactions to the many loads that it will experience during its proposed use. The material tested was a unique multi-ply composite fabric, reinforced with fibers that had a tensile strength eight times that of common carbon steel. The fiber plies were oriented at 0 and 90 to provide great membrane stiffness, as well as oriented at 45 to provide an unusually high resistance to shear distortion. The fabric's heritage is in astronaut space suits and other NASA programs.
NASA Astrophysics Data System (ADS)
Sergeev, D. A.; Kandaurov, A. A.; Troitskaya, Yu I.
2017-11-01
In this paper we describe PIV-system specially designed for the study of the hydrophysical processes in large-scale benchmark setup of promising fast reactor. The system allows the PIV-measurements for the conditions of complicated configuration of the reactor benchmark, reflections and distortions section of the laser sheet, blackout, in the closed volume. The use of filtering techniques and method of masks images enabled us to reduce the number of incorrect measurement of flow velocity vectors by an order. The method of conversion of image coordinates and velocity field in the reference model of the reactor using a virtual 3D simulation targets, without loss of accuracy in comparison with a method of using physical objects in filming area was released. The results of measurements of velocity fields in various modes, both stationary (workers), as well as in non-stationary (emergency).
ff14ipq: A Self-Consistent Force Field for Condensed-Phase Simulations of Proteins
2015-01-01
We present the ff14ipq force field, implementing the previously published IPolQ charge set for simulations of complete proteins. Minor modifications to the charge derivation scheme and van der Waals interactions between polar atoms are introduced. Torsion parameters are developed through a generational learning approach, based on gas-phase MP2/cc-pVTZ single-point energies computed of structures optimized by the force field itself rather than the quantum benchmark. In this manner, we sacrifice information about the true quantum minima in order to ensure that the force field maintains optimal agreement with the MP2/cc-pVTZ benchmark for the ensembles it will actually produce in simulations. A means of making the gas-phase torsion parameters compatible with solution-phase IPolQ charges is presented. The ff14ipq model is an alternative to ff99SB and other Amber force fields for protein simulations in programs that accommodate pair-specific Lennard–Jones combining rules. The force field gives strong performance on α-helical and β-sheet oligopeptides as well as globular proteins over microsecond time scale simulations, although it has not yet been tested in conjunction with lipid and nucleic acid models. We show how our choices in parameter development influence the resulting force field and how other choices that may have appeared reasonable would actually have led to poorer results. The tools we developed may also aid in the development of future fixed-charge and even polarizable biomolecular force fields. PMID:25328495
Benchmarking and testing the "Sea Level Equation
NASA Astrophysics Data System (ADS)
Spada, G.; Barletta, V. R.; Klemann, V.; van der Wal, W.; James, T. S.; Simon, K.; Riva, R. E. M.; Martinec, Z.; Gasperini, P.; Lund, B.; Wolf, D.; Vermeersen, L. L. A.; King, M. A.
2012-04-01
The study of the process of Glacial Isostatic Adjustment (GIA) and of the consequent sea level variations is gaining an increasingly important role within the geophysical community. Understanding the response of the Earth to the waxing and waning ice sheets is crucial in various contexts, ranging from the interpretation of modern satellite geodetic measurements to the projections of future sea level trends in response to climate change. All the processes accompanying GIA can be described solving the so-called Sea Level Equation (SLE), an integral equation that accounts for the interactions between the ice sheets, the solid Earth, and the oceans. Modern approaches to the SLE are based on various techniques that range from purely analytical formulations to fully numerical methods. Despite various teams independently investigating GIA, we do not have a suitably large set of agreed numerical results through which the methods may be validated. Following the example of the mantle convection community and our recent successful Benchmark for Post Glacial Rebound codes (Spada et al., 2011, doi: 10.1111/j.1365-246X.2011.04952.x), here we present the results of a benchmark study of independently developed codes designed to solve the SLE. This study has taken place within a collaboration facilitated through the European Cooperation in Science and Technology (COST) Action ES0701. The tests involve predictions of past and current sea level variations, and 3D deformations of the Earth surface. In spite of the signi?cant differences in the numerical methods employed, the test computations performed so far show a satisfactory agreement between the results provided by the participants. The differences found, which can be often attributed to the different numerical algorithms employed within the community, help to constrain the intrinsic errors in model predictions. These are of fundamental importance for a correct interpretation of the geodetic variations observed today, and particularly for the evaluation of climate-driven sea level variations.
2015-09-15
middleware implementations via a common object-oriented software hierarchy, with library -specific implementations of the five GMTI benchmark ...Full-Chain Benchmarking for Open Architecture Airborne ISR Systems A Case Study for GMTI Radar Applications Matthias Beebe, Matthew Alexander...time performance, effective benchmarks are necessary to ensure that an ARP system can meet the mission constraints and performance requirements of
Social Studies: Grades 4, 8, & 11. Content Specifications for Statewide Assessment by Standard.
ERIC Educational Resources Information Center
Missouri State Dept. of Elementary and Secondary Education, Jefferson City.
This state of Missouri guide to content specifications for social studies assessment is designed to give teachers direction for assessment at the benchmark levels of grades 4, 8, and 11 for each standard that is appropriate for a statewide assessment. The guide includes specifications of what students are expected to know at the benchmark levels…
ERIC Educational Resources Information Center
Canadian Health Libraries Association.
Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…
The NAS kernel benchmark program
NASA Technical Reports Server (NTRS)
Bailey, D. H.; Barton, J. T.
1985-01-01
A collection of benchmark test kernels that measure supercomputer performance has been developed for the use of the NAS (Numerical Aerodynamic Simulation) program at the NASA Ames Research Center. This benchmark program is described in detail and the specific ground rules are given for running the program as a performance test.
Transaction Processing Performance Council (TPC): State of the Council 2010
NASA Astrophysics Data System (ADS)
Nambiar, Raghunath; Wakou, Nicholas; Carman, Forrest; Majdalany, Michael
The Transaction Processing Performance Council (TPC) is a non-profit corporation founded to define transaction processing and database benchmarks and to disseminate objective, verifiable performance data to the industry. Established in August 1988, the TPC has been integral in shaping the landscape of modern transaction processing and database benchmarks over the past twenty-two years. This paper provides an overview of the TPC's existing benchmark standards and specifications, introduces two new TPC benchmarks under development, and examines the TPC's active involvement in the early creation of additional future benchmarks.
MARC calculations for the second WIPP structural benchmark problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, H.S.
1981-05-01
This report describes calculations made with the MARC structural finite element code for the second WIPP structural benchmark problem. Specific aspects of problem implementation such as element choice, slip line modeling, creep law implementation, and thermal-mechanical coupling are discussed in detail. Also included are the computational results specified in the benchmark problem formulation.
[Developing patient information sheets in general practice. Proposal for a methodology].
Sustersic, Mélanie; Meneau, Aurélia; Drémont, Roger; Paris, Adeline; Laborde, Laurent; Bosson, Jean-Luc
2008-12-15
Health information is patients' wish and right. For general practitioners, it is a duty, a legal obligation and a pre-requisite in any preventive approach. Written information must complete oral information since it improves health care quality. However, in general practice, there are no patient documents which are scientifically valid, understandable and efficient in terms of communication. To develop a method for creating patient information sheets and to experiment its feasibility through the development of 125 sheets focused on the most common clinical conditions in general practice. Research and literature review pour the development of specifications, and creation of 125 sheets following these specifications. The specifications developed consist of the 10 following steps: selection of the topic and the objectives, literature review, selection of the sections, drafting, validation of the scientific contents, assessment among patients, validation of the layout, selection of the media, delivery to patients and update. Following these specifications, we developed 125 information sheets. Each of these was reviewed by several physicians and assessed with R. Flesh readability test (the established acceptable threshold value was 40). The 30 sheets associated with the lowest scores were selected and reviewed to improve their overall readability. Even though some difficulties cannot be avoided when developing patient information sheets, each physician or physician association can create its own documents following the proposed specifications and thus deliver a customized message.
NASA Technical Reports Server (NTRS)
Chubb, Donald L.; White, K. Alan, III
1987-01-01
A new external flow radiator concept, the liquid sheet radiator (LSR), is introduced. The LSR sheet flow is described and an expression for the length/width (l/w), ratio is presented. A linear dependence of l/w on velocity is predicted that agrees with experimental results. Specific power for the LSR is calculated and is found to be nearly the same as the specific power of a liquid droplet radiator, (LDR). Several sheet thicknesses and widths were experimentally investigated. In no case was the flow found to be unstable.
Noiseless Vlasov-Poisson simulations with linearly transformed particles
Pinto, Martin C.; Sonnendrucker, Eric; Friedman, Alex; ...
2014-06-25
We introduce a deterministic discrete-particle simulation approach, the Linearly-Transformed Particle-In-Cell (LTPIC) method, that employs linear deformations of the particles to reduce the noise traditionally associated with particle schemes. Formally, transforming the particles is justified by local first order expansions of the characteristic flow in phase space. In practice the method amounts of using deformation matrices within the particle shape functions; these matrices are updated via local evaluations of the forward numerical flow. Because it is necessary to periodically remap the particles on a regular grid to avoid excessively deforming their shapes, the method can be seen as a development ofmore » Denavit's Forward Semi-Lagrangian (FSL) scheme (Denavit, 1972 [8]). However, it has recently been established (Campos Pinto, 2012 [20]) that the underlying Linearly-Transformed Particle scheme converges for abstract transport problems, with no need to remap the particles; deforming the particles can thus be seen as a way to significantly lower the remapping frequency needed in the FSL schemes, and hence the associated numerical diffusion. To couple the method with electrostatic field solvers, two specific charge deposition schemes are examined, and their performance compared with that of the standard deposition method. Finally, numerical 1d1v simulations involving benchmark test cases and halo formation in an initially mismatched thermal sheet beam demonstrate some advantages of our LTPIC scheme over the classical PIC and FSL methods. Lastly, benchmarked test cases also indicate that, for numerical choices involving similar computational effort, the LTPIC method is capable of accuracy comparable to or exceeding that of state-of-the-art, high-resolution Vlasov schemes.« less
Algorithm and Architecture Independent Benchmarking with SEAK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tallent, Nathan R.; Manzano Franco, Joseph B.; Gawande, Nitin A.
2016-05-23
Many applications of high performance embedded computing are limited by performance or power bottlenecks. We have designed the Suite for Embedded Applications & Kernels (SEAK), a new benchmark suite, (a) to capture these bottlenecks in a way that encourages creative solutions; and (b) to facilitate rigorous, objective, end-user evaluation for their solutions. To avoid biasing solutions toward existing algorithms, SEAK benchmarks use a mission-centric (abstracted from a particular algorithm) and goal-oriented (functional) specification. To encourage solutions that are any combination of software or hardware, we use an end-user black-box evaluation that can capture tradeoffs between performance, power, accuracy, size, andmore » weight. The tradeoffs are especially informative for procurement decisions. We call our benchmarks future proof because each mission-centric interface and evaluation remains useful despite shifting algorithmic preferences. It is challenging to create both concise and precise goal-oriented specifications for mission-centric problems. This paper describes the SEAK benchmark suite and presents an evaluation of sample solutions that highlights power and performance tradeoffs.« less
A Field-Based Aquatic Life Benchmark for Conductivity in ...
EPA announced the availability of the final report, A Field-Based Aquatic Life Benchmark for Conductivity in Central Appalachian Streams. This report describes a method to characterize the relationship between the extirpation (the effective extinction) of invertebrate genera and salinity (measured as conductivity) and from that relationship derives a freshwater aquatic life benchmark. This benchmark of 300 µS/cm may be applied to waters in Appalachian streams that are dominated by calcium and magnesium salts of sulfate and bicarbonate at circum-neutral to mildly alkaline pH. This report provides scientific evidence for a conductivity benchmark in a specific region rather than for the entire United States.
ERIC Educational Resources Information Center
Moskovkin, Vladimir M.; Bocharova, Emilia A.; Balashova, Oksana V.
2014-01-01
Purpose: The purpose of this paper is to introduce and develop the methodology of journal benchmarking. Design/Methodology/ Approach: The journal benchmarking method is understood to be an analytic procedure of continuous monitoring and comparing of the advance of specific journal(s) against that of competing journals in the same subject area,…
75 FR 45144 - Recovery Fact Sheet 9580.203, Debris Monitoring
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-02
...] Recovery Fact Sheet 9580.203, Debris Monitoring AGENCY: Federal Emergency Management Agency, DHS. ACTION... accepting comments on Recovery Fact Sheet 9580.203, Debris Monitoring. DATES: Comments must be received by... guidelines. Specifically, the fact sheet provides information on debris monitoring roles and responsibilities...
ICSBEP Benchmarks For Nuclear Data Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briggs, J. Blair
2005-05-24
The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was initiated in 1992 by the United States Department of Energy. The ICSBEP became an official activity of the Organization for Economic Cooperation and Development (OECD) -- Nuclear Energy Agency (NEA) in 1995. Representatives from the United States, United Kingdom, France, Japan, the Russian Federation, Hungary, Republic of Korea, Slovenia, Serbia and Montenegro (formerly Yugoslavia), Kazakhstan, Spain, Israel, Brazil, Poland, and the Czech Republic are now participating. South Africa, India, China, and Germany are considering participation. The purpose of the ICSBEP is to identify, evaluate, verify, and formally document a comprehensive andmore » internationally peer-reviewed set of criticality safety benchmark data. The work of the ICSBEP is published as an OECD handbook entitled ''International Handbook of Evaluated Criticality Safety Benchmark Experiments.'' The 2004 Edition of the Handbook contains benchmark specifications for 3331 critical or subcritical configurations that are intended for use in validation efforts and for testing basic nuclear data. New to the 2004 Edition of the Handbook is a draft criticality alarm / shielding type benchmark that should be finalized in 2005 along with two other similar benchmarks. The Handbook is being used extensively for nuclear data testing and is expected to be a valuable resource for code and data validation and improvement efforts for decades to come. Specific benchmarks that are useful for testing structural materials such as iron, chromium, nickel, and manganese; beryllium; lead; thorium; and 238U are highlighted.« less
Rahman, Sajjad; Salameh, Khalil; Al-Rifai, Hilal; Masoud, Ahmed; Lutfi, Samawal; Salama, Husam; Abdoh, Ghassan; Omar, Fahmi; Bener, Abdulbari
2011-09-01
To analyze and compare the current gestational age specific neonatal survival rates between Qatar and international benchmarks. An analytical comparative study. Women's Hospital, Hamad Medical Corporation, Doha, Qatar, from 2003-2008. Six year's (2003-2008) gestational age specific neonatal mortality data was stratified for each completed week of gestation at birth from 24 weeks till term. The data from World Health Statistics by WHO (2010), Vermont Oxford Network (VON, 2007) and National Statistics United Kingdom (2006) were used as international benchmarks for comparative analysis. A total of 82,002 babies were born during the study period. Qatar's neonatal mortality rate (NMR) dropped from 6/1000 in 2003 to 4.3/1000 in 2008 (p < 0.05). The overall and gestational age specific neonatal mortality rates of Qatar were comparable with international benchmarks. The survival of < 27 weeks and term babies was better in Qatar (p=0.01 and p < 0.001 respectively) as compared to VON. The survival of > 32 weeks babies was better in UK (p=0.01) as compared to Qatar. The relative risk (RR) of death decreased with increasing gestational age (p < 0.0001). Preterm babies (45%) followed by lethal chromosomal and congenital anomalies (26.5%) were the two leading causes of neonatal deaths in Qatar. The current total and gestational age specific neonatal survival rates in the State of Qatar are comparable with international benchmarks. In Qatar, persistently high rates of low birth weight and lethal chromosomal and congenital anomalies significantly contribute towards neonatal mortality.
Benchmarking specialty hospitals, a scoping review on theory and practice.
Wind, A; van Harten, W H
2017-04-04
Although benchmarking may improve hospital processes, research on this subject is limited. The aim of this study was to provide an overview of publications on benchmarking in specialty hospitals and a description of study characteristics. We searched PubMed and EMBASE for articles published in English in the last 10 years. Eligible articles described a project stating benchmarking as its objective and involving a specialty hospital or specific patient category; or those dealing with the methodology or evaluation of benchmarking. Of 1,817 articles identified in total, 24 were included in the study. Articles were categorized into: pathway benchmarking, institutional benchmarking, articles on benchmark methodology or -evaluation and benchmarking using a patient registry. There was a large degree of variability:(1) study designs were mostly descriptive and retrospective; (2) not all studies generated and showed data in sufficient detail; and (3) there was variety in whether a benchmarking model was just described or if quality improvement as a consequence of the benchmark was reported upon. Most of the studies that described a benchmark model described the use of benchmarking partners from the same industry category, sometimes from all over the world. Benchmarking seems to be more developed in eye hospitals, emergency departments and oncology specialty hospitals. Some studies showed promising improvement effects. However, the majority of the articles lacked a structured design, and did not report on benchmark outcomes. In order to evaluate the effectiveness of benchmarking to improve quality in specialty hospitals, robust and structured designs are needed including a follow up to check whether the benchmark study has led to improvements.
Benchmarking Using Basic DBMS Operations
NASA Astrophysics Data System (ADS)
Crolotte, Alain; Ghazal, Ahmad
The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.
Xia, Yuan; Deshpande, Sameer; Bonates, Tiberius
2016-11-01
Social marketing managers promote desired behaviors to an audience by making them tangible in the form of environmental opportunities to enhance benefits and reduce barriers. This study proposed "benchmarks," modified from those found in the past literature, that would match important concepts of the social marketing framework and the inclusion of which would ensure behavior change effectiveness. In addition, we analyzed behavior change interventions on a "social marketing continuum" to assess whether the number of benchmarks and the role of specific benchmarks influence the effectiveness of physical activity promotion efforts. A systematic review of social marketing interventions available in academic studies published between 1997 and 2013 revealed 173 conditions in 92 interventions. Findings based on χ 2 , Mallows' Cp, and Logical Analysis of Data tests revealed that the presence of more benchmarks in interventions increased the likelihood of success in promoting physical activity. The presence of more than 3 benchmarks improved the success of the interventions; specifically, all interventions were successful when more than 7.5 benchmarks were present. Further, primary formative research, core product, actual product, augmented product, promotion, and behavioral competition all had a significant influence on the effectiveness of interventions. Social marketing is an effective approach in promoting physical activity among adults when a substantial number of benchmarks are used and when managers understand the audience, make the desired behavior tangible, and promote the desired behavior persuasively.
46 CFR 160.005-1 - Incorporation by reference.
Code of Federal Regulations, 2014 CFR
2014-10-01
... makes reference to the following documents: (1) Federal Specification: L-P-375C—Plastic Film, Flexible....005-1: (Sheet 1) Cutting Pattern and General Arrangement (Adult). (Sheet 2) Alternate Stitching of Tapes and Webbing (Adult and Child). (Sheet 3) Pad Detail (Adult). (Sheet 4) Cutting Pattern and General...
46 CFR 160.005-1 - Incorporation by reference.
Code of Federal Regulations, 2013 CFR
2013-10-01
... makes reference to the following documents: (1) Federal Specification: L-P-375C—Plastic Film, Flexible....005-1: (Sheet 1) Cutting Pattern and General Arrangement (Adult). (Sheet 2) Alternate Stitching of Tapes and Webbing (Adult and Child). (Sheet 3) Pad Detail (Adult). (Sheet 4) Cutting Pattern and General...
46 CFR 160.005-1 - Incorporation by reference.
Code of Federal Regulations, 2012 CFR
2012-10-01
... makes reference to the following documents: (1) Federal Specification: L-P-375C—Plastic Film, Flexible....005-1: (Sheet 1) Cutting Pattern and General Arrangement (Adult). (Sheet 2) Alternate Stitching of Tapes and Webbing (Adult and Child). (Sheet 3) Pad Detail (Adult). (Sheet 4) Cutting Pattern and General...
46 CFR 160.005-1 - Incorporation by reference.
Code of Federal Regulations, 2011 CFR
2011-10-01
... makes reference to the following documents: (1) Federal Specification: L-P-375C—Plastic Film, Flexible....005-1: (Sheet 1) Cutting Pattern and General Arrangement (Adult). (Sheet 2) Alternate Stitching of Tapes and Webbing (Adult and Child). (Sheet 3) Pad Detail (Adult). (Sheet 4) Cutting Pattern and General...
Length of stay benchmarks for inpatient rehabilitation after stroke.
Meyer, Matthew; Britt, Eileen; McHale, Heather A; Teasell, Robert
2012-01-01
In Canada, no standardized benchmarks for length of stay (LOS) have been established for post-stroke inpatient rehabilitation. This paper describes the development of a severity specific median length of stay benchmarking strategy, assessment of its impact after one year of implementation in a Canadian rehabilitation hospital, and establishment of updated benchmarks that may be useful for comparison with other facilities across Canada. Patient data were retrospectively assessed for all patients admitted to a single post-acute stroke rehabilitation unit in Ontario, Canada between April 2005 and March 2008. Rehabilitation Patient Groups (RPGs) were used to establish stratified median length of stay benchmarks for each group that were incorporated into team rounds beginning in October 2009. Benchmark impact was assessed using mean LOS, FIM(®) gain, and discharge destination for each RPG group, collected prospectively for one year, compared against similar information from the previous calendar year. Benchmarks were then adjusted accordingly for future use. Between October 2009 and September 2010, a significant reduction in average LOS was noted compared to the previous year (35.3 vs. 41.2 days; p < 0.05). Reductions in LOS were noted in each RPG group including statistically significant reductions in 4 of the 7 groups. As intended, reductions in LOS were achieved with no significant reduction in mean FIM(®) gain or proportion of patients discharged home compared to the previous year. Adjusted benchmarks for LOS ranged from 13 to 48 days depending on the RPG group. After a single year of implementation, severity specific benchmarks helped the rehabilitation team reduce LOS while maintaining the same levels of functional gain and achieving the same rate of discharge to the community. © 2012 Informa UK, Ltd.
Flow interaction with a flexible viscoelastic sheet
NASA Astrophysics Data System (ADS)
Shoele, Kourosh
2017-11-01
Many new engineered materials and almost all soft biological tissues are made up of heterogeneous multi-scale components with complex viscoelastic behavior. This implies that their macro constitutive relations cannot be modeled sufficiently with a typical integer-order viscoelastic relation and a more general mode is required. Here, we study the flow-induced vibration of a viscoelastic sheet where a generalized fractional constitutive model is employed to represent the relation between the bending stress and the temporal response of the structure. A new method is proposed for the calculation of the convolution integral inside the fractal model and its computational benefits will be discussed. Using a coupled fluid-structure interaction (FSI) methodology based on the immersed boundary technique, dynamic fluttering modes of the structure as a result of the fluid force will be presented and the role of fractal viscoelasticity on the dynamic of the structure will be shown. Finally, it will be argued how the stress relaxation modifies the flow-induced oscillatory responses of this benchmark problem.
Tie-fibre structure and organization in the knee menisci
Andrews, Stephen H J; Rattner, Jerome B; Abusara, Ziad; Adesida, Adetola; Shrive, Nigel G; Ronsky, Janet L
2014-01-01
The collagenous structure of the knee menisci is integral to the mechanical integrity of the tissue and the knee joint. The tie-fibre structure of the tissue has largely been neglected, despite previous studies demonstrating its correlation with radial stiffness. This study has evaluated the structure of the tie-fibres of bovine menisci using 2D and 3D microscopy techniques. Standard collagen and proteoglycan (PG) staining and 2D light microscopy techniques were conducted. For the first time, the collagenous structure of the menisci was evaluated using 3D, second harmonic generation (SHG) microscopy. This technique facilitated the imaging of collagen structure in thick sections (50–100 μm). Imaging identified that tie-fibres of the menisci arborize from the outer margin of the meniscus toward the inner tip. This arborization is associated with the structural arrangement of the circumferential fibres. SHG microscopy has definitively demonstrated the 3D organization of tie-fibres in both sheets and bundles. The hierarchy of the structure is related to the organization of circumferential fascicles. Large tie-fibre sheets bifurcate into smaller sheets to surround circumferential fascicles of decreasing size. The tie-fibres emanate from the lamellar layer that appears to surround the entire meniscus. At the tibial and femoral surfaces these tie-fibre sheets branch perpendicularly into the meniscal body. The relationship between tie-fibres and blood vessels in the menisci was also observed in this study. Tie-fibre sheets surround the blood vessels and an associated PG-rich region. This subunit of the menisci has not previously been described. The size of tie-fibre sheets surrounding the vessels appeared to be associated with the size of blood vessel. These structural findings have implications in understanding the mechanics of the menisci. Further, refinement of the complex structure of the tie-fibres is important in understanding the consequences of injury and disease in the menisci. The framework of meniscus architecture also defines benchmarks for the development of tissue-engineered replacements in the future. PMID:24617800
Foreign currency-related translation complexities in cross-border healthcare applications.
Kumar, Anand; Rodrigues, Jean M
2009-01-01
International cross-border private hospital chains need to apply the standards for foreign currency translation in order to consolidate the balance sheet and income statements. This not only exposes such chains to exchange rate fluctuations in different ways, but also creates added requirements for enterprise-level IT systems especially when they produce parameters which are used to measure the financial and operational performance of the foreign subsidiary or the parent hospital. Such systems would need to come to terms with the complexities involved in such currency-related translations in order to provide the correct data for performance benchmarking.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hennig, J., E-mail: jonas.hennig@ovgu.de; Dadgar, A.; Witte, H.
2015-07-15
We report on GaN based field-effect transistor (FET) structures exhibiting sheet carrier densities of n = 2.9 10{sup 13} cm{sup −2} for high-power transistor applications. By grading the indium-content of InGaN layers grown prior to a conventional GaN/AlN/AlInN FET structure control of the channel width at the GaN/AlN interface is obtained. The composition of the InGaN layer was graded from nominally x{sub In} = 30 % to pure GaN just below the AlN/AlInN interface. Simulations reveal the impact of the additional InGaN layer on the potential well width which controls the sheet carrier density within the channel region of the devices.more » Benchmarking the In{sub x}Ga{sub 1−x}N/GaN/AlN/Al{sub 0.87}In{sub 0.13}N based FETs against GaN/AlN/AlInN FET reference structures we found increased maximum current densities of I{sub SD} = 1300 mA/mm (560 mA/mm). In addition, the InGaN layer helps to achieve broader transconductance profiles as well as reduced leakage currents.« less
Did accelerated North American ice sheet melt contribute to the 8.2 ka cooling event ?
NASA Astrophysics Data System (ADS)
Matero, Ilkka S. O.; Gregoire, Lauren J.; Ivanović, Ruža F.; Tindall, Julia C.; Haywood, Alan M.
2016-04-01
The 8.2 ka event was an abrupt cooling of the Northern Hemisphere 8,200 years ago. It is an almost ideal case study to benchmark the sensitivity of climate models to freshening of the North Atlantic by ice sheet melt (Schmidt and LeGrande, 2005). The event is attributed to the outburst of North American proglacial lakes into the Labrador Sea, causing a slow-down in Atlantic overturning circulation and cooling of 1-2.5 °C around the N. Atlantic (Alley and Ágústsdóttir,2005). Climate models fail to simulate the ~150 year duration of the event when forced with a sudden (0.5 to 5 years) drainage of the lakes (Morrill et al., 2013a). This could be because of missing forcings. For example, the separation of ice sheet domes around the Hudson Bay is thought to have produced a pronounced acceleration in ice sheet melt through a saddle collapse mechanism around the time of the event (Gregoire et al., 2012). Here we investigate whether this century scale acceleration of melt contributed to the observed climatic perturbation, using the coupled Ocean-Atmosphere climate model HadCM3. We designed and ran a set of simulations with temporally variable ice melt scenarios based on a model of the North American ice sheet. The simulated magnitude and duration of the cold period is controlled by the duration and amount of freshwater introduced to the ocean. With a 100-200 year-long acceleration of ice melt up to a maximum of 0.61 Sv, we simulate 1-3 °C cooling in the North Atlantic and ~0.5-1 °C cooling in Continental Europe; which are similar in magnitude to the ~1-2 °C cooling estimated from records for these areas (Morrill et al., 2013b). Some of the observed features are however not reproduced in our experiments, such as the most pronounced cooling of ~6 °C observed in central Greenland (Alley and Ágústsdóttir, 2005). The results suggest that the ~150 year North Atlantic and European cooling could be caused by ~200 years of accelerated North American ice sheet melt. This forcing should therefore be taken into account in the setup of 8.2 ka simulations. References: Alley, R.B., Ágústsdóttir, A.M., 2005. The 8 k event: cause and consequences of a major Holocene abrupt climate change. Quaternary Science Reviews 24 (10-11),1123-1149. Gregoire, L. J., A. J. Payne, and P. J. Valdes (2012), Deglacial rapid sea level rises caused by ice-sheet saddle collapses, Nature, 487, 219-223. Morrill, C., A. N. LeGrande, H. Renssen, P. Bakker, and B. L. Otto-Bliesner (2013a), Model sensitivity to North Atlantic freshwater forcing at 8.2 ka, Clim. Past, 9, 955-968. Morrill, C., D. M. Anderson, B. A. Bauer, R. Buckner, E. P. Gille, W. S. Gross, M. Hartman, and A. Shah (2013b), Proxy benchmarks for inter-comparison of 8.2 ka simulations, Clim. Past, 9, 423-432. Schmidt, G. A., and A. N. LeGrande (2005), The Goldilocks abrupt climate change event, Quat. Sci. Rev., 24, 1109-1110.
PMLB: a large benchmark suite for machine learning evaluation and comparison.
Olson, Randal S; La Cava, William; Orzechowski, Patryk; Urbanowicz, Ryan J; Moore, Jason H
2017-01-01
The selection, development, or comparison of machine learning methods in data mining can be a difficult task based on the target problem and goals of a particular study. Numerous publicly available real-world and simulated benchmark datasets have emerged from different sources, but their organization and adoption as standards have been inconsistent. As such, selecting and curating specific benchmarks remains an unnecessary burden on machine learning practitioners and data scientists. The present study introduces an accessible, curated, and developing public benchmark resource to facilitate identification of the strengths and weaknesses of different machine learning methodologies. We compare meta-features among the current set of benchmark datasets in this resource to characterize the diversity of available data. Finally, we apply a number of established machine learning methods to the entire benchmark suite and analyze how datasets and algorithms cluster in terms of performance. From this study, we find that existing benchmarks lack the diversity to properly benchmark machine learning algorithms, and there are several gaps in benchmarking problems that still need to be considered. This work represents another important step towards understanding the limitations of popular benchmarking suites and developing a resource that connects existing benchmarking standards to more diverse and efficient standards in the future.
46 CFR 160.002-1 - Incorporation by reference.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... General Purpose, Natural or in Colors. (2) Federal Specification: L-P-375—Plastic Film, Flexible, Vinyl...: (Sheet 1) Cutting Pattern and General Arrangement (adult). (Sheet 1A) Alternate stitching of tapes and webbing (adult and child). (Sheet 2) Pad Detail (adult). Dwg. No. F-49-6-5: (Sheet 1) Cutting Pattern and...
46 CFR 160.002-1 - Incorporation by reference.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... General Purpose, Natural or in Colors. (2) Federal Specification: L-P-375—Plastic Film, Flexible, Vinyl...: (Sheet 1) Cutting Pattern and General Arrangement (adult). (Sheet 1A) Alternate stitching of tapes and webbing (adult and child). (Sheet 2) Pad Detail (adult). Dwg. No. F-49-6-5: (Sheet 1) Cutting Pattern and...
46 CFR 160.002-1 - Incorporation by reference.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... General Purpose, Natural or in Colors. (2) Federal Specification: L-P-375—Plastic Film, Flexible, Vinyl...: (Sheet 1) Cutting Pattern and General Arrangement (adult). (Sheet 1A) Alternate stitching of tapes and webbing (adult and child). (Sheet 2) Pad Detail (adult). Dwg. No. F-49-6-5: (Sheet 1) Cutting Pattern and...
46 CFR 160.002-1 - Incorporation by reference.
Code of Federal Regulations, 2011 CFR
2011-10-01
.... General Purpose, Natural or in Colors. (2) Federal Specification: L-P-375—Plastic Film, Flexible, Vinyl...: (Sheet 1) Cutting Pattern and General Arrangement (adult). (Sheet 1A) Alternate stitching of tapes and webbing (adult and child). (Sheet 2) Pad Detail (adult). Dwg. No. F-49-6-5: (Sheet 1) Cutting Pattern and...
46 CFR 160.002-1 - Incorporation by reference.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... General Purpose, Natural or in Colors. (2) Federal Specification: L-P-375—Plastic Film, Flexible, Vinyl...: (Sheet 1) Cutting Pattern and General Arrangement (adult). (Sheet 1A) Alternate stitching of tapes and webbing (adult and child). (Sheet 2) Pad Detail (adult). Dwg. No. F-49-6-5: (Sheet 1) Cutting Pattern and...
A review and assessment of Virginia's license plate sheeting specifications.
DOT National Transportation Integrated Search
2003-01-01
At the request of Virginia's Secretary of Transportation, the Virginia Transportation Research Council undertook a review and assessment of Virginia's license plate sheeting specifications. The review was focused on the five test methods or specifica...
NASA Astrophysics Data System (ADS)
Steen-Larsen, Hans Christian; Sveinbjörnsdottir, Arny; Masson-Delmotte, Valerie; Werner, Martin; Risi, Camille; Yoshimura, Kei
2016-04-01
We have since 2010 carried out in-situ continuous water vapor isotope observations on top of the Greenland Ice Sheet (3 seasons at NEEM), in Svalbard (1 year), in Iceland (4 years), in Bermuda (4 years). The expansive dataset containing high accuracy and precision measurements of δ18O, δD, and the d-excess allow us to validate and benchmark the treatment of the atmospheric hydrological cycle's processes in General Circulation Models using simulations nudged to reanalysis products. Recent findings from both Antarctica and Greenland have documented strong interaction between the snow surface isotopes and the near surface atmospheric water vapor isotopes on diurnal to synoptic time scales. In fact, it has been shown that the snow surface isotopes take up the synoptic driven atmospheric water vapor isotopic signal in-between precipitation events, erasing the precipitation isotope signal in the surface snow. This highlights the importance of using General or Regional Climate Models, which accurately are able to simulate the atmospheric water vapor isotopic composition, to understand and interpret the ice core isotope signal. With this in mind we have used three isotope-enabled General Circulation Models (isoGSM, ECHAM5-wiso, and LMDZiso) nudged to reanalysis products. We have compared the simulations of daily mean isotope values directly with our in-situ observations. This has allowed us to characterize the variability of the isotopic composition in the models and compared it to our observations. We have specifically focused on the d-excess in order to characterize why both the mean and the variability is significantly lower than our observations. We argue that using water vapor isotopes to benchmark General Circulation Models offers an excellent tool for improving the treatment and parameterization of the atmospheric hydrological cycle. Recent studies have documented a very large inter-model dispersion in the treatment of the Arctic water cycle under a future global warming and greenhouse gas emission scenario. Our results call for action to create an international pan-Arctic monitoring water vapor isotope network in order to improve future projections of Arctic climate.
SP2Bench: A SPARQL Performance Benchmark
NASA Astrophysics Data System (ADS)
Schmidt, Michael; Hornung, Thomas; Meier, Michael; Pinkel, Christoph; Lausen, Georg
A meaningful analysis and comparison of both existing storage schemes for RDF data and evaluation approaches for SPARQL queries necessitates a comprehensive and universal benchmark platform. We present SP2Bench, a publicly available, language-specific performance benchmark for the SPARQL query language. SP2Bench is settled in the DBLP scenario and comprises a data generator for creating arbitrarily large DBLP-like documents and a set of carefully designed benchmark queries. The generated documents mirror vital key characteristics and social-world distributions encountered in the original DBLP data set, while the queries implement meaningful requests on top of this data, covering a variety of SPARQL operator constellations and RDF access patterns. In this chapter, we discuss requirements and desiderata for SPARQL benchmarks and present the SP2Bench framework, including its data generator, benchmark queries and performance metrics.
46 CFR 160.061-1 - Applicable specifications.
Code of Federal Regulations, 2011 CFR
2011-10-01
...—Iron and steel; sheet, tinned (tin plate). QQ-W-423—Wire, steel, corrosion-resisting HH-P-91—Packing, fiber, hard sheet. CCC-F-451—Flannel, canton. (2) Military specifications: MIL-H-2846—Hooks, fish, steel...
46 CFR 160.061-1 - Applicable specifications.
Code of Federal Regulations, 2010 CFR
2010-10-01
...—Iron and steel; sheet, tinned (tin plate). QQ-W-423—Wire, steel, corrosion-resisting HH-P-91—Packing, fiber, hard sheet. CCC-F-451—Flannel, canton. (2) Military specifications: MIL-H-2846—Hooks, fish, steel...
Benchmarking Discount Rate in Natural Resource Damage Assessment with Risk Aversion.
Wu, Desheng; Chen, Shuzhen
2017-08-01
Benchmarking a credible discount rate is of crucial importance in natural resource damage assessment (NRDA) and restoration evaluation. This article integrates a holistic framework of NRDA with prevailing low discount rate theory, and proposes a discount rate benchmarking decision support system based on service-specific risk aversion. The proposed approach has the flexibility of choosing appropriate discount rates for gauging long-term services, as opposed to decisions based simply on duration. It improves injury identification in NRDA since potential damages and side-effects to ecosystem services are revealed within the service-specific framework. A real embankment case study demonstrates valid implementation of the method. © 2017 Society for Risk Analysis.
Exploring Deliberate Practice & the Use of Skill Sheets in the Collegiate Leadership Competition
ERIC Educational Resources Information Center
Allen, Scott J.; Jenkins, Daniel M.; Krizanovic, Bela
2018-01-01
Little has been written about the use of skill sheets in leadership education and this paper demonstrates how they have been implemented in one specific context. Used in a number of domains (e.g., karate, cardiopulmonary resuscitation) skill sheets are checklists or rubrics that record skill performance. The use of skill sheets in leadership…
NAS Grid Benchmarks: A Tool for Grid Space Exploration
NASA Technical Reports Server (NTRS)
Frumkin, Michael; VanderWijngaart, Rob F.; Biegel, Bryan (Technical Monitor)
2001-01-01
We present an approach for benchmarking services provided by computational Grids. It is based on the NAS Parallel Benchmarks (NPB) and is called NAS Grid Benchmark (NGB) in this paper. We present NGB as a data flow graph encapsulating an instance of an NPB code in each graph node, which communicates with other nodes by sending/receiving initialization data. These nodes may be mapped to the same or different Grid machines. Like NPB, NGB will specify several different classes (problem sizes). NGB also specifies the generic Grid services sufficient for running the bench-mark. The implementor has the freedom to choose any specific Grid environment. However, we describe a reference implementation in Java, and present some scenarios for using NGB.
NASA Technical Reports Server (NTRS)
VanderWijngaart, Rob; Frumkin, Michael; Biegel, Bryan A. (Technical Monitor)
2002-01-01
We provide a paper-and-pencil specification of a benchmark suite for computational grids. It is based on the NAS (NASA Advanced Supercomputing) Parallel Benchmarks (NPB) and is called the NAS Grid Benchmarks (NGB). NGB problems are presented as data flow graphs encapsulating an instance of a slightly modified NPB task in each graph node, which communicates with other nodes by sending/receiving initialization data. Like NPB, NGB specifies several different classes (problem sizes). In this report we describe classes S, W, and A, and provide verification values for each. The implementor has the freedom to choose any language, grid environment, security model, fault tolerance/error correction mechanism, etc., as long as the resulting implementation passes the verification test and reports the turnaround time of the benchmark.
Recommended specifications and a tentative testing procedure for reflective sheeting.
DOT National Transportation Integrated Search
1977-01-01
The Department spends approximately a half million dollars annually on reflective sheeting for highway signs. For years this sheeting has been purchased primarily from one manufacturer, because of the inability of other firms to produce a competitive...
U.S. EPA'S ACUTE REFERENCE EXPOSURE METHODOLOGY FOR ACUTE INHALATION EXPOSURES
The US EPA National Center for Environmental Assessment has developed a methodology to derive acute inhalation toxicity benchmarks, called acute reference exposures (AREs), for noncancer effects. The methodology provides guidance for the derivation of chemical-specific benchmark...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karl Anderson, Steve Plimpton
2015-01-27
The FireHose Streaming Benchmarks are a suite of stream-processing benchmarks defined to enable comparison of streaming software and hardware, both quantitatively vis-a-vis the rate at which they can process data, and qualitatively by judging the effort involved to implement and run the benchmarks. Each benchmark has two parts. The first is a generator which produces and outputs datums at a high rate in a specific format. The second is an analytic which reads the stream of datums and is required to perform a well-defined calculation on the collection of datums, typically to find anomalous datums that have been created inmore » the stream by the generator. The FireHose suite provides code for the generators, sample code for the analytics (which users are free to re-implement in their own custom frameworks), and a precise definition of each benchmark calculation.« less
Reducing accounts receivable through benchmarking and best practices identification.
Berkey, T
1998-01-01
As HIM professionals look for ways to become more competitive and achieve the best results, the importance of discovering best practices becomes more apparent. Here's how one team used a benchmarking project to provide specific best practices that reduced accounts receivable days.
APPLICATION OF BENCHMARK DOSE METHODOLOGY TO DATA FROM PRENATAL DEVELOPMENTAL TOXICITY STUDIES
The benchmark dose (BMD) concept was applied to 246 conventional developmental toxicity datasets from government, industry and commercial laboratories. Five modeling approaches were used, two generic and three specific to developmental toxicity (DT models). BMDs for both quantal ...
New NAS Parallel Benchmarks Results
NASA Technical Reports Server (NTRS)
Yarrow, Maurice; Saphir, William; VanderWijngaart, Rob; Woo, Alex; Kutler, Paul (Technical Monitor)
1997-01-01
NPB2 (NAS (NASA Advanced Supercomputing) Parallel Benchmarks 2) is an implementation, based on Fortran and the MPI (message passing interface) message passing standard, of the original NAS Parallel Benchmark specifications. NPB2 programs are run with little or no tuning, in contrast to NPB vendor implementations, which are highly optimized for specific architectures. NPB2 results complement, rather than replace, NPB results. Because they have not been optimized by vendors, NPB2 implementations approximate the performance a typical user can expect for a portable parallel program on distributed memory parallel computers. Together these results provide an insightful comparison of the real-world performance of high-performance computers. New NPB2 features: New implementation (CG), new workstation class problem sizes, new serial sample versions, more performance statistics.
Benchmark matrix and guide: Part II.
1991-01-01
In the last issue of the Journal of Quality Assurance (September/October 1991, Volume 13, Number 5, pp. 14-19), the benchmark matrix developed by Headquarters Air Force Logistics Command was published. Five horizontal levels on the matrix delineate progress in TQM: business as usual, initiation, implementation, expansion, and integration. The six vertical categories that are critical to the success of TQM are leadership, structure, training, recognition, process improvement, and customer focus. In this issue, "Benchmark Matrix and Guide: Part II" will show specifically how to apply the categories of leadership, structure, and training to the benchmark matrix progress levels. At the intersection of each category and level, specific behavior objectives are listed with supporting behaviors and guidelines. Some categories will have objectives that are relatively easy to accomplish, allowing quick progress from one level to the next. Other categories will take considerable time and effort to complete. In the next issue, Part III of this series will focus on recognition, process improvement, and customer focus.
The Suite for Embedded Applications and Kernels
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-05-10
Many applications of high performance embedded computing are limited by performance or power bottlenecks. We havedesigned SEAK, a new benchmark suite, (a) to capture these bottlenecks in a way that encourages creative solutions to these bottlenecks? and (b) to facilitate rigorous, objective, end-user evaluation for their solutions. To avoid biasing solutions toward existing algorithms, SEAK benchmarks use a mission-centric (abstracted from a particular algorithm) andgoal-oriented (functional) specification. To encourage solutions that are any combination of software or hardware, we use an end-user blackbox evaluation that can capture tradeoffs between performance, power, accuracy, size, and weight. The tradeoffs are especially informativemore » for procurement decisions. We call our benchmarks future proof because each mission-centric interface and evaluation remains useful despite shifting algorithmic preferences. It is challenging to create both concise and precise goal-oriented specifications for mission-centric problems. This paper describes the SEAK benchmark suite and presents an evaluation of sample solutions that highlights power and performance tradeoffs.« less
14 CFR Section 23 - Certification and Balance Sheet Elements
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Certification and Balance Sheet Elements... AIR CARRIERS Financial Reporting Requirements Section 23 Certification and Balance Sheet Elements... report except as specifically noted in the financial and statistical statements. Schedule B-1 Balance...
14 CFR Section 23 - Certification and Balance Sheet Elements
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Certification and Balance Sheet Elements... AIR CARRIERS Financial Reporting Requirements Section 23 Certification and Balance Sheet Elements... report except as specifically noted in the financial and statistical statements. Schedule B-1 Balance...
Benchmarking Strategies for Measuring the Quality of Healthcare: Problems and Prospects
Lovaglio, Pietro Giorgio
2012-01-01
Over the last few years, increasing attention has been directed toward the problems inherent to measuring the quality of healthcare and implementing benchmarking strategies. Besides offering accreditation and certification processes, recent approaches measure the performance of healthcare institutions in order to evaluate their effectiveness, defined as the capacity to provide treatment that modifies and improves the patient's state of health. This paper, dealing with hospital effectiveness, focuses on research methods for effectiveness analyses within a strategy comparing different healthcare institutions. The paper, after having introduced readers to the principle debates on benchmarking strategies, which depend on the perspective and type of indicators used, focuses on the methodological problems related to performing consistent benchmarking analyses. Particularly, statistical methods suitable for controlling case-mix, analyzing aggregate data, rare events, and continuous outcomes measured with error are examined. Specific challenges of benchmarking strategies, such as the risk of risk adjustment (case-mix fallacy, underreporting, risk of comparing noncomparable hospitals), selection bias, and possible strategies for the development of consistent benchmarking analyses, are discussed. Finally, to demonstrate the feasibility of the illustrated benchmarking strategies, an application focused on determining regional benchmarks for patient satisfaction (using 2009 Lombardy Region Patient Satisfaction Questionnaire) is proposed. PMID:22666140
Benchmarking strategies for measuring the quality of healthcare: problems and prospects.
Lovaglio, Pietro Giorgio
2012-01-01
Over the last few years, increasing attention has been directed toward the problems inherent to measuring the quality of healthcare and implementing benchmarking strategies. Besides offering accreditation and certification processes, recent approaches measure the performance of healthcare institutions in order to evaluate their effectiveness, defined as the capacity to provide treatment that modifies and improves the patient's state of health. This paper, dealing with hospital effectiveness, focuses on research methods for effectiveness analyses within a strategy comparing different healthcare institutions. The paper, after having introduced readers to the principle debates on benchmarking strategies, which depend on the perspective and type of indicators used, focuses on the methodological problems related to performing consistent benchmarking analyses. Particularly, statistical methods suitable for controlling case-mix, analyzing aggregate data, rare events, and continuous outcomes measured with error are examined. Specific challenges of benchmarking strategies, such as the risk of risk adjustment (case-mix fallacy, underreporting, risk of comparing noncomparable hospitals), selection bias, and possible strategies for the development of consistent benchmarking analyses, are discussed. Finally, to demonstrate the feasibility of the illustrated benchmarking strategies, an application focused on determining regional benchmarks for patient satisfaction (using 2009 Lombardy Region Patient Satisfaction Questionnaire) is proposed.
Benchmark Problems for Spacecraft Formation Flying Missions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Leitner, Jesse A.; Burns, Richard D.; Folta, David C.
2003-01-01
To provide high-level focus to distributed space system flight dynamics and control research, several benchmark problems are suggested. These problems are not specific to any current or proposed mission, but instead are intended to capture high-level features that would be generic to many similar missions.
NASA Astrophysics Data System (ADS)
Fermo, Raymond Luis Lachica
2011-12-01
Magnetic reconnection is a process responsible for the conversion of magnetic energy into plasma flows in laboratory, space, and astrophysical plasmas. A product of reconnection, magnetic islands have been observed in long current layers for various space plasmas, including the magnetopause, the magnetotail, and the solar corona. In this thesis, a statistical model is developed for the dynamics of magnetic islands in very large current layers, for which conventional plasma simulations prove inadequate. An island distribution function f characterizes islands by the flux they contain psi and the area they enclose A. An integro-differential evolution equation for f describes their creation at small scales, growth due to quasi-steady reconnection, convection along the current sheet, and their coalescence with one another. The steady-state solution of the evolution equation predicts a distribution of islands in which the signature of island merging is an asymmetry in psi-- r phase space. A Hall MHD (magnetohydrodynamic) simulation of a very long current sheet with large numbers of magnetic islands is used to explore their dynamics, specifically their growth via two distinct mechanisms: quasi-steady reconnection and merging. The results of the simulation enable validation of the statistical model and benchmarking of its parameters. A PIC (particle-in-cell) simulation investigates how secondary islands form in guide field reconnection, revealing that they are born at electron skin depth scales not as islands from the tearing instability but as vortices from a flow instability. A database of 1,098 flux transfer events (FTEs) observed by Cluster between 2001 and 2003 compares favorably with the model's predictions, and also suggests island merging plays a significant role in the magnetopause. Consequently, the magnetopause is likely populated by many FTEs too small to be recognized by spacecraft instrumentation. The results of this research suggest that a complete theory of reconnection in large current sheets should account for the disparate separation of scales---from the kinetic scales at which islands are produced to the macroscale objects observed in the systems in question.
NASA Astrophysics Data System (ADS)
Cui, Xiangyang; Li, She; Feng, Hui; Li, Guangyao
2017-05-01
In this paper, a novel triangular prism solid and shell interactive mapping element is proposed to solve the coupled magnetic-mechanical formulation in electromagnetic sheet metal forming process. A linear six-node "Triprism" element is firstly proposed for transient eddy current analysis in electromagnetic field. In present "Triprism" element, shape functions are given explicitly, and a cell-wise gradient smoothing operation is used to obtain the gradient matrices without evaluating derivatives of shape functions. In mechanical field analysis, a shear locking free triangular shell element is employed in internal force computation, and a data mapping method is developed to transfer the Lorentz force on solid into the external forces suffered by shell structure for dynamic elasto-plasticity deformation analysis. Based on the deformed triangular shell structure, a "Triprism" element generation rule is established for updated electromagnetic analysis, which means inter-transformation of meshes between the coupled fields can be performed automatically. In addition, the dynamic moving mesh is adopted for air mesh updating based on the deformation of sheet metal. A benchmark problem is carried out for confirming the accuracy of the proposed "Triprism" element in predicting flux density in electromagnetic field. Solutions of several EMF problems obtained by present work are compared with experiment results and those of traditional method, which are showing excellent performances of present interactive mapping element.
Higher-order ice-sheet modelling accelerated by multigrid on graphics cards
NASA Astrophysics Data System (ADS)
Brædstrup, Christian; Egholm, David
2013-04-01
Higher-order ice flow modelling is a very computer intensive process owing primarily to the nonlinear influence of the horizontal stress coupling. When applied for simulating long-term glacial landscape evolution, the ice-sheet models must consider very long time series, while both high temporal and spatial resolution is needed to resolve small effects. The use of higher-order and full stokes models have therefore seen very limited usage in this field. However, recent advances in graphics card (GPU) technology for high performance computing have proven extremely efficient in accelerating many large-scale scientific computations. The general purpose GPU (GPGPU) technology is cheap, has a low power consumption and fits into a normal desktop computer. It could therefore provide a powerful tool for many glaciologists working on ice flow models. Our current research focuses on utilising the GPU as a tool in ice-sheet and glacier modelling. To this extent we have implemented the Integrated Second-Order Shallow Ice Approximation (iSOSIA) equations on the device using the finite difference method. To accelerate the computations, the GPU solver uses a non-linear Red-Black Gauss-Seidel iterator coupled with a Full Approximation Scheme (FAS) multigrid setup to further aid convergence. The GPU finite difference implementation provides the inherent parallelization that scales from hundreds to several thousands of cores on newer cards. We demonstrate the efficiency of the GPU multigrid solver using benchmark experiments.
The GLAS Algorithm Theoretical Basis Document for Precision Orbit Determination (POD)
NASA Technical Reports Server (NTRS)
Rim, Hyung Jin; Yoon, S. P.; Schultz, Bob E.
2013-01-01
The Geoscience Laser Altimeter System (GLAS) was the sole instrument for NASA's Ice, Cloud and land Elevation Satellite (ICESat) laser altimetry mission. The primary purpose of the ICESat mission was to make ice sheet elevation measurements of the polar regions. Additional goals were to measure the global distribution of clouds and aerosols and to map sea ice, land topography and vegetation. ICESat was the benchmark Earth Observing System (EOS) mission to be used to determine the mass balance of the ice sheets, as well as for providing cloud property information, especially for stratospheric clouds common over polar areas. The GLAS instrument operated from 2003 to 2009 and provided multi-year elevation data needed to determine changes in sea ice freeboard, land topography and vegetation around the globe, in addition to elevation changes of the Greenland and Antarctic ice sheets. This document describes the Precision Orbit Determination (POD) algorithm for the ICESat mission. The problem of determining an accurate ephemeris for an orbiting satellite involves estimating the position and velocity of the satellite from a sequence of observations. The ICESatGLAS elevation measurements must be very accurately geolocated, combining precise orbit information with precision pointing information. The ICESat mission POD requirement states that the position of the instrument should be determined with an accuracy of 5 and 20 cm (1-s) in radial and horizontal components, respectively, to meet the science requirements for determining elevation change.
A Field-Based Aquatic Life Benchmark for Conductivity in ...
This report adapts the standard U.S. EPA methodology for deriving ambient water quality criteria. Rather than use toxicity test results, the adaptation uses field data to determine the loss of 5% of genera from streams. The method is applied to derive effect benchmarks for dissolved salts as measured by conductivity in Central Appalachian streams using data from West Virginia and Kentucky. This report provides scientific evidence for a conductivity benchmark in a specific region rather than for the entire United States.
Review of the GMD Benchmark Event in TPL-007-1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backhaus, Scott N.; Rivera, Michael Kelly
2015-07-21
Los Alamos National Laboratory (LANL) examined the approaches suggested in NERC Standard TPL-007-1 for defining the geo-electric field for the Benchmark Geomagnetic Disturbance (GMD) Event. Specifically; 1. Estimating 100-year exceedance geo-electric field magnitude; The scaling of the GMD Benchmark Event to geomagnetic latitudes below 60 degrees north; and 3. The effect of uncertainties in earth conductivity data on the conversion from geomagnetic field to geo-electric field. This document summarizes the review and presents recommendations for consideration
46 CFR 160.052-1 - Incorporation by reference.
Code of Federal Regulations, 2012 CFR
2012-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Specification for a Buoyant Vest, Unicellular Plastic Foam... Preservers, Unicellular Plastic Foam, Adult and Child. 164.015—Plastic Foam, Unicellular, Buoyant Sheet and... manufactured, form a part of this subpart: Dwg. No. 160.052-1: Sheet 1—Cutting Pattern and General Arrangement...
46 CFR 160.052-1 - Incorporation by reference.
Code of Federal Regulations, 2013 CFR
2013-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Specification for a Buoyant Vest, Unicellular Plastic Foam... Preservers, Unicellular Plastic Foam, Adult and Child. 164.015—Plastic Foam, Unicellular, Buoyant Sheet and... manufactured, form a part of this subpart: Dwg. No. 160.052-1: Sheet 1—Cutting Pattern and General Arrangement...
46 CFR 160.052-1 - Incorporation by reference.
Code of Federal Regulations, 2014 CFR
2014-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Specification for a Buoyant Vest, Unicellular Plastic Foam... Preservers, Unicellular Plastic Foam, Adult and Child. 164.015—Plastic Foam, Unicellular, Buoyant Sheet and... manufactured, form a part of this subpart: Dwg. No. 160.052-1: Sheet 1—Cutting Pattern and General Arrangement...
46 CFR 160.052-1 - Incorporation by reference.
Code of Federal Regulations, 2011 CFR
2011-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Specification for a Buoyant Vest, Unicellular Plastic Foam... Preservers, Unicellular Plastic Foam, Adult and Child. 164.015—Plastic Foam, Unicellular, Buoyant Sheet and... manufactured, form a part of this subpart: Dwg. No. 160.052-1: Sheet 1—Cutting Pattern and General Arrangement...
46 CFR 114.600 - Incorporation by reference.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Conshohocken, PA 19428-2959 ASTM B 96-93, Standard Specification for Copper-Silicon Alloy Plate, Sheet, Strip... Operating Salt Spray (Fog) Apparatus 114.400 ASTM B 122/B 122M-95, Standard Specification for Copper-Nickel-Tin Alloy , Copper-Nickel-Zinc Alloy (Nickel Silver), and Copper-Nickel Alloy Plate, Sheet, Strip, and...
46 CFR 114.600 - Incorporation by reference.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Conshohocken, PA 19428-2959 ASTM B 96-93, Standard Specification for Copper-Silicon Alloy Plate, Sheet, Strip... Operating Salt Spray (Fog) Apparatus 114.400 ASTM B 122/B 122M-95, Standard Specification for Copper-Nickel-Tin Alloy , Copper-Nickel-Zinc Alloy (Nickel Silver), and Copper-Nickel Alloy Plate, Sheet, Strip, and...
General Metal Trades Book I. Units of Instruction. Teacher's Guide.
ERIC Educational Resources Information Center
East Texas State Univ., Commerce. Occupational Curriculum Lab.
This teacher's guide provides instructional materials for a 10-unit course in the General Metal Trades program. Each unit includes most or all of these basic components: performance objectives (unit and specific objectives), suggested teaching activities (a sheet outlining steps to follow to accomplish specific objectives), information sheets,…
Rotator cuff repair using cell sheets derived from human rotator cuff in a rat model.
Harada, Yoshifumi; Mifune, Yutaka; Inui, Atsuyuki; Sakata, Ryosuke; Muto, Tomoyuki; Takase, Fumiaki; Ueda, Yasuhiro; Kataoka, Takeshi; Kokubu, Takeshi; Kuroda, Ryosuke; Kurosaka, Masahiro
2017-02-01
To achieve biological regeneration of tendon-bone junctions, cell sheets of human rotator-cuff derived cells were used in a rat rotator cuff injury model. Human rotator-cuff derived cells were isolated, and cell sheets were made using temperature-responsive culture plates. Infraspinatus tendons in immunodeficient rats were resected bilaterally at the enthesis. In right shoulders, infraspinatus tendons were repaired by the transosseous method and covered with the cell sheet (sheet group), whereas the left infraspinatus tendons were repaired in the same way without the cell sheet (control group). Histological examinations (safranin-O and fast green staining, isolectin B4, type II collagen, and human-specific CD31) and mRNA expression (vascular endothelial growth factor; VEGF, type II collagen; Col2, and tenomodulin; TeM) were analyzed 4 weeks after surgery. Biomechanical tests were performed at 8 weeks. In the sheet group, proteoglycan at the enthesis with more type II collagen and isolectin B4 positive cells were seen compared with in the control group. Human specific CD31-positive cells were detected only in the sheet group. VEGF and Col2 gene expressions were higher and TeM gene expression was lower in the sheet group than in the control group. In mechanical testing, the sheet group showed a significantly higher ultimate failure load than the control group at 8 weeks. Our results indicated that the rotator-cuff derived cell sheet could promote cartilage regeneration and angiogenesis at the enthesis, with superior mechanical strength compared with the control. Treatment for rotator cuff injury using cell sheets could be a promising strategy for enthesis of tendon tissue engineering. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 35:289-296, 2017. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.
Communication Fact Sheets for Parents.
ERIC Educational Resources Information Center
Stremel, Kathleen; Bixler, Betsy; Morgan, Susanne; Layton, Kristen
This booklet contains 28 fact sheets on communication written primarily for parents and families with a child who is deaf-blind. They attempt to address fundamental but complex issues related to the communication needs of children with vision and hearing impairments. Each fact sheet targets a specific area, including: (1) communication; (2)…
46 CFR 160.052-1 - Incorporation by reference.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., Model AP. Sheet 2—Cutting Pattern and General Arrangement, Model CPM. Sheet 3—Cutting Pattern and General Arrangement, Model CPS. Sheet 4—Insert Patterns. (c) Copies on file. The manufacturer shall keep a... Specifications and Standards may be purchased from the Business Service Center, General Services Administration...
XWeB: The XML Warehouse Benchmark
NASA Astrophysics Data System (ADS)
Mahboubi, Hadj; Darmont, Jérôme
With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marck, Steven C. van der, E-mail: vandermarck@nrg.eu
Recent releases of three major world nuclear reaction data libraries, ENDF/B-VII.1, JENDL-4.0, and JEFF-3.1.1, have been tested extensively using benchmark calculations. The calculations were performed with the latest release of the continuous energy Monte Carlo neutronics code MCNP, i.e. MCNP6. Three types of benchmarks were used, viz. criticality safety benchmarks, (fusion) shielding benchmarks, and reference systems for which the effective delayed neutron fraction is reported. For criticality safety, more than 2000 benchmarks from the International Handbook of Criticality Safety Benchmark Experiments were used. Benchmarks from all categories were used, ranging from low-enriched uranium, compound fuel, thermal spectrum ones (LEU-COMP-THERM), tomore » mixed uranium-plutonium, metallic fuel, fast spectrum ones (MIX-MET-FAST). For fusion shielding many benchmarks were based on IAEA specifications for the Oktavian experiments (for Al, Co, Cr, Cu, LiF, Mn, Mo, Si, Ti, W, Zr), Fusion Neutronics Source in Japan (for Be, C, N, O, Fe, Pb), and Pulsed Sphere experiments at Lawrence Livermore National Laboratory (for {sup 6}Li, {sup 7}Li, Be, C, N, O, Mg, Al, Ti, Fe, Pb, D2O, H2O, concrete, polyethylene and teflon). The new functionality in MCNP6 to calculate the effective delayed neutron fraction was tested by comparison with more than thirty measurements in widely varying systems. Among these were measurements in the Tank Critical Assembly (TCA in Japan) and IPEN/MB-01 (Brazil), both with a thermal spectrum, two cores in Masurca (France) and three cores in the Fast Critical Assembly (FCA, Japan), all with fast spectra. The performance of the three libraries, in combination with MCNP6, is shown to be good. The results for the LEU-COMP-THERM category are on average very close to the benchmark value. Also for most other categories the results are satisfactory. Deviations from the benchmark values do occur in certain benchmark series, or in isolated cases within benchmark series. Such instances can often be related to nuclear data for specific non-fissile elements, such as C, Fe, or Gd. Indications are that the intermediate and mixed spectrum cases are less well described. The results for the shielding benchmarks are generally good, with very similar results for the three libraries in the majority of cases. Nevertheless there are, in certain cases, strong deviations between calculated and benchmark values, such as for Co and Mg. Also, the results show discrepancies at certain energies or angles for e.g. C, N, O, Mo, and W. The functionality of MCNP6 to calculate the effective delayed neutron fraction yields very good results for all three libraries.« less
Memory-Intensive Benchmarks: IRAM vs. Cache-Based Machines
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Gaeke, Brian R.; Husbands, Parry; Li, Xiaoye S.; Oliker, Leonid; Yelick, Katherine A.; Biegel, Bryan (Technical Monitor)
2002-01-01
The increasing gap between processor and memory performance has lead to new architectural models for memory-intensive applications. In this paper, we explore the performance of a set of memory-intensive benchmarks and use them to compare the performance of conventional cache-based microprocessors to a mixed logic and DRAM processor called VIRAM. The benchmarks are based on problem statements, rather than specific implementations, and in each case we explore the fundamental hardware requirements of the problem, as well as alternative algorithms and data structures that can help expose fine-grained parallelism or simplify memory access patterns. The benchmarks are characterized by their memory access patterns, their basic control structures, and the ratio of computation to memory operation.
NASA Astrophysics Data System (ADS)
Pattyn, Frank
2017-08-01
The magnitude of the Antarctic ice sheet's contribution to global sea-level rise is dominated by the potential of its marine sectors to become unstable and collapse as a response to ocean (and atmospheric) forcing. This paper presents Antarctic sea-level response to sudden atmospheric and oceanic forcings on multi-centennial timescales with the newly developed fast Elementary Thermomechanical Ice Sheet (f.ETISh) model. The f.ETISh model is a vertically integrated hybrid ice sheet-ice shelf model with vertically integrated thermomechanical coupling, making the model two-dimensional. Its marine boundary is represented by two different flux conditions, coherent with power-law basal sliding and Coulomb basal friction. The model has been compared to existing benchmarks. Modelled Antarctic ice sheet response to forcing is dominated by sub-ice shelf melt and the sensitivity is highly dependent on basal conditions at the grounding line. Coulomb friction in the grounding-line transition zone leads to significantly higher mass loss in both West and East Antarctica on centennial timescales, leading to 1.5 m sea-level rise after 500 years for a limited melt scenario of 10 m a-1 under freely floating ice shelves, up to 6 m for a 50 m a-1 scenario. The higher sensitivity is attributed to higher ice fluxes at the grounding line due to vanishing effective pressure. Removing the ice shelves altogether results in a disintegration of the West Antarctic ice sheet and (partially) marine basins in East Antarctica. After 500 years, this leads to a 5 m and a 16 m sea-level rise for the power-law basal sliding and Coulomb friction conditions at the grounding line, respectively. The latter value agrees with simulations by DeConto and Pollard (2016) over a similar period (but with different forcing and including processes of hydrofracturing and cliff failure). The chosen parametrizations make model results largely independent of spatial resolution so that f.ETISh can potentially be integrated in large-scale Earth system models.
Ellis, Judith
2006-07-01
The aim of this article is to review published descriptions of benchmarking activity and synthesize benchmarking principles to encourage the acceptance and use of Essence of Care as a new benchmarking approach to continuous quality improvement, and to promote its acceptance as an integral and effective part of benchmarking activity in health services. The Essence of Care, was launched by the Department of Health in England in 2001 to provide a benchmarking tool kit to support continuous improvement in the quality of fundamental aspects of health care, for example, privacy and dignity, nutrition and hygiene. The tool kit is now being effectively used by some frontline staff. However, use is inconsistent, with the value of the tool kit, or the support clinical practice benchmarking requires to be effective, not always recognized or provided by National Health Service managers, who are absorbed with the use of quantitative benchmarking approaches and measurability of comparative performance data. This review of published benchmarking literature, was obtained through an ever-narrowing search strategy commencing from benchmarking within quality improvement literature through to benchmarking activity in health services and including access to not only published examples of benchmarking approaches and models used but the actual consideration of web-based benchmarking data. This supported identification of how benchmarking approaches have developed and been used, remaining true to the basic benchmarking principles of continuous improvement through comparison and sharing (Camp 1989). Descriptions of models and exemplars of quantitative and specifically performance benchmarking activity in industry abound (Camp 1998), with far fewer examples of more qualitative and process benchmarking approaches in use in the public services and then applied to the health service (Bullivant 1998). The literature is also in the main descriptive in its support of the effectiveness of benchmarking activity and although this does not seem to have restricted its popularity in quantitative activity, reticence about the value of the more qualitative approaches, for example Essence of Care, needs to be overcome in order to improve the quality of patient care and experiences. The perceived immeasurability and subjectivity of Essence of Care and clinical practice benchmarks means that these benchmarking approaches are not always accepted or supported by health service organizations as valid benchmarking activity. In conclusion, Essence of Care benchmarking is a sophisticated clinical practice benchmarking approach which needs to be accepted as an integral part of health service benchmarking activity to support improvement in the quality of patient care and experiences.
IMAGESEER - IMAGEs for Education and Research
NASA Technical Reports Server (NTRS)
Le Moigne, Jacqueline; Grubb, Thomas; Milner, Barbara
2012-01-01
IMAGESEER is a new Web portal that brings easy access to NASA image data for non-NASA researchers, educators, and students. The IMAGESEER Web site and database are specifically designed to be utilized by the university community, to enable teaching image processing (IP) techniques on NASA data, as well as to provide reference benchmark data to validate new IP algorithms. Along with the data and a Web user interface front-end, basic knowledge of the application domains, benchmark information, and specific NASA IP challenges (or case studies) are provided.
A Benchmark and Comparative Study of Video-Based Face Recognition on COX Face Database.
Huang, Zhiwu; Shan, Shiguang; Wang, Ruiping; Zhang, Haihong; Lao, Shihong; Kuerban, Alifu; Chen, Xilin
2015-12-01
Face recognition with still face images has been widely studied, while the research on video-based face recognition is inadequate relatively, especially in terms of benchmark datasets and comparisons. Real-world video-based face recognition applications require techniques for three distinct scenarios: 1) Videoto-Still (V2S); 2) Still-to-Video (S2V); and 3) Video-to-Video (V2V), respectively, taking video or still image as query or target. To the best of our knowledge, few datasets and evaluation protocols have benchmarked for all the three scenarios. In order to facilitate the study of this specific topic, this paper contributes a benchmarking and comparative study based on a newly collected still/video face database, named COX(1) Face DB. Specifically, we make three contributions. First, we collect and release a largescale still/video face database to simulate video surveillance with three different video-based face recognition scenarios (i.e., V2S, S2V, and V2V). Second, for benchmarking the three scenarios designed on our database, we review and experimentally compare a number of existing set-based methods. Third, we further propose a novel Point-to-Set Correlation Learning (PSCL) method, and experimentally show that it can be used as a promising baseline method for V2S/S2V face recognition on COX Face DB. Extensive experimental results clearly demonstrate that video-based face recognition needs more efforts, and our COX Face DB is a good benchmark database for evaluation.
DOT National Transportation Integrated Search
2010-09-01
This project focused on the evaluation of traffic sign sheeting performance in terms of meeting the nighttime : driver needs. The goal was to develop a nighttime driver needs specification for traffic signs. The : researchers used nighttime sign legi...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-17
... cards, and other commercial printing applications requiring high quality print graphics. Specifically... Suitable for High-Quality Print Graphics Using Sheet-Fed Presses From the People's Republic of China... on certain coated paper suitable for high-quality print graphics using sheet-fed presses (``coated...
An Amino Acid Code for β-sheet Packing Structure
Joo, Hyun; Tsai, Jerry
2014-01-01
To understand the relationship between protein sequence and structure, this work extends the knob-socket model in an investigation of β-sheet packing. Over a comprehensive set of β-sheet folds, the contacts between residues were used to identify packing cliques: sets of residues that all contact each other. These packing cliques were then classified based on size and contact order. From this analysis, the 2 types of 4 residue packing cliques necessary to describe β-sheet packing were characterized. Both occur between 2 adjacent hydrogen bonded β-strands. First, defining the secondary structure packing within β-sheets, the combined socket or XY:HG pocket consists of 4 residues i,i+2 on one strand and j,j+2 on the other. Second, characterizing the tertiary packing between β-sheets, the knob-socket XY:H+B consists of a 3 residue XY:H socket (i,i+2 on one strand and j on the other) packed against a knob B residue (residue k distant in sequence). Depending on the packing depth of the knob B residue, 2 types of knob-sockets are found: side-chain and main-chain sockets. The amino acid composition of the pockets and knob-sockets reveal the sequence specificity of β-sheet packing. For β-sheet formation, the XY:HG pocket clearly shows sequence specificity of amino acids. For tertiary packing, the XY:H+B side-chain and main-chain sockets exhibit distinct amino acid preferences at each position. These relationships define an amino acid code for β-sheet structure and provide an intuitive topological mapping of β-sheet packing. PMID:24668690
Developing of Indicators of an E-Learning Benchmarking Model for Higher Education Institutions
ERIC Educational Resources Information Center
Sae-Khow, Jirasak
2014-01-01
This study was the development of e-learning indicators used as an e-learning benchmarking model for higher education institutes. Specifically, it aimed to: 1) synthesize the e-learning indicators; 2) examine content validity by specialists; and 3) explore appropriateness of the e-learning indicators. Review of related literature included…
Mathematics Content Standards Benchmarks and Performance Standards
ERIC Educational Resources Information Center
New Mexico Public Education Department, 2008
2008-01-01
New Mexico Mathematics Content Standards, Benchmarks, and Performance Standards identify what students should know and be able to do across all grade levels, forming a spiraling framework in the sense that many skills, once introduced, develop over time. While the Performance Standards are set forth at grade-specific levels, they do not exist as…
ERIC Educational Resources Information Center
Kroll, Juidith A.
2012-01-01
The inaugural Advancement Investment Metrics Study, or AIMS, benchmarked investments and staffing in each of the advancement disciplines (advancement services, alumni relations, communications and marketing, fundraising and advancement management) as well as the return on the investment in fundraising specifically. This white paper reports on the…
NASA Astrophysics Data System (ADS)
Rohrer, Brandon
2010-12-01
Measuring progress in the field of Artificial General Intelligence (AGI) can be difficult without commonly accepted methods of evaluation. An AGI benchmark would allow evaluation and comparison of the many computational intelligence algorithms that have been developed. In this paper I propose that a benchmark for natural world interaction would possess seven key characteristics: fitness, breadth, specificity, low cost, simplicity, range, and task focus. I also outline two benchmark examples that meet most of these criteria. In the first, the direction task, a human coach directs a machine to perform a novel task in an unfamiliar environment. The direction task is extremely broad, but may be idealistic. In the second, the AGI battery, AGI candidates are evaluated based on their performance on a collection of more specific tasks. The AGI battery is designed to be appropriate to the capabilities of currently existing systems. Both the direction task and the AGI battery would require further definition before implementing. The paper concludes with a description of a task that might be included in the AGI battery: the search and retrieve task.
Bonnet, F; Solignac, S; Marty, J
2008-03-01
The purpose of benchmarking is to settle improvement processes by comparing the activities to quality standards. The proposed methodology is illustrated by benchmark business cases performed inside medical plants on some items like nosocomial diseases or organization of surgery facilities. Moreover, the authors have built a specific graphic tool, enhanced with balance score numbers and mappings, so that the comparison between different anesthesia-reanimation services, which are willing to start an improvement program, is easy and relevant. This ready-made application is even more accurate as far as detailed tariffs of activities are implemented.
NASA Technical Reports Server (NTRS)
1973-01-01
A specification catalog to define the equipment to be used for conducting life sciences experiments in a space laboratory is presented. The specification sheets list the purpose of the equipment item, and any specific technical requirements which can be identified. The status of similar hardware for ground use is stated with comments regarding modifications required to achieve spaceflight qualified hardware. Pertinent sketches, commercial catalog sheets, or drawings of the applicable equipment are included.
Fabrication method for cores of structural sandwich materials including star shaped core cells
Christensen, Richard M.
1997-01-01
A method for fabricating structural sandwich materials having a core pattern which utilizes star and non-star shaped cells. The sheets of material are bonded together or a single folded sheet is used, and bonded or welded at specific locations, into a flat configuration, and are then mechanically pulled or expanded normal to the plane of the sheets which expand to form the cells. This method can be utilized to fabricate other geometric cell arrangements than the star/non-star shaped cells. Four sheets of material (either a pair of bonded sheets or a single folded sheet) are bonded so as to define an area therebetween, which forms the star shaped cell when expanded.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-14
... Film, Sheet and Strip From the United Arab Emirates: Extension of Time Limit for Preliminary Results of... polyethylene terephthalate film, sheet and strip from the United Arab Emirates (UAE) for the period November 06... analysis of all information on the record, specifically considering the cost and affiliation issues in this...
LED Outdoor Area Lighting Fact Sheet
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This fact sheet reviews the major design and specification concerns for outdoor area lighting, and discusses the potential for LED luminaires to save energy while providing high quality lighting for outdoor areas.
ERIC Educational Resources Information Center
Adams, J. M.; Evans, S.
1980-01-01
Describes a student project in analytical chemistry using sheet silicates. Provides specific information regarding the use of phlogopite in an experiment to analyze samples for silicon, aluminum, magnesium, iron, potassium, and fluoride. (CS)
IT-benchmarking of clinical workflows: concept, implementation, and evaluation.
Thye, Johannes; Straede, Matthias-Christopher; Liebe, Jan-David; Hübner, Ursula
2014-01-01
Due to the emerging evidence of health IT as opportunity and risk for clinical workflows, health IT must undergo a continuous measurement of its efficacy and efficiency. IT-benchmarks are a proven means for providing this information. The aim of this study was to enhance the methodology of an existing benchmarking procedure by including, in particular, new indicators of clinical workflows and by proposing new types of visualisation. Drawing on the concept of information logistics, we propose four workflow descriptors that were applied to four clinical processes. General and specific indicators were derived from these descriptors and processes. 199 chief information officers (CIOs) took part in the benchmarking. These hospitals were assigned to reference groups of a similar size and ownership from a total of 259 hospitals. Stepwise and comprehensive feedback was given to the CIOs. Most participants who evaluated the benchmark rated the procedure as very good, good, or rather good (98.4%). Benchmark information was used by CIOs for getting a general overview, advancing IT, preparing negotiations with board members, and arguing for a new IT project.
Pagano, Timothy S.; Terry, David B.; Ingram, Arlynn W.
1986-01-01
Seven sheets of map data comprise this geohydrologic report. Sheet 1, surficial geology, illustrates the distribution of: open water areas; artificial fill; made land; urban land; alluvial silt and sand; alluvial sand and gravel; peat, marl, muck and clay; lake silt and/or clay; delta sand and gravel; beach sand and gravel; outwash sand and gravel; ice contact sand and ground; thick till cover bedrock; and thin till over bedrock over the Baldwinsville Area. Sheet 2, geologic sections, shows the layering of the aforementioned components below the surface layer. Sheet 3 illustrates the water infiltration of soil zone. Sheet 4 depicts the aquifer thickness. Sheet 5 illustrates the potentiometric surface, and Sheet 6 the well yield. Finally, Sheet 7 shows the land use in the region, specifically: industrial and extractive; commercial and services; transportation; farmland; forestland; residential; open public land; and water and wetlands. (Lantz-PTT)
Generation of openEHR Test Datasets for Benchmarking.
El Helou, Samar; Karvonen, Tuukka; Yamamoto, Goshiro; Kume, Naoto; Kobayashi, Shinji; Kondo, Eiji; Hiragi, Shusuke; Okamoto, Kazuya; Tamura, Hiroshi; Kuroda, Tomohiro
2017-01-01
openEHR is a widely used EHR specification. Given its technology-independent nature, different approaches for implementing openEHR data repositories exist. Public openEHR datasets are needed to conduct benchmark analyses over different implementations. To address their current unavailability, we propose a method for generating openEHR test datasets that can be publicly shared and used.
A Seafloor Benchmark for 3-dimensional Geodesy
NASA Astrophysics Data System (ADS)
Chadwell, C. D.; Webb, S. C.; Nooner, S. L.
2014-12-01
We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone. Using a ROV to place and remove sensors on the benchmarks will significantly reduce the number of sensors required by the community to monitor offshore strain in subduction zones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Zhenyu; Lin, Yu; Wang, Xueyi
The eigenmode stability properties of three-dimensional lower-hybrid-drift-instabilities (LHDI) in a Harris current sheet with a small but finite guide magnetic field have been systematically studied by employing the gyrokinetic electron and fully kinetic ion (GeFi) particle-in-cell (PIC) simulation model with a realistic ion-to-electron mass ratio m i/m e. In contrast to the fully kinetic PIC simulation scheme, the fast electron cyclotron motion and plasma oscillations are systematically removed in the GeFi model, and hence one can employ the realistic m i/m e. The GeFi simulations are benchmarked against and show excellent agreement with both the fully kinetic PIC simulation and the analytical eigenmode theory. Our studies indicate that, for small wavenumbers, ky, along the current direction, the most unstable eigenmodes are peaked at the location wheremore » $$\\vec{k}$$• $$\\vec{B}$$ =0, consistent with previous analytical and simulation studies. Here, $$\\vec{B}$$ is the equilibrium magnetic field and $$\\vec{k}$$ is the wavevector perpendicular to the nonuniformity direction. As ky increases, however, the most unstable eigenmodes are found to be peaked at $$\\vec{k}$$ •$$\\vec{B}$$ ≠0. Additionally, the simulation results indicate that varying m i/m e, the current sheet width, and the guide magnetic field can affect the stability of LHDI. Simulations with the varying mass ratio confirm the lower hybrid frequency and wave number scalings.« less
Wang, Zhenyu; Lin, Yu; Wang, Xueyi; ...
2016-07-07
The eigenmode stability properties of three-dimensional lower-hybrid-drift-instabilities (LHDI) in a Harris current sheet with a small but finite guide magnetic field have been systematically studied by employing the gyrokinetic electron and fully kinetic ion (GeFi) particle-in-cell (PIC) simulation model with a realistic ion-to-electron mass ratio m i/m e. In contrast to the fully kinetic PIC simulation scheme, the fast electron cyclotron motion and plasma oscillations are systematically removed in the GeFi model, and hence one can employ the realistic m i/m e. The GeFi simulations are benchmarked against and show excellent agreement with both the fully kinetic PIC simulation and the analytical eigenmode theory. Our studies indicate that, for small wavenumbers, ky, along the current direction, the most unstable eigenmodes are peaked at the location wheremore » $$\\vec{k}$$• $$\\vec{B}$$ =0, consistent with previous analytical and simulation studies. Here, $$\\vec{B}$$ is the equilibrium magnetic field and $$\\vec{k}$$ is the wavevector perpendicular to the nonuniformity direction. As ky increases, however, the most unstable eigenmodes are found to be peaked at $$\\vec{k}$$ •$$\\vec{B}$$ ≠0. Additionally, the simulation results indicate that varying m i/m e, the current sheet width, and the guide magnetic field can affect the stability of LHDI. Simulations with the varying mass ratio confirm the lower hybrid frequency and wave number scalings.« less
Single Point Incremental Forming to increase material knowledge and production flexibility
NASA Astrophysics Data System (ADS)
Habraken, A. M.
2016-08-01
Nowadays, manufactured pieces can be divided into two groups: mass production and production of low volume number of parts. Within the second group (prototyping or small batch production), an emerging solution relies on Incremental Sheet Forming or ISF. ISF refers to processes where the plastic deformation occurs by repeated contact with a relatively small tool. More specifically, many publications over the past decade investigate Single Point Incremental Forming (SPIF) where the final shape is determined only by the tool movement. This manufacturing process is characterized by the forming of sheets by means of a CNC controlled generic tool stylus, with the sheets clamped by means of a non-workpiece-specific clamping system and in absence of a partial or a full die. The advantage is no tooling requirements and often enhanced formability, however it poses a challenge in term of process control and accuracy assurance. Note that the most commonly used materials in incremental forming are aluminum and steel alloys however other alloys are also used especially for medical industry applications, such as cobalt and chromium alloys, stainless steel and titanium alloys. Some scientists have applied incremental forming on PVC plates and other on sandwich panels composed of propylene with mild steel and aluminum metallic foams with aluminum sheet metal. Micro incremental forming of thin foils has also been developed. Starting from the scattering of the results of Finite Element (FE) simulations, when one tries to predict the tool force (see SPIF benchmark of 2014 Numisheet conference), we will see how SPIF and even micro SPIF (process applied on thin metallic sheet with a few grains within the thickness) allow investigating the material behavior. This lecture will focus on the identification of constitutive laws, on the SPIF forming mechanisms and formability as well as the failure mechanism. Different hypotheses have been proposed to explain SPIF formability, they will be listed however the lecture will be more focused on the use of SPIF to identify material parameters of well-chosen constitutive law. Results of FE simulations with damage models will be investigated to better understand the relation between the particular stress and strain states in the material during SPIF and the material degradation leading to localization or fracture. Last but not least, as industrial world does not wait that academic scientists provide a deep and total understanding on how it works, to use interesting processes, the lecture will review some applications. Examples in fields as different as automotive guard, engine heat shield, gas turbine, electronic sensor, shower basin, medical component (patient-fitted organic shapes) and architecture demonstrate that the integration of SPIF within the industry is more and more a reality. Note that this plenary lecture is the result of the research performed by the author in the University of Liege (Belgium) and in Aveiro (Portugal) with the team of R. de Souza during PhD theses of C. Henrard, J. Sena and C. Guzman and different research projects. It is also a synthesis of the knowledge gathered during her interactions with many research teams such as the ones of J.R. Duflou from KU Leuven in Belgium, J. Cao from Northwestern University in USA, M. Bambach in BTU Cottbus-Senftenberg in Germany, J. Jeswiet from Queen's University, Kingston, Canada who are currently working together on a state-of-the-art paper. The micro SPIF knowledge relies on contacts with S. Thibaud from the University of Franche Comte.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, C G; Mathews, S
2006-09-07
Current regulatory schemes use generic or industrial sector specific benchmarks to evaluate the quality of industrial stormwater discharges. While benchmarks can be a useful tool for facility stormwater managers in evaluating the quality stormwater runoff, benchmarks typically do not take into account site-specific conditions, such as: soil chemistry, atmospheric deposition, seasonal changes in water source, and upstream land use. Failing to account for these factors may lead to unnecessary costs to trace a source of natural variation, or potentially missing a significant local water quality problem. Site-specific water quality thresholds, established upon the statistical evaluation of historic data take intomore » account these factors, are a better tool for the direct evaluation of runoff quality, and a more cost-effective trigger to investigate anomalous results. Lawrence Livermore National Laboratory (LLNL), a federal facility, established stormwater monitoring programs to comply with the requirements of the industrial stormwater permit and Department of Energy orders, which require the evaluation of the impact of effluent discharges on the environment. LLNL recognized the need to create a tool to evaluate and manage stormwater quality that would allow analysts to identify trends in stormwater quality and recognize anomalous results so that trace-back and corrective actions could be initiated. LLNL created the site-specific water quality threshold tool to better understand the nature of the stormwater influent and effluent, to establish a technical basis for determining when facility operations might be impacting the quality of stormwater discharges, and to provide ''action levels'' to initiate follow-up to analytical results. The threshold criteria were based on a statistical analysis of the historic stormwater monitoring data and a review of relevant water quality objectives.« less
Wang, Zhongshan; Feng, Zhihong; Wu, Guofeng; Bai, Shizhu; Dong, Yan; Zhao, Yimin
2016-05-01
Numerous preclinical and clinical studies have focused on the periodontal regenerative functions of enamel matrix derivative (EMD), a heat-treated preparation derived from enamel matrix proteins (EMPs) of developing porcine teeth. In this study, periodontal ligament (PDL) stem cells (PDLSCs) were isolated, and the effects of EMD on the extracorporeal induction process and the characteristics of PDLSC sheets were investigated for their potential as a more effective stem-cell therapy. EMD-enhanced cell sheets could be induced by complete medium supplemented with 50 μg/mL vitamin C and 100 μg/mL EMD. The EMD-enhanced cell sheets appeared thicker and more compact than the normal PDLSC sheets, demonstrated more layers of cells (3-7 layers), secreted richer extracellular matrix (ECM), showed varying degrees of increases in mRNA expression of periodontal tissue-specific genes (COL I, POSTN), calcification-related genes (RUNX2, OPN, OCN) and a cementum tissue-specific gene (CAP), and possessed a better mineralization ability in terms of osteogenic differentiation in vitro. These EMD-enhanced cell sheets may represent a potential option for stem-cell therapy for PDL regeneration. Copyright © 2016 Elsevier B.V. All rights reserved.
Moldable cork ablation material
NASA Technical Reports Server (NTRS)
1977-01-01
A successful thermal ablative material was manufactured. Moldable cork sheets were tested for density, tensile strength, tensile elongation, thermal conductivity, compression set, and specific heat. A moldable cork sheet, therefore, was established as a realistic product.
Fabrication method for cores of structural sandwich materials including star shaped core cells
Christensen, R.M.
1997-07-15
A method for fabricating structural sandwich materials having a core pattern which utilizes star and non-star shaped cells is disclosed. The sheets of material are bonded together or a single folded sheet is used, and bonded or welded at specific locations, into a flat configuration, and are then mechanically pulled or expanded normal to the plane of the sheets which expand to form the cells. This method can be utilized to fabricate other geometric cell arrangements than the star/non-star shaped cells. Four sheets of material (either a pair of bonded sheets or a single folded sheet) are bonded so as to define an area therebetween, which forms the star shaped cell when expanded. 3 figs.
Peeters, Dominique; Sekeris, Elke; Verschaffel, Lieven; Luwel, Koen
2017-01-01
Some authors argue that age-related improvements in number line estimation (NLE) performance result from changes in strategy use. More specifically, children’s strategy use develops from only using the origin of the number line, to using the origin and the endpoint, to eventually also relying on the midpoint of the number line. Recently, Peeters et al. (unpublished) investigated whether the provision of additional unlabeled benchmarks at 25, 50, and 75% of the number line, positively affects third and fifth graders’ NLE performance and benchmark-based strategy use. It was found that only the older children benefitted from the presence of these benchmarks at the quartiles of the number line (i.e., 25 and 75%), as they made more use of these benchmarks, leading to more accurate estimates. A possible explanation for this lack of improvement in third graders might be their inability to correctly link the presented benchmarks with their corresponding numerical values. In the present study, we investigated whether labeling these benchmarks with their corresponding numerical values, would have a positive effect on younger children’s NLE performance and quartile-based strategy use as well. Third and sixth graders were assigned to one of three conditions: (a) a control condition with an empty number line bounded by 0 at the origin and 1,000 at the endpoint, (b) an unlabeled condition with three additional external benchmarks without numerical labels at 25, 50, and 75% of the number line, and (c) a labeled condition in which these benchmarks were labeled with 250, 500, and 750, respectively. Results indicated that labeling the benchmarks has a positive effect on third graders’ NLE performance and quartile-based strategy use, whereas sixth graders already benefited from the mere provision of unlabeled benchmarks. These findings imply that children’s benchmark-based strategy use can be stimulated by adding additional externally provided benchmarks on the number line, but that, depending on children’s age and familiarity with the number range, these additional external benchmarks might need to be labeled. PMID:28713302
Peeters, Dominique; Sekeris, Elke; Verschaffel, Lieven; Luwel, Koen
2017-01-01
Some authors argue that age-related improvements in number line estimation (NLE) performance result from changes in strategy use. More specifically, children's strategy use develops from only using the origin of the number line, to using the origin and the endpoint, to eventually also relying on the midpoint of the number line. Recently, Peeters et al. (unpublished) investigated whether the provision of additional unlabeled benchmarks at 25, 50, and 75% of the number line, positively affects third and fifth graders' NLE performance and benchmark-based strategy use. It was found that only the older children benefitted from the presence of these benchmarks at the quartiles of the number line (i.e., 25 and 75%), as they made more use of these benchmarks, leading to more accurate estimates. A possible explanation for this lack of improvement in third graders might be their inability to correctly link the presented benchmarks with their corresponding numerical values. In the present study, we investigated whether labeling these benchmarks with their corresponding numerical values, would have a positive effect on younger children's NLE performance and quartile-based strategy use as well. Third and sixth graders were assigned to one of three conditions: (a) a control condition with an empty number line bounded by 0 at the origin and 1,000 at the endpoint, (b) an unlabeled condition with three additional external benchmarks without numerical labels at 25, 50, and 75% of the number line, and (c) a labeled condition in which these benchmarks were labeled with 250, 500, and 750, respectively. Results indicated that labeling the benchmarks has a positive effect on third graders' NLE performance and quartile-based strategy use, whereas sixth graders already benefited from the mere provision of unlabeled benchmarks. These findings imply that children's benchmark-based strategy use can be stimulated by adding additional externally provided benchmarks on the number line, but that, depending on children's age and familiarity with the number range, these additional external benchmarks might need to be labeled.
Research on computer systems benchmarking
NASA Technical Reports Server (NTRS)
Smith, Alan Jay (Principal Investigator)
1996-01-01
This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.
75 FR 41989 - Content of Periodicals Mail
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-20
... media, since they are not printed sheets. But specifically allowing de minimis product samples will... provision in the DMM allowing product samples in de minimis form to be included as part of a printed sheet...
Enhancement Approachof Object Constraint Language Generation
NASA Astrophysics Data System (ADS)
Salemi, Samin; Selamat, Ali
2018-01-01
OCL is the most prevalent language to document system constraints that are annotated in UML. Writing OCL specifications is not an easy task due to the complexity of the OCL syntax. Therefore, an approach to help and assist developers to write OCL specifications is needed. There are two approaches to do so: First, creating an OCL specifications by a tool called COPACABANA. Second, an MDA-based approach to help developers in writing OCL specification by another tool called NL2OCLviaSBVR that generates OCL specification automatically. This study presents another MDA-based approach called En2OCL, and its objective is twofold. 1- to improve the precison of the existing works. 2- to present a benchmark of these approaches. The benchmark shows that the accuracy of COPACABANA, NL2OCLviaSBVR, and En2OCL are 69.23, 84.64, and 88.40 respectively.
Zhu, H.; Braun, W.
1999-01-01
A statistical analysis of a representative data set of 169 known protein structures was used to analyze the specificity of residue interactions between spatial neighboring strands in beta-sheets. Pairwise potentials were derived from the frequency of residue pairs in nearest contact, second nearest and third nearest contacts across neighboring beta-strands compared to the expected frequency of residue pairs in a random model. A pseudo-energy function based on these statistical pairwise potentials recognized native beta-sheets among possible alternative pairings. The native pairing was found within the three lowest energies in 73% of the cases in the training data set and in 63% of beta-sheets in a test data set of 67 proteins, which were not part of the training set. The energy function was also used to detect tripeptides, which occur frequently in beta-sheets of native proteins. The majority of native partners of tripeptides were distributed in a low energy range. Self-correcting distance geometry (SECODG) calculations using distance constraints sets derived from possible low energy pairing of beta-strands uniquely identified the native pairing of the beta-sheet in pancreatic trypsin inhibitor (BPTI). These results will be useful for predicting the structure of proteins from their amino acid sequence as well as for the design of proteins containing beta-sheets. PMID:10048326
ERIC Educational Resources Information Center
Stremel, Kathleen; Wilson, Rebecca M.
This document consists of three separately published fact sheets combined here because of the close relationship of their subject matter. The first fact sheet, "Communication Interactions: It Takes Two" (Kathleen Stremel), defines communication; suggests ways to find opportunities for interactive communication; offers specific suggestions for…
ISMIP6: Ice Sheet Model Intercomparison Project for CMIP6
NASA Technical Reports Server (NTRS)
Nowicki, S.
2015-01-01
ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6) targets the Cryosphere in a Changing Climate and the Future Sea Level Grand Challenges of the WCRP (World Climate Research Program). Primary goal is to provide future sea level contribution from the Greenland and Antarctic ice sheets, along with associated uncertainty. Secondary goal is to investigate feedback due to dynamic ice sheet models. Experiment design uses and augment the existing CMIP6 (Coupled Model Intercomparison Project Phase 6) DECK (Diagnosis, Evaluation, and Characterization of Klima) experiments. Additonal MIP (Model Intercomparison Project)- specific experiments will be designed for ISM (Ice Sheet Model). Effort builds on the Ice2sea, SeaRISE (Sea-level Response to Ice Sheet Evolution) and COMBINE (Comprehensive Modelling of the Earth System for Better Climate Prediction and Projection) efforts.
GROWTH OF THE INTERNATIONAL CRITICALITY SAFETY AND REACTOR PHYSICS EXPERIMENT EVALUATION PROJECTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Blair Briggs; John D. Bess; Jim Gulliford
2011-09-01
Since the International Conference on Nuclear Criticality Safety (ICNC) 2007, the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) have continued to expand their efforts and broaden their scope. Eighteen countries participated on the ICSBEP in 2007. Now, there are 20, with recent contributions from Sweden and Argentina. The IRPhEP has also expanded from eight contributing countries in 2007 to 16 in 2011. Since ICNC 2007, the contents of the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments1' have increased from 442 evaluations (38000 pages), containing benchmark specifications for 3955 critical ormore » subcritical configurations to 516 evaluations (nearly 55000 pages), containing benchmark specifications for 4405 critical or subcritical configurations in the 2010 Edition of the ICSBEP Handbook. The contents of the Handbook have also increased from 21 to 24 criticality-alarm-placement/shielding configurations with multiple dose points for each, and from 20 to 200 configurations categorized as fundamental physics measurements relevant to criticality safety applications. Approximately 25 new evaluations and 150 additional configurations are expected to be added to the 2011 edition of the Handbook. Since ICNC 2007, the contents of the 'International Handbook of Evaluated Reactor Physics Benchmark Experiments2' have increased from 16 different experimental series that were performed at 12 different reactor facilities to 53 experimental series that were performed at 30 different reactor facilities in the 2011 edition of the Handbook. Considerable effort has also been made to improve the functionality of the searchable database, DICE (Database for the International Criticality Benchmark Evaluation Project) and verify the accuracy of the data contained therein. DICE will be discussed in separate papers at ICNC 2011. The status of the ICSBEP and the IRPhEP will be discussed in the full paper, selected benchmarks that have been added to the ICSBEP Handbook will be highlighted, and a preview of the new benchmarks that will appear in the September 2011 edition of the Handbook will be provided. Accomplishments of the IRPhEP will also be highlighted and the future of both projects will be discussed. REFERENCES (1) International Handbook of Evaluated Criticality Safety Benchmark Experiments, NEA/NSC/DOC(95)03/I-IX, Organisation for Economic Co-operation and Development-Nuclear Energy Agency (OECD-NEA), September 2010 Edition, ISBN 978-92-64-99140-8. (2) International Handbook of Evaluated Reactor Physics Benchmark Experiments, NEA/NSC/DOC(2006)1, Organisation for Economic Co-operation and Development-Nuclear Energy Agency (OECD-NEA), March 2011 Edition, ISBN 978-92-64-99141-5.« less
Benchmark Dataset for Whole Genome Sequence Compression.
C L, Biji; S Nair, Achuthsankar
2017-01-01
The research in DNA data compression lacks a standard dataset to test out compression tools specific to DNA. This paper argues that the current state of achievement in DNA compression is unable to be benchmarked in the absence of such scientifically compiled whole genome sequence dataset and proposes a benchmark dataset using multistage sampling procedure. Considering the genome sequence of organisms available in the National Centre for Biotechnology and Information (NCBI) as the universe, the proposed dataset selects 1,105 prokaryotes, 200 plasmids, 164 viruses, and 65 eukaryotes. This paper reports the results of using three established tools on the newly compiled dataset and show that their strength and weakness are evident only with a comparison based on the scientifically compiled benchmark dataset. The sample dataset and the respective links are available @ https://sourceforge.net/projects/benchmarkdnacompressiondataset/.
Benchmarking nitrogen removal suspended-carrier biofilm systems using dynamic simulation.
Vanhooren, H; Yuan, Z; Vanrolleghem, P A
2002-01-01
We are witnessing an enormous growth in biological nitrogen removal from wastewater. It presents specific challenges beyond traditional COD (carbon) removal. A possibility for optimised process design is the use of biomass-supporting media. In this paper, attached growth processes (AGP) are evaluated using dynamic simulations. The advantages of these systems that were qualitatively described elsewhere, are validated quantitatively based on a simulation benchmark for activated sludge treatment systems. This simulation benchmark is extended with a biofilm model that allows for fast and accurate simulation of the conversion of different substrates in a biofilm. The economic feasibility of this system is evaluated using the data generated with the benchmark simulations. Capital savings due to volume reduction and reduced sludge production are weighed out against increased aeration costs. In this evaluation, effluent quality is integrated as well.
Analysis of contact zones from whole field isochromatics using reflection photoelasticity
NASA Astrophysics Data System (ADS)
Hariprasad, M. P.; Ramesh, K.
2018-06-01
This paper discusses the method for evaluating the unknown contact parameters by post processing the whole field fringe order data obtained from reflection photoelasticity in a nonlinear least squares sense. Recent developments in Twelve Fringe Photoelasticity (TFP) for fringe order evaluation from single isochromatics is utilized for the whole field fringe order evaluation. One of the issues in using TFP for reflection photoelasticity is the smudging of isochromatic data at the contact zone. This leads to error in identifying the origin of contact, which is successfully addressed by implementing a semi-automatic contact point refinement algorithm. The methodologies are initially verified for benchmark problems and demonstrated for two application problems of turbine blade and sheet pile contacting interfaces.
NASA Technical Reports Server (NTRS)
Deshpande, M. D.; Cockrell, C. R.; Beck, F. B.; Nguyen, T. X.
1993-01-01
The validation of low-frequency measurements and electromagnetic (EM) scattering computations for several simple, generic shapes, such as an equilateral-triangular plate, an equilateral-triangular plate with a concentric equilateral-triangular hole, and diamond- and hexagonal-shaped plates, is discussed. The plates were constructed from a thin aluminum sheet with a thickness of 0.08 cm. EM scattering by the planar plates was measured in the experimental test range (ETR) facility of NASA Langley Research Center. The dimensions of the plates were selected such that, over the frequency range of interest, the dimensions were in the range of lambda0 to 3(lambda0). In addition, the triangular plate with a triangular hole was selected to study internal-hole resonances.
NASA Astrophysics Data System (ADS)
Hashim; Khan, Masood; Alshomrani, Ali Saleh
2017-12-01
This article considers a realistic approach to examine the magnetohydrodynamics (MHD) flow of Carreau fluid induced by the shrinking sheet subject to the stagnation-point. This study also explores the impacts of non-linear thermal radiation on the heat transfer process. The governing equations of physical model are expressed as a system of partial differential equations and are transformed into non-linear ordinary differential equations by introducing local similarity variables. The economized equations of the problem are numerically integrated using the Runge-Kutta Fehlberg integration scheme. In this study, we explore the condition of existence, non-existence, uniqueness and dual nature for obtaining numerical solutions. It is found that the solutions may possess multiple natures, upper and lower branch, for a specific range of shrinking parameter. Results indicate that due to an increment in the magnetic parameter, range of shrinking parameter where a dual solution exists, increases. Further, strong magnetic field enhances the thickness of the momentum boundary layer in case of the second solution while for first solution it reduces. We further note that the fluid suction diminishes the fluid velocity and therefore the thickness of the hydrodynamic boundary layer decreases as well. A critical analysis with existing works is performed which shows that outcome are benchmarks with these works.
ERIC Educational Resources Information Center
Thomas, Allyson
2015-01-01
This study examined the relationship between student engagement and graduation rates between Black/African American students and White (Non-Hispanic) students in their senior year at faith-based institutions in the southeastern region of the United States using the NSSE benchmarks of effective educational practices. Specifically, scores from the…
Energized Oxygen : Speiser Current Sheet Bifurcation
NASA Astrophysics Data System (ADS)
George, D. E.; Jahn, J. M.
2017-12-01
A single population of energized Oxygen (O+) is shown to produce a cross-tail bifurcated current sheet in 2.5D PIC simulations of the magnetotail without the influence of magnetic reconnection. Treatment of oxygen in simulations of space plasmas, specifically a magnetotail current sheet, has been limited to thermal energies despite observations of and mechanisms which explain energized ions. We performed simulations of a homogeneous oxygen background, that has been energized in a physically appropriate manner, to study the behavior of current sheets and magnetic reconnection, specifically their bifurcation. This work uses a 2.5D explicit Particle-In-a-Cell (PIC) code to investigate the dynamics of energized heavy ions as they stream Dawn-to-Dusk in the magnetotail current sheet. We present a simulation study dealing with the response of a current sheet system to energized oxygen ions. We establish a, well known and studied, 2-species GEM Challenge Harris current sheet as a starting point. This system is known to eventually evolve and produce magnetic reconnection upon thinning of the current sheet. We added a uniform distribution of thermal O+ to the background. This 3-species system is also known to eventually evolve and produce magnetic reconnection. We add one additional variable to the system by providing an initial duskward velocity to energize the O+. We also traced individual particle motion within the PIC simulation. Three main results are shown. First, energized dawn- dusk streaming ions are clearly seen to exhibit sustained Speiser motion. Second, a single population of heavy ions clearly produces a stable bifurcated current sheet. Third, magnetic reconnection is not required to produce the bifurcated current sheet. Finally a bifurcated current sheet is compatible with the Harris current sheet model. This work is the first step in a series of investigations aimed at studying the effects of energized heavy ions on magnetic reconnection. This work differs significantly from previous investigations involving heavy ions in that they are energized as opposed to being simply thermal. This is a variation based firmly on published in-situ measurements. It also differs in that a complete population is used as opposed to simply test particles in a magnetic field model.
A Privacy-Preserving Platform for User-Centric Quantitative Benchmarking
NASA Astrophysics Data System (ADS)
Herrmann, Dominik; Scheuer, Florian; Feustel, Philipp; Nowey, Thomas; Federrath, Hannes
We propose a centralised platform for quantitative benchmarking of key performance indicators (KPI) among mutually distrustful organisations. Our platform offers users the opportunity to request an ad-hoc benchmarking for a specific KPI within a peer group of their choice. Architecture and protocol are designed to provide anonymity to its users and to hide the sensitive KPI values from other clients and the central server. To this end, we integrate user-centric peer group formation, exchangeable secure multi-party computation protocols, short-lived ephemeral key pairs as pseudonyms, and attribute certificates. We show by empirical evaluation of a prototype that the performance is acceptable for reasonably sized peer groups.
Child-Resistant Packaging for E-Liquid: A Review of US State Legislation.
Frey, Leslie T; Tilburg, William C
2016-02-01
A growing number of states have introduced or enacted legislation requiring child-resistant packaging for e-liquid containers; however, these laws involve varying terms, packaging standards, and enforcement provisions, raising concerns about their effectiveness. We evaluated bills against 4 benchmarks: broad product definitions that contemplate future developments in the market, citations to a specific packaging standard, stated penalties for violations, and express grants of authority to a state entity to enforce the packaging requirements. Our findings showed that 3 states meet all 4 benchmarks in their enacted legislation. We encourage states to consider these benchmarks when revising statutes or drafting future legislation.
Child-Resistant Packaging for E-Liquid: A Review of US State Legislation
Tilburg, William C.
2016-01-01
A growing number of states have introduced or enacted legislation requiring child-resistant packaging for e-liquid containers; however, these laws involve varying terms, packaging standards, and enforcement provisions, raising concerns about their effectiveness. We evaluated bills against 4 benchmarks: broad product definitions that contemplate future developments in the market, citations to a specific packaging standard, stated penalties for violations, and express grants of authority to a state entity to enforce the packaging requirements. Our findings showed that 3 states meet all 4 benchmarks in their enacted legislation. We encourage states to consider these benchmarks when revising statutes or drafting future legislation. PMID:26691114
Tug fleet and ground operations schedules and controls. Volume 2: Part 3, appendixes
NASA Technical Reports Server (NTRS)
1975-01-01
A space tug function description data sheet is prepared for each block of the space tug functional flow diagram. A summary of the basic information regarding the activities performed in its respective functional block is provided. The sheets are catalogued by functional flow block numbers with reference blocks at the end. The specific items of information contained in each data sheet are defined.
NASA Astrophysics Data System (ADS)
Jacques, Diederik
2017-04-01
As soil functions are governed by a multitude of interacting hydrological, geochemical and biological processes, simulation tools coupling mathematical models for interacting processes are needed. Coupled reactive transport models are a typical example of such coupled tools mainly focusing on hydrological and geochemical coupling (see e.g. Steefel et al., 2015). Mathematical and numerical complexity for both the tool itself or of the specific conceptual model can increase rapidly. Therefore, numerical verification of such type of models is a prerequisite for guaranteeing reliability and confidence and qualifying simulation tools and approaches for any further model application. In 2011, a first SeSBench -Subsurface Environmental Simulation Benchmarking- workshop was held in Berkeley (USA) followed by four other ones. The objective is to benchmark subsurface environmental simulation models and methods with a current focus on reactive transport processes. The final outcome was a special issue in Computational Geosciences (2015, issue 3 - Reactive transport benchmarks for subsurface environmental simulation) with a collection of 11 benchmarks. Benchmarks, proposed by the participants of the workshops, should be relevant for environmental or geo-engineering applications; the latter were mostly related to radioactive waste disposal issues - excluding benchmarks defined for pure mathematical reasons. Another important feature is the tiered approach within a benchmark with the definition of a single principle problem and different sub problems. The latter typically benchmarked individual or simplified processes (e.g. inert solute transport, simplified geochemical conceptual model) or geometries (e.g. batch or one-dimensional, homogeneous). Finally, three codes should be involved into a benchmark. The SeSBench initiative contributes to confidence building for applying reactive transport codes. Furthermore, it illustrates the use of those type of models for different environmental and geo-engineering applications. SeSBench will organize new workshops to add new benchmarks in a new special issue. Steefel, C. I., et al. (2015). "Reactive transport codes for subsurface environmental simulation." Computational Geosciences 19: 445-478.
SU-E-T-148: Benchmarks and Pre-Treatment Reviews: A Study of Quality Assurance Effectiveness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowenstein, J; Nguyen, H; Roll, J
Purpose: To determine the impact benchmarks and pre-treatment reviews have on improving the quality of submitted clinical trial data. Methods: Benchmarks are used to evaluate a site’s ability to develop a treatment that meets a specific protocol’s treatment guidelines prior to placing their first patient on the protocol. A pre-treatment review is an actual patient placed on the protocol in which the dosimetry and contour volumes are evaluated to be per protocol guidelines prior to allowing the beginning of the treatment. A key component of these QA mechanisms is that sites are provided timely feedback to educate them on howmore » to plan per the protocol and prevent protocol deviations on patients accrued to a protocol. For both benchmarks and pre-treatment reviews a dose volume analysis (DVA) was performed using MIM softwareTM. For pre-treatment reviews a volume contour evaluation was also performed. Results: IROC Houston performed a QA effectiveness analysis of a protocol which required both benchmarks and pre-treatment reviews. In 70 percent of the patient cases submitted, the benchmark played an effective role in assuring that the pre-treatment review of the cases met protocol requirements. The 35 percent of sites failing the benchmark subsequently modified there planning technique to pass the benchmark before being allowed to submit a patient for pre-treatment review. However, in 30 percent of the submitted cases the pre-treatment review failed where the majority (71 percent) failed the DVA. 20 percent of sites submitting patients failed to correct their dose volume discrepancies indicated by the benchmark case. Conclusion: Benchmark cases and pre-treatment reviews can be an effective QA tool to educate sites on protocol guidelines and to minimize deviations. Without the benchmark cases it is possible that 65 percent of the cases undergoing a pre-treatment review would have failed to meet the protocols requirements.Support: U24-CA-180803.« less
Öchsner, Wolfgang; Böckers, Anja
2016-01-01
A competent review process is crucial to ensure the quality of multiple-choice (MC) questions. However, the acquisition of reviewing skills should not cause any unnecessary additional burden for a medical staff that is already facing heavy workloads. 100 MC questions, for which an expert review existed, were presented to 12 novices. In advance, six participants received a specific information sheet covering critical information for high-calibre review; the other six participants attended a 2.5-hour workshop covering the same information. The review results of both groups were analysed with a licensed version of the IBM software SPSS 19.0 (SPSS Inc., Chicago, IL). The results of the workshop group were distinctly closer to the experts' results (gold standard) than those of the information sheet group. For the quantitatively important category of medium quality MC questions, the results of the workshop group did not significantly differ from the experts' results. In the information sheet group the results were significantly poorer than the experts', regardless of the quality of the questions. Distributing specific information sheets to MC question reviewers is not sufficient for ensuring the quality of the review so that - regardless of the increased effort involved - a recommendation to conduct specific workshops must be made. Copyright © 2014. Published by Elsevier GmbH.
Beauchamp, Kyle A; Behr, Julie M; Rustenburg, Ariën S; Bayly, Christopher I; Kroenlein, Kenneth; Chodera, John D
2015-10-08
Atomistic molecular simulations are a powerful way to make quantitative predictions, but the accuracy of these predictions depends entirely on the quality of the force field employed. Although experimental measurements of fundamental physical properties offer a straightforward approach for evaluating force field quality, the bulk of this information has been tied up in formats that are not machine-readable. Compiling benchmark data sets of physical properties from non-machine-readable sources requires substantial human effort and is prone to the accumulation of human errors, hindering the development of reproducible benchmarks of force-field accuracy. Here, we examine the feasibility of benchmarking atomistic force fields against the NIST ThermoML data archive of physicochemical measurements, which aggregates thousands of experimental measurements in a portable, machine-readable, self-annotating IUPAC-standard format. As a proof of concept, we present a detailed benchmark of the generalized Amber small-molecule force field (GAFF) using the AM1-BCC charge model against experimental measurements (specifically, bulk liquid densities and static dielectric constants at ambient pressure) automatically extracted from the archive and discuss the extent of data available for use in larger scale (or continuously performed) benchmarks. The results of even this limited initial benchmark highlight a general problem with fixed-charge force fields in the representation low-dielectric environments, such as those seen in binding cavities or biological membranes.
Benchmarking: a method for continuous quality improvement in health.
Ettorchi-Tardy, Amina; Levif, Marie; Michel, Philippe
2012-05-01
Benchmarking, a management approach for implementing best practices at best cost, is a recent concept in the healthcare system. The objectives of this paper are to better understand the concept and its evolution in the healthcare sector, to propose an operational definition, and to describe some French and international experiences of benchmarking in the healthcare sector. To this end, we reviewed the literature on this approach's emergence in the industrial sector, its evolution, its fields of application and examples of how it has been used in the healthcare sector. Benchmarking is often thought to consist simply of comparing indicators and is not perceived in its entirety, that is, as a tool based on voluntary and active collaboration among several organizations to create a spirit of competition and to apply best practices. The key feature of benchmarking is its integration within a comprehensive and participatory policy of continuous quality improvement (CQI). Conditions for successful benchmarking focus essentially on careful preparation of the process, monitoring of the relevant indicators, staff involvement and inter-organizational visits. Compared to methods previously implemented in France (CQI and collaborative projects), benchmarking has specific features that set it apart as a healthcare innovation. This is especially true for healthcare or medical-social organizations, as the principle of inter-organizational visiting is not part of their culture. Thus, this approach will need to be assessed for feasibility and acceptability before it is more widely promoted.
Principles for designing proteins with cavities formed by curved β sheets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcos, Enrique; Basanta, Benjamin; Chidyausiku, Tamuka M.
Active sites and ligand-binding cavities in native proteins are often formed by curved β sheets, and the ability to control β-sheet curvature would allow design of binding proteins with cavities customized to specific ligands. Toward this end, we investigated the mechanisms controlling β-sheet curvature by studying the geometry of β sheets in naturally occurring protein structures and folding simulations. The principles emerging from this analysis were used to design, de novo, a series of proteins with curved β sheets topped with α helices. Nuclear magnetic resonance and crystal structures of the designs closely match the computational models, showing that β-sheetmore » curvature can be controlled with atomic-level accuracy. Our approach enables the design of proteins with cavities and provides a route to custom design ligand-binding and catalytic sites.« less
ERIC Educational Resources Information Center
Cowan, Earl; And Others
The curriculum guide for welding instruction contains 16 units presented in six sections. Each unit is divided into the following areas, each of which is color coded: terminal objectives, specific objectives, suggested activities, and instructional materials; information sheet; transparency masters; assignment sheet; test; and test answers. The…
Tezaur, I. K.; Perego, M.; Salinger, A. G.; ...
2015-04-27
This paper describes a new parallel, scalable and robust finite element based solver for the first-order Stokes momentum balance equations for ice flow. The solver, known as Albany/FELIX, is constructed using the component-based approach to building application codes, in which mature, modular libraries developed as a part of the Trilinos project are combined using abstract interfaces and template-based generic programming, resulting in a final code with access to dozens of algorithmic and advanced analysis capabilities. Following an overview of the relevant partial differential equations and boundary conditions, the numerical methods chosen to discretize the ice flow equations are described, alongmore » with their implementation. The results of several verification studies of the model accuracy are presented using (1) new test cases for simplified two-dimensional (2-D) versions of the governing equations derived using the method of manufactured solutions, and (2) canonical ice sheet modeling benchmarks. Model accuracy and convergence with respect to mesh resolution are then studied on problems involving a realistic Greenland ice sheet geometry discretized using hexahedral and tetrahedral meshes. Also explored as a part of this study is the effect of vertical mesh resolution on the solution accuracy and solver performance. The robustness and scalability of our solver on these problems is demonstrated. Lastly, we show that good scalability can be achieved by preconditioning the iterative linear solver using a new algebraic multilevel preconditioner, constructed based on the idea of semi-coarsening.« less
NASA Astrophysics Data System (ADS)
Yetna n'jock, M.; Houssem, B.; Labergere, C.; Saanouni, K.; Zhenming, Y.
2018-05-01
The springback is an important phenomenon which accompanies the forming of metallic sheets especially for high strength materials. A quantitative prediction of springback becomes very important for newly developed material with high mechanical characteristics. In this work, a numerical methodology is developed to quantify this undesirable phenomenon. This methodoly is based on the use of both explicit and implicit finite element solvers of Abaqus®. The most important ingredient of this methodology consists on the use of highly predictive mechanical model. A thermodynamically-consistent, non-associative and fully anisotropic elastoplastic constitutive model strongly coupled with isotropic ductile damage and accounting for distortional hardening is then used. An algorithm for local integration of the complete set of the constitutive equations is developed. This algorithm considers the rotated frame formulation (RFF) to ensure the incremental objectivity of the model in the framework of finite strains. This algorithm is implemented in both explicit (Abaqus/Explicit®) and implicit (Abaqus/Standard®) solvers of Abaqus® through the users routine VUMAT and UMAT respectively. The implicit solver of Abaqus® has been used to study spingback as it is generally a quasi-static unloading. In order to compare the methods `efficiency, the explicit method (Dynamic Relaxation Method) proposed by Rayleigh has been also used for springback prediction. The results obtained within U draw/bending benchmark are studied, discussed and compared with experimental results as reference. Finally, the purpose of this work is to evaluate the reliability of different methods predict efficiently springback in sheet metal forming.
12. "TEST STAND; STRUCTURAL; DEFLECTOR PIT DETAILS, SHEET NO. 1." ...
12. "TEST STAND; STRUCTURAL; DEFLECTOR PIT DETAILS, SHEET NO. 1." Specifications No. ENG-04-353-55-72; Drawing No. 60-09-12; sheet 41 of 148; file no. 1320/92, Rev. A. Stamped: RECORD DRAWING - AS CONSTRUCTED. Below stamp: Contract no. 4338, no change. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Test Stand 1-A Terminal Room, Test Area 1-120, north end of Jupiter Boulevard, Boron, Kern County, CA
Data book: Space station/base food system study. Book 3: Study selection rationale sheets
NASA Technical Reports Server (NTRS)
1970-01-01
The supporting rationale sheets are presented which were utilized in the selection and support of the concepts considered in the final phase of the study. Each concept, conceived to fulfill a specific function of the food system, was assessed in terms of the eight critical factors depicted on the rationale sheet. When weighted and totaled, the resulting selection factor was used as a guide in making the final decision.
Spontaneous formation of electric current sheets and the origin of solar flares
NASA Technical Reports Server (NTRS)
Low, B. C.; Wolfson, R.
1988-01-01
It is demonstrated that the continuous boundary motion of a sheared magnetic field in a tenuous plasma with an infinite electrical conductivity can induce the formation of multiple electric current sheets in the interior plasma. In response to specific footpoint displacements, the quadrupolar magnetic field considered is shown to require the formation of multiple electric current sheets as it achieves a force-free state. Some of the current sheets are found to be of finite length, running along separatrix lines of force which separate lobes of magnetic flux. It is suggested that current sheets in the form of infinitely thin magnetic shear layers may be unstable to resistive tearing, a process which may have application to solar flares.
Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; ...
2015-12-21
This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000 ® problems. These benchmark and scaling studies show promising results.« less
NASA Astrophysics Data System (ADS)
Allen, Jeffery M.
This research involves a few First-Order System Least Squares (FOSLS) formulations of a nonlinear-Stokes flow model for ice sheets. In Glen's flow law, a commonly used constitutive equation for ice rheology, the viscosity becomes infinite as the velocity gradients approach zero. This typically occurs near the ice surface or where there is basal sliding. The computational difficulties associated with the infinite viscosity are often overcome by an arbitrary modification of Glen's law that bounds the maximum viscosity. The FOSLS formulations developed in this thesis are designed to overcome this difficulty. The first FOSLS formulation is just the first-order representation of the standard nonlinear, full-Stokes and is known as the viscosity formulation and suffers from the problem above. To overcome the problem of infinite viscosity, two new formulation exploit the fact that the deviatoric stress, the product of viscosity and strain-rate, approaches zero as the viscosity goes to infinity. Using the deviatoric stress as the basis for a first-order system results in the the basic fluidity system. Augmenting the basic fluidity system with a curl-type equation results in the augmented fluidity system, which is more amenable to the iterative solver, Algebraic MultiGrid (AMG). A Nested Iteration (NI) Newton-FOSLS-AMG approach is used to solve the nonlinear-Stokes problems. Several test problems from the ISMIP set of benchmarks is examined to test the effectiveness of the various formulations. These test show that the viscosity based method is more expensive and less accurate. The basic fluidity system shows optimal finite-element convergence. However, there is not yet an efficient iterative solver for this type of system and this is the topic of future research. Alternatively, AMG performs better on the augmented fluidity system when using specific scaling. Unfortunately, this scaling results in reduced finite-element convergence.
Development of risk-based nanomaterial groups for occupational exposure control
NASA Astrophysics Data System (ADS)
Kuempel, E. D.; Castranova, V.; Geraci, C. L.; Schulte, P. A.
2012-09-01
Given the almost limitless variety of nanomaterials, it will be virtually impossible to assess the possible occupational health hazard of each nanomaterial individually. The development of science-based hazard and risk categories for nanomaterials is needed for decision-making about exposure control practices in the workplace. A possible strategy would be to select representative (benchmark) materials from various mode of action (MOA) classes, evaluate the hazard and develop risk estimates, and then apply a systematic comparison of new nanomaterials with the benchmark materials in the same MOA class. Poorly soluble particles are used here as an example to illustrate quantitative risk assessment methods for possible benchmark particles and occupational exposure control groups, given mode of action and relative toxicity. Linking such benchmark particles to specific exposure control bands would facilitate the translation of health hazard and quantitative risk information to the development of effective exposure control practices in the workplace. A key challenge is obtaining sufficient dose-response data, based on standard testing, to systematically evaluate the nanomaterials' physical-chemical factors influencing their biological activity. Categorization processes involve both science-based analyses and default assumptions in the absence of substance-specific information. Utilizing data and information from related materials may facilitate initial determinations of exposure control systems for nanomaterials.
Structural Benchmark Testing of Superalloy Lattice Block Subelements Completed
NASA Technical Reports Server (NTRS)
2004-01-01
Superalloy lattice block panels, which are produced directly by investment casting, are composed of thin ligaments arranged in three-dimensional triangulated trusslike structures (see the preceding figure). Optionally, solid panel face sheets can be formed integrally during casting. In either form, lattice block panels can easily be produced with weights less than 25 percent of the mass of a solid panel. Inconel 718 (IN 718) and MarM-247 superalloy lattice block panels have been developed under NASA's Ultra-Efficient Engine Technology Project and Higher Operating Temperature Propulsion Components Project to take advantage of the superalloys' high strength and elevated temperature capability with the inherent light weight and high stiffness of the lattice architecture (ref. 1). These characteristics are important in the future development of turbine engine components. Casting quality and structural efficiency were evaluated experimentally using small beam specimens machined from the cast and heat treated 140- by 300- by 11-mm panels. The matrix of specimens included samples of each superalloy in both open-celled and single-face-sheet configurations, machined from longitudinal, transverse, and diagonal panel orientations. Thirty-five beam subelements were tested in Glenn's Life Prediction Branch's material test machine at room temperature and 650 C under both static (see the following photograph) and cyclic load conditions. Surprisingly, test results exceeded initial linear elastic analytical predictions. This was likely a result of the formation of plastic hinges and redundancies inherent in lattice block geometry, which was not considered in the finite element models. The value of a single face sheet was demonstrated by increased bending moment capacity, where the face sheet simultaneously increased the gross section modulus and braced the compression ligaments against early buckling as seen in open-cell specimens. Preexisting flaws in specimens were not a discriminator in flexural, shear, or stiffness measurements, again because of redundant load paths available in the lattice block structure. Early test results are available in references 2 and 3; more complete analyses are scheduled for publication in 2004.
Benchmarking image fusion system design parameters
NASA Astrophysics Data System (ADS)
Howell, Christopher L.
2013-06-01
A clear and absolute method for discriminating between image fusion algorithm performances is presented. This method can effectively be used to assist in the design and modeling of image fusion systems. Specifically, it is postulated that quantifying human task performance using image fusion should be benchmarked to whether the fusion algorithm, at a minimum, retained the performance benefit achievable by each independent spectral band being fused. The established benchmark would then clearly represent the threshold that a fusion system should surpass to be considered beneficial to a particular task. A genetic algorithm is employed to characterize the fused system parameters using a Matlab® implementation of NVThermIP as the objective function. By setting the problem up as a mixed-integer constraint optimization problem, one can effectively look backwards through the image acquisition process: optimizing fused system parameters by minimizing the difference between modeled task difficulty measure and the benchmark task difficulty measure. The results of an identification perception experiment are presented, where human observers were asked to identify a standard set of military targets, and used to demonstrate the effectiveness of the benchmarking process.
Laboratory Powder Metallurgy Makes Tough Aluminum Sheet
NASA Technical Reports Server (NTRS)
Royster, D. M.; Thomas, J. R.; Singleton, O. R.
1993-01-01
Aluminum alloy sheet exhibits high tensile and Kahn tear strengths. Rapid solidification of aluminum alloys in powder form and subsequent consolidation and fabrication processes used to tailor parts made of these alloys to satisfy such specific aerospace design requirements as high strength and toughness.
Continuous quality improvement for the clinical decision unit.
Mace, Sharon E
2004-01-01
Clinical decision units (CDUs) are a relatively new and growing area of medicine in which patients undergo rapid evaluation and treatment. Continuous quality improvement (CQI) is important for the establishment and functioning of CDUs. CQI in CDUs has many advantages: better CDU functioning, fulfillment of Joint Commission on Accreditation of Healthcare Organizations mandates, greater efficiency/productivity, increased job satisfaction, better performance improvement, data availability, and benchmarking. Key elements include a database with volume indicators, operational policies, clinical practice protocols (diagnosis specific/condition specific), monitors, benchmarks, and clinical pathways. Examples of these important parameters are given. The CQI process should be individualized for each CDU and hospital.
Brown, Timothy A.; Dunning, Charles P.; Sharpe, Jennifer B.
2000-01-01
The report series will enable investigators involved in site-specific studies within the subcrop area to understand the regional geologic framework of the unit and to find additional reference sources. This report consists of four sheets that show the altitude (sheet 1), depth from land surface (sheet 2), total thickness (sheet 3), and location of altitude data (sheet 4) of the lithologic units that constitute the Galena-Platteville bedrock unit within the subcrop area. The sheets also show major known geologic features within the Galena-Platteville study area in Illinois and Wisconsin. A geographic information system (GIS) was used to generate data layers (coverages) from point data and from published and unpublished contour maps at various scales and detail. Standard GIS procedures were used to change the coverages into the maps shown on the sheets presented in this report. A list of references for the data used to prepare the maps is provided.
Structure and Dynamics of Current Sheets in 3D Magnetic Fields with the X-line
NASA Astrophysics Data System (ADS)
Frank, Anna G.; Bogdanov, S. Yu.; Bugrov, S. G.; Markov, V. S.; Dreiden, G. V.; Ostrovskaya, G. V.
2004-11-01
Experimental results are presented on the structure of current sheets formed in 3D magnetic fields with singular lines of the X-type. Two basic diagnostics were used with the device CS - 3D: two-exposure holographic interferometry and magnetic measurements. Formation of extended current sheets and plasma compression were observed in the presence of the longitudinal magnetic field component aligned with the X-line. Plasma density decreased and the sheet thickness increased with an increase of the longitudinal component. We succeeded to reveal formation of the sheets taking unusual shape, namely tilted and asymmetric sheets, in plasmas with the heavy ions. These current sheets were obviously different from the planar sheets formed in 2D magnetic fields, i.e. without longitudinal component. Analysis of typical plasma parameters made it evident that plasma dynamics and current sheet evolution should be treated on the base of the two-fluid approach. Specifically it is necessary to take into account the Hall currents in the plane perpendicular to the X-line, and the dynamic effects resulting from interaction of the Hall currents and the 3D magnetic field. Supported by RFBR, grant 03-02-17282, and ISTC, project 2098.
Sakai, Yusuke; Koike, Makiko; Hasegawa, Hideko; Yamanouchi, Kosho; Soyama, Akihiko; Takatsuki, Mitsuhisa; Kuroki, Tamotsu; Ohashi, Kazuo; Okano, Teruo; Eguchi, Susumu
2013-01-01
Cell sheet engineering is attracting attention from investigators in various fields, from basic research scientists to clinicians focused on regenerative medicine. However, hepatocytes have a limited proliferation potential in vitro, and it generally takes a several days to form a sheet morphology and multi-layered sheets. We herein report our rapid and efficient technique for generating multi-layered human hepatic cell (HepaRG® cell) sheets using pre-cultured fibroblast monolayers derived from human skin (TIG-118 cells) as a feeder layer on a temperature-responsive culture dish. Multi-layered TIG-118/HepaRG cell sheets with a thick morphology were harvested on day 4 of culturing HepaRG cells by forceful contraction of the TIG-118 cells, and the resulting sheet could be easily handled. In addition, the human albumin and alpha 1-antitrypsin synthesis activities of TIG-118/HepaRG cells were approximately 1.2 and 1.3 times higher than those of HepaRG cells, respectively. Therefore, this technique is considered to be a promising modality for rapidly fabricating multi-layered human hepatocyte sheets from cells with limited proliferation potential, and the engineered cell sheet could be used for cell transplantation with highly specific functions.
Popp, Alexander; Scheerer, David; Chi, Heng; Keiderling, Timothy A; Hauser, Karin
2016-05-04
Turn residues and side-chain interactions play an important role for the folding of β-sheets. We investigated the conformational dynamics of a three-stranded β-sheet peptide ((D) P(D) P) and a two-stranded β-hairpin (WVYY-(D) P) by time-resolved temperature-jump (T-jump) infrared spectroscopy. Both peptide sequences contain (D) Pro-Gly residues that favor a tight β-turn. The three-stranded β-sheet (Ac-VFITS(D) PGKTYTEV(D) PGOKILQ-NH2 ) is stabilized by the turn sequences, whereas the β-hairpin (SWTVE(D) PGKYTYK-NH2 ) folding is assisted by both the turn sequence and hydrophobic cross-strand interactions. Relaxation times after the T-jump were monitored as a function of temperature and occur on a sub-microsecond time scale, (D) P(D) P being faster than WVYY-(D) P. The Xxx-(D) Pro tertiary amide provides a detectable IR band, allowing us to probe the dynamics site-specifically. The relative importance of the turn versus the intrastrand stability in β-sheet formation is discussed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Closed-Loop Neuromorphic Benchmarks
Stewart, Terrence C.; DeWolf, Travis; Kleinhans, Ashley; Eliasmith, Chris
2015-01-01
Evaluating the effectiveness and performance of neuromorphic hardware is difficult. It is even more difficult when the task of interest is a closed-loop task; that is, a task where the output from the neuromorphic hardware affects some environment, which then in turn affects the hardware's future input. However, closed-loop situations are one of the primary potential uses of neuromorphic hardware. To address this, we present a methodology for generating closed-loop benchmarks that makes use of a hybrid of real physical embodiment and a type of “minimal” simulation. Minimal simulation has been shown to lead to robust real-world performance, while still maintaining the practical advantages of simulation, such as making it easy for the same benchmark to be used by many researchers. This method is flexible enough to allow researchers to explicitly modify the benchmarks to identify specific task domains where particular hardware excels. To demonstrate the method, we present a set of novel benchmarks that focus on motor control for an arbitrary system with unknown external forces. Using these benchmarks, we show that an error-driven learning rule can consistently improve motor control performance across a randomly generated family of closed-loop simulations, even when there are up to 15 interacting joints to be controlled. PMID:26696820
Stratification of unresponsive patients by an independently validated index of brain complexity
Casarotto, Silvia; Comanducci, Angela; Rosanova, Mario; Sarasso, Simone; Fecchio, Matteo; Napolitani, Martino; Pigorini, Andrea; G. Casali, Adenauer; Trimarchi, Pietro D.; Boly, Melanie; Gosseries, Olivia; Bodart, Olivier; Curto, Francesco; Landi, Cristina; Mariotti, Maurizio; Devalle, Guya; Laureys, Steven; Tononi, Giulio
2016-01-01
Objective Validating objective, brain‐based indices of consciousness in behaviorally unresponsive patients represents a challenge due to the impossibility of obtaining independent evidence through subjective reports. Here we address this problem by first validating a promising metric of consciousness—the Perturbational Complexity Index (PCI)—in a benchmark population who could confirm the presence or absence of consciousness through subjective reports, and then applying the same index to patients with disorders of consciousness (DOCs). Methods The benchmark population encompassed 150 healthy controls and communicative brain‐injured subjects in various states of conscious wakefulness, disconnected consciousness, and unconsciousness. Receiver operating characteristic curve analysis was performed to define an optimal cutoff for discriminating between the conscious and unconscious conditions. This cutoff was then applied to a cohort of noncommunicative DOC patients (38 in a minimally conscious state [MCS] and 43 in a vegetative state [VS]). Results We found an empirical cutoff that discriminated with 100% sensitivity and specificity between the conscious and the unconscious conditions in the benchmark population. This cutoff resulted in a sensitivity of 94.7% in detecting MCS and allowed the identification of a number of unresponsive VS patients (9 of 43) with high values of PCI, overlapping with the distribution of the benchmark conscious condition. Interpretation Given its high sensitivity and specificity in the benchmark and MCS population, PCI offers a reliable, independently validated stratification of unresponsive patients that has important physiopathological and therapeutic implications. In particular, the high‐PCI subgroup of VS patients may retain a capacity for consciousness that is not expressed in behavior. Ann Neurol 2016;80:718–729 PMID:27717082
Talaminos-Barroso, Alejandro; Estudillo-Valderrama, Miguel A; Roa, Laura M; Reina-Tosina, Javier; Ortega-Ruiz, Francisco
2016-06-01
M2M (Machine-to-Machine) communications represent one of the main pillars of the new paradigm of the Internet of Things (IoT), and is making possible new opportunities for the eHealth business. Nevertheless, the large number of M2M protocols currently available hinders the election of a suitable solution that satisfies the requirements that can demand eHealth applications. In the first place, to develop a tool that provides a benchmarking analysis in order to objectively select among the most relevant M2M protocols for eHealth solutions. In the second place, to validate the tool with a particular use case: the respiratory rehabilitation. A software tool, called Distributed Computing Framework (DFC), has been designed and developed to execute the benchmarking tests and facilitate the deployment in environments with a large number of machines, with independence of the protocol and performance metrics selected. DDS, MQTT, CoAP, JMS, AMQP and XMPP protocols were evaluated considering different specific performance metrics, including CPU usage, memory usage, bandwidth consumption, latency and jitter. The results obtained allowed to validate a case of use: respiratory rehabilitation of chronic obstructive pulmonary disease (COPD) patients in two scenarios with different types of requirement: Home-Based and Ambulatory. The results of the benchmark comparison can guide eHealth developers in the choice of M2M technologies. In this regard, the framework presented is a simple and powerful tool for the deployment of benchmark tests under specific environments and conditions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
ARARS Q's and A's: The fund-balancing waiver. Fact sheet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-01-01
The fact sheet implements the applicable or relevant and appropriate requirements (ARARs) provisions EPA has developed guidance and provided training to Regions and States on the identification of and compliance with ARARs. It is part of a series that provide guidance on a number of questions that arose in developing ARARs policies, in ARARs training sessions, and in identifying and complying with ARARs at specific sites. The fact sheet addresses the Fund-balancing waiver.
THE LANGUAGE LABORATORY--WORK SHEET.
ERIC Educational Resources Information Center
CROSBIE, KEITH
DESIGNED FOR TEACHERS AND ADMINISTRATORS, THIS WORK SHEET PROVIDES GENERAL AND SPECIFIC INFORMATION ABOUT THE PHILOSOPHY, TYPES, AND USES OF LANGUAGE LABORATORIES IN SECONDARY SCHOOL LANGUAGE PROGRAMS. THE FIRST SECTION DISCUSSES THE ADVANTAGES OF USING THE LABORATORY EFFECTIVELY TO REINFORCE AND CONSOLIDATE CLASSROOM LEARNING, AND MENTIONS SOME…
NASA Astrophysics Data System (ADS)
Rimov, A. A.; Chukanova, T. I.; Trofimov, Yu. V.
2016-12-01
Data on the comparative analysis variants of the quality of power installations (benchmarking) applied in the power industry is systematized. It is shown that the most efficient variant of implementation of the benchmarking technique is the analysis of statistical distributions of the indicators in the composed homogenous group of the uniform power installations. The benchmarking technique aimed at revealing the available reserves on improvement of the reliability and heat efficiency indicators of the power installations of the thermal power plants is developed in the furtherance of this approach. The technique provides a possibility of reliable comparison of the quality of the power installations in their homogenous group limited by the number and adoption of the adequate decision on improving some or other technical characteristics of this power installation. The technique provides structuring of the list of the comparison indicators and internal factors affecting them represented according to the requirements of the sectoral standards and taking into account the price formation characteristics in the Russian power industry. The mentioned structuring ensures traceability of the reasons of deviation of the internal influencing factors from the specified values. The starting point for further detail analysis of the delay of the certain power installation indicators from the best practice expressed in the specific money equivalent is positioning of this power installation on distribution of the key indicator being a convolution of the comparison indicators. The distribution of the key indicator is simulated by the Monte-Carlo method after receiving the actual distributions of the comparison indicators: specific lost profit due to the short supply of electric energy and short delivery of power, specific cost of losses due to the nonoptimal expenditures for repairs, and specific cost of excess fuel equivalent consumption. The quality loss indicators are developed facilitating the analysis of the benchmarking results permitting to represent the quality loss of this power installation in the form of the difference between the actual value of the key indicator or comparison indicator and the best quartile of the existing distribution. The uncertainty of the obtained values of the quality loss indicators was evaluated by transforming the standard uncertainties of the input values into the expanded uncertainties of the output values with the confidence level of 95%. The efficiency of the technique is demonstrated in terms of benchmarking of the main thermal and mechanical equipment of the extraction power-generating units T-250 and power installations of the thermal power plants with the main steam pressure 130 atm.
INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom; Javier Ortensi; Sonat Sen
2013-09-01
The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible formore » defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III results of all other international participants in 2014, while the remaining Phase II transient case results will be reported in 2015.« less
Metastable Autoionizing States of Molecules and Radicals in Highly Energetic Environment
2016-03-22
electronic states. The specific aims are to develop and calibrate complex-scaled equation-of-motion coupled cluster (cs-EOM- CC ) and CAP (complex...absorbing potential) augmented EOM- CC methods. We have implemented and benchmarked cs-EOM-CCSD and CAP- augmented EOM-CCSD methods for excitation energies...motion coupled cluster (cs-EOM- CC ) and CAP (complex absorbing potential) augmented EOM- CC methods. We have implemented and benchmarked cs-EOM-CCSD and
The US EPA National Center for Environmental Assessment has developed a methodology to derive acute inhalation toxicity benchmarks, called acute reference exposures (AREs), for noncancer effects. The methodology provides guidance for the derivation of chemical-specific benchmark...
Benchmarking: A Method for Continuous Quality Improvement in Health
Ettorchi-Tardy, Amina; Levif, Marie; Michel, Philippe
2012-01-01
Benchmarking, a management approach for implementing best practices at best cost, is a recent concept in the healthcare system. The objectives of this paper are to better understand the concept and its evolution in the healthcare sector, to propose an operational definition, and to describe some French and international experiences of benchmarking in the healthcare sector. To this end, we reviewed the literature on this approach's emergence in the industrial sector, its evolution, its fields of application and examples of how it has been used in the healthcare sector. Benchmarking is often thought to consist simply of comparing indicators and is not perceived in its entirety, that is, as a tool based on voluntary and active collaboration among several organizations to create a spirit of competition and to apply best practices. The key feature of benchmarking is its integration within a comprehensive and participatory policy of continuous quality improvement (CQI). Conditions for successful benchmarking focus essentially on careful preparation of the process, monitoring of the relevant indicators, staff involvement and inter-organizational visits. Compared to methods previously implemented in France (CQI and collaborative projects), benchmarking has specific features that set it apart as a healthcare innovation. This is especially true for healthcare or medical–social organizations, as the principle of inter-organizational visiting is not part of their culture. Thus, this approach will need to be assessed for feasibility and acceptability before it is more widely promoted. PMID:23634166
Chemical vapor deposition growth
NASA Technical Reports Server (NTRS)
Ruth, R. P.; Manasevit, H. M.; Kenty, J. L.; Moudy, L. A.; Simpson, W. I.; Yang, J. J.
1976-01-01
The chemical vapor deposition (CVD) method for the growth of Si sheet on inexpensive substrate materials is investigated. The objective is to develop CVD techniques for producing large areas of Si sheet on inexpensive substrate materials, with sheet properties suitable for fabricating solar cells meeting the technical goals of the Low Cost Silicon Solar Array Project. Specific areas covered include: (1) modification and test of existing CVD reactor system; (2) identification and/or development of suitable inexpensive substrate materials; (3) experimental investigation of CVD process parameters using various candidate substrate materials; (4) preparation of Si sheet samples for various special studies, including solar cell fabrication; (5) evaluation of the properties of the Si sheet material produced by the CVD process; and (6) fabrication and evaluation of experimental solar cell structures, using standard and near-standard processing techniques.
46 CFR 160.061-1 - Applicable specifications.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Applicable specifications. (a) The following specifications, of the issue in effect on the date emergency..., fiber, hard sheet. CCC-F-451—Flannel, canton. (2) Military specifications: MIL-H-2846—Hooks, fish, steel...
46 CFR 160.061-1 - Applicable specifications.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Applicable specifications. (a) The following specifications, of the issue in effect on the date emergency..., fiber, hard sheet. CCC-F-451—Flannel, canton. (2) Military specifications: MIL-H-2846—Hooks, fish, steel...
46 CFR 160.061-1 - Applicable specifications.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Applicable specifications. (a) The following specifications, of the issue in effect on the date emergency..., fiber, hard sheet. CCC-F-451—Flannel, canton. (2) Military specifications: MIL-H-2846—Hooks, fish, steel...
ERIC Educational Resources Information Center
Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.
This task-based curriculum guide for agricultural production, specifically for dairy livestock, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task…
ERIC Educational Resources Information Center
Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.
This task-based curriculum guide for agricultural production, specifically for sheep, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task list. Each…
ERIC Educational Resources Information Center
Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.
This task-based curriculum guide for agricultural production, specifically for beef livestock, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task…
Agriculture. Poultry Livestock.
ERIC Educational Resources Information Center
Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.
This task-based curriculum guide for agricultural production, specifically for poultry, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task list.…
ERIC Educational Resources Information Center
Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.
This task-based curriculum guide for agricultural production, specifically for swine, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task list. Each…
37 CFR 1.76 - Application data sheet.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false Application data sheet. 1.76 Section 1.76 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Specification § 1.76...
Hatfield, Mark D; Ashton, Carol M; Bass, Barbara L; Shirkey, Beverly A
2016-02-01
Methods to assess a surgeon's individual performance based on clinically meaningful outcomes have not been fully developed, due to small numbers of adverse outcomes and wide variation in case volumes. The Achievable Benchmark of Care (ABC) method addresses these issues by identifying benchmark-setting surgeons with high levels of performance and greater case volumes. This method was used to help surgeons compare their surgical practice to that of their peers by using merged National Surgical Quality Improvement Program (NSQIP) and Metabolic and Bariatric Surgery Accreditation and Quality Improvement Program (MBSAQIP) data to generate surgeon-specific reports. A retrospective cohort study at a single institution's department of surgery was conducted involving 107 surgeons (8,660 cases) over 5.5 years. Stratification of more than 32,000 CPT codes into 16 CPT clusters served as the risk adjustment. Thirty-day outcomes of interest included surgical site infection (SSI), acute kidney injury (AKI), and mortality. Performance characteristics of the ABC method were explored by examining how many surgeons were identified as benchmark-setters in view of volume and outcome rates within CPT clusters. For the data captured, most surgeons performed cases spanning a median of 5 CPT clusters (range 1 to 15 clusters), with a median of 26 cases (range 1 to 776 cases) and a median of 2.8 years (range 0 to 5.5 years). The highest volume surgeon for that CPT cluster set the benchmark for 6 of 16 CPT clusters for SSIs, 8 of 16 CPT clusters for AKIs, and 9 of 16 CPT clusters for mortality. The ABC method appears to be a sound and useful approach to identifying benchmark-setting surgeons within a single institution. Such surgeons may be able to help their peers improve their performance. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Modelling the climate and ice sheets of the mid-Pliocene warm period: a test of model dependency
NASA Astrophysics Data System (ADS)
Dolan, Aisling; Haywood, Alan; Lunt, Daniel; Hill, Daniel
2010-05-01
The mid-Pliocene warm period (MPWP; c. 3.0 - 3.3 million years ago) has been the subject of a large number of published studies during the last decade. It is an interval in Earth history, where conditions were similar to those predicted by climate models for the end of the 21st Century. Not only is it important to increase our understanding of the climate dynamics in a warmer world, it is also important to determine exactly how well numerical models can retrodict a climate significantly different from the present day, in order to have confidence in them for predicting the future climate. Previous General Circulation Model (GCM) simulations have indicated that MPWP mean annual surface temperatures were on average 2 to 3˚C warmer than the pre-industrial era. Coastal stratigraphy and benthic oxygen isotope records suggest that terrestrial ice volumes were reduced when compared to modern. Ice sheet modelling studies have supported this decrease in cryospheric extent. Generally speaking, both climate and ice sheet modelling studies have only used results from one numerical model when simulating the climate of the MPWP. However, recent projects such as PMIP (the Palaeoclimate Modelling Intercomparison Project) have emphasised the need to explore the dependency of past climate predictions on the specific climate model which is used. Here we present a comparison of MPWP climatologies produced by three atmosphere only GCMs from the Goddard Institute of Space Studies (GISS), the National Centre for Atmospheric Research (NCAR) and the Hadley Centre for Climate Prediction and Research (GCMAM3, CAM3-CLM and HadAM3 respectively). We focus on the ability of the GCMs to simulate climate fields needed to drive an offline ice sheet model to assess whether there are any significant differences between the climatologies. By taking the different temperature and precipitation predictions simulated by the three models as a forcing, and adopting GCM-specific topography, we have used the British Antarctic Survey thermomechanically coupled ice sheet model (BASISM) to test the extent to which equilibrium state ice sheets in the Northern Hemisphere are GCM dependent. Initial results which do not use GCM-specific topography suggest that employing different GCM climatologies with only small differences in surface air temperature and precipitation has a dramatic effect on the resultant Greenland ice sheet, where the end-member ice sheets vary from near modern to almost zero ice volume. As an extension of this analysis, we will also present results using a second ice sheet model (Glimmer), with a view to testing the degree to which end-member ice sheets are ice sheet model dependent, something which has not previously been addressed. Initially, BASISM and Glimmer will be internally optimised for performance, but we will also present a comparison where BASISM will be configured to the Glimmer model setup in a further test of ice sheet model dependency.
2014-01-09
High-performance transparent and stretchable all-solid supercapacitors based on highly aligned carbon nanotube sheets Tao Chen1, Huisheng Peng2...stretchable all-solid supercapacitors with a good stability were developed. A transmittance up to 75% at the wavelength of 550 nmwas achieved for a...supercapacitormade from a cross-over assembly of two single-layer CNT sheets. The transparent supercapacitor has a specific capacitance of 7.3 F g21 and can be
27. "SITE PLAN." Specifications No. OC15775, Drawing No. AF600915, sheet ...
27. "SITE PLAN." Specifications No. OC1-57-75, Drawing No. AF-60-09-15, sheet 1 of 96, D.O. Series No. AF 1394/20, Rev. B. Stamped: RECORD DRAWING - AS CONSTRUCTED. Below stamp: Contract no. 5296 Rev. B, Date: 11/17/59. Site plan of 20,000-foot track, including construction phasing notes. - Edwards Air Force Base, South Base Sled Track, Edwards Air Force Base, North of Avenue B, between 100th & 140th Streets East, Lancaster, Los Angeles County, CA
Fact Sheet: Vulnerable Young Children
ERIC Educational Resources Information Center
Shaw, Evelyn, Comp.; Goode, Sue, Comp.
2008-01-01
This fact sheet provides data on infants, toddlers and young children who are experiencing high stress as a result of a number of risk factors specifically identified in the Individuals with Disabilities Education Improvement Act of 2004 (IDEA 2004), including substantiated abuse or neglect, foster care placement, homelessness, exposure to family…
Machine Shop Practice. Trade and Industrial Education Course of Study.
ERIC Educational Resources Information Center
Emerly, Robert J.; And Others
Designed for secondary school students who are interested in becoming machinists, this beginning course guide in machine shop practice is organized into the following sections: (1) Introduction, (2) instructional plan, (3) educational philosophy, (4) specific course objectives, (5) course outline, (6) job sheets, and (7) operation sheets. The…
14. Photographic copy of photocopy of bridge drawing, reinforced rod ...
14. Photographic copy of photocopy of bridge drawing, reinforced rod specifications (June 12, 1937, original drawing on file in Structures Section, Utah Department of Transportation, Salt Lake City, Utah). SHEET NO. 6 OF 6 SHEETS. - Gould Wash Bridge, Spanning Gould Wash at State Route 9, Hurricane, Washington County, UT
46 CFR 160.005-4 - Construction.
Code of Federal Regulations, 2010 CFR
2010-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Life Preservers, Fibrous Glass, Adult and Child (Jacket Type.... 160.005-1, Sheet 1, for adult size, and Sheet 4, for child size, joined by seams and stitching as shown on the drawing. A drawstring tunnel shall be formed by stitching a strip of the tunnel strip...
Sakai, Yusuke; Koike, Makiko; Hasegawa, Hideko; Yamanouchi, Kosho; Soyama, Akihiko; Takatsuki, Mitsuhisa; Kuroki, Tamotsu; Ohashi, Kazuo; Okano, Teruo; Eguchi, Susumu
2013-01-01
Cell sheet engineering is attracting attention from investigators in various fields, from basic research scientists to clinicians focused on regenerative medicine. However, hepatocytes have a limited proliferation potential in vitro, and it generally takes a several days to form a sheet morphology and multi-layered sheets. We herein report our rapid and efficient technique for generating multi-layered human hepatic cell (HepaRG® cell) sheets using pre-cultured fibroblast monolayers derived from human skin (TIG-118 cells) as a feeder layer on a temperature-responsive culture dish. Multi-layered TIG-118/HepaRG cell sheets with a thick morphology were harvested on day 4 of culturing HepaRG cells by forceful contraction of the TIG-118 cells, and the resulting sheet could be easily handled. In addition, the human albumin and alpha 1-antitrypsin synthesis activities of TIG-118/HepaRG cells were approximately 1.2 and 1.3 times higher than those of HepaRG cells, respectively. Therefore, this technique is considered to be a promising modality for rapidly fabricating multi-layered human hepatocyte sheets from cells with limited proliferation potential, and the engineered cell sheet could be used for cell transplantation with highly specific functions. PMID:23923035
The influence of meltwater on the thermal structure and flow of the Greenland Ice Sheet
NASA Astrophysics Data System (ADS)
Poinar, Kristin
As the climate has warmed over the past decades, the amount of melt on the Greenland Ice Sheet has increased, and areas higher on the ice sheet have begun to melt regularly. This increase in melt has been hypothesized to enhance ice flow in myriad ways, including through basal lubrication and englacial refreezing. By developing and interpreting thermal ice-sheet models and analyzing remote sensing data, I evaluate the effect of these processes on ice flow and sea-level rise from the Greenland Ice Sheet. I first develop a thermal ice sheet model that is applicable to western Greenland. Key components of this model are its treatment of multiple phases (solid ice and liquid water) and its viscosity-dependent velocity field. I apply the model to Jakobshavn Isbrae, a fast-flowing outlet glacier. This is an important benchmark for my model, which I next apply to the topics outlined above. I use the thermal model to calculate the effect of englacial latent-heat transfer (meltwater refreezing within englacial features such as firn and crevasses) on ice dynamics in western Greenland. I find that in slow-moving areas, this can significantly warm the ice, but that englacial latent heat transfer has only a minimal effect on ice motion (60%) of the ice flux into the ocean, evidence of deep englacial warming is virtually absent. Thus, the effects of englacial latent heat transfer on ice motion are likely limited to slow-moving regions, which limits its importance to ice-sheet mass balance. Next, I couple a model for ice fracture to a modified version of my thermal model to calculate the depth and shape evolution of water-filled crevasses that form in crevasse fields. At most elevations and for typical water input volumes, crevasses penetrate to the top ~200--300 meters depth, warm the ice there by ~10°C, and may persist englacially, in a liquid state, for multiple decades. The surface hydrological network limits the amount of water that can reach most crevasses. We find that the depth and longevity of such crevasses is relatively robust to realistic increases in melt volumes over the coming century, so that we should not expect large changes in the englacial hydrological system under near-future climate regimes. These inferences put important constraints on the timescales of the Greenland supraglacial-to-subglacial water cycle. Finally, I assess the likelihood that higher-elevation surface melt could deliver water to regions where the bed is currently frozen. This hypothetical process is important because it could potentially greatly accelerate the seaward motion of the ice sheet. By analyzing surface strain rates and comparing them to my modeled basal temperature field, I find that this scenario is unlikely to occur: the conditions necessary to form surface-to-bed conduits are rarely found at higher elevations (~1600 meters) that may overlie frozen beds.
Chemotherapy Extravasation: Establishing a National Benchmark for Incidence Among Cancer Centers.
Jackson-Rose, Jeannette; Del Monte, Judith; Groman, Adrienne; Dial, Linda S; Atwell, Leah; Graham, Judy; O'Neil Semler, Rosemary; O'Sullivan, Maryellen; Truini-Pittman, Lisa; Cunningham, Terri A; Roman-Fischetti, Lisa; Costantinou, Eileen; Rimkus, Chris; Banavage, Adrienne J; Dietz, Barbara; Colussi, Carol J; Catania, Kimberly; Wasko, Michelle; Schreffler, Kevin A; West, Colleen; Siefert, Mary Lou; Rice, Robert David
2017-08-01
Given the high-risk nature and nurse sensitivity of chemotherapy infusion and extravasation prevention, as well as the absence of an industry benchmark, a group of nurses studied oncology-specific nursing-sensitive indicators. . The purpose was to establish a benchmark for the incidence of chemotherapy extravasation with vesicants, irritants, and irritants with vesicant potential. . Infusions with actual or suspected extravasations of vesicant and irritant chemotherapies were evaluated. Extravasation events were reviewed by type of agent, occurrence by drug category, route of administration, level of harm, follow-up, and patient referrals to surgical consultation. . A total of 739,812 infusions were evaluated, with 673 extravasation events identified. Incidence for all extravasation events was 0.09%.
Ontology for Semantic Data Integration in the Domain of IT Benchmarking.
Pfaff, Matthias; Neubig, Stefan; Krcmar, Helmut
2018-01-01
A domain-specific ontology for IT benchmarking has been developed to bridge the gap between a systematic characterization of IT services and their data-based valuation. Since information is generally collected during a benchmark exercise using questionnaires on a broad range of topics, such as employee costs, software licensing costs, and quantities of hardware, it is commonly stored as natural language text; thus, this information is stored in an intrinsically unstructured form. Although these data form the basis for identifying potentials for IT cost reductions, neither a uniform description of any measured parameters nor the relationship between such parameters exists. Hence, this work proposes an ontology for the domain of IT benchmarking, available at https://w3id.org/bmontology. The design of this ontology is based on requirements mainly elicited from a domain analysis, which considers analyzing documents and interviews with representatives from Small- and Medium-Sized Enterprises and Information and Communications Technology companies over the last eight years. The development of the ontology and its main concepts is described in detail (i.e., the conceptualization of benchmarking events, questionnaires, IT services, indicators and their values) together with its alignment with the DOLCE-UltraLite foundational ontology.
NASA Technical Reports Server (NTRS)
Nysmith, C. Robert; Summers, James L.
1961-01-01
Small pyrex glass spheres, representative of stoney meteoroids, were fired into 2024-T3 aluminum alclad multiple-sheet structures at velocities to 11,000 feet per second to evaluate the effectiveness of multisheet hull construction as a means of increasing the resistance of a spacecraft to meteoroid penetrations. The results of these tests indicate that increasing the number of sheets in a structure while keeping the total sheet thickness constant and increasing the spacing between sheets both tend to increase the penetration resistance of a structure of constant weight per unit area. In addition, filling the space between the sheets with a light filler material was found to substantially increase structure penetration resistance with a small increase in weight. An evaluation of the meteoroid hazard to space vehicles is presented in the form of an illustrative-example for two specific lunar mission vehicles, a single-sheet, monocoque hull vehicle and a glass-wool filled, double-sheet hull vehicle. The evaluation is presented in terms of the "best" and the "worst" conditions that might be expected as determined from astronomical and satellite measurements, high-speed impact data, and hypothesized meteoroid structures and compositions. It was observed that the vehicle flight time without penetration can be increased significantly by use of multiple-sheet rather than single-sheet hull construction with no increase in hull weight. Nevertheless, it is evident that a meteoroid hazard exists, even for the vehicle with the selected multiple-sheet hull.
Landercasper, Jeffrey; Fayanju, Oluwadamilola M; Bailey, Lisa; Berry, Tiffany S; Borgert, Andrew J; Buras, Robert; Chen, Steven L; Degnim, Amy C; Froman, Joshua; Gass, Jennifer; Greenberg, Caprice; Mautner, Starr Koslow; Krontiras, Helen; Ramirez, Luis D; Sowden, Michelle; Wexelman, Barbara; Wilke, Lee; Rao, Roshni
2018-02-01
Nine breast cancer quality measures (QM) were selected by the American Society of Breast Surgeons (ASBrS) for the Centers for Medicare and Medicaid Services (CMS) Quality Payment Programs (QPP) and other performance improvement programs. We report member performance. Surgeons entered QM data into an electronic registry. For each QM, aggregate "performance met" (PM) was reported (median, range and percentiles) and benchmarks (target goals) were calculated by CMS methodology, specifically, the Achievable Benchmark of Care™ (ABC) method. A total of 1,286,011 QM encounters were captured from 2011-2015. For 7 QM, first and last PM rates were as follows: (1) needle biopsy (95.8, 98.5%), (2) specimen imaging (97.9, 98.8%), (3) specimen orientation (98.5, 98.3%), (4) sentinel node use (95.1, 93.4%), (5) antibiotic selection (98.0, 99.4%), (6) antibiotic duration (99.0, 99.8%), and (7) no surgical site infection (98.8, 98.9%); all p values < 0.001 for trends. Variability and reasons for noncompliance by surgeon for each QM were identified. The CMS-calculated target goals (ABC™ benchmarks) for PM for 6 QM were 100%, suggesting that not meeting performance is a "never should occur" event. Surgeons self-reported a large number of specialty-specific patient-measure encounters into a registry for self-assessment and participation in QPP. Despite high levels of performance demonstrated initially in 2011 with minimal subsequent change, the ASBrS concluded "perfect" performance was not a realistic goal for QPP. Thus, after review of our normative performance data, the ASBrS recommended different benchmarks than CMS for each QM.
Zhan, Cheng; Zhang, Pengfei; Dai, Sheng; ...
2016-11-16
Supercapacitors based on the electric double-layer mechanism use porous carbons or graphene as electrodes. To move beyond this paradigm, we propose boron supercapacitors to leverage two-dimensional (2D) boron sheets’ metallicity and low weight. Six 2D boron sheets from both previous theoretical design and experimental growth are chosen as test electrodes. By applying joint density functional theory (JDFT) to the electrode–electrolyte system, we examine how the 2D boron sheets charge up against applied potential. JDFT predicts that these 2D boron sheets exhibit specific capacitance on the order of 400 F/g, about four times that of graphene. As a result, our workmore » suggests that 2D boron sheets are promising electrodes for supercapacitor applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhan, Cheng; Zhang, Pengfei; Dai, Sheng
Supercapacitors based on the electric double-layer mechanism use porous carbons or graphene as electrodes. To move beyond this paradigm, we propose boron supercapacitors to leverage two-dimensional (2D) boron sheets’ metallicity and low weight. Six 2D boron sheets from both previous theoretical design and experimental growth are chosen as test electrodes. By applying joint density functional theory (JDFT) to the electrode–electrolyte system, we examine how the 2D boron sheets charge up against applied potential. JDFT predicts that these 2D boron sheets exhibit specific capacitance on the order of 400 F/g, about four times that of graphene. As a result, our workmore » suggests that 2D boron sheets are promising electrodes for supercapacitor applications.« less
Imaging Neuronal Seal Resistance on Silicon Chip using Fluorescent Voltage-Sensitive Dye
Braun, Dieter; Fromherz, Peter
2004-01-01
The electrical sheet resistance between living cells grown on planar electronic contacts of semiconductors or metals is a crucial parameter for bioelectronic devices. It determines the strength of electrical signal transduction from cells to chips and from chips to cells. We measured the sheet resistance by applying AC voltage to oxidized silicon chips and by imaging the voltage change across the attached cell membrane with a fluorescent voltage-sensitive dye. The phase map of voltage change was fitted with a planar core-coat conductor model using the sheet resistance as a free parameter. For nerve cells from rat brain on polylysine as well as for HEK293 cells and MDCK cells on fibronectin we find a similar sheet resistance of 10 MΩ. Taking into account the independently measured distance of 50 nm between chip and membrane for these cells, we obtain a specific resistance of 50 Ωcm that is indistinguishable from bulk electrolyte. On the other hand, the sheet resistance for erythrocytes on polylysine is far higher, at ∼1.5 GΩ. Considering the distance of 10 nm, the specific resistance in the narrow cleft is enhanced to 1500 Ωcm. We find this novel optical method to be a convenient tool to optimize the interface between cells and chips for bioelectronic devices. PMID:15298937
Imaging neuronal seal resistance on silicon chip using fluorescent voltage-sensitive dye.
Braun, Dieter; Fromherz, Peter
2004-08-01
The electrical sheet resistance between living cells grown on planar electronic contacts of semiconductors or metals is a crucial parameter for bioelectronic devices. It determines the strength of electrical signal transduction from cells to chips and from chips to cells. We measured the sheet resistance by applying AC voltage to oxidized silicon chips and by imaging the voltage change across the attached cell membrane with a fluorescent voltage-sensitive dye. The phase map of voltage change was fitted with a planar core-coat conductor model using the sheet resistance as a free parameter. For nerve cells from rat brain on polylysine as well as for HEK293 cells and MDCK cells on fibronectin we find a similar sheet resistance of 10 MOmega. Taking into account the independently measured distance of 50 nm between chip and membrane for these cells, we obtain a specific resistance of 50 Omegacm that is indistinguishable from bulk electrolyte. On the other hand, the sheet resistance for erythrocytes on polylysine is far higher, at approximately 1.5 GOmega. Considering the distance of 10 nm, the specific resistance in the narrow cleft is enhanced to 1500 Omegacm. We find this novel optical method to be a convenient tool to optimize the interface between cells and chips for bioelectronic devices.
Brandenburg, Marcus; Hahn, Gerd J
2018-06-01
Process industries typically involve complex manufacturing operations and thus require adequate decision support for aggregate production planning (APP). The need for powerful and efficient approaches to solve complex APP problems persists. Problem-specific solution approaches are advantageous compared to standardized approaches that are designed to provide basic decision support for a broad range of planning problems but inadequate to optimize under consideration of specific settings. This in turn calls for methods to compare different approaches regarding their computational performance and solution quality. In this paper, we present a benchmarking problem for APP in the chemical process industry. The presented problem focuses on (i) sustainable operations planning involving multiple alternative production modes/routings with specific production-related carbon emission and the social dimension of varying operating rates and (ii) integrated campaign planning with production mix/volume on the operational level. The mutual trade-offs between economic, environmental and social factors can be considered as externalized factors (production-related carbon emission and overtime working hours) as well as internalized ones (resulting costs). We provide data for all problem parameters in addition to a detailed verbal problem statement. We refer to Hahn and Brandenburg [1] for a first numerical analysis based on and for future research perspectives arising from this benchmarking problem.
NASA Astrophysics Data System (ADS)
Oh, S.-T.; Chang, H.-J.; Oh, K. H.; Han, H. N.
2006-04-01
It has been observed that the forming limit curve at fracture (FLCF) of steel sheets, with a relatively higher ductility limit have linear shapes, similar to those of a bulk forming process. In contrast, the FLCF of sheets with a relatively lower ductility limit have rather complex shapes approaching the forming limit curve at neck (FLCN) towards the equi-biaxial strain paths. In this study, the FLCFs of steel sheets were measured and compared with the fracture strains predicted from specific ductile fracture criteria, including a criterion suggested by the authors, which can accurately describe FLCFs with both linear and complex shapes. To predict the forming limit for hydro-mechanical deep drawing of steel sheets, the ductile fracture criteria were integrated into a finite element simulation. The simulation, results based on the criterion suggested by authors accurately predicted the experimetal, fracture limits of steel sheets for the hydro-mechanical deep drawing process.
Fluctuation dynamics in reconnecting current sheets
NASA Astrophysics Data System (ADS)
von Stechow, Adrian; Grulke, Olaf; Ji, Hantao; Yamada, Masaaki; Klinger, Thomas
2015-11-01
During magnetic reconnection, a highly localized current sheet forms at the boundary between opposed magnetic fields. Its steep perpendicular gradients and fast parallel drifts can give rise to a range of instabilities which can contribute to the overall reconnection dynamics. In two complementary laboratory reconnection experiments, MRX (PPPL, Princeton) and VINETA.II (IPP, Greifswald, Germany), magnetic fluctuations are observed within the current sheet. Despite the large differences in geometries (toroidal vs. linear), plasma parameters (high vs. low beta) and magnetic configuration (low vs. high magnetic guide field), similar broadband fluctuation characteristics are observed in both experiments. These are identified as Whistler-like fluctuations in the lower hybrid frequency range that propagate along the current sheet in the electron drift direction. They are intrinsic to the localized current sheet and largely independent of the slower reconnection dynamics. This contribution characterizes these magnetic fluctuations within the wide parameter range accessible by both experiments. Specifically, the fluctuation spectra and wave dispersion are characterized with respect to the magnetic topology and plasma parameters of the reconnecting current sheet.
Wang, Cunjing; Wu, Dapeng; Wang, Hongju; Gao, Zhiyong; Xu, Fang; Jiang, Kai
2018-08-01
A facile potassium chloride salt-locking technique combined with hydrothermal treatment on precursors was explored to prepare nitrogen-doped hierarchical porous carbon sheets in air from biomass. Benefiting from the effective synthesis strategy, the as-obtained carbon possesses a unique nitrogen-doped thin carbon sheet structure with abundant hierarchical pores and large specific surface areas of 1459 m 2 g -1 . The doped nitrogen in carbon framework has a positive effect on the electrochemical properties of the electrode material, the thin carbon sheet structure benefits for fast ion transfer, the abundant meso-pores provide convenient channels for rapid charge transportation, large specific surface area and lots of micro-pores guarantee sufficient ion-storage sites. Therefore, applied for supercapacitors, the carbon electrode material exhibits an outstanding specific capacitance of 451 F g -1 at 0.5 A g -1 in a three-electrode system. Moreover, the assembled symmetric supercapacitor based on two identical carbon electrodes also displays high specific capacitance of 309 F g -1 at 0.5 A g -1 , excellent rate capacity and remarkable cycling stability with 99.3% of the initial capacitance retention after 10,000 cycles at 5 A -1 . The synthesis strategy avoids expensive inert gas protection and the use of corrosive KOH and toxic ZnCl 2 activated reagents, representing a promising green route to design advanced carbon electrode materials from biomass for high-capacity supercapacitors. Copyright © 2018. Published by Elsevier Inc.
Marshall, Margaret A.
2014-11-04
In the early 1970s Dr. John T. Mihalczo (team leader), J.J. Lynn, and J.R. Taylor performed experiments at the Oak Ridge Critical Experiments Facility (ORCEF) with highly enriched uranium (HEU) metal (called Oak Ridge Alloy or ORALLOY) in an effort to recreate GODIVA I results with greater accuracy than those performed at Los Alamos National Laboratory in the 1950s. The purpose of the Oak Ridge ORALLOY Sphere (ORSphere) experiments was to estimate the unreflected and unmoderated critical mass of an idealized sphere of uranium metal corrected to a density, purity, and enrichment such that it could be compared with themore » GODIVA I experiments. Additionally, various material reactivity worths, the surface material worth coefficient, the delayed neutron fraction, the prompt neutron decay constant, relative fission density, and relative neutron importance were all measured. The critical assembly, material reactivity worths, the surface material worth coefficient, and the delayed neutron fraction were all evaluated as benchmark experiment measurements. The reactor physics measurements are the focus of this paper; although for clarity the critical assembly benchmark specifications are briefly discussed.« less
Contributions to Integral Nuclear Data in ICSBEP and IRPhEP since ND 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Briggs, J. Blair; Gulliford, Jim
2016-09-01
The status of the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) was last discussed directly with the international nuclear data community at ND2013. Since ND2013, integral benchmark data that are available for nuclear data testing has continued to increase. The status of the international benchmark efforts and the latest contributions to integral nuclear data for testing is discussed. Select benchmark configurations that have been added to the ICSBEP and IRPhEP Handbooks since ND2013 are highlighted. The 2015 edition of the ICSBEP Handbook now contains 567 evaluations with benchmark specifications for 4,874more » critical, near-critical, or subcritical configurations, 31 criticality alarm placement/shielding configuration with multiple dose points apiece, and 207 configurations that have been categorized as fundamental physics measurements that are relevant to criticality safety applications. The 2015 edition of the IRPhEP Handbook contains data from 143 different experimental series that were performed at 50 different nuclear facilities. Currently 139 of the 143 evaluations are published as approved benchmarks with the remaining four evaluations published in draft format only. Measurements found in the IRPhEP Handbook include criticality, buckling and extrapolation length, spectral characteristics, reactivity effects, reactivity coefficients, kinetics, reaction-rate distributions, power distributions, isotopic compositions, and/or other miscellaneous types of measurements for various types of reactor systems. Annual technical review meetings for both projects were held in April 2016; additional approved benchmark evaluations will be included in the 2016 editions of these handbooks.« less
Benchmarking routine psychological services: a discussion of challenges and methods.
Delgadillo, Jaime; McMillan, Dean; Leach, Chris; Lucock, Mike; Gilbody, Simon; Wood, Nick
2014-01-01
Policy developments in recent years have led to important changes in the level of access to evidence-based psychological treatments. Several methods have been used to investigate the effectiveness of these treatments in routine care, with different approaches to outcome definition and data analysis. To present a review of challenges and methods for the evaluation of evidence-based treatments delivered in routine mental healthcare. This is followed by a case example of a benchmarking method applied in primary care. High, average and poor performance benchmarks were calculated through a meta-analysis of published data from services working under the Improving Access to Psychological Therapies (IAPT) Programme in England. Pre-post treatment effect sizes (ES) and confidence intervals were estimated to illustrate a benchmarking method enabling services to evaluate routine clinical outcomes. High, average and poor performance ES for routine IAPT services were estimated to be 0.91, 0.73 and 0.46 for depression (using PHQ-9) and 1.02, 0.78 and 0.52 for anxiety (using GAD-7). Data from one specific IAPT service exemplify how to evaluate and contextualize routine clinical performance against these benchmarks. The main contribution of this report is to summarize key recommendations for the selection of an adequate set of psychometric measures, the operational definition of outcomes, and the statistical evaluation of clinical performance. A benchmarking method is also presented, which may enable a robust evaluation of clinical performance against national benchmarks. Some limitations concerned significant heterogeneity among data sources, and wide variations in ES and data completeness.
Medicare Part D Roulette: Potential Implications of Random Assignment and Plan Restrictions
Patel, Rajul A.; Walberg, Mark P.; Woelfel, Joseph A.; Amaral, Michelle M.; Varu, Paresh
2013-01-01
Background Dual-eligible (Medicare/Medicaid) beneficiaries are randomly assigned to a benchmark plan, which provides prescription drug coverage under the Part D benefit without consideration of their prescription drug profile. To date, the potential for beneficiary assignment to a plan with poor formulary coverage has been minimally studied and the resultant financial impact to beneficiaries unknown. Objective We sought to determine cost variability and drug use restrictions under each available 2010 California benchmark plan. Methods Dual-eligible beneficiaries were provided Part D plan assistance during the 2010 annual election period. The Medicare Web site was used to determine benchmark plan costs and prescription utilization restrictions for each of the six California benchmark plans available for random assignment in 2010. A standardized survey was used to record all de-identified beneficiary demographic and plan specific data. For each low-income subsidy-recipient (n = 113), cost, rank, number of non-formulary medications, and prescription utilization restrictions were recorded for each available 2010 California benchmark plan. Formulary matching rates (percent of beneficiary's medications on plan formulary) were calculated for each benchmark plan. Results Auto-assigned beneficiaries had only a 34% chance of being assigned to the lowest cost plan; the remainder faced potentially significant avoidable out-of-pocket costs. Wide variations between benchmark plans were observed for plan cost, formulary coverage, formulary matching rates, and prescription utilization restrictions. Conclusions Beneficiaries had a 66% chance of being assigned to a sub-optimal plan; thereby, they faced significant avoidable out-of-pocket costs. Alternative methods of beneficiary assignment could decrease beneficiary and Medicare costs while also reducing medication non-compliance. PMID:24753963
14. VIEW OF METAL ROLLING OPERATION. THE METALS ARE BEING ...
14. VIEW OF METAL ROLLING OPERATION. THE METALS ARE BEING PREPARED TO BE ROLLED INTO SHEETS OF SPECIFIC THICKNESS. COMPONENT PARTS WERE FABRICATED FROM THE METAL SHEETS. (11/82) - Rocky Flats Plant, Uranium Rolling & Forming Operations, Southeast section of plant, southeast quadrant of intersection of Central Avenue & Eighth Street, Golden, Jefferson County, CO
Agricultural Science I. Supplementary Units. Instructor Information.
ERIC Educational Resources Information Center
Martin, Donna; And Others
These supplementary units are designed to help students with special needs learn and apply agricultural skills in the areas of animal breeding, animal nutrition, leadership, and power tools. Specific competencies are listed as study questions at the beginning of each of the 10 self-paced and self-contained units. Skill sheets, activity sheets, and…
Code of Federal Regulations, 2011 CFR
2011-07-01
... exist. (2) Agency-acquired motion picture films: Two projection prints in good condition or one projection print and one videotape. (3) Unedited footage, outtakes, and trims (the discards of film...: (1) Existing finding aids such as data sheets, shot lists, continuities, review sheets, catalogs...
Code of Federal Regulations, 2014 CFR
2014-07-01
... exist. (2) Agency-acquired motion picture films: Two projection prints in good condition or one projection print and one videotape. (3) Unedited footage, outtakes, and trims (the discards of film...: (1) Existing finding aids such as data sheets, shot lists, continuities, review sheets, catalogs...
Code of Federal Regulations, 2013 CFR
2013-07-01
... exist. (2) Agency-acquired motion picture films: Two projection prints in good condition or one projection print and one videotape. (3) Unedited footage, outtakes, and trims (the discards of film...: (1) Existing finding aids such as data sheets, shot lists, continuities, review sheets, catalogs...
Code of Federal Regulations, 2012 CFR
2012-07-01
... exist. (2) Agency-acquired motion picture films: Two projection prints in good condition or one projection print and one videotape. (3) Unedited footage, outtakes, and trims (the discards of film...: (1) Existing finding aids such as data sheets, shot lists, continuities, review sheets, catalogs...
Code of Federal Regulations, 2010 CFR
2010-07-01
... exist. (2) Agency-acquired motion picture films: Two projection prints in good condition or one projection print and one videotape. (3) Unedited footage, outtakes, and trims (the discards of film...: (1) Existing finding aids such as data sheets, shot lists, continuities, review sheets, catalogs...
46 CFR 160.047-4 - Construction.
Code of Federal Regulations, 2010 CFR
2010-10-01
... and Child § 160.047-4 Construction. (a) General. This specification covers buoyant vests which... pattern shown on Dwg. No. 160.047-1, Sheet 1, for adult size, and Sheets 2 and 3 for child sizes, and sewed with seams and stitching as shown on the drawing. Three compartments shall be formed to hold the...
46 CFR 160.002-4 - Construction.
Code of Federal Regulations, 2010 CFR
2010-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Life Preservers, Kapok, Adult and Child (Jacket Type), Models 3...-49-6-1, Sheet 1, for adult size, and Dwg. F-49-6-5, Sheet 1, for child size, joined by seams and stitching as shown on the drawing. A drawstring tunnel shall be formed by stitching a strip of the tunnel...
Materials Safety Data Sheets: the basis for control of toxic chemicals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ketchen, E.E.; Porter, W.E.
1979-09-01
The Material Safety Data Sheets contained in this volume are the basis for the Toxic Chemical Control Program developed by the Industrial Hygiene Department, Health Division, ORNL. The three volumes are the update and expansion of ORNL/TM-5721 and ORNL/TM-5722 Material Safety Data Sheets: The Basis for Control of Toxic Chemicals, Volume I and Volume II. As such, they are a valuable adjunct to the data cards issued with specific chemicals. The chemicals are identified by name, stores catalog number where appropriate, and sequence numbers from the NIOSH Registry of Toxic Effects of Chemical Substances, 1977 Edition, if available. The datamore » sheets were developed and compiled to aid in apprising the employees of hazards peculiar to the handling and/or use of specific toxic chemicals. Space limitation necessitate the use of descriptive medical terms and toxicological abbreviations. A glossary and an abbreviation list were developed to define some of those sometimes unfamiliar terms and abbreviations. The page numbers are keyed to the catalog number in the chemical stores at ORNL.« less
Materials Safety Data Sheets: the basis for control of toxic chemicals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ketchen, E.E.; Porter, W.E.
The Material Safety Data Sheets contained in this volume are the basis for the Toxic Chemical Control Program developed by the Industrial Hygiene Department, Health Division, ORNL. The three volumes are the update and expansion of ORNL/TM-5721 and ORNL/TM-5722 Material Safety Data Sheets: The Basis for Control of Toxic Chemicals, Volume I and Volume II. As such, they are a valuable adjunct to the data cards issued with specific chemicals. The chemicals are identified by name, stores catalog number where appropriate, and sequence numbers from the NIOSH Registry of Toxic Effects of Chemical Substances, 1977 Edition, if available. The datamore » sheets were developed and compiled to aid in apprising the employees of hazards peculiar to the handling and/or use of specific toxic chemicals. Space limitation necessitate the use of descriptive medical terms and toxicological abbreviations. A glossary and an abbreviation list were developed to define some of those sometimes unfamiliar terms and abbreviations. The page numbers are keyed to the catalog number in the chemical stores at ORNL.« less
NASA Astrophysics Data System (ADS)
Kahler, A. C.; MacFarlane, R. E.; Mosteller, R. D.; Kiedrowski, B. C.; Frankle, S. C.; Chadwick, M. B.; McKnight, R. D.; Lell, R. M.; Palmiotti, G.; Hiruta, H.; Herman, M.; Arcilla, R.; Mughabghab, S. F.; Sublet, J. C.; Trkov, A.; Trumbull, T. H.; Dunn, M.
2011-12-01
The ENDF/B-VII.1 library is the latest revision to the United States' Evaluated Nuclear Data File (ENDF). The ENDF library is currently in its seventh generation, with ENDF/B-VII.0 being released in 2006. This revision expands upon that library, including the addition of new evaluated files (was 393 neutron files previously, now 423 including replacement of elemental vanadium and zinc evaluations with isotopic evaluations) and extension or updating of many existing neutron data files. Complete details are provided in the companion paper [M. B. Chadwick et al., "ENDF/B-VII.1 Nuclear Data for Science and Technology: Cross Sections, Covariances, Fission Product Yields and Decay Data," Nuclear Data Sheets, 112, 2887 (2011)]. This paper focuses on how accurately application libraries may be expected to perform in criticality calculations with these data. Continuous energy cross section libraries, suitable for use with the MCNP Monte Carlo transport code, have been generated and applied to a suite of nearly one thousand critical benchmark assemblies defined in the International Criticality Safety Benchmark Evaluation Project's International Handbook of Evaluated Criticality Safety Benchmark Experiments. This suite covers uranium and plutonium fuel systems in a variety of forms such as metallic, oxide or solution, and under a variety of spectral conditions, including unmoderated (i.e., bare), metal reflected and water or other light element reflected. Assembly eigenvalues that were accurately predicted with ENDF/B-VII.0 cross sections such as unmoderated and uranium reflected 235U and 239Pu assemblies, HEU solution systems and LEU oxide lattice systems that mimic commercial PWR configurations continue to be accurately calculated with ENDF/B-VII.1 cross sections, and deficiencies in predicted eigenvalues for assemblies containing selected materials, including titanium, manganese, cadmium and tungsten are greatly reduced. Improvements are also confirmed for selected actinide reaction rates such as 236U, 238,242Pu and 241,243Am capture in fast systems. Other deficiencies, such as the overprediction of Pu solution system critical eigenvalues and a decreasing trend in calculated eigenvalue for 233U fueled systems as a function of Above-Thermal Fission Fraction remain. The comprehensive nature of this critical benchmark suite and the generally accurate calculated eigenvalues obtained with ENDF/B-VII.1 neutron cross sections support the conclusion that this is the most accurate general purpose ENDF/B cross section library yet released to the technical community.
Reconnection AND Bursty Bulk Flow Associated Turbulence IN THE Earth'S Plasma Sheet
NASA Astrophysics Data System (ADS)
Voros, Z.; Nakamura, R.; Baumjohann, W.; Runov, A.; Volwerk, M.; Jankovicova, D.; Balogh, A.; Klecker, B.
2006-12-01
Reconnection related fast flows in the Earth's plasma sheet can be associated with several accompanying phenomena, such as magnetic field dipolarization, current sheet thinning and turbulence. Statistical analysis of multi-scale properties of turbulence facilitates to understand the interaction of the plasma flow with the dipolar magnetic field and to recognize the remote or nearby temporal and spatial characteristics of reconnection. The main emphasis of this presentation is on differentiating between the specific statistical features of flow associated fluctuations at different distances from the reconnection site.
ERIC Educational Resources Information Center
Stremel, Kathleen
This document consists of three separately published fact sheets combined here because of the close relationship of their subject matter. The first fact sheet, "Communication Interactions: It Takes Two" (Kathleen Stremel), defines communication; suggests ways to find opportunities for interactive communication; offers specific suggestions for…
NASA Technical Reports Server (NTRS)
Rhodes, R. C.; Smith, E. I.
1972-01-01
Individual ash-flow sheets distributed over wide areas in the Mogollon-Datil volcanic province can be delineated and related by flow direction techniques to specific source cauldrons. Two major mid-Tertiary ash flows in the Mogollon Plateau have measurable microscopic directional fabric indicative of primary flow direction imprinted in the ash-flow sheets during late-stage laminar flow. Regional stratigraphic relationships and flow patterns of the ash-flow sheets indicate a late Tertiary origin of the Mogollon Plateau depression. They also show that Basin-Range faulting in southwestern New Mexico was not initiated until after emplacement of the younger ash flow (23 m.y. B.P.). Directional fabric is an inherent property of many calc-alkalic ash-flow sheets and measurement of preferred orientation provides a powerful tool in unravelling the geologic history of complex volcanic terrane.
NASA Technical Reports Server (NTRS)
Anderson, John B.
1991-01-01
Some of the questions to be addressed by SeaRISE include: (1) what was the configuration of the West Antarctic ice sheet during the last glacial maximum; (2) What is its configuration during a glacial minimum; and (3) has it, or any marine ice sheet, undergone episodic rapid mass wasting. These questions are addressed in terms of what is known about the history of the marine ice sheet, specifically in Ross Sea, and what further studies are required to resolve these problems. A second question concerns the extent to which disintegration of marine ice sheets may result in rises in sea level that are episodic in nature and extremely rapid, as suggested by several glaciologists. Evidence that rapid, episodic sea level changes have occurred during the Holocene is also reviewed.
Shamsir, Mohd S.; Dalby, Andrew R.
2007-01-01
Previous molecular dynamic simulations have reported elongation of the existing β-sheet in prion proteins. Detailed examination has shown that these elongations do not extend beyond the proline residues flanking these β-sheets. In addition, proline has also been suggested to possess a possible structural role in preserving protein interaction sites by preventing invasion of neighboring secondary structures. In this work, we have studied the possible structural role of the flanking proline residues by simulating mutant structures with alternate substitution of the proline residues with valine. Simulations showed a directional inhibition of elongation, with the elongation progressing in the direction of valine including evident inhibition of elongation by existing proline residues. This suggests that the flanking proline residues in prion proteins may have a containment role and would confine the β-sheet within a specific length. PMID:17172295
Campbell, Norm R C; Lackland, Daniel T; Lisheng, Liu; Niebylski, Mark L; Nilsson, Peter M; Zhang, Xin-Hua
2015-03-01
Increased blood pressure and high dietary salt are leading risks for death and disability globally. Reducing the burden of both health risks are United Nations' targets for reducing noncommunicable disease. Nongovernmental organizations and individuals can assist by ensuring widespread dissemination of the best available facts and recommended interventions for both health risks. Simple but impactful fact sheets can be useful for informing the public, healthcare professionals, and policy makers. The World Hypertension League has developed fact sheets on dietary salt and hypertension but in many circumstances the greatest impact would be obtained from national-level fact sheets. This manuscript provides instructions and a template for developing fact sheets based on the Global Burden of Disease study and national survey data. ©2015 Wiley Periodicals, Inc.
In Vitro Engineering of Vascularized Tissue Surrogates
Sakaguchi, Katsuhisa; Shimizu, Tatsuya; Horaguchi, Shigeto; Sekine, Hidekazu; Yamato, Masayuki; Umezu, Mitsuo; Okano, Teruo
2013-01-01
In vitro scaling up of bioengineered tissues is known to be limited by diffusion issues, specifically a lack of vasculature. Here, we report a new strategy for preserving cell viability in three-dimensional tissues using cell sheet technology and a perfusion bioreactor having collagen-based microchannels. When triple-layer cardiac cell sheets are incubated within this bioreactor, endothelial cells in the cell sheets migrate to vascularize in the collagen gel, and finally connect with the microchannels. Medium readily flows into the cell sheets through the microchannels and the newly developed capillaries, while the cardiac construct shows simultaneous beating. When additional triple-layer cell sheets are repeatedly layered, new multi-layer construct spontaneously integrates and the resulting construct becomes a vascularized thick tissue. These results confirmed our method to fabricate in vitro vascularized tissue surrogates that overcomes engineered-tissue thickness limitations. The surrogates promise new therapies for damaged organs as well as new in vitro tissue models. PMID:23419835
Tocilizumab: therapy and safety management.
Pham, Thao; Claudepierre, Pascal; Constantin, Arnaud; de Bandt, Michel; Fautrel, Bruno; Gossec, Laure; Gottenberg, Jacques-Eric; Goupille, Philippe; Guillaume, Séverine; Hachulla, Eric; Masson, Charles; Morel, Jacques; Puéchal, Xavier; Saraux, Alain; Schaeverbeke, Thierry; Wendling, Daniel; Bruckert, Eric; Pol, Stanislas; Mariette, Xavier; Sibilia, Jean
2010-06-01
To develop fact sheets about tocilizumab, in order to assist physicians in the management of patients with inflammatory joint disease. 1. selection by a committee of rheumatology experts of the main topics of interest for which fact sheets were desirable; 2. identification and review of publications relevant to each topic; 3. development of fact sheets based on three levels of evidence: evidence-based medicine, official recommendations, and expert opinion. The 20 experts were rheumatologists and invited specialists in other fields, and they had extensive experience with the management of RA. They were members of the CRI (Club Rhumatismes et Inflammation), a section of the Société Francaise de Rhumatologie. Each fact sheet was revised by several. experts and the overall process was coordinated by three experts. Several topics of major interest were selected: contraindications of tocilizumab; the management of adverse effects and concomitant diseases that may develop during tocilizumab therapy; and the management of everyday situations such as pregnancy, surgery, and immunizations. After a review of the literature and discussions among experts, a consensus was developed about the content of the fact sheets presented here. These fact sheets focus on several points: Several topics of major interest were selected: contraindications of tocilizumab; the management of adverse effects and concomitant diseases that may develop during tocilizumab therapy; and the management of everyday situations such as pregnancy, surgery, and immunizations. After a review of the literature and discussions among experts, a consensus was developed about the content of the fact sheets presented here. These fact sheets focus on several points: 1. in RA, initiation and monitoring of tocilizumab therapy, management of patients with specific past histories, and specific clinical situations such as pregnancy; 2. diseases other than RA, such as juvenile idiopathic arthritis; 3. models of letters for informing the rheumatologist and general practitioner; 4. and patient information. These tocilizumab fact sheets built on evidence-based medicine and expert opinion will serve as a practical tool for assisting physicians who manage patients on tocilizumab therapy. They will be available continuously at www.cri-net.com and updated at appropriate intervals. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.
A novel closed cell culture device for fabrication of corneal epithelial cell sheets.
Nakajima, Ryota; Kobayashi, Toyoshige; Moriya, Noboru; Mizutani, Manabu; Kan, Kazutoshi; Nozaki, Takayuki; Saitoh, Kazuo; Yamato, Masayuki; Okano, Teruo; Takeda, Shizu
2015-11-01
Automation technology for cell sheet-based tissue engineering would need to optimize the cell sheet fabrication process, stabilize cell sheet quality and reduce biological contamination risks. Biological contamination must be avoided in clinical settings. A closed culture system provides a solution for this. In the present study, we developed a closed culture device called a cell cartridge, to be used in a closed cell culture system for fabricating corneal epithelial cell sheets. Rabbit limbal epithelial cells were cultured on the surface of a porous membrane with 3T3 feeder cells, which are separate from the epithelial cells in the cell cartridges and in the cell-culture inserts as a control. To fabricate the stratified cell sheets, five different thicknesses of the membranes which were welded to the cell cartridge, were examined. Multilayered corneal epithelial cell sheets were fabricated in cell cartridges that were welded to a 25 µm-thick gas-permeable membrane, which was similar to the results with the cell-culture inserts. However, stratification of corneal epithelial cell sheets did not occur with cell cartridges that were welded to 100-300 µm-thick gas-permeable membranes. The fabricated cell sheets were evaluated by histological analyses to examine the expression of corneal epithelial-specific markers. Immunohistochemical analyses showed that a putative stem cell marker, p63, a corneal epithelial differentiation maker, CK3, and a barrier function marker, Claudin-1, were expressed in the appropriate position in the cell sheets. These results suggest that the cell cartridge is effective for fabricating corneal epithelial cell sheets. Copyright © 2012 John Wiley & Sons, Ltd.
Interhemispheric ice-sheet synchronicity during the Last Glacial Maximum
NASA Astrophysics Data System (ADS)
Weber, M. E.; Clark, P. U.; Ricken, W.; Mitrovica, J. X.; Hostetler, S. W.; Kuhn, G.
2012-04-01
The timing of the last maximum extent of the Antarctic ice sheets relative to those in the Northern Hemisphere remains poorly understood because only a few findings with robust chronologies exist for Antarctic ice sheets. We developed a chronology for the Weddell Sea sector of the East Antarctic ice sheet that, combined with ages from other Antarctic ice-sheet sectors, indicates the advance to their maximum extent at 29 -28 ka, and retreat from their maximum extent at 19 ka was nearly synchronous with Northern Hemisphere ice sheets (Weber, M.E., Clark, P. U., Ricken, W., Mitrovica, J. X., Hostetler, S. W., and Kuhn, G. (2011): Interhemispheric ice-sheet synchronicity during the Last Glacial Maximum. - Science, 334, 1265-1269, doi: 10.1126:science.1209299). As for the deglaciation, modeling studies suggest a late ice-sheet retreat starting around 14 ka BP and ending around 7 ka BP with a large impact of an unstable West Antarctic Ice Sheet (WAIS) and a small impact of a stable East Antarctic Ice Sheet (EAIS). However, the Weddell Sea sites studied here, as well as sites from the Scotia Sea, provide evidence that specifically the EAIS responded much earlier, possibly provided a significant contribution to the last sea-level rise, and was much more dynamic than previously thought. Using the results of an atmospheric general circulation we conclude that surface climate forcing of Antarctic ice mass balance would likely cause an opposite response, whereby a warming climate would increase accumulation but not surface melting. Furthermore, our new data support teleconnections involving a sea-level fingerprint forced from Northern Hemisphere ice sheets as indicated by gravitational modeling. Also, changes in North Atlantic Deepwater formation and attendant heat flux to Antarctic grounding lines may have contributed to synchronizing the hemispheric ice sheets.
Wall, Jonathan S; Williams, Angela; Richey, Tina; Stuckey, Alan; Wooliver, Craig; Christopher Scott, J; Donnell, Robert; Martin, Emily B; Kennel, Stephen J
2017-10-01
The heparin-reactive, helical peptide p5 is an effective amyloid imaging agent in mice with systemic amyloidosis. Analogs of p5 with modified secondary structure characteristics exhibited altered binding to heparin, synthetic amyloid fibrils, and amyloid extracts in vitro. Herein, we further study the effects of peptide helicity and chirality on specific amyloid binding using a mouse model of systemic inflammation-associated (AA) amyloidosis. Peptides with disrupted helical structure [p5 (coil) and p5 (Pro3) ], with an extended sheet conformation [p5 (sheet) ] or an all-D enantiomer [p5 (D) ], were chemically synthesized, radioiodinated, and their biodistribution studied in WT mice as well as transgenic animals with severe systemic AA amyloidosis. Peptide binding was assessed qualitatively by using small animal single-photon emission computed tomography/x-ray computed tomography imaging and microautoradiography and quantitatively using tissue counting. Peptides with reduced helical propensity, p5 (coil) and p5 (Pro3) , exhibited significantly reduced binding to AA amyloid-laden organs. In contrast, peptide p5 (D) was retained by non-amyloid-related ligands in the liver and kidneys of both WT and AA mice, but it also bound AA amyloid in the spleen. The p5 (sheet) peptide specifically bound AA amyloid in vivo and was not retained by healthy tissues in WT animals. Modification of amyloid-targeting peptides using D-amino acids should be performed cautiously due to the introduction of unexpected secondary pharmacologic effects. Peptides that adopt a helical structure, to align charged amino acid side chains along one face, exhibit specific reactivity with amyloid; however, polybasic peptides with a propensity for β-sheet conformation are also amyloid-reactive and may yield a novel class of amyloid-targeting agents for imaging and therapy.
ER sheet persistence is coupled to myosin 1c–regulated dynamic actin filament arrays
Joensuu, Merja; Belevich, Ilya; Rämö, Olli; Nevzorov, Ilya; Vihinen, Helena; Puhka, Maija; Witkos, Tomasz M.; Lowe, Martin; Vartiainen, Maria K.; Jokitalo, Eija
2014-01-01
The endoplasmic reticulum (ER) comprises a dynamic three-dimensional (3D) network with diverse structural and functional domains. Proper ER operation requires an intricate balance within and between dynamics, morphology, and functions, but how these processes are coupled in cells has been unclear. Using live-cell imaging and 3D electron microscopy, we identify a specific subset of actin filaments localizing to polygons defined by ER sheets and tubules and describe a role for these actin arrays in ER sheet persistence and, thereby, in maintenance of the characteristic network architecture by showing that actin depolymerization leads to increased sheet fluctuation and transformations and results in small and less abundant sheet remnants and a defective ER network distribution. Furthermore, we identify myosin 1c localizing to the ER-associated actin filament arrays and reveal a novel role for myosin 1c in regulating these actin structures, as myosin 1c manipulations lead to loss of the actin filaments and to similar ER phenotype as observed after actin depolymerization. We propose that ER-associated actin filaments have a role in ER sheet persistence regulation and thus support the maintenance of sheets as a stationary subdomain of the dynamic ER network. PMID:24523293
ER sheet persistence is coupled to myosin 1c-regulated dynamic actin filament arrays.
Joensuu, Merja; Belevich, Ilya; Rämö, Olli; Nevzorov, Ilya; Vihinen, Helena; Puhka, Maija; Witkos, Tomasz M; Lowe, Martin; Vartiainen, Maria K; Jokitalo, Eija
2014-04-01
The endoplasmic reticulum (ER) comprises a dynamic three-dimensional (3D) network with diverse structural and functional domains. Proper ER operation requires an intricate balance within and between dynamics, morphology, and functions, but how these processes are coupled in cells has been unclear. Using live-cell imaging and 3D electron microscopy, we identify a specific subset of actin filaments localizing to polygons defined by ER sheets and tubules and describe a role for these actin arrays in ER sheet persistence and, thereby, in maintenance of the characteristic network architecture by showing that actin depolymerization leads to increased sheet fluctuation and transformations and results in small and less abundant sheet remnants and a defective ER network distribution. Furthermore, we identify myosin 1c localizing to the ER-associated actin filament arrays and reveal a novel role for myosin 1c in regulating these actin structures, as myosin 1c manipulations lead to loss of the actin filaments and to similar ER phenotype as observed after actin depolymerization. We propose that ER-associated actin filaments have a role in ER sheet persistence regulation and thus support the maintenance of sheets as a stationary subdomain of the dynamic ER network.
DOT National Transportation Integrated Search
2010-09-01
This report presents data and technical analyses for Texas Department of Transportation Project 0-5235. This : project focused on the evaluation of traffic sign sheeting performance in terms of meeting the nighttime : driver needs. The goal was to de...
46 CFR 160.047-1 - Incorporation by reference.
Code of Federal Regulations, 2014 CFR
2014-10-01
... reference to the following documents: (1) Federal Specification: L-P-375C—Plastic Film, Flexible, Vinyl... part of this subpart: Dwg. No. 160.047-1: Sheet 1, Rev. 2—Cutting Pattern and General Arrangement, Models AK-1, and AF-1. Sheet 2, Rev. 2—Cutting Pattern and General Arrangement, Models CKM-1 and CFM-1...
46 CFR 160.047-1 - Incorporation by reference.
Code of Federal Regulations, 2013 CFR
2013-10-01
... reference to the following documents: (1) Federal Specification: L-P-375C—Plastic Film, Flexible, Vinyl... part of this subpart: Dwg. No. 160.047-1: Sheet 1, Rev. 2—Cutting Pattern and General Arrangement, Models AK-1, and AF-1. Sheet 2, Rev. 2—Cutting Pattern and General Arrangement, Models CKM-1 and CFM-1...
46 CFR 160.047-1 - Incorporation by reference.
Code of Federal Regulations, 2012 CFR
2012-10-01
... reference to the following documents: (1) Federal Specification: L-P-375C—Plastic Film, Flexible, Vinyl... part of this subpart: Dwg. No. 160.047-1: Sheet 1, Rev. 2—Cutting Pattern and General Arrangement, Models AK-1, and AF-1. Sheet 2, Rev. 2—Cutting Pattern and General Arrangement, Models CKM-1 and CFM-1...
46 CFR 160.047-1 - Incorporation by reference.
Code of Federal Regulations, 2011 CFR
2011-10-01
... reference to the following documents: (1) Federal Specification: L-P-375C—Plastic Film, Flexible, Vinyl... part of this subpart: Dwg. No. 160.047-1: Sheet 1, Rev. 2—Cutting Pattern and General Arrangement, Models AK-1, and AF-1. Sheet 2, Rev. 2—Cutting Pattern and General Arrangement, Models CKM-1 and CFM-1...
Gels as battery separators for soluble electrode cells
NASA Technical Reports Server (NTRS)
Sheibley, D. W.; Gahn, R. F. (Inventor)
1977-01-01
Gels are formed from silica powders and hydrochloric acid. The gels are then impregnated into a polymeric foam and the resultant sheet material is then used in applications where the transport of chloride ions is desired. Specifically disclosed is the utilization of the sheet in electrically rechargeable redox flow cells which find application in bulk power storage systems.
7. FLOOR PLAN, PLOT PLAN, ELEVATIONS, SHEET NO. 11044/1 OF ...
7. FLOOR PLAN, PLOT PLAN, ELEVATIONS, SHEET NO. 1-10-44/1 OF 11. (NOTE: BUILDINGS 821-823 WERE CONSTRUCTED FROM A SINGLE SET OF PLANS. SOME DRAWINGS ARE TYPICAL OF ALL 3 STRUCTURES, CERTAIN OTHER DRAWINGS IN THE SAME SET ARE BUILDING-SPECIFIC.) - Oakland Army Base, Storehouse Type, Ukraine & Maritime Streets, Oakland, Alameda County, CA
Sucrose Treated Carbon Nanotube and Graphene Yarns and Sheets
NASA Technical Reports Server (NTRS)
Sauti, Godfrey (Inventor); Kim, Jae-Woo (Inventor); Siochi, Emilie J. (Inventor); Wise, Kristopher E. (Inventor)
2017-01-01
Consolidated carbon nanotube or graphene yarns and woven sheets are consolidated through the formation of a carbon binder formed from the dehydration of sucrose. The resulting materials, on a macro-scale are lightweight and of a high specific modulus and/or strength. Sucrose is relatively inexpensive and readily available, and the process is therefore cost-effective.
School Preparation to the Terrorist Threat. SVRC Fact Sheet
ERIC Educational Resources Information Center
School Violence Resource Center, 2004
2004-01-01
This fact sheet provides a list of "lessons learned" to assist schools in better preparing for a crisis event. The list was compiled by the Centers for Disease Control and Prevention and the U.S. Department of Education specifically to assist schools in preparing for a terrorist attack. The lessons can help schools better identify appropriate…
Escobar, Gabriel J; Baker, Jennifer M; Turk, Benjamin J; Draper, David; Liu, Vincent; Kipnis, Patricia
2017-01-01
This article is not a traditional research report. It describes how conducting a specific set of benchmarking analyses led us to broader reflections on hospital benchmarking. We reexamined an issue that has received far less attention from researchers than in the past: How variations in the hospital admission threshold might affect hospital rankings. Considering this threshold made us reconsider what benchmarking is and what future benchmarking studies might be like. Although we recognize that some of our assertions are speculative, they are based on our reading of the literature and previous and ongoing data analyses being conducted in our research unit. We describe the benchmarking analyses that led to these reflections. The Centers for Medicare and Medicaid Services' Hospital Compare Web site includes data on fee-for-service Medicare beneficiaries but does not control for severity of illness, which requires physiologic data now available in most electronic medical records.To address this limitation, we compared hospital processes and outcomes among Kaiser Permanente Northern California's (KPNC) Medicare Advantage beneficiaries and non-KPNC California Medicare beneficiaries between 2009 and 2010. We assigned a simulated severity of illness measure to each record and explored the effect of having the additional information on outcomes. We found that if the admission severity of illness in non-KPNC hospitals increased, KPNC hospitals' mortality performance would appear worse; conversely, if admission severity at non-KPNC hospitals' decreased, KPNC hospitals' performance would appear better. Future hospital benchmarking should consider the impact of variation in admission thresholds.
Xu, Yuxi; Shi, Gaoquan; Duan, Xiangfeng
2015-06-16
Graphene and its derivatives are versatile building blocks for bottom-up assembly of advanced functional materials. In particular, with exceptionally large specific surface area, excellent electrical conductivity, and superior chemical/electrochemical stability, graphene represents the ideal material for various electrochemical energy storage devices including supercapacitors. However, due to the strong π-π interaction between graphene sheets, the graphene flakes tend to restack to form graphite-like powders when they are processed into practical electrode materials, which can greatly reduce the specific surface area and lead to inefficient utilization of the graphene layers for electrochemical energy storage. The self-assembly of two-dimensional graphene sheets into three-dimensional (3D) framework structures can largely retain the unique properties of individual graphene sheets and has recently garnered intense interest for fundamental investigations and potential applications in diverse technologies. In this Account, we review the recent advances in preparing 3D graphene macrostructures and exploring them as a unique platform for supercapacitor applications. We first describe the synthetic strategies, in which reduction of a graphene oxide dispersion above a certain critical concentration can induce the reduced graphene oxide sheets to cross-link with each other via partial π-π stacking interactions to form a 3D interconnected porous macrostructure. Multiple reduction strategies, including hydrothermal/solvothermal reduction, chemical reduction, and electrochemical reduction, have been developed for the preparation of 3D graphene macrostructures. The versatile synthetic strategies allow for easy incorporation of heteroatoms, carbon nanomaterials, functional polymers, and inorganic nanostructures into the macrostructures to yield diverse composites with tailored structures and properties. We then summarize the applications of the 3D graphene macrostructures for high-performance supercapacitors. With a unique framework structure in which the graphene sheets are interlocked in 3D space to prevent their restacking, the graphene macrostructures feature very high specific surface areas, rapid electron and ion transport, and superior mechanical strength. They can thus be directly used as supercapacitor electrodes with excellent specific capacitances, rate capabilities, and cycling stabilities. We finally discuss the current challenges and future opportunities in this research field. By regarding the graphene as both a single-atom-thick carbon sheet and a conjugated macromolecule, our work opens a new avenue to bottom-up self-assembly of graphene macromolecule sheets into functional 3D graphene macrostructures with remarkable electrochemical performances. We hope that this Account will promote further efforts toward fundamental investigation of graphene self-assembly and the development of advanced 3D graphene materials for their real-world applications in electrochemical energy storage devices and beyond.
NAS Parallel Benchmark Results 11-96. 1.0
NASA Technical Reports Server (NTRS)
Bailey, David H.; Bailey, David; Chancellor, Marisa K. (Technical Monitor)
1997-01-01
The NAS Parallel Benchmarks have been developed at NASA Ames Research Center to study the performance of parallel supercomputers. The eight benchmark problems are specified in a "pencil and paper" fashion. In other words, the complete details of the problem to be solved are given in a technical document, and except for a few restrictions, benchmarkers are free to select the language constructs and implementation techniques best suited for a particular system. These results represent the best results that have been reported to us by the vendors for the specific 3 systems listed. In this report, we present new NPB (Version 1.0) performance results for the following systems: DEC Alpha Server 8400 5/440, Fujitsu VPP Series (VX, VPP300, and VPP700), HP/Convex Exemplar SPP2000, IBM RS/6000 SP P2SC node (120 MHz), NEC SX-4/32, SGI/CRAY T3E, SGI Origin200, and SGI Origin2000. We also report High Performance Fortran (HPF) based NPB results for IBM SP2 Wide Nodes, HP/Convex Exemplar SPP2000, and SGI/CRAY T3D. These results have been submitted by Applied Parallel Research (APR) and Portland Group Inc. (PGI). We also present sustained performance per dollar for Class B LU, SP and BT benchmarks.
Benchmarking Big Data Systems and the BigData Top100 List.
Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann
2013-03-01
"Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.
Cereda, Carlo W; Christensen, Søren; Campbell, Bruce Cv; Mishra, Nishant K; Mlynash, Michael; Levi, Christopher; Straka, Matus; Wintermark, Max; Bammer, Roland; Albers, Gregory W; Parsons, Mark W; Lansberg, Maarten G
2016-10-01
Differences in research methodology have hampered the optimization of Computer Tomography Perfusion (CTP) for identification of the ischemic core. We aim to optimize CTP core identification using a novel benchmarking tool. The benchmarking tool consists of an imaging library and a statistical analysis algorithm to evaluate the performance of CTP. The tool was used to optimize and evaluate an in-house developed CTP-software algorithm. Imaging data of 103 acute stroke patients were included in the benchmarking tool. Median time from stroke onset to CT was 185 min (IQR 180-238), and the median time between completion of CT and start of MRI was 36 min (IQR 25-79). Volumetric accuracy of the CTP-ROIs was optimal at an rCBF threshold of <38%; at this threshold, the mean difference was 0.3 ml (SD 19.8 ml), the mean absolute difference was 14.3 (SD 13.7) ml, and CTP was 67% sensitive and 87% specific for identification of DWI positive tissue voxels. The benchmarking tool can play an important role in optimizing CTP software as it provides investigators with a novel method to directly compare the performance of alternative CTP software packages. © The Author(s) 2015.
2015-01-01
Benchmarking data sets have become common in recent years for the purpose of virtual screening, though the main focus had been placed on the structure-based virtual screening (SBVS) approaches. Due to the lack of crystal structures, there is great need for unbiased benchmarking sets to evaluate various ligand-based virtual screening (LBVS) methods for important drug targets such as G protein-coupled receptors (GPCRs). To date these ready-to-apply data sets for LBVS are fairly limited, and the direct usage of benchmarking sets designed for SBVS could bring the biases to the evaluation of LBVS. Herein, we propose an unbiased method to build benchmarking sets for LBVS and validate it on a multitude of GPCRs targets. To be more specific, our methods can (1) ensure chemical diversity of ligands, (2) maintain the physicochemical similarity between ligands and decoys, (3) make the decoys dissimilar in chemical topology to all ligands to avoid false negatives, and (4) maximize spatial random distribution of ligands and decoys. We evaluated the quality of our Unbiased Ligand Set (ULS) and Unbiased Decoy Set (UDS) using three common LBVS approaches, with Leave-One-Out (LOO) Cross-Validation (CV) and a metric of average AUC of the ROC curves. Our method has greatly reduced the “artificial enrichment” and “analogue bias” of a published GPCRs benchmarking set, i.e., GPCR Ligand Library (GLL)/GPCR Decoy Database (GDD). In addition, we addressed an important issue about the ratio of decoys per ligand and found that for a range of 30 to 100 it does not affect the quality of the benchmarking set, so we kept the original ratio of 39 from the GLL/GDD. PMID:24749745
Xia, Jie; Jin, Hongwei; Liu, Zhenming; Zhang, Liangren; Wang, Xiang Simon
2014-05-27
Benchmarking data sets have become common in recent years for the purpose of virtual screening, though the main focus had been placed on the structure-based virtual screening (SBVS) approaches. Due to the lack of crystal structures, there is great need for unbiased benchmarking sets to evaluate various ligand-based virtual screening (LBVS) methods for important drug targets such as G protein-coupled receptors (GPCRs). To date these ready-to-apply data sets for LBVS are fairly limited, and the direct usage of benchmarking sets designed for SBVS could bring the biases to the evaluation of LBVS. Herein, we propose an unbiased method to build benchmarking sets for LBVS and validate it on a multitude of GPCRs targets. To be more specific, our methods can (1) ensure chemical diversity of ligands, (2) maintain the physicochemical similarity between ligands and decoys, (3) make the decoys dissimilar in chemical topology to all ligands to avoid false negatives, and (4) maximize spatial random distribution of ligands and decoys. We evaluated the quality of our Unbiased Ligand Set (ULS) and Unbiased Decoy Set (UDS) using three common LBVS approaches, with Leave-One-Out (LOO) Cross-Validation (CV) and a metric of average AUC of the ROC curves. Our method has greatly reduced the "artificial enrichment" and "analogue bias" of a published GPCRs benchmarking set, i.e., GPCR Ligand Library (GLL)/GPCR Decoy Database (GDD). In addition, we addressed an important issue about the ratio of decoys per ligand and found that for a range of 30 to 100 it does not affect the quality of the benchmarking set, so we kept the original ratio of 39 from the GLL/GDD.
Bu, Jiyoon; Kim, Young Jun; Kang, Yoon-Tae; Lee, Tae Hee; Kim, Jeongsuk; Cho, Young-Ho; Han, Sae-Won
2017-05-01
The metastasis of cancer is strongly associated with the spread of circulating tumor cells (CTCs). Based on the microfluidic devices, which offer rapid recovery of CTCs, a number of studies have demonstrated the potential of CTCs as a diagnostic tool. However, not only the insufficient specificity and sensitivity derived from the rarity and heterogeneity of CTCs, but also the high-cost fabrication processes limit the use of CTC-based medical devices in commercial. Here, we present a low-cost fabric sheet layers for CTC isolation, which are composed of polyester monofilament yarns. Fabric sheet layers are easily functionalized with graphene oxide (GO), which is beneficial for improving both sensitivity and specificity. The GO modification to the low-cost fabrics enhances the binding of anti-EpCAM antibodies, resulting in 10-25% increase of capture efficiency compared to the surface without GO (anti-EpCAM antibodies directly onto the fabric sheets), while achieving high purity by isolating only 50-300 leukocytes in 1 mL of human blood. We investigated CTCs in ten human blood samples and successfully isolated 4-42 CTCs/mL from cancer patients, while none of cancerous cells were found among healthy donors. This remarkable results show the feasibility of GO-functionalized fabric sheet layers to be used in various CTC-based clinical applications, with high sensitivity and selectivity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Membrane Specifications for Multi-Configuration Membrane Distillation Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Villa, Daniel; Vanneste, Johan; Cath, Tzahi
The data includes the membrane properties and specifications used for multi-configuration membrane distillation desalination. In this study, membranes from CLARCOR, 3M, and Aquastill are tested in counter-current, co-current and air-gap configurations at Colorado School of Mines (CSM), Advanced Water technology Center ( Aqwatech) laboratories. In the data sheets: The "theoretical" worksheet, contains steady-state values of the experimental runs and also provides several calculated values. The "Specifications" worksheet contains the inputs to the experiment. The "Data" spreadsheet contains the entire set of data and the rest of the sheets "20-40", "20-45", ...etc., contain individual portions of the data with variation ofmore » feed temperatures.« less
Kamel Boulos, M N; Roudsari, A V; Gordon, C; Muir Gray, J A
2001-01-01
In 1998, the U.K. National Health Service Information for Health Strategy proposed the implementation of a National electronic Library for Health to provide clinicians, healthcare managers and planners, patients and the public with easy, round the clock access to high quality, up-to-date electronic information on health and healthcare. The Virtual Branch Libraries are among the most important components of the National electronic Library for Health. They aim at creating online knowledge based communities, each concerned with some specific clinical and other health-related topics. This study is about the envisaged Dermatology Virtual Branch Libraries of the National electronic Library for Health. It aims at selecting suitable dermatology Web resources for inclusion in the forthcoming Virtual Branch Libraries after establishing preliminary quality benchmarking rules for this task. Psoriasis, being a common dermatological condition, has been chosen as a starting point. Because quality is a principal concern of the National electronic Library for Health, the study includes a review of the major quality benchmarking systems available today for assessing health-related Web sites. The methodology of developing a quality benchmarking system has been also reviewed. Aided by metasearch Web tools, candidate resources were hand-selected in light of the reviewed benchmarking systems and specific criteria set by the authors. Over 90 professional and patient-oriented Web resources on psoriasis and dermatology in general are suggested for inclusion in the forthcoming Dermatology Virtual Branch Libraries. The idea of an all-in knowledge-hallmarking instrument for the National electronic Library for Health is also proposed based on the reviewed quality benchmarking systems. Skilled, methodical, organized human reviewing, selection and filtering based on well-defined quality appraisal criteria seems likely to be the key ingredient in the envisaged National electronic Library for Health service. Furthermore, by promoting the application of agreed quality guidelines and codes of ethics by all health information providers and not just within the National electronic Library for Health, the overall quality of the Web will improve with time and the Web will ultimately become a reliable and integral part of the care space.
A proposed benchmark problem for cargo nuclear threat monitoring
NASA Astrophysics Data System (ADS)
Wesley Holmes, Thomas; Calderon, Adan; Peeples, Cody R.; Gardner, Robin P.
2011-10-01
There is currently a great deal of technical and political effort focused on reducing the risk of potential attacks on the United States involving radiological dispersal devices or nuclear weapons. This paper proposes a benchmark problem for gamma-ray and X-ray cargo monitoring with results calculated using MCNP5, v1.51. The primary goal is to provide a benchmark problem that will allow researchers in this area to evaluate Monte Carlo models for both speed and accuracy in both forward and inverse calculational codes and approaches for nuclear security applications. A previous benchmark problem was developed by one of the authors (RPG) for two similar oil well logging problems (Gardner and Verghese, 1991, [1]). One of those benchmarks has recently been used by at least two researchers in the nuclear threat area to evaluate the speed and accuracy of Monte Carlo codes combined with variance reduction techniques. This apparent need has prompted us to design this benchmark problem specifically for the nuclear threat researcher. This benchmark consists of conceptual design and preliminary calculational results using gamma-ray interactions on a system containing three thicknesses of three different shielding materials. A point source is placed inside the three materials lead, aluminum, and plywood. The first two materials are in right circular cylindrical form while the third is a cube. The entire system rests on a sufficiently thick lead base so as to reduce undesired scattering events. The configuration was arranged in such a manner that as gamma-ray moves from the source outward it first passes through the lead circular cylinder, then the aluminum circular cylinder, and finally the wooden cube before reaching the detector. A 2 in.×4 in.×16 in. box style NaI (Tl) detector was placed 1 m from the point source located in the center with the 4 in.×16 in. side facing the system. The two sources used in the benchmark are 137Cs and 235U.
Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu
2016-01-01
A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.
NASA Astrophysics Data System (ADS)
Demler, Eugen; Rodman, Dmytro; Rodman, Mykhailo; Gerstein, Gregory; Grydin, Olexandr; Briukhanov, Arkadiy A.; Klose, Christian; Nürnberger, Florian; Maier, Hans Jürgen
2018-02-01
The process of cyclic bending was investigated using thin sheets of the magnesium alloy AZ31 and α-titanium. These materials possess an hcp crystal lattice with different c/a ratios. It turned out that the latter have a substantial influence on the sheet deformation behavior. Even for small deformations (up to 2% strain), a large influence on the yield stress was present for both materials. In addition, cyclic bending contributes to the activation of prismatic slip, which is accompanied by twinning and detwinning. The changes in sheet anisotropy following cyclic bending were determined using texture measurements. Specifically, the AZ31 alloy sheets exhibited a considerable change in anisotropy of the mechanical properties with an increasing number of bending cycles. The anisotropy in the yield stress increases from 15% in the initial condition to 40% after three cycles. For the α-titanium sheet, the change in anisotropy was approx. 26% less. In general, the largest changes in properties occurred already in the first bending cycle and a stabilization took place upon further cycling.
Optimization of CO2 laser cutting parameters on Austenitic type Stainless steel sheet
NASA Astrophysics Data System (ADS)
Parthiban, A.; Sathish, S.; Chandrasekaran, M.; Ravikumar, R.
2017-03-01
Thin AISI 316L stainless steel sheet widely used in sheet metal processing industries for specific applications. CO2 laser cutting is one of the most popular sheet metal cutting processes for cutting of sheets in different profile. In present work various cutting parameters such as laser power (2000 watts-4000 watts), cutting speed (3500mm/min - 5500 mm/min) and assist gas pressure (0.7 Mpa-0.9Mpa) for cutting of AISI 316L 2mm thickness stainless sheet. This experimentation was conducted based on Box-Behenken design. The aim of this work is to develop a mathematical model kerf width for straight and curved profile through response surface methodology. The developed mathematical models for straight and curved profile have been compared. The Quadratic models have the best agreement with experimental data, and also the shape of the profile a substantial role in achieving to minimize the kerf width. Finally the numerical optimization technique has been used to find out best optimum laser cutting parameter for both straight and curved profile cut.
Experimental Method for Characterizing Electrical Steel Sheets in the Normal Direction
Hihat, Nabil; Lecointe, Jean Philippe; Duchesne, Stephane; Napieralska, Ewa; Belgrand, Thierry
2010-01-01
This paper proposes an experimental method to characterise magnetic laminations in the direction normal to the sheet plane. The principle, which is based on a static excitation to avoid planar eddy currents, is explained and specific test benches are proposed. Measurements of the flux density are made with a sensor moving in and out of an air-gap. A simple analytical model is derived in order to determine the permeability in the normal direction. The experimental results for grain oriented steel sheets are presented and a comparison is provided with values obtained from literature. PMID:22163394
Samman, Samir; McCarthur, Jennifer O; Peat, Mary
2006-01-01
Benchmarking has been adopted by educational institutions as a potentially sensitive tool for improving learning and teaching. To date there has been limited application of benchmarking methodology in the Discipline of Nutritional Science. The aim of this survey was to define core elements and outstanding practice in Nutritional Science through collaborative benchmarking. Questionnaires that aimed to establish proposed core elements for Nutritional Science, and inquired about definitions of " good" and " outstanding" practice were posted to named representatives at eight Australian universities. Seven respondents identified core elements that included knowledge of nutrient metabolism and requirement, food production and processing, modern biomedical techniques that could be applied to understanding nutrition, and social and environmental issues as related to Nutritional Science. Four of the eight institutions who agreed to participate in the present survey identified the integration of teaching with research as an indicator of outstanding practice. Nutritional Science is a rapidly evolving discipline. Further and more comprehensive surveys are required to consolidate and update the definition of the discipline, and to identify the optimal way of teaching it. Global ideas and specific regional requirements also need to be considered.
Sczyrba, Alexander; Hofmann, Peter; Belmann, Peter; Koslicki, David; Janssen, Stefan; Dröge, Johannes; Gregor, Ivan; Majda, Stephan; Fiedler, Jessika; Dahms, Eik; Bremges, Andreas; Fritz, Adrian; Garrido-Oter, Ruben; Jørgensen, Tue Sparholt; Shapiro, Nicole; Blood, Philip D.; Gurevich, Alexey; Bai, Yang; Turaev, Dmitrij; DeMaere, Matthew Z.; Chikhi, Rayan; Nagarajan, Niranjan; Quince, Christopher; Meyer, Fernando; Balvočiūtė, Monika; Hansen, Lars Hestbjerg; Sørensen, Søren J.; Chia, Burton K. H.; Denis, Bertrand; Froula, Jeff L.; Wang, Zhong; Egan, Robert; Kang, Dongwan Don; Cook, Jeffrey J.; Deltel, Charles; Beckstette, Michael; Lemaitre, Claire; Peterlongo, Pierre; Rizk, Guillaume; Lavenier, Dominique; Wu, Yu-Wei; Singer, Steven W.; Jain, Chirag; Strous, Marc; Klingenberg, Heiner; Meinicke, Peter; Barton, Michael; Lingner, Thomas; Lin, Hsin-Hung; Liao, Yu-Chieh; Silva, Genivaldo Gueiros Z.; Cuevas, Daniel A.; Edwards, Robert A.; Saha, Surya; Piro, Vitor C.; Renard, Bernhard Y.; Pop, Mihai; Klenk, Hans-Peter; Göker, Markus; Kyrpides, Nikos C.; Woyke, Tanja; Vorholt, Julia A.; Schulze-Lefert, Paul; Rubin, Edward M.; Darling, Aaron E.; Rattei, Thomas; McHardy, Alice C.
2018-01-01
In metagenome analysis, computational methods for assembly, taxonomic profiling and binning are key components facilitating downstream biological data interpretation. However, a lack of consensus about benchmarking datasets and evaluation metrics complicates proper performance assessment. The Critical Assessment of Metagenome Interpretation (CAMI) challenge has engaged the global developer community to benchmark their programs on datasets of unprecedented complexity and realism. Benchmark metagenomes were generated from ~700 newly sequenced microorganisms and ~600 novel viruses and plasmids, including genomes with varying degrees of relatedness to each other and to publicly available ones and representing common experimental setups. Across all datasets, assembly and genome binning programs performed well for species represented by individual genomes, while performance was substantially affected by the presence of related strains. Taxonomic profiling and binning programs were proficient at high taxonomic ranks, with a notable performance decrease below the family level. Parameter settings substantially impacted performances, underscoring the importance of program reproducibility. While highlighting current challenges in computational metagenomics, the CAMI results provide a roadmap for software selection to answer specific research questions. PMID:28967888
First benchmark of the Unstructured Grid Adaptation Working Group
NASA Technical Reports Server (NTRS)
Ibanez, Daniel; Barral, Nicolas; Krakos, Joshua; Loseille, Adrien; Michal, Todd; Park, Mike
2017-01-01
Unstructured grid adaptation is a technology that holds the potential to improve the automation and accuracy of computational fluid dynamics and other computational disciplines. Difficulty producing the highly anisotropic elements necessary for simulation on complex curved geometries that satisfies a resolution request has limited this technology's widespread adoption. The Unstructured Grid Adaptation Working Group is an open gathering of researchers working on adapting simplicial meshes to conform to a metric field. Current members span a wide range of institutions including academia, industry, and national laboratories. The purpose of this group is to create a common basis for understanding and improving mesh adaptation. We present our first major contribution: a common set of benchmark cases, including input meshes and analytic metric specifications, that are publicly available to be used for evaluating any mesh adaptation code. We also present the results of several existing codes on these benchmark cases, to illustrate their utility in identifying key challenges common to all codes and important differences between available codes. Future directions are defined to expand this benchmark to mature the technology necessary to impact practical simulation workflows.
Personalized disease-specific protein corona influences the therapeutic impact of graphene oxide
NASA Astrophysics Data System (ADS)
Hajipour, Mohammad Javad; Raheb, Jamshid; Akhavan, Omid; Arjmand, Sareh; Mashinchian, Omid; Rahman, Masoud; Abdolahad, Mohammad; Serpooshan, Vahid; Laurent, Sophie; Mahmoudi, Morteza
2015-05-01
The hard corona, the protein shell that is strongly attached to the surface of nano-objects in biological fluids, is recognized as the first layer that interacts with biological objects (e.g., cells and tissues). The decoration of the hard corona (i.e., the type, amount, and conformation of the attached proteins) can define the biological fate of the nanomaterial. Recent developments have revealed that corona decoration strongly depends on the type of disease in human patients from which the plasma is obtained as a protein source for corona formation (referred to as the `personalized protein corona'). In this study, we demonstrate that graphene oxide (GO) sheets can trigger different biological responses in the presence of coronas obtained from various types of diseases. GO sheets were incubated with plasma from human subjects with different diseases/conditions, including hypofibrinogenemia, blood cancer, thalassemia major, thalassemia minor, rheumatism, fauvism, hypercholesterolemia, diabetes, and pregnancy. Identical sheets coated with varying protein corona decorations exhibited significantly different cellular toxicity, apoptosis, and uptake, reactive oxygen species production, lipid peroxidation and nitrogen oxide levels. The results of this report will help researchers design efficient and safe, patient-specific nano biomaterials in a disease type-specific manner for clinical and biological applications.The hard corona, the protein shell that is strongly attached to the surface of nano-objects in biological fluids, is recognized as the first layer that interacts with biological objects (e.g., cells and tissues). The decoration of the hard corona (i.e., the type, amount, and conformation of the attached proteins) can define the biological fate of the nanomaterial. Recent developments have revealed that corona decoration strongly depends on the type of disease in human patients from which the plasma is obtained as a protein source for corona formation (referred to as the `personalized protein corona'). In this study, we demonstrate that graphene oxide (GO) sheets can trigger different biological responses in the presence of coronas obtained from various types of diseases. GO sheets were incubated with plasma from human subjects with different diseases/conditions, including hypofibrinogenemia, blood cancer, thalassemia major, thalassemia minor, rheumatism, fauvism, hypercholesterolemia, diabetes, and pregnancy. Identical sheets coated with varying protein corona decorations exhibited significantly different cellular toxicity, apoptosis, and uptake, reactive oxygen species production, lipid peroxidation and nitrogen oxide levels. The results of this report will help researchers design efficient and safe, patient-specific nano biomaterials in a disease type-specific manner for clinical and biological applications. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr00520e
Formability of Annealed Ni-Ti Shape Memory Alloy Sheet
NASA Astrophysics Data System (ADS)
Fann, K. J.; Su, J. Y.; Chang, C. H.
2018-03-01
Ni-Ti shape memory alloy has two specific properties, superelasiticity and shape memory effect, and thus is widely applied in diverse industries. To extend its application, this study attempts to investigate the strength and cold formability of its sheet blank, which is annealed at various temperatures, by hardness test and by Erichsen-like cupping test. As a result, the higher the annealing temperature, the lower the hardness, the lower the maximum punch load as the sheet blank fractured, and the lower the Erichsen-like index or the lower the formability. In general, the Ni-Ti sheet after annealing has an Erichsen-like index between 8 mm and 9 mm. This study has also confirmed via DSC that the Ni-Ti shape memory alloy possesses the austenitic phase and shows the superelasticity at room temperature.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-10
... Film, Sheet, and Strip (PET Film) From Taiwan: Extension of Time Limit for the Preliminary Results of... antidumping duty order on PET Film from Taiwan covering the period July 1, 2009, through June 30, 2010. See... preliminary results of the administrative review of PET Film from Taiwan within this time limit. Specifically...
Facts on Kids in South Dakota, 2000.
ERIC Educational Resources Information Center
Goebel, Pat, Ed.; Blad, Amy, Ed.
2000-01-01
This Kids Count report consists of four issues in a series of fact sheets that examine specific indicators of the well-being of children in South Dakota. Issue one focuses on teens and motor vehicle crashes. The fact sheet notes that teen death rates from car crashes have been higher than the national rate for 4 of the 5 years between 1992-1996.…
ERIC Educational Resources Information Center
Erisman, Wendy; Looney, Shannon
2008-01-01
This fact sheet presents a snapshot of important facts that are specific to the state of New York from the "Opening the Door to the American Dream: Increasing Higher Education Access and Success for Immigrants" report, which exposes systemic barriers that prevent immigrants from entering college and/or completing bachelor's degrees…
Steve Sutherland; Melanie Miller
2005-01-01
The Understory Response Model is a species-specific computer model that qualitatively predicts change in total species biomass for grasses, forbs, and shrubs after thinning, prescribed fire, or wildfire. The model examines the effect of fuels management on plant survivorship and reproduction. This fact sheet identifies the intended users and uses, required inputs, what...
7 CFR 1485.15 - Activity plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... participant shall develop a specific activity plan(s) based on its strategic plan and the allocation approval... any changes in strategy from the strategic plan; (iii) A budget for each proposed activity, identifying the source of funds; (iv) Specific goals and benchmarks to be used to measure the effectiveness of...
Definition study for an advanced cosmic ray experiment utilizing the long duration exposure facility
NASA Astrophysics Data System (ADS)
Price, P. B.
1982-06-01
To achieve the goals of cosmic ray astrophysics, an ultraheavy cosmic ray experiment on an LDEF reflight should be in an orbit with high inclination (approximately 57 deg) at approximately 230 nm for approximately 2 years near solar minimum (approximately 1986). It should fill 61 trays. Each tray should contain 4 modules of total active area 0.7 sq m, with a thermal blanket, thermal labyrinth mounts, aluminum honeycomb mechanical support, and total weight approximately 100 kg. Each module should contain interleaved CR39, Lexan, and thin copper sheets plus one event-thermometer canned in a thin metal cannister sealed with approximately 0.2 atm dry O2. The CR39 and Lexan should be manufactured to specifications and the sheet copper rolled to specifications. The event-thermometer should be a stiffened CR39 sheet that slides via bimetal strips relative to fixed CR39 sheet so that stack temperature can be read out for each event. The metal cannister can be collapsed at launch and landing, capturing the sliding assembly to prevent damage. An engineering study should be made of a prototype LDEF tray; this will include thermal and mechanical tests of detectors and the event thermometer.
Definition study for an advanced cosmic ray experiment utilizing the long duration exposure facility
NASA Technical Reports Server (NTRS)
Price, P. B.
1982-01-01
To achieve the goals of cosmic ray astrophysics, an ultraheavy cosmic ray experiment on an LDEF reflight should be in an orbit with high inclination (approximately 57 deg) at approximately 230 nm for approximately 2 years near solar minimum (approximately 1986). It should fill 61 trays. Each tray should contain 4 modules of total active area 0.7 sq m, with a thermal blanket, thermal labyrinth mounts, aluminum honeycomb mechanical support, and total weight approximately 100 kg. Each module should contain interleaved CR39, Lexan, and thin copper sheets plus one event-thermometer canned in a thin metal cannister sealed with approximately 0.2 atm dry O2. The CR39 and Lexan should be manufactured to specifications and the sheet copper rolled to specifications. The event-thermometer should be a stiffened CR39 sheet that slides via bimetal strips relative to fixed CR39 sheet so that stack temperature can be read out for each event. The metal cannister can be collapsed at launch and landing, capturing the sliding assembly to prevent damage. An engineering study should be made of a prototype LDEF tray; this will include thermal and mechanical tests of detectors and the event thermometer.
2012-08-01
This proceedings report presents the outcomes from an international workshop designed to establish consensus on: definitions for key performance indicators (KPIs) for oocyte and embryo cryopreservation, using either slow freezing or vitrification; minimum performance level values for each KPI, representing basic competency; and aspirational benchmark values for each KPI, representing best practice goals. This report includes general presentations about current practice and factors for consideration in the development of KPIs. A total of 14 KPIs were recommended and benchmarks for each are presented. No recommendations were made regarding specific cryopreservation techniques or devices, or whether vitrification is 'better' than slow freezing, or vice versa, for any particular stage or application, as this was considered to be outside the scope of this workshop. Copyright © 2012 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Benchmark Problems for Space Mission Formation Flying
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Leitner, Jesse A.; Folta, David C.; Burns, Richard
2003-01-01
To provide a high-level focus to distributed space system flight dynamics and control research, several benchmark problems are suggested for space mission formation flying. The problems cover formation flying in low altitude, near-circular Earth orbit, high altitude, highly elliptical Earth orbits, and large amplitude lissajous trajectories about co-linear libration points of the Sun-Earth/Moon system. These problems are not specific to any current or proposed mission, but instead are intended to capture high-level features that would be generic to many similar missions that are of interest to various agencies.
Automatic Thread-Level Parallelization in the Chombo AMR Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christen, Matthias; Keen, Noel; Ligocki, Terry
2011-05-26
The increasing on-chip parallelism has some substantial implications for HPC applications. Currently, hybrid programming models (typically MPI+OpenMP) are employed for mapping software to the hardware in order to leverage the hardware?s architectural features. In this paper, we present an approach that automatically introduces thread level parallelism into Chombo, a parallel adaptive mesh refinement framework for finite difference type PDE solvers. In Chombo, core algorithms are specified in the ChomboFortran, a macro language extension to F77 that is part of the Chombo framework. This domain-specific language forms an already used target language for an automatic migration of the large number ofmore » existing algorithms into a hybrid MPI+OpenMP implementation. It also provides access to the auto-tuning methodology that enables tuning certain aspects of an algorithm to hardware characteristics. Performance measurements are presented for a few of the most relevant kernels with respect to a specific application benchmark using this technique as well as benchmark results for the entire application. The kernel benchmarks show that, using auto-tuning, up to a factor of 11 in performance was gained with 4 threads with respect to the serial reference implementation.« less
A web-based system architecture for ontology-based data integration in the domain of IT benchmarking
NASA Astrophysics Data System (ADS)
Pfaff, Matthias; Krcmar, Helmut
2018-03-01
In the domain of IT benchmarking (ITBM), a variety of data and information are collected. Although these data serve as the basis for business analyses, no unified semantic representation of such data yet exists. Consequently, data analysis across different distributed data sets and different benchmarks is almost impossible. This paper presents a system architecture and prototypical implementation for an integrated data management of distributed databases based on a domain-specific ontology. To preserve the semantic meaning of the data, the ITBM ontology is linked to data sources and functions as the central concept for database access. Thus, additional databases can be integrated by linking them to this domain-specific ontology and are directly available for further business analyses. Moreover, the web-based system supports the process of mapping ontology concepts to external databases by introducing a semi-automatic mapping recommender and by visualizing possible mapping candidates. The system also provides a natural language interface to easily query linked databases. The expected result of this ontology-based approach of knowledge representation and data access is an increase in knowledge and data sharing in this domain, which will enhance existing business analysis methods.
NASA Astrophysics Data System (ADS)
Grilo, Tiago J.; Vladimirov, Ivaylo N.; Valente, Robertt A. F.; Reese, Stefanie
2016-06-01
In the present paper, a finite strain model for complex combined isotropic-kinematic hardening is presented. It accounts for finite elastic and finite plastic strains and is suitable for any anisotropic yield criterion. In order to model complex cyclic hardening phenomena, the kinematic hardening is described by several back stress components. To that end, a new procedure is proposed in which several multiplicative decompositions of the plastic part of the deformation gradient are considered. The formulation incorporates a completely general format of the yield function, which means that any yield function can by employed by following a procedure that ensures the principle of material frame indifference. The constitutive equations are derived in a thermodynamically consistent way and numerically integrated by means of a backward-Euler algorithm based on the exponential map. The performance of the constitutive model is assessed via numerical simulations of industry-relevant sheet metal forming processes (U-channel forming and draw/re-draw of a panel benchmarks), the results of which are compared to experimental data. The comparison between numerical and experimental results shows that the use of multiple back stress components is very advantageous in the description of springback. This holds in particular if one carries out a comparison with the results of using only one component. Moreover, the numerically obtained results are in excellent agreement with the experimental data.
The Effects of Grain Size and Texture on Dynamic Abnormal Grain Growth in Mo
NASA Astrophysics Data System (ADS)
Noell, Philip J.; Taleff, Eric M.
2016-10-01
This is the first report of abnormal grain morphologies specific to a Mo sheet material produced from a commercial-purity arc-melted ingot. Abnormal grains initiated and grew during plastic deformation of this material at temperatures of 1793 K and 1813 K (1520 °C and 1540 °C). This abnormal grain growth during high-temperature plastic deformation is termed dynamic abnormal grain growth, DAGG. DAGG in this material readily consumes nearly all grains near the sheet center while leaving many grains near the sheet surface unconsumed. Crystallographic texture, grain size, and other microstructural features are characterized. After recrystallization, a significant through-thickness variation in crystallographic texture exists in this material but does not appear to directly influence DAGG propagation. Instead, dynamic normal grain growth, which may be influenced by texture, preferentially occurs near the sheet surface prior to DAGG. The large grains thus produced near the sheet surface inhibit the subsequent growth of the abnormal grains produced by DAGG, which preferentially consume the finer grains near the sheet center. This produces abnormal grains that span the sheet center but leave unconsumed polycrystalline microstructure near the sheet surface. Abnormal grains are preferentially oriented with the < 110rangle approximately along the tensile axis. These results provide additional new evidence that boundary curvature is the primary driving force for DAGG in Mo.
Bailey, Lucas; Sun, Jing; Courtney, Mark; Murphy, Paul
2015-05-01
To evaluate paediatric post-tonsillectomy pain management using oxycodone when a specific analgesia information sheet is included with standard postoperative information. Oxycodone information sheets were randomly allocated to half the study children's post-tonsillectomy information pack. The trial was double-blinded to the surgeon, anaesthetist, nursing and administrative staff. Parents and children completed the pain assessment on day 3, 5 and 7. On day 10 the parents completed a questionnaire. A postoperative analgesia information sheet provides for higher satisfaction and knowledge for parents using oxycodone (p<0.001) and children have improved postoperative pain control, most significantly at day 5 (p<0.05). Parent assessment of the child's analgesia was superior with the oxycodone information sheet, most significantly at day 3 and 7 post operatively (p<0.05). There is also a positive correlation between the parents' observed pain score and children's self reported pain score, with a low correlation efficient level observed (p<0.001). Information sheets are useful in education and use of postoperative analgesia. The primary objective to explore the efficacy of the information sheet has proved to be successful in this setting. Given risks of opioid analgesia, it is recommended that postoperative information sheets be given to all parents, to provide for improved analgesia control and safe management of children in the postoperative period. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.
2017-06-01
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.
NASA Astrophysics Data System (ADS)
Hu, Liangbing; Wu, Hui; Cui, Yi
2010-05-01
We report carbon nanotube thin film-based supercapacitors fabricated with printing methods, where electrodes and separators are integrated into single sheets of commercial paper. Carbon nanotube films are easily printed with Meyer rod coating or ink-jet printing onto a paper substrate due to the excellent ink absorption of paper. A specific capacity of 33 F/g at a high specific power of 250 000 W/kg is achieved with an organic electrolyte. Such a lightweight paper-based supercapacitor could be used to power paper electronics such as transistors or displays.
46 CFR 160.049-1 - Incorporation by reference.
Code of Federal Regulations, 2011 CFR
2011-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Specification for a Buoyant Cushion Plastic Foam § 160.049-1... Guard specifications: 160.055—Life Preservers, Unicellular Plastic Foam, Adult and Child. 164.015—Plastic Foam, Unicellular, Buoyant, Sheet and Molded Shapes. (4) Military specifications. MIL-C-43006...
46 CFR 160.049-1 - Incorporation by reference.
Code of Federal Regulations, 2010 CFR
2010-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Specification for a Buoyant Cushion Plastic Foam § 160.049-1... Guard specifications: 160.055—Life Preservers, Unicellular Plastic Foam, Adult and Child. 164.015—Plastic Foam, Unicellular, Buoyant, Sheet and Molded Shapes. (4) Military specifications. MIL-C-43006...
46 CFR 164.015-1 - Applicable specifications and standards.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., CONSTRUCTION, AND MATERIALS: SPECIFICATIONS AND APPROVAL MATERIALS Plastic Foam, Unicellular, Buoyant, Sheet... following specification and standard, of the issue in effect on the date the plastic foam material is... be kept on file by the plastic foam manufacturer with this subpart. (1) The Federal Specification and...
46 CFR 164.015-1 - Applicable specifications and standards.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., CONSTRUCTION, AND MATERIALS: SPECIFICATIONS AND APPROVAL MATERIALS Plastic Foam, Unicellular, Buoyant, Sheet... following specification and standard, of the issue in effect on the date the plastic foam material is... be kept on file by the plastic foam manufacturer with this subpart. (1) The Federal Specification and...
NASA Astrophysics Data System (ADS)
Kruger, J. M.
2016-12-01
This study determines the rates of subsidence or uplift in coastal areas of SE Texas by comparing recent GNSS measurements to the original orthometric heights of previously installed National Geodetic Survey (NGS) benchmarks. Understanding subsidence rates in coastal areas of SE Texas is critical when determining its vulnerability to local sea level rise and flooding, as well as for accurate survey control. The study area includes major metropolitan and industrial areas as well as more rural areas at risk for flooding and hurricane surge. The resurveying methods used in this RTK GNSS study allow a large area to be covered relatively quickly with enough detail to determine subsidence rates that are averaged over several decades, and identify at-risk regions that can be monitored more closely with permanent or campaign-style measurements. The most recent measurements were acquired using a Trimble R8 GNSS system on all NGS benchmarks found in the study area. Differential corrections were applied in real time using a VRS network of base stations. Nominal vertical accuracies were 1.5 to 3.0 cm for a 2 to 5 minute reading. Usually three readings were measured and averaged for the final result. A total of 340 benchmarks were used for vertical rate calculations. Original NGS elevations were subtracted from the new elevations and divided by the number of years between the two elevation measurements to determine the average subsidence or uplift rate of the benchmark. Besides inaccuracies in the NGS datasheet and re-measured elevations, another source of error includes uncertainties in the year the NGS datasheet elevations were measured. Overall, vertical rates of change vary from -6 to -15 mm/yr subsidence in Port Arthur, Nederland, and other areas of Jefferson County, as well as in areas northwest of Beaumont, Texas. Other areas with subsidence rates between -10 and -4 mm/yr include parts of the Bolivar Peninsula in Galveston County, northeastern Chambers County, and the Mont Belvieu area. Surprisingly, areas of uplift, with rates as great as +5 mm/yr, were found in some parts of the study area, mostly around Liberty, Texas, western Chambers County, east-central Beaumont, and in the northern part of the study area near Jasper, Texas.
Two-dimensional infrared spectroscopy reveals the complex behaviour of an amyloid fibril inhibitor
NASA Astrophysics Data System (ADS)
Middleton, Chris T.; Marek, Peter; Cao, Ping; Chiu, Chi-Cheng; Singh, Sadanand; Woys, Ann Marie; de Pablo, Juan J.; Raleigh, Daniel P.; Zanni, Martin T.
2012-05-01
Amyloid formation has been implicated in the pathology of over 20 human diseases, but the rational design of amyloid inhibitors is hampered by a lack of structural information about amyloid-inhibitor complexes. We use isotope labelling and two-dimensional infrared spectroscopy to obtain a residue-specific structure for the complex of human amylin (the peptide responsible for islet amyloid formation in type 2 diabetes) with a known inhibitor (rat amylin). Based on its sequence, rat amylin should block formation of the C-terminal β-sheet, but at 8 h after mixing, rat amylin blocks the N-terminal β-sheet instead. At 24 h after mixing, rat amylin blocks neither β-sheet and forms its own β-sheet, most probably on the outside of the human fibrils. This is striking, because rat amylin is natively disordered and not previously known to form amyloid β-sheets. The results show that even seemingly intuitive inhibitors may function by unforeseen and complex structural processes.
NASA Astrophysics Data System (ADS)
Wu, Zi Liang; Moshe, Michael; Greener, Jesse; Therien-Aubin, Heloise; Nie, Zhihong; Sharon, Eran; Kumacheva, Eugenia
2013-03-01
Although Nature has always been a common source of inspiration in the development of artificial materials, only recently has the ability of man-made materials to produce complex three-dimensional (3D) structures from two-dimensional sheets been explored. Here we present a new approach to the self-shaping of soft matter that mimics fibrous plant tissues by exploiting small-scale variations in the internal stresses to form three-dimensional morphologies. We design single-layer hydrogel sheets with chemically distinct, fibre-like regions that exhibit differential shrinkage and elastic moduli under the application of external stimulus. Using a planar-to-helical three-dimensional shape transformation as an example, we explore the relation between the internal architecture of the sheets and their transition to cylindrical and conical helices with specific structural characteristics. The ability to engineer multiple three-dimensional shape transformations determined by small-scale patterns in a hydrogel sheet represents a promising step in the development of programmable soft matter.
Lapão, Luís Velez
2015-01-01
The article by Catan et al. presents a benchmarking exercise comparing Israel and Portugal on the implementation of Information and Communication Technologies in the healthcare sector. Special attention was given to e-Health and m-Health. The authors collected information via a set of interviews with key stakeholders. They compared two different cultures and societies, which have reached slightly different implementation outcomes. Although the comparison is very enlightening, it is also challenging. Benchmarking exercises present a set of challenges, such as the choice of methodologies and the assessment of the impact on organizational strategy. Precise benchmarking methodology is a valid tool for eliciting information about alternatives for improving health systems. However, many beneficial interventions, which benchmark as effective, fail to translate into meaningful healthcare outcomes across contexts. There is a relationship between results and the innovational and competitive environments. Differences in healthcare governance and financing models are well known; but little is known about their impact on Information and Communication Technology implementation. The article by Catan et al. provides interesting clues about this issue. Public systems (such as those of Portugal, UK, Sweden, Spain, etc.) present specific advantages and disadvantages concerning Information and Communication Technology development and implementation. Meanwhile, private systems based fundamentally on insurance packages, (such as Israel, Germany, Netherlands or USA) present a different set of advantages and disadvantages - especially a more open context for innovation. Challenging issues from both the Portuguese and Israeli cases will be addressed. Clearly, more research is needed on both benchmarking methodologies and on ICT implementation strategies.
Vaccari, M; Foladori, P; Nembrini, S; Vitali, F
2018-05-01
One of the largest surveys in Europe about energy consumption in Italian wastewater treatment plants (WWTPs) is presented, based on 241 WWTPs and a total population equivalent (PE) of more than 9,000,000 PE. The study contributes towards standardised resilient data and benchmarking and to identify potentials for energy savings. In the energy benchmark, three indicators were used: specific energy consumption expressed per population equivalents (kWh PE -1 year -1 ), per cubic meter (kWh/m 3 ), and per unit of chemical oxygen demand (COD) removed (kWh/kgCOD). The indicator kWh/m 3 , even though widely applied, resulted in a biased benchmark, because highly influenced by stormwater and infiltrations. Plants with combined networks (often used in Europe) showed an apparent better energy performance. Conversely, the indicator kWh PE -1 year -1 resulted in a more meaningful definition of a benchmark. High energy efficiency was associated with: (i) large capacity of the plant, (ii) higher COD concentration in wastewater, (iii) separate sewer systems, (iv) capacity utilisation over 80%, and (v) high organic loads, but without overloading. The 25th percentile was proposed as a benchmark for four size classes: 23 kWh PE -1 y -1 for large plants > 100,000 PE; 42 kWh PE -1 y -1 for capacity 10,000 < PE < 100,000, 48 kWh PE -1 y -1 for capacity 2,000 < PE < 10,000 and 76 kWh PE -1 y -1 for small plants < 2,000 PE.
Escobar, Gabriel J; Baker, Jennifer M; Turk, Benjamin J; Draper, David; Liu, Vincent; Kipnis, Patricia
2017-01-01
Introduction This article is not a traditional research report. It describes how conducting a specific set of benchmarking analyses led us to broader reflections on hospital benchmarking. We reexamined an issue that has received far less attention from researchers than in the past: How variations in the hospital admission threshold might affect hospital rankings. Considering this threshold made us reconsider what benchmarking is and what future benchmarking studies might be like. Although we recognize that some of our assertions are speculative, they are based on our reading of the literature and previous and ongoing data analyses being conducted in our research unit. We describe the benchmarking analyses that led to these reflections. Objectives The Centers for Medicare and Medicaid Services’ Hospital Compare Web site includes data on fee-for-service Medicare beneficiaries but does not control for severity of illness, which requires physiologic data now available in most electronic medical records. To address this limitation, we compared hospital processes and outcomes among Kaiser Permanente Northern California’s (KPNC) Medicare Advantage beneficiaries and non-KPNC California Medicare beneficiaries between 2009 and 2010. Methods We assigned a simulated severity of illness measure to each record and explored the effect of having the additional information on outcomes. Results We found that if the admission severity of illness in non-KPNC hospitals increased, KPNC hospitals’ mortality performance would appear worse; conversely, if admission severity at non-KPNC hospitals’ decreased, KPNC hospitals’ performance would appear better. Conclusion Future hospital benchmarking should consider the impact of variation in admission thresholds. PMID:29035176
NASA Astrophysics Data System (ADS)
Steen-Larsen, H. C.; Risi, C.; Werner, M.; Yoshimura, K.; Masson-Delmotte, V.
2017-01-01
The skills of isotope-enabled general circulation models are evaluated against atmospheric water vapor isotopes. We have combined in situ observations of surface water vapor isotopes spanning multiple field seasons (2010, 2011, and 2012) from the top of the Greenland Ice Sheet (NEEM site: 77.45°N, 51.05°W, 2484 m above sea level) with observations from the marine boundary layer of the North Atlantic and Arctic Ocean (Bermuda Islands 32.26°N, 64.88°W, year: 2012; south coast of Iceland 63.83°N, 21.47°W, year: 2012; South Greenland 61.21°N, 47.17°W, year: 2012; Svalbard 78.92°N, 11.92°E, year: 2014). This allows us to benchmark the ability to simulate the daily water vapor isotope variations from five different simulations using isotope-enabled general circulation models. Our model-data comparison documents clear isotope biases both on top of the Greenland Ice Sheet (1-11‰ for δ18O and 4-19‰ for d-excess depending on model and season) and in the marine boundary layer (maximum differences for the following: Bermuda δ18O = 1‰, d-excess = 3‰; South coast of Iceland δ18O = 2‰, d-excess = 5‰; South Greenland δ18O = 4‰, d-excess = 7‰; Svalbard δ18O = 2‰, d-excess = 7‰). We find that the simulated isotope biases are not just explained by simulated biases in temperature and humidity. Instead, we argue that these isotope biases are related to a poor simulation of the spatial structure of the marine boundary layer water vapor isotopic composition. Furthermore, we specifically show that the marine boundary layer water vapor isotopes of the Baffin Bay region show strong influence on the water vapor isotopes at the NEEM deep ice core-drilling site in northwest Greenland. Our evaluation of the simulations using isotope-enabled general circulation models also documents wide intermodel spatial variability in the Arctic. This stresses the importance of a coordinated water vapor isotope-monitoring network in order to discriminate amongst these model behaviors.
NASA Astrophysics Data System (ADS)
Wang, Cunjing; Wu, Dapeng; Wang, Hongju; Gao, Zhiyong; Xu, Fang; Jiang, Kai
2017-09-01
Highly porous carbon sheets were prepared from fresh clover stems under air atmosphere via a facile potassium chloride salt-sealing technique, which not only avoids using the high cost inert gas protection but also spontaneously introduce multi-level porosity into the carbon structure taking advantage of the trace of oxygen in the molten salt system. The as-obtained porous carbon sheets possess high specific surface area of 2244 m2 g-1 and interconnected hierarchical pore structures from micro-to macro-scale, which provide abundant storage active sites and fast ion diffusion channels. In addition, the spontaneously formed N (2.55 at%) and O (6.94 at%) doping sites not only improve the electron conductivity of the electrode but also enhance the specific capacitance by introducing pseudocapacitance. When employed as supercapacitor electrodes, a high specific capacitance of 436 F g-1 at 1 A g-1 and an excellent rate capacity with capacitance remaining 290 F g-1 at 50 A g-1 are demonstrated. Furthermore, the assembled symmetric supercapacitor delivers a high specific capacitance of 420 F g-1 at 0.5 A g-1, excellent energy density of 58.4 Wh kg-1 and good cycling stability which retains 99.4% of the initial capacitance at 5 A g-1 after 30,000 cycles.
ORBDA: An openEHR benchmark dataset for performance assessment of electronic health record servers.
Teodoro, Douglas; Sundvall, Erik; João Junior, Mario; Ruch, Patrick; Miranda Freire, Sergio
2018-01-01
The openEHR specifications are designed to support implementation of flexible and interoperable Electronic Health Record (EHR) systems. Despite the increasing number of solutions based on the openEHR specifications, it is difficult to find publicly available healthcare datasets in the openEHR format that can be used to test, compare and validate different data persistence mechanisms for openEHR. To foster research on openEHR servers, we present the openEHR Benchmark Dataset, ORBDA, a very large healthcare benchmark dataset encoded using the openEHR formalism. To construct ORBDA, we extracted and cleaned a de-identified dataset from the Brazilian National Healthcare System (SUS) containing hospitalisation and high complexity procedures information and formalised it using a set of openEHR archetypes and templates. Then, we implemented a tool to enrich the raw relational data and convert it into the openEHR model using the openEHR Java reference model library. The ORBDA dataset is available in composition, versioned composition and EHR openEHR representations in XML and JSON formats. In total, the dataset contains more than 150 million composition records. We describe the dataset and provide means to access it. Additionally, we demonstrate the usage of ORBDA for evaluating inserting throughput and query latency performances of some NoSQL database management systems. We believe that ORBDA is a valuable asset for assessing storage models for openEHR-based information systems during the software engineering process. It may also be a suitable component in future standardised benchmarking of available openEHR storage platforms.
ORBDA: An openEHR benchmark dataset for performance assessment of electronic health record servers
Sundvall, Erik; João Junior, Mario; Ruch, Patrick; Miranda Freire, Sergio
2018-01-01
The openEHR specifications are designed to support implementation of flexible and interoperable Electronic Health Record (EHR) systems. Despite the increasing number of solutions based on the openEHR specifications, it is difficult to find publicly available healthcare datasets in the openEHR format that can be used to test, compare and validate different data persistence mechanisms for openEHR. To foster research on openEHR servers, we present the openEHR Benchmark Dataset, ORBDA, a very large healthcare benchmark dataset encoded using the openEHR formalism. To construct ORBDA, we extracted and cleaned a de-identified dataset from the Brazilian National Healthcare System (SUS) containing hospitalisation and high complexity procedures information and formalised it using a set of openEHR archetypes and templates. Then, we implemented a tool to enrich the raw relational data and convert it into the openEHR model using the openEHR Java reference model library. The ORBDA dataset is available in composition, versioned composition and EHR openEHR representations in XML and JSON formats. In total, the dataset contains more than 150 million composition records. We describe the dataset and provide means to access it. Additionally, we demonstrate the usage of ORBDA for evaluating inserting throughput and query latency performances of some NoSQL database management systems. We believe that ORBDA is a valuable asset for assessing storage models for openEHR-based information systems during the software engineering process. It may also be a suitable component in future standardised benchmarking of available openEHR storage platforms. PMID:29293556
Gao, Hui; Li, Bei; Zhao, Lingzhou; Jin, Yan
2015-01-01
Periodontal regeneration is an important part of regenerative medicine, with great clinical significance; however, the effects of nanotopography on the functions of periodontal ligament (PDL) stem cells (PDLSCs) and on PDLSC sheet based periodontal regeneration have never been explored. Titania nanotubes (NTs) layered on titanium (Ti) provide a good platform to study this. In the current study, the influence of NTs of different tube size on the functions of PDLSCs was observed. Afterward, an ectopic implantation model using a Ti/cell sheets/hydroxyapatite (HA) complex was applied to study the effect of the NTs on cell sheet based periodontal regeneration. The NTs were able to enhance the initial PDLSC adhesion and spread, as well as collagen secretion. With the Ti/cell sheets/HA complex model, it was demonstrated that the PDLSC sheets were capable of regenerating the PDL tissue, when combined with bone marrow mesenchymal stem cell (BMSC) sheets and HA, without the need for extra soluble chemical cues. Simultaneously, the NTs improved the periodontal regeneration result of the ectopically implanted Ti/cell sheets/HA complex, giving rise to functionally aligned collagen fiber bundles. Specifically, much denser collagen fibers, with abundant blood vessels as well as cementum-like tissue on the Ti surface, which well-resembled the structure of natural PDL, were observed in the NT5 and NT10 sample groups. Our study provides the first evidence that the nanotopographical cues obviously influence the functions of PDLSCs and improve the PDLSC sheet based periodontal regeneration size dependently, which provides new insight to the periodontal regeneration. The Ti/cell sheets/HA complex may constitute a good model to predict the effect of biomaterials on periodontal regeneration. PMID:26150714
Gao, Hui; Li, Bei; Zhao, Lingzhou; Jin, Yan
2015-01-01
Periodontal regeneration is an important part of regenerative medicine, with great clinical significance; however, the effects of nanotopography on the functions of periodontal ligament (PDL) stem cells (PDLSCs) and on PDLSC sheet based periodontal regeneration have never been explored. Titania nanotubes (NTs) layered on titanium (Ti) provide a good platform to study this. In the current study, the influence of NTs of different tube size on the functions of PDLSCs was observed. Afterward, an ectopic implantation model using a Ti/cell sheets/hydroxyapatite (HA) complex was applied to study the effect of the NTs on cell sheet based periodontal regeneration. The NTs were able to enhance the initial PDLSC adhesion and spread, as well as collagen secretion. With the Ti/cell sheets/HA complex model, it was demonstrated that the PDLSC sheets were capable of regenerating the PDL tissue, when combined with bone marrow mesenchymal stem cell (BMSC) sheets and HA, without the need for extra soluble chemical cues. Simultaneously, the NTs improved the periodontal regeneration result of the ectopically implanted Ti/cell sheets/HA complex, giving rise to functionally aligned collagen fiber bundles. Specifically, much denser collagen fibers, with abundant blood vessels as well as cementum-like tissue on the Ti surface, which well-resembled the structure of natural PDL, were observed in the NT5 and NT10 sample groups. Our study provides the first evidence that the nanotopographical cues obviously influence the functions of PDLSCs and improve the PDLSC sheet based periodontal regeneration size dependently, which provides new insight to the periodontal regeneration. The Ti/cell sheets/HA complex may constitute a good model to predict the effect of biomaterials on periodontal regeneration.
A Business Case Analysis for the Vulture Program
2010-12-01
HARNESSING SOLAR POWER FOR UNMANNED AERIAL VEHICLES Marty Curry noted in a NASA Dryden Fact Sheet, “The first flight of a solar- powered aircraft...wavelength of light absorbed. In order to produce enough power to be 20 Marty Curry , “Solar Power...40 Lim, “Global Observer,” 37. 41 Marty Curry , “Global Hawk – Performance & Specifications,” NASA Dryden Fact Sheet, 7
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mkhabela, P.; Han, J.; Tyobeka, B.
2006-07-01
The Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) has accepted, through the Nuclear Science Committee (NSC), the inclusion of the Pebble-Bed Modular Reactor 400 MW design (PBMR-400) coupled neutronics/thermal hydraulics transient benchmark problem as part of their official activities. The scope of the benchmark is to establish a well-defined problem, based on a common given library of cross sections, to compare methods and tools in core simulation and thermal hydraulics analysis with a specific focus on transient events through a set of multi-dimensional computational test problems. The benchmark includes three steady state exercises andmore » six transient exercises. This paper describes the first two steady state exercises, their objectives and the international participation in terms of organization, country and computer code utilized. This description is followed by a comparison and analysis of the participants' results submitted for these two exercises. The comparison of results from different codes allows for an assessment of the sensitivity of a result to the method employed and can thus help to focus the development efforts on the most critical areas. The two first exercises also allow for removing of user-related modeling errors and prepare core neutronics and thermal-hydraulics models of the different codes for the rest of the exercises in the benchmark. (authors)« less
Nobels, Frank; Debacker, Noëmi; Brotons, Carlos; Elisaf, Moses; Hermans, Michel P; Michel, Georges; Muls, Erik
2011-09-22
To investigate the effect of physician- and patient-specific feedback with benchmarking on the quality of care in adults with type 2 diabetes mellitus (T2DM). Study centres in six European countries were randomised to either a benchmarking or control group. Physicians in both groups received feedback on modifiable outcome indicators (glycated haemoglobin [HbA1c], glycaemia, total cholesterol, high density lipoprotein-cholesterol, low density lipoprotein [LDL]-cholesterol and triglycerides) for each patient at 0, 4, 8 and 12 months, based on the four times yearly control visits recommended by international guidelines. The benchmarking group also received comparative results on three critical quality indicators of vascular risk (HbA1c, LDL-cholesterol and systolic blood pressure [SBP]), checked against the results of their colleagues from the same country, and versus pre-set targets. After 12 months of follow up, the percentage of patients achieving the pre-determined targets for the three critical quality indicators will be assessed in the two groups. Recruitment was completed in December 2008 with 3994 evaluable patients. This paper discusses the study rationale and design of OPTIMISE, a randomised controlled study, that will help assess whether benchmarking is a useful clinical tool for improving outcomes in T2DM in primary care. NCT00681850.
2011-01-01
Background To investigate the effect of physician- and patient-specific feedback with benchmarking on the quality of care in adults with type 2 diabetes mellitus (T2DM). Methods Study centres in six European countries were randomised to either a benchmarking or control group. Physicians in both groups received feedback on modifiable outcome indicators (glycated haemoglobin [HbA1c], glycaemia, total cholesterol, high density lipoprotein-cholesterol, low density lipoprotein [LDL]-cholesterol and triglycerides) for each patient at 0, 4, 8 and 12 months, based on the four times yearly control visits recommended by international guidelines. The benchmarking group also received comparative results on three critical quality indicators of vascular risk (HbA1c, LDL-cholesterol and systolic blood pressure [SBP]), checked against the results of their colleagues from the same country, and versus pre-set targets. After 12 months of follow up, the percentage of patients achieving the pre-determined targets for the three critical quality indicators will be assessed in the two groups. Results Recruitment was completed in December 2008 with 3994 evaluable patients. Conclusions This paper discusses the study rationale and design of OPTIMISE, a randomised controlled study, that will help assess whether benchmarking is a useful clinical tool for improving outcomes in T2DM in primary care. Trial registration NCT00681850 PMID:21939502
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopes, M. L.
2014-07-01
SolCalc is a software suite that computes and displays magnetic fields generated by a three dimensional (3D) solenoid system. Examples of such systems are the Mu2e magnet system and Helical Solenoids for muon cooling systems. SolCalc was originally coded in Matlab, and later upgraded to a compiled version (called MEX) to improve solving speed. Matlab was chosen because its graphical capabilities represent an attractive feature over other computer languages. Solenoid geometries can be created using any text editor or spread sheets and can be displayed dynamically in 3D. Fields are computed from any given list of coordinates. The field distributionmore » on the surfaces of the coils can be displayed as well. SolCalc was benchmarked against a well-known commercial software for speed and accuracy and the results compared favorably.« less
Automated detection and characterization of harmonic tremor in continuous seismic data
NASA Astrophysics Data System (ADS)
Roman, Diana C.
2017-06-01
Harmonic tremor is a common feature of volcanic, hydrothermal, and ice sheet seismicity and is thus an important proxy for monitoring changes in these systems. However, no automated methods for detecting harmonic tremor currently exist. Because harmonic tremor shares characteristics with speech and music, digital signal processing techniques for analyzing these signals can be adapted. I develop a novel pitch-detection-based algorithm to automatically identify occurrences of harmonic tremor and characterize their frequency content. The algorithm is applied to seismic data from Popocatepetl Volcano, Mexico, and benchmarked against a monthlong manually detected catalog of harmonic tremor events. During a period of heightened eruptive activity from December 2014 to May 2015, the algorithm detects 1465 min of harmonic tremor, which generally precede periods of heightened explosive activity. These results demonstrate the algorithm's ability to accurately characterize harmonic tremor while highlighting the need for additional work to understand its causes and implications at restless volcanoes.
NASA Astrophysics Data System (ADS)
Schwegler, Eric; Challacombe, Matt; Head-Gordon, Martin
1997-06-01
A new linear scaling method for computation of the Cartesian Gaussian-based Hartree-Fock exchange matrix is described, which employs a method numerically equivalent to standard direct SCF, and which does not enforce locality of the density matrix. With a previously described method for computing the Coulomb matrix [J. Chem. Phys. 106, 5526 (1997)], linear scaling incremental Fock builds are demonstrated for the first time. Microhartree accuracy and linear scaling are achieved for restricted Hartree-Fock calculations on sequences of water clusters and polyglycine α-helices with the 3-21G and 6-31G basis sets. Eightfold speedups are found relative to our previous method. For systems with a small ionization potential, such as graphitic sheets, the method naturally reverts to the expected quadratic behavior. Also, benchmark 3-21G calculations attaining microhartree accuracy are reported for the P53 tetramerization monomer involving 698 atoms and 3836 basis functions.
The Effect of Thermophoresis on Unsteady Oldroyd-B Nanofluid Flow over Stretching Surface
Awad, Faiz G.; Ahamed, Sami M. S.; Sibanda, Precious; Khumalo, Melusi
2015-01-01
There are currently only a few theoretical studies on convective heat transfer in polymer nanocomposites. In this paper, the unsteady incompressible flow of a polymer nanocomposite represented by an Oldroyd-B nanofluid along a stretching sheet is investigated. Recent studies have assumed that the nanoparticle fraction can be actively controlled on the boundary, similar to the temperature. However, in practice, such control presents significant challenges and in this study the nanoparticle flux at the boundary surface is assumed to be zero. We have used a relatively novel numerical scheme; the spectral relaxation method to solve the momentum, heat and mass transport equations. The accuracy of the solutions has been determined by benchmarking the results against the quasilinearisation method. We have conducted a parametric study to determine the influence of the fluid parameters on the heat and mass transfer coefficients. PMID:26312754
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D. A.; Chadwick, M. B.; Capote, R.
We describe the new ENDF/B-VIII.0 evaluated nuclear reaction data library. ENDF/B-VIII.0 fully incorporates the new IAEA standards, includes improved thermal neutron scattering data and uses new evaluated data from the CIELO project for neutron reactions on 1H, 16O, 56Fe, 235U, 238U and 239Pu described in companion papers in the present issue of Nuclear Data Sheets. The evaluations benefit from recent experimental data obtained in the U.S. and Europe, and improvements in theory and simulation. Notable advances include updated evaluated data for light nuclei, structural materials, actinides, fission energy release, prompt fission neutron and γ-ray spectra, thermal neutron scattering data, andmore » charged-particle reactions. Integral validation testing is shown for a wide range of criticality, reaction rate, and neutron transmission benchmarks. In general, integral validation performance of the library is improved relative to the previous ENDF/B-VII.1 library.« less
Calendering and Rolling of Viscoplastic Materials: Theory and Experiments
NASA Astrophysics Data System (ADS)
Mitsoulis, E.; Sofou, S.; Muliawan, E. B.; Hatzikiriakos, S. G.
2007-04-01
The calendering and rolling processes are used in a wide variety of industries for the production of rolled sheets or films of specific thickness and final appearance. The acquired final sheet thickness depends mainly on the rheological properties of the material. Materials which have been used in the present study are foodstuff (such as mozzarella cheese and flour-water dough) used in food processing. These materials are rheologically viscoplastic, obeying the Herschel-Bulkley model. The results give the final sheet thickness and the torque as a function of the roll speed. Theoretical analysis based on the Lubrication Approximation Theory (LAT) shows that LAT is a good predictive tool for calendering, where the sheet thickness is very small compared with the roll size. However, in rolling where this is not true, LAT does not hold, and a 2-D analysis is necessary.
Graphene electron cannon: High-current edge emission from aligned graphene sheets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jianlong; Li, Nannan; Guo, Jing
2014-01-13
High-current field emitters are made by graphene paper consist of aligned graphene sheets. Field emission luminance pattern shows that their electron beams can be controlled by rolling the graphene paper from sheet to cylinder. These specific electron beams would be useful to vacuum devices and electron beam lithograph. To get high-current emission, the graphene paper is rolled to array and form graphene cannon. Due to aligned emission array, graphene cannon have high emission current. Besides high emission current, the graphene cannon is also tolerable with excellent emission stability. With good field emission properties, these aligned graphene emitters bring application insight.
Chen, Tao; Peng, Huisheng; Durstock, Michael; Dai, Liming
2014-01-01
By using highly aligned carbon nanotube (CNT) sheets of excellent optical transmittance and mechanical stretchability as both the current collector and active electrode, high-performance transparent and stretchable all-solid supercapacitors with a good stability were developed. A transmittance up to 75% at the wavelength of 550 nm was achieved for a supercapacitor made from a cross-over assembly of two single-layer CNT sheets. The transparent supercapacitor has a specific capacitance of 7.3 F g−1 and can be biaxially stretched up to 30% strain without any obvious change in electrochemical performance even over hundreds stretching cycles. PMID:24402400
Long term ice sheet mass change rates and inter-annual variability from GRACE gravimetry.
NASA Astrophysics Data System (ADS)
Harig, C.
2017-12-01
The GRACE time series of gravimetry now stretches 15 years since its launch in 2002. Here we use Slepian functions to estimate the long term ice mass trends of Greenland, Antarctica, and several glaciated regions. The spatial representation shows multi-year to decadal regional shifts in accelerations, in agreement with increases in radar derived ice velocity. Interannual variations in ice mass are of particular interest since they can directly link changes in ice sheets to the drivers of change in the polar ocean and atmosphere. The spatial information retained in Slepian functions provides a tool to determine how this link varies in different regions within an ice sheet. We present GRACE observations of the 2013-2014 slowdown in mass loss of the Greenland ice sheet, which was concentrated in specific parts of the ice sheet and in certain months of the year. We also discuss estimating the relative importance of climate factors that control ice mass balance, as a function of location of the glacier/ice cap as well as the spatial variation within an ice sheet by comparing gravimetry with observations of surface air temperature, ocean temperature, etc. as well as model data from climate reanalysis products.
Interpretation of high-speed flows in the plasma sheet
NASA Technical Reports Server (NTRS)
Chen, C. X.; Wolf, R. A.
1993-01-01
Pursuing an idea suggested by Pontius and Wolf (1990), we propose that the `bursty bulk flows' observed by Baumjohann et al. (1990) and Angelopoulos et al. (1992) are `bubbles' in the Earth's plasma sheet. Specifically, they are flux tubes that have lower values of pV(exp 5/3) than their neighbors, where p is the thermal pressure of the particles and V is the volume of a tube containing one unit of magnetic flux. Whether they are created by reconnection or some other mechanism, the bubbles are propelled earthward by a magnetic buoyancy force, which is related to the interchange instability. Most of the major observed characteristics of the bursty bulk flows can be interpreted naturally in terms of the bubble picture. We propose a new `stratified fluid' picture of the plasma sheet, based on the idea that bubbles constitute the crucial transport mechanism. Results from simple mathematical models of plasma sheet transport support the idea that bubbles can resolve the pressure balance inconsistency, particularly in cases where plasma sheet ions are lost by gradient/curvature drift out the sides of the tail or bubbles are generated by reconnection in the middle of plasma sheet.
NASA Astrophysics Data System (ADS)
Kuzmina, K. S.; Marchevsky, I. K.; Ryatina, E. P.
2017-11-01
We consider the methodology of numerical schemes development for two-dimensional vortex method. We describe two different approaches to deriving integral equation for unknown vortex sheet intensity. We simulate the velocity of the surface line of an airfoil as the influence of attached vortex and source sheets. We consider a polygonal approximation of the airfoil and assume intensity distributions of free and attached vortex sheets and attached source sheet to be approximated with piecewise constant or piecewise linear (continuous or discontinuous) functions. We describe several specific numerical schemes that provide different accuracy and have a different computational cost. The study shows that a Galerkin-type approach to solving boundary integral equation requires computing several integrals and double integrals over the panels. We obtain exact analytical formulae for all the necessary integrals, which makes it possible to raise significantly the accuracy of vortex sheet intensity computation and improve the quality of velocity and vorticity field representation, especially in proximity to the surface line of the airfoil. All the formulae are written down in the invariant form and depend only on the geometric relationship between the positions of the beginnings and ends of the panels.
46 CFR 164.015-1 - Applicable specifications and standards.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., CONSTRUCTION, AND MATERIALS: SPECIFICATIONS AND APPROVAL MATERIALS Plastic Foam, Unicellular, Buoyant, Sheet... following specification and standard, of the issue in effect on the date the plastic foam material is...) ASTM D4986-98, Standard Test Method for Horizontal Burning Characteristics of Cellular Polymeric...
Extraction of Ice Sheet Layers from Two Intersected Radar Echograms Near Neem Ice Core in Greenland
NASA Astrophysics Data System (ADS)
Xiong, S.; Muller, J.-P.
2016-06-01
Accumulation of snow and ice over time result in ice sheet layers. These can be remotely sensed where there is a contrast in electromagnetic properties, which reflect variations of the ice density, acidity and fabric orientation. Internal ice layers are assumed to be isochronous, deep beneath the ice surface, and parallel to the direction of ice flow. The distribution of internal layers is related to ice sheet dynamics, such as the basal melt rate, basal elevation variation and changes in ice flow mode, which are important parameters to model the ice sheet. Radar echo sounder is an effective instrument used to study the sedimentology of the Earth and planets. Ice Penetrating Radar (IPR) is specific kind of radar echo sounder, which extends studies of ice sheets from surface to subsurface to deep internal ice sheets depending on the frequency utilised. In this study, we examine a study site where folded ice occurs in the internal ice sheet south of the North Greenland Eemian ice drilling (NEEM) station, where two intersected radar echograms acquired by the Multi-channel Coherent Radar Depth Sounder (MCoRDS) employed in the NASA's Operation IceBridge (OIB) mission imaged this folded ice. We propose a slice processing flow based on a Radon Transform to trace and extract these two sets of curved ice sheet layers, which can then be viewed in 3-D, demonstrating the 3-D structure of the ice folds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...
2017-03-23
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Using Grid Benchmarks for Dynamic Scheduling of Grid Applications
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Hood, Robert
2003-01-01
Navigation or dynamic scheduling of applications on computational grids can be improved through the use of an application-specific characterization of grid resources. Current grid information systems provide a description of the resources, but do not contain any application-specific information. We define a GridScape as dynamic state of the grid resources. We measure the dynamic performance of these resources using the grid benchmarks. Then we use the GridScape for automatic assignment of the tasks of a grid application to grid resources. The scalability of the system is achieved by limiting the navigation overhead to a few percent of the application resource requirements. Our task submission and assignment protocol guarantees that the navigation system does not cause grid congestion. On a synthetic data mining application we demonstrate that Gridscape-based task assignment reduces the application tunaround time.
NASA Astrophysics Data System (ADS)
Georges, F.; Remouche, M.; Meyrueis, P.
2011-06-01
Usually manufacturer's specifications do not deal with the ability of linear sheet polarizers to have a constant transmittance function over their geometric area. These parameters are fundamental for developing low cost polarimetric sensors(for instance rotation, torque, displacement) specifically for hybrid car (thermic + electricity power). It is then necessary to specially characterize commercial polarizers sheets to find if they are adapted to this kind of applications. In this paper, we present measuring methods and bench developed for this purpose, and some preliminary characterization results. We state conclusions for effective applications to hybrid car gearbox control and monitoring.
Personalized disease-specific protein corona influences the therapeutic impact of graphene oxide.
Hajipour, Mohammad Javad; Raheb, Jamshid; Akhavan, Omid; Arjmand, Sareh; Mashinchian, Omid; Rahman, Masoud; Abdolahad, Mohammad; Serpooshan, Vahid; Laurent, Sophie; Mahmoudi, Morteza
2015-05-21
The hard corona, the protein shell that is strongly attached to the surface of nano-objects in biological fluids, is recognized as the first layer that interacts with biological objects (e.g., cells and tissues). The decoration of the hard corona (i.e., the type, amount, and conformation of the attached proteins) can define the biological fate of the nanomaterial. Recent developments have revealed that corona decoration strongly depends on the type of disease in human patients from which the plasma is obtained as a protein source for corona formation (referred to as the 'personalized protein corona'). In this study, we demonstrate that graphene oxide (GO) sheets can trigger different biological responses in the presence of coronas obtained from various types of diseases. GO sheets were incubated with plasma from human subjects with different diseases/conditions, including hypofibrinogenemia, blood cancer, thalassemia major, thalassemia minor, rheumatism, fauvism, hypercholesterolemia, diabetes, and pregnancy. Identical sheets coated with varying protein corona decorations exhibited significantly different cellular toxicity, apoptosis, and uptake, reactive oxygen species production, lipid peroxidation and nitrogen oxide levels. The results of this report will help researchers design efficient and safe, patient-specific nano biomaterials in a disease type-specific manner for clinical and biological applications.
International consensus on a complications list after gastrectomy for cancer.
Baiocchi, Gian Luca; Giacopuzzi, Simone; Marrelli, Daniele; Reim, Daniel; Piessen, Guillaume; Matos da Costa, Paulo; Reynolds, John V; Meyer, Hans-Joachim; Morgagni, Paolo; Gockel, Ines; Lara Santos, Lucio; Jensen, Lone Susanne; Murphy, Thomas; Preston, Shaun R; Ter-Ovanesov, Mikhail; Fumagalli Romario, Uberto; Degiuli, Maurizio; Kielan, Wojciech; Mönig, Stefan; Kołodziejczyk, Piotr; Polkowski, Wojciech; Hardwick, Richard; Pera, Manuel; Johansson, Jan; Schneider, Paul M; de Steur, Wobbe O; Gisbertz, Suzanne S; Hartgrink, Henk; van Sandick, Joanna W; Portolani, Nazario; Hölscher, Arnulf H; Botticini, Maristella; Roviello, Franco; Mariette, Christophe; Allum, William; De Manzoni, Giovanni
2018-05-30
Perioperative complications can affect outcomes after gastrectomy for cancer, with high mortality and morbidity rates ranging between 10 and 40%. The absence of a standardized system for recording complications generates wide variation in evaluating their impacts on outcomes and hinders proposals of quality-improvement projects. The aim of this study was to provide a list of defined gastrectomy complications approved through international consensus. The Gastrectomy Complications Consensus Group consists of 34 European gastric cancer experts who are members of the International Gastric Cancer Association. A group meeting established the work plan for study implementation through Delphi surveys. A consensus was reached regarding a set of standardized methods to define gastrectomy complications. A standardized list of 27 defined complications (grouped into 3 intraoperative, 14 postoperative general, and 10 postoperative surgical complications) was created to provide a simple but accurate template for recording individual gastrectomy complications. A consensus was reached for both the list of complications that should be considered major adverse events after gastrectomy for cancer and their specific definitions. The study group also agreed that an assessment of each surgical case should be completed at patient discharge and 90 days postoperatively using a Complication Recording Sheet. The list of defined complications (soon to be validated in an international multicenter study) and the ongoing development of an electronic datasheet app to record them provide the basic infrastructure to reach the ultimate goals of standardized international data collection, establishment of benchmark results, and fostering of quality-improvement projects.
Automatic Keyword Extraction from Individual Documents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Stuart J.; Engel, David W.; Cramer, Nicholas O.
2010-05-03
This paper introduces a novel and domain-independent method for automatically extracting keywords, as sequences of one or more words, from individual documents. We describe the method’s configuration parameters and algorithm, and present an evaluation on a benchmark corpus of technical abstracts. We also present a method for generating lists of stop words for specific corpora and domains, and evaluate its ability to improve keyword extraction on the benchmark corpus. Finally, we apply our method of automatic keyword extraction to a corpus of news articles and define metrics for characterizing the exclusivity, essentiality, and generality of extracted keywords within a corpus.
ERIC Educational Resources Information Center
Waldron, Laurie
This guide offers a nutrition education program for students in Kindergarten through Grade 6. Activities span all grades as well as activities for the specific level. Nutrition education objectives are stated for each grade level: (1) grade four--students will explore how to balance food intake and energy output for overall health and physical…
NASA Astrophysics Data System (ADS)
Rothdiener, Miriam; Hegemann, Miriam; Uynuk-Ool, Tatiana; Walters, Brandan; Papugy, Piruntha; Nguyen, Phong; Claus, Valentin; Seeger, Tanja; Stoeckle, Ulrich; Boehme, Karen A.; Aicher, Wilhelm K.; Stegemann, Jan P.; Hart, Melanie L.; Kurz, Bodo; Klein, Gerd; Rolauffs, Bernd
2016-10-01
Using matrix elasticity and cyclic stretch have been investigated for inducing mesenchymal stromal cell (MSC) differentiation towards the smooth muscle cell (SMC) lineage but not in combination. We hypothesized that combining lineage-specific stiffness with cyclic stretch would result in a significantly increased expression of SMC markers, compared to non-stretched controls. First, we generated dense collagen type I sheets by mechanically compressing collagen hydrogels. Atomic force microscopy revealed a nanoscale stiffness range known to support myogenic differentiation. Further characterization revealed viscoelasticity and stable biomechanical properties under cyclic stretch with >99% viable adherent human MSC. MSCs on collagen sheets demonstrated a significantly increased mRNA but not protein expression of SMC markers, compared to on culture flasks. However, cyclic stretch of MSCs on collagen sheets significantly increased both mRNA and protein expression of α-smooth muscle actin, transgelin, and calponin versus plastic and non-stretched sheets. Thus, lineage-specific stiffness and cyclic stretch can be applied together for inducing MSC differentiation towards SMCs without the addition of recombinant growth factors or other soluble factors. This represents a novel stimulation method for modulating the phenotype of MSCs towards SMCs that could easily be incorporated into currently available methodologies to obtain a more targeted control of MSC phenotype.
Design and development of a community carbon cycle benchmarking system for CMIP5 models
NASA Astrophysics Data System (ADS)
Mu, M.; Hoffman, F. M.; Lawrence, D. M.; Riley, W. J.; Keppel-Aleks, G.; Randerson, J. T.
2013-12-01
Benchmarking has been widely used to assess the ability of atmosphere, ocean, sea ice, and land surface models to capture the spatial and temporal variability of observations during the historical period. For the carbon cycle and terrestrial ecosystems, the design and development of an open-source community platform has been an important goal as part of the International Land Model Benchmarking (ILAMB) project. Here we designed and developed a software system that enables the user to specify the models, benchmarks, and scoring systems so that results can be tailored to specific model intercomparison projects. We used this system to evaluate the performance of CMIP5 Earth system models (ESMs). Our scoring system used information from four different aspects of climate, including the climatological mean spatial pattern of gridded surface variables, seasonal cycle dynamics, the amplitude of interannual variability, and long-term decadal trends. We used this system to evaluate burned area, global biomass stocks, net ecosystem exchange, gross primary production, and ecosystem respiration from CMIP5 historical simulations. Initial results indicated that the multi-model mean often performed better than many of the individual models for most of the observational constraints.
46 CFR 164.009-3 - Noncombustible materials not requiring specific approval.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Merchant Vessels § 164.009-3 Noncombustible materials not requiring specific approval. The following noncombustible materials may be used in merchant vessel construction though not specifically approved under this subpart: (a) Sheet glass, block glass, clay, ceramics, and uncoated fibers. (b) All metals, except...
Roudsari, AV; Gordon, C; Gray, JA Muir
2001-01-01
Background In 1998, the U.K. National Health Service Information for Health Strategy proposed the implementation of a National electronic Library for Health to provide clinicians, healthcare managers and planners, patients and the public with easy, round the clock access to high quality, up-to-date electronic information on health and healthcare. The Virtual Branch Libraries are among the most important components of the National electronic Library for Health . They aim at creating online knowledge based communities, each concerned with some specific clinical and other health-related topics. Objectives This study is about the envisaged Dermatology Virtual Branch Libraries of the National electronic Library for Health . It aims at selecting suitable dermatology Web resources for inclusion in the forthcoming Virtual Branch Libraries after establishing preliminary quality benchmarking rules for this task. Psoriasis, being a common dermatological condition, has been chosen as a starting point. Methods Because quality is a principal concern of the National electronic Library for Health, the study includes a review of the major quality benchmarking systems available today for assessing health-related Web sites. The methodology of developing a quality benchmarking system has been also reviewed. Aided by metasearch Web tools, candidate resources were hand-selected in light of the reviewed benchmarking systems and specific criteria set by the authors. Results Over 90 professional and patient-oriented Web resources on psoriasis and dermatology in general are suggested for inclusion in the forthcoming Dermatology Virtual Branch Libraries. The idea of an all-in knowledge-hallmarking instrument for the National electronic Library for Health is also proposed based on the reviewed quality benchmarking systems. Conclusions Skilled, methodical, organized human reviewing, selection and filtering based on well-defined quality appraisal criteria seems likely to be the key ingredient in the envisaged National electronic Library for Health service. Furthermore, by promoting the application of agreed quality guidelines and codes of ethics by all health information providers and not just within the National electronic Library for Health, the overall quality of the Web will improve with time and the Web will ultimately become a reliable and integral part of the care space. PMID:11720947
Takahashi, Hironobu; Okano, Teruo
2015-11-18
In some native tissues, appropriate microstructures, including orientation of the cell/extracellular matrix, provide specific mechanical and biological functions. For example, skeletal muscle is made of oriented myofibers that is responsible for the mechanical function. Native artery and myocardial tissues are organized three-dimensionally by stacking sheet-like tissues of aligned cells. Therefore, to construct any kind of complex tissue, the microstructures of cells such as myotubes, smooth muscle cells, and cardiomyocytes also need to be organized three-dimensionally just as in the native tissues of the body. Cell sheet-based tissue engineering allows the production of scaffold-free engineered tissues through a layer-by-layer construction technique. Recently, using microfabricated thermoresponsive substrates, aligned cells are being harvested as single continuous cell sheets. The cell sheets act as anisotropic tissue units to build three-dimensional tissue constructs with the appropriate anisotropy. This cell sheet-based technology is straightforward and has the potential to engineer a wide variety of complex tissues. In addition, due to the scaffold-free cell-dense environment, the physical and biological cell-cell interactions of these cell sheet constructs exhibit unique cell behaviors. These advantages will provide important clues to enable the production of well-organized tissues that closely mimic the structure and function of native tissues, required for the future of tissue engineering. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Steefel, C. I.
2015-12-01
Over the last 20 years, we have seen the evolution of multicomponent reactive transport modeling and the expanding range and increasing complexity of subsurface environmental applications it is being used to address. Reactive transport modeling is being asked to provide accurate assessments of engineering performance and risk for important issues with far-reaching consequences. As a result, the complexity and detail of subsurface processes, properties, and conditions that can be simulated have significantly expanded. Closed form solutions are necessary and useful, but limited to situations that are far simpler than typical applications that combine many physical and chemical processes, in many cases in coupled form. In the absence of closed form and yet realistic solutions for complex applications, numerical benchmark problems with an accepted set of results will be indispensable to qualifying codes for various environmental applications. The intent of this benchmarking exercise, now underway for more than five years, is to develop and publish a set of well-described benchmark problems that can be used to demonstrate simulator conformance with norms established by the subsurface science and engineering community. The objective is not to verify this or that specific code--the reactive transport codes play a supporting role in this regard—but rather to use the codes to verify that a common solution of the problem can be achieved. Thus, the objective of each of the manuscripts is to present an environmentally-relevant benchmark problem that tests the conceptual model capabilities, numerical implementation, process coupling, and accuracy. The benchmark problems developed to date include 1) microbially-mediated reactions, 2) isotopes, 3) multi-component diffusion, 4) uranium fate and transport, 5) metal mobility in mining affected systems, and 6) waste repositories and related aspects.
NASA Technical Reports Server (NTRS)
Davis, G. J.
1994-01-01
One area of research of the Information Sciences Division at NASA Ames Research Center is devoted to the analysis and enhancement of processors and advanced computer architectures, specifically in support of automation and robotic systems. To compare systems' abilities to efficiently process Lisp and Ada, scientists at Ames Research Center have developed a suite of non-parallel benchmarks called ELAPSE. The benchmark suite was designed to test a single computer's efficiency as well as alternate machine comparisons on Lisp, and/or Ada languages. ELAPSE tests the efficiency with which a machine can execute the various routines in each environment. The sample routines are based on numeric and symbolic manipulations and include two-dimensional fast Fourier transformations, Cholesky decomposition and substitution, Gaussian elimination, high-level data processing, and symbol-list references. Also included is a routine based on a Bayesian classification program sorting data into optimized groups. The ELAPSE benchmarks are available for any computer with a validated Ada compiler and/or Common Lisp system. Of the 18 routines that comprise ELAPSE, provided within this package are 14 developed or translated at Ames. The others are readily available through literature. The benchmark that requires the most memory is CHOLESKY.ADA. Under VAX/VMS, CHOLESKY.ADA requires 760K of main memory. ELAPSE is available on either two 5.25 inch 360K MS-DOS format diskettes (standard distribution) or a 9-track 1600 BPI ASCII CARD IMAGE format magnetic tape. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The ELAPSE benchmarks were written in 1990. VAX and VMS are trademarks of Digital Equipment Corporation. MS-DOS is a registered trademark of Microsoft Corporation.
Gururaj, Anupama E.; Chen, Xiaoling; Pournejati, Saeid; Alter, George; Hersh, William R.; Demner-Fushman, Dina; Ohno-Machado, Lucila
2017-01-01
Abstract The rapid proliferation of publicly available biomedical datasets has provided abundant resources that are potentially of value as a means to reproduce prior experiments, and to generate and explore novel hypotheses. However, there are a number of barriers to the re-use of such datasets, which are distributed across a broad array of dataset repositories, focusing on different data types and indexed using different terminologies. New methods are needed to enable biomedical researchers to locate datasets of interest within this rapidly expanding information ecosystem, and new resources are needed for the formal evaluation of these methods as they emerge. In this paper, we describe the design and generation of a benchmark for information retrieval of biomedical datasets, which was developed and used for the 2016 bioCADDIE Dataset Retrieval Challenge. In the tradition of the seminal Cranfield experiments, and as exemplified by the Text Retrieval Conference (TREC), this benchmark includes a corpus (biomedical datasets), a set of queries, and relevance judgments relating these queries to elements of the corpus. This paper describes the process through which each of these elements was derived, with a focus on those aspects that distinguish this benchmark from typical information retrieval reference sets. Specifically, we discuss the origin of our queries in the context of a larger collaborative effort, the biomedical and healthCAre Data Discovery Index Ecosystem (bioCADDIE) consortium, and the distinguishing features of biomedical dataset retrieval as a task. The resulting benchmark set has been made publicly available to advance research in the area of biomedical dataset retrieval. Database URL: https://biocaddie.org/benchmark-data PMID:29220453
Buell, G.R.; Grams, S.C.
1985-01-01
Significant temporal trends in monthly pH, specific conductance, total alkalinity, hardness, total nitrite-plus-nitrite nitrogen, and total phosphorus measurements at five stream sites in Georgia were identified using a rank correlation technique, the seasonal Kendall test and slope estimator. These sites include a U.S. Geological Survey Hydrologic Bench-Mark site, Falling Creek near Juliette, and four periodic water-quality monitoring sites. Comparison of raw data trends with streamflow-residual trends and, where applicable, with chemical-discharge trends (instantaneous fluxes) shws that some of these trends are responses to factors other than changing streamflow. Percentages of forested, agricultural, and urban cover with each basin did not change much during the periods of water-quality record, and therefore these non-flow-related trends are not obviously related to changes in land cover or land use. Flow-residual water-quality trends at the Hydrologic Bench-Mark site and at the Chattooga River site probably indicate basin reponses to changes in the chemical quality of atmospheric deposition. These two basins are predominantly forested and have received little recent human use. Observed trends at the other three sites probably indicate basin responses to various land uses and water uses associated with agricultural and urban land or to changes in specific uses. (USGS)
Foam Core Shielding for Spacecraft
NASA Technical Reports Server (NTRS)
Adams, Marc
2007-01-01
A foam core shield (FCS) system is now being developed to supplant multilayer insulation (MLI) systems heretofore installed on spacecraft for thermal management and protection against meteoroid impacts. A typical FCS system consists of a core sandwiched between a face sheet and a back sheet. The core can consist of any of a variety of low-to-medium-density polymeric or inorganic foams chosen to satisfy application-specific requirements regarding heat transfer and temperature. The face sheet serves to shock and thereby shatter incident meteoroids, and is coated on its outer surface to optimize its absorptance and emittance for regulation of temperature. The back sheet can be dimpled to minimize undesired thermal contact with the underlying spacecraft component and can be metallized on the surface facing the component to optimize its absorptance and emittance. The FCS systems can perform better than do MLI systems, at lower mass and lower cost and with greater volumetric efficiency.
Song, Jiangxuan; Yu, Zhaoxin; Gordin, Mikhail L; Wang, Donghai
2016-02-10
Herein, we report a synthesis of highly crumpled nitrogen-doped graphene sheets with ultrahigh pore volume (5.4 cm(3)/g) via a simple thermally induced expansion strategy in absence of any templates. The wrinkled graphene sheets are interwoven rather than stacked, enabling rich nitrogen-containing active sites. Benefiting from the unique pore structure and nitrogen-doping induced strong polysulfide adsorption ability, lithium-sulfur battery cells using these wrinkled graphene sheets as both sulfur host and interlayer achieved a high capacity of ∼1000 mAh/g and exceptional cycling stability even at high sulfur content (≥80 wt %) and sulfur loading (5 mg sulfur/cm(2)). The high specific capacity together with the high sulfur loading push the areal capacity of sulfur cathodes to ∼5 mAh/cm(2), which is outstanding compared to other recently developed sulfur cathodes and ideal for practical applications.
Low cost tooling material and process for graphite and Kevlar composites
NASA Technical Reports Server (NTRS)
Childs, William I.
1987-01-01
An Extruded Sheet Tooling Compound (ESTC) was developed for use in quickly building low cost molds for fabricating composites. The ESTC is a very highly mineral-filled resin system formed into a 6 mm thick sheet. The sheet is laid on the pattern, vacuum (bag) is applied to remove air from the pattern surface, and the assembly is heat cured. The formed ESTC is then backed and/or framed and ready for use. The cured ESTC exhibits low coefficient of thermal expansion and maintains strength at temperatures of 180 to 200 C. Tools were made and used successfully for: Compression molding of high strength epoxy sheet molding compound, stamping of aluminum, resin transfer molding of polyester, and liquid resin molding of polyester. Several variations of ESTC can be made for specific requirements. Higher thermal conductivity can be achieved by using an aluminum particle filler. Room temperature gel is possible to allow use of foam patterns.
Thermally stabilized heliostat
Anderson, Alfred J.
1983-01-01
An improvement in a heliostat having a main support structure and pivoting and tilting motors and gears and a mirror module for reflecting solar energy onto a collector, the improvement being characterized by an internal support structure within each mirror module and front and back sheets attached to the internal support structure, the front and back sheets having the same coefficient of thermal expansion such that no curvature is induced by temperature change, and a layer of adhesive adhering the mirror to the front sheet. The adhesive is water repellent and has adequate set strength to support the mirror but has sufficient shear tolerance to permit the differential expansion of the mirror and the front sheet without inducing stresses or currature effect. The adhesive also serves to dampen fluttering of the mirror and to protect the mirror backside against the adverse effects of weather. Also disclosed are specific details of the preferred embodiment.
NASA Technical Reports Server (NTRS)
Ko, William L.; Jackson, Raymond H.
1993-01-01
Combined inplane compressive and shear buckling analysis was conducted on flat rectangular sandwich panels using the Raleigh-Ritz minimum energy method with a consideration of transverse shear effect of the sandwich core. The sandwich panels were fabricated with titanium honeycomb core and laminated metal matrix composite face sheets. The results show that slightly slender (along unidirectional compressive loading axis) rectangular sandwich panels have the most desirable stiffness-to-weight ratios for aerospace structural applications; the degradation of buckling strength of sandwich panels with rising temperature is faster in shear than in compression; and the fiber orientation of the face sheets for optimum combined-load buckling strength of sandwich panels is a strong function of both loading condition and panel aspect ratio. Under the same specific weight and panel aspect ratio, a sandwich panel with metal matrix composite face sheets has much higher buckling strength than one having monolithic face sheets.
NASA Technical Reports Server (NTRS)
Klingler, L. J.; Weinberger, W. R.; Bailey, P. G.; Baranow, S.
1972-01-01
Two dispersion strengthened nickel base alloy systems were developed for use at temperatures up to 1204 C(2200 F); TD nickel chromium (TDNiCr) and TD nickel chromium aluminum (TDNiCrA1). They are considered candidate materials for use on the thermal protection systems of the space shuttle and for long term use in aircraft gas turbine engine applications. Improved manufacturing processes were developed for the fabrication of TDNiCr sheet and foil to specifications. Sheet rolling process studies and extrusion studies were made on two aluminum containing alloys: Ni-16%Cr-3.5%A1-2%ThO2 and Ni-16%Cr-5.0%A12%ThO2. Over 1600 kg.(3500 lb.) of plate, sheet, foil, bar and extrusion products were supplied to NASA Centers for technology studies.
Performance Evaluation of NoSQL Databases: A Case Study
2015-02-01
a centralized relational database. The customer decided to consider NoSQL technologies for two specific uses, namely: the primary data store for...17 custom specific 6. FU NoSQL availab data mo arking of data g a specific wo sin benchmark f hmark for tran le workload de o publish meas their...The choice of a particular NoSQL database imposes a specific distributed software architecture and data model, and is a major determinant of the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Asay-Davis, Xylar S.; Cornford, Stephen L.; Durand, Gaël
Coupled ice sheet-ocean models capable of simulating moving grounding lines are just becoming available. Such models have a broad range of potential applications in studying the dynamics of marine ice sheets and tidewater glaciers, from process studies to future projections of ice mass loss and sea level rise. The Marine Ice Sheet-Ocean Model Intercomparison Project (MISOMIP) is a community effort aimed at designing and coordinating a series of model intercomparison projects (MIPs) for model evaluation in idealized setups, model verification based on observations, and future projections for key regions of the West Antarctic Ice Sheet (WAIS). Here we describe computationalmore » experiments constituting three interrelated MIPs for marine ice sheet models and regional ocean circulation models incorporating ice shelf cavities. These consist of ice sheet experiments under the Marine Ice Sheet MIP third phase (MISMIP+), ocean experiments under the Ice Shelf-Ocean MIP second phase (ISOMIP+) and coupled ice sheet-ocean experiments under the MISOMIP first phase (MISOMIP1). All three MIPs use a shared domain with idealized bedrock topography and forcing, allowing the coupled simulations (MISOMIP1) to be compared directly to the individual component simulations (MISMIP+ and ISOMIP+). The experiments, which have qualitative similarities to Pine Island Glacier Ice Shelf and the adjacent region of the Amundsen Sea, are designed to explore the effects of changes in ocean conditions, specifically the temperature at depth, on basal melting and ice dynamics. In future work, differences between model results will form the basis for the evaluation of the participating models.« less
NASA Astrophysics Data System (ADS)
Aghazadeh, Mustafa; Rashidi, Amir; Ganjali, Mohammad Reza
2018-01-01
In this paper, the well-defined nano-sheets of α-Co(OH)2 were prepared through the cathodic electrosynthesis from an additive-free aqueous cobalt nitrate bath. The pulse current cathodic electro-deposition (PC-CED) was used as the means for the controlling the OH- electrogeneration on the cathode surface. The characteristics and electrochemical behavior of the prepared cobalt hydroxide were also assessed through SEM, TEM, XRD, BET, and IR. The results proved the product to be composed of crystalline pure α phase of cobalt hydroxide with sheet-like morphology at nanoscale. Evaluations of the electrochemical behaviour of the α-Co(OH)2 nano-sheets revealed that they are capable to delivering the specific capacitance of 1122 F g-1 at a discharge load of 3 A g-1 and SC retention of 84% after 4000 continues discharging cycles, suggesting the nano-sheets as promising candidates for use in electrochemical supercapacitors. Further, the method used for the preparation of the compounds enjoys the capability of being scaled up. [Figure not available: see fulltext.
NASA Astrophysics Data System (ADS)
Shi, Mingjie; Cui, Mangwei; Kang, Litao; Li, Taotao; Yun, Shan; Du, Jing; Xu, Shoudong; Liu, Ying
2018-01-01
For supercapacitors, pores in electrode materials can accelerate chemical reaction kinetics by shortening ion diffusion distances and by enlarging electrolyte/electrode interfaces. This article describes a simple one-step route for the preparation of pure-phase porous Ni3(NO3)2(OH)4 nano-sheets by directly heating a mild Ni(NO3)2 and urea solution. During heating, urea decomposed into NH3·H2O, which provided a suitable alkaline environment for the formation of Ni3(NO3)2(OH)4 nano-sheets. Meanwhile, the side product, NH4NO3, created numerous pores as a pore-forming agent. After NH4NO3 removal, the specific surface areas and pore volumes of products were boosted by ∼180-times (from 0.61 to 113.12 m2/g) and ∼90-times (from 3.40 × 10-3 to 3.17 × 10-1 m2/g), respectively. As a cathode material of supercapacitor, the porous Ni3(NO3)2(OH)4 nano-sheets exhibited a high specific capacitance of 1094 F/g at an ultrahigh mass loading of 17.55 mg/cm2, leading to an impressive areal capacitance of 19.2 F/cm2. Furthermore, a Ni3(NO3)2(OH)4 nano-sheet//commercial active carbon asymmetric supercapacitor was constructed and delivered an energy density of 33.2 Wh/Kg at a power density of 190.5 W/Kg, based on the mass of active materials on both electrodes.
NASA Astrophysics Data System (ADS)
Qiu, Zenghui; He, Dawei; Wang, Yongsheng; Li, Jiayuan
2017-09-01
A simple cobalt catalyzed gasification strategy to synthesize drilled graphene sheets (DGNs) is performed, and 3D DGNs hydrogel is prepared at a relatively low temperature. Due to mesopore hydrogel structure that increases the charge transfer efficiency by providing pathways for ionic into the overlaps of DGNs hydrogel and hole density displays controllably, the resulting DGNs hydrogel electrode provides excellent rate capability with an ultrahigh specific capacitance of 264.1 F g-1 at 1 A g-1 compared to a value of 187.8 F g-1 for graphene sheets (GNs) pole. DGNs hydrogel expands the design space for developing high-performance energy storage devices.
Characterization of the mechanical and physical properties of TD-NiCr (Ni-20Cr-2ThO2) alloy sheet
NASA Technical Reports Server (NTRS)
Fritz, L. J.; Koster, W. P.; Taylor, R. E.
1973-01-01
Sheets of TD-NiCr processed using techniques developed to produce uniform material were tested to supply mechanical and physical property data. Two heats each of 0.025 and 0.051 cm thick sheet were tested. Mechanical properties evaluated included tensile, modulus of elasticity, Poisson's Ratio, compression, creep-rupture, creep strength, bearing strength, shear strength, sharp notch and fatigue strength. Test temperatures covered the range from ambient to 1589K. Physical properties were also studied as a function of temperature. The physical properties measured were thermal conductivity, linear thermal expansion, specific heat, total hemispherical emittance, thermal diffusivity, and electrical conductivity.
Megias, Daniel; Phillips, Mark; Clifton-Hadley, Laura; Harron, Elizabeth; Eaton, David J; Sanghera, Paul; Whitfield, Gillian
2017-03-01
The HIPPO trial is a UK randomized Phase II trial of hippocampal sparing (HS) vs conventional whole-brain radiotherapy after surgical resection or radiosurgery in patients with favourable prognosis with 1-4 brain metastases. Each participating centre completed a planning benchmark case as part of the dedicated radiotherapy trials quality assurance programme (RTQA), promoting the safe and effective delivery of HS intensity-modulated radiotherapy (IMRT) in a multicentre trial setting. Submitted planning benchmark cases were reviewed using visualization for radiotherapy software (VODCA) evaluating plan quality and compliance in relation to the HIPPO radiotherapy planning and delivery guidelines. Comparison of the planning benchmark data highlighted a plan specified using dose to medium as an outlier by comparison with those specified using dose to water. Further evaluation identified that the reported plan statistics for dose to medium were lower as a result of the dose calculated at regions of PTV inclusive of bony cranium being lower relative to brain. Specification of dose to water or medium remains a source of potential ambiguity and it is essential that as part of a multicentre trial, consideration is given to reported differences, particularly in the presence of bone. Evaluation of planning benchmark data as part of an RTQA programme has highlighted an important feature of HS IMRT dosimetry dependent on dose being specified to water or medium, informing the development and undertaking of HS IMRT as part of the HIPPO trial. Advances in knowledge: The potential clinical impact of differences between dose to medium and dose to water are demonstrated for the first time, in the setting of HS whole-brain radiotherapy.
Cleanliness audit of clinical surfaces and equipment: who cleans what?
Anderson, R E; Young, V; Stewart, M; Robertson, C; Dancer, S J
2011-07-01
Current guidelines recommend regular cleaning of clinical equipment. We monitored items on a surgical ward for predominant user, hand-touch frequency, cleaning responsibilities and measurement of organic soil. Equipment was assessed in triplicate against a cleanliness benchmark of 100 relative light units (RLU) using the Hygiena® ATP system. There were 44 items, of which 21 were cleaned by clinical support workers (CSWs), five by domestic staff; three by nurses, three by doctors, and 12 with no designated cleaning responsibility. Geometric mean RLUs ranged from 60 to 550/100 cm² for small items such as hand-gel containers, bed control, blood pressure cuff and clinical notes; with similar values of 80-540/100 cm² RLU for larger items such as electrocardiogram machine, defibrillator, trolleys and tables. Overall geometric mean was 249/100 cm² RLU for all surfaces, with 84% (37 of 44) items exceeding the 100RLU benchmark. Of 27 items cleaned by clinical staff, 24 (89%) failed the benchmark. Of 12 sites with no cleaning specification, 11 (92%) failed the benchmark. Three of seven 'clean' sites (<100/100 cm² RLU) were cleaned by domestic staff. Average log(10) RLU of surfaces cleaned by domestics were 64% lower compared with surfaces cleaned by CSWs (95% confidence interval: 35%, 80%; P=0.019). In conclusion, clinical equipment frequently demonstrates high levels of organic soil, whether or not items have assigned cleaning responsibility. These findings suggest that cleaning practices for clinical equipment may require review, along with education of staff with specific cleaning responsibilities. Copyright © 2011 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Present Status and Extensions of the Monte Carlo Performance Benchmark
NASA Astrophysics Data System (ADS)
Hoogenboom, J. Eduard; Petrovic, Bojan; Martin, William R.
2014-06-01
The NEA Monte Carlo Performance benchmark started in 2011 aiming to monitor over the years the abilities to perform a full-size Monte Carlo reactor core calculation with a detailed power production for each fuel pin with axial distribution. This paper gives an overview of the contributed results thus far. It shows that reaching a statistical accuracy of 1 % for most of the small fuel zones requires about 100 billion neutron histories. The efficiency of parallel execution of Monte Carlo codes on a large number of processor cores shows clear limitations for computer clusters with common type computer nodes. However, using true supercomputers the speedup of parallel calculations is increasing up to large numbers of processor cores. More experience is needed from calculations on true supercomputers using large numbers of processors in order to predict if the requested calculations can be done in a short time. As the specifications of the reactor geometry for this benchmark test are well suited for further investigations of full-core Monte Carlo calculations and a need is felt for testing other issues than its computational performance, proposals are presented for extending the benchmark to a suite of benchmark problems for evaluating fission source convergence for a system with a high dominance ratio, for coupling with thermal-hydraulics calculations to evaluate the use of different temperatures and coolant densities and to study the correctness and effectiveness of burnup calculations. Moreover, other contemporary proposals for a full-core calculation with realistic geometry and material composition will be discussed.
Automated Rapid Prototyping of 3D Ceramic Parts
NASA Technical Reports Server (NTRS)
McMillin, Scott G.; Griffin, Eugene A.; Griffin, Curtis W.; Coles, Peter W. H.; Engle, James D.
2005-01-01
An automated system of manufacturing equipment produces three-dimensional (3D) ceramic parts specified by computational models of the parts. The system implements an advanced, automated version of a generic rapid-prototyping process in which the fabrication of an object having a possibly complex 3D shape includes stacking of thin sheets, the outlines of which closely approximate the horizontal cross sections of the object at their respective heights. In this process, the thin sheets are made of a ceramic precursor material, and the stack is subsequently heated to transform it into a unitary ceramic object. In addition to the computer used to generate the computational model of the part to be fabricated, the equipment used in this process includes: 1) A commercially available laminated-object-manufacturing machine that was originally designed for building woodlike 3D objects from paper and was modified to accept sheets of ceramic precursor material, and 2) A machine designed specifically to feed single sheets of ceramic precursor material to the laminated-object-manufacturing machine. Like other rapid-prototyping processes that utilize stacking of thin sheets, this process begins with generation of the computational model of the part to be fabricated, followed by computational sectioning of the part into layers of predetermined thickness that collectively define the shape of the part. Information about each layer is transmitted to rapid-prototyping equipment, where the part is built layer by layer. What distinguishes this process from other rapid-prototyping processes that utilize stacking of thin sheets are the details of the machines and the actions that they perform. In this process, flexible sheets of ceramic precursor material (called "green" ceramic sheets) suitable for lamination are produced by tape casting. The binder used in the tape casting is specially formulated to enable lamination of layers with little or no applied heat or pressure. The tape is cut into individual sheets, which are stacked in the sheet-feeding machine until used. The sheet-feeding machine can hold enough sheets for about 8 hours of continuous operation.
NASA Technical Reports Server (NTRS)
Barrett, C. A.; Lowell, C. E.
1975-01-01
Twenty-five commercial nickel-, iron-, and cobalt-base sheet alloys incorporating chromium or chromium and aluminum additions for oxidation resistance were tested at 1150 C in air for 100 hr in both isothermal and 1-hr cyclic furnace exposures. The alloys were evaluated by sample specific weight change, by type of scale formed, by amount and type of spall, and by sample thickness change and microstructure.-
EBIC Characterization and Hydrogen Passivation in Silicon Sheet
NASA Technical Reports Server (NTRS)
Hanoka, J. I.
1985-01-01
As a general qualitative tool, the electron beam induced current (EBIC) method can be very useful in imaging recombination in silicon sheet used for solar cells. Work using EBIC on EFG silicon ribbon is described. In particular, some efforts at making the technique more quantitative and hence more useful, some limitations of the method, and finally specific application to hydrogen passivation is treated. Some brief remarks are made regarding the technique itself.
1980-09-01
CLASSIFICATION OF THIS PAGE (Uffi Pat* jfntered) READ INSTRUCTIONSREPORT DOCUMENTATION PAGE BEFORE COMPLETING FORM AH -8- -21 12 . GOVT ACCESSION NO. 3. RECIPIENT’S...appliration of that specification. - DDO ,JA11473- K Unclassified t ,9 SECURITY CLASSIFICATION OF THIS PAGG Rnh DM- Entered) U nclassified SECURITY...codes .............................. 52 12 Sample data sheet for use in user analysis ............... 54 13 Sample data sheet G for use in user analysis
Inverted light-sheet microscope for imaging mouse pre-implantation development.
Strnad, Petr; Gunther, Stefan; Reichmann, Judith; Krzic, Uros; Balazs, Balint; de Medeiros, Gustavo; Norlin, Nils; Hiiragi, Takashi; Hufnagel, Lars; Ellenberg, Jan
2016-02-01
Despite its importance for understanding human infertility and congenital diseases, early mammalian development has remained inaccessible to in toto imaging. We developed an inverted light-sheet microscope that enabled us to image mouse embryos from zygote to blastocyst, computationally track all cells and reconstruct a complete lineage tree of mouse pre-implantation development. We used this unique data set to show that the first cell fate specification occurs at the 16-cell stage.
Computational and Experimental Studies on β-Sheet Breakers Targeting Aβ1–40 Fibrils
Minicozzi, Velia; Chiaraluce, Roberta; Consalvi, Valerio; Giordano, Cesare; Narcisi, Claudia; Punzi, Pasqualina; Rossi, Giancarlo C.; Morante, Silvia
2014-01-01
In this work we present and compare the results of extensive molecular dynamics simulations of model systems comprising an Aβ1–40 peptide in water in interaction with short peptides (β-sheet breakers) mimicking the 17–21 region of the Aβ1–40 sequence. Various systems differing in the customized β-sheet breaker structure have been studied. Specifically we have considered three kinds of β-sheet breakers, namely Ac-LPFFD-NH2 and two variants thereof, one obtained by substituting the acetyl group with the sulfonic amino acid taurine (Tau-LPFFD-NH2) and a second novel one in which the aspartic acid is substituted by an asparagine (Ac-LPFFN-NH2). Thioflavin T fluorescence, circular dichroism, and mass spectrometry experiments have been performed indicating that β-sheet breakers are able to inhibit in vitro fibril formation and prevent the β sheet folding of portions of the Aβ1–40 peptide. We show that molecular dynamics simulations and far UV circular dichroism provide consistent evidence that the new Ac-LPFFN-NH2 β-sheet breaker is more effective than the other two in stabilizing the native α-helix structure of Aβ1–40. In agreement with these results thioflavin T fluorescence experiments confirm the higher efficiency in inhibiting Aβ1–40 aggregation. Furthermore, mass spectrometry data and molecular dynamics simulations consistently identified the 17–21 Aβ1–40 portion as the location of the interaction region between peptide and the Ac-LPFFN-NH2 β-sheet breaker. PMID:24584938
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cohen, J; Dossa, D; Gokhale, M
Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe:more » (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows: SuperMicro X7DBE Xeon Dual Socket Blackford Server Motherboard; 2 Intel Xeon Dual-Core 2.66 GHz processors; 1 GB DDR2 PC2-5300 RAM (2 x 512); 80GB Hard Drive (Seagate SATA II Barracuda). The Fusion board is presently capable of 4X in a PCIe slot. The image resampling benchmark was run on a dual Xeon workstation with NVIDIA graphics card (see Chapter 5 for full specification). An XtremeData Opteron+FPGA was used for the language classification application. We observed that these benchmarks are not uniformly I/O intensive. The only benchmark that showed greater that 50% of the time in I/O was the graph algorithm when it accessed data files over NFS. When local disk was used, the graph benchmark spent at most 40% of its time in I/O. The other benchmarks were CPU dominated. The image resampling benchmark and language classification showed order of magnitude speedup over software by using co-processor technology to offload the CPU-intensive kernels. Our experiments to date suggest that emerging hardware technologies offer significant benefit to boosting the performance of data-intensive algorithms. Using GPU and FPGA co-processors, we were able to improve performance by more than an order of magnitude on the benchmark algorithms, eliminating the processor bottleneck of CPU-bound tasks. Experiments with a prototype solid state nonvolative memory available today show 10X better throughput on random reads than disk, with a 2X speedup on a graph processing benchmark when compared to the use of local SATA disk.« less
Armen, Roger S; DeMarco, Mari L; Alonso, Darwin O V; Daggett, Valerie
2004-08-10
Transthyretin, beta(2)-microglobulin, lysozyme, and the prion protein are four of the best-characterized proteins implicated in amyloid disease. Upon partial acid denaturation, these proteins undergo conformational change into an amyloidogenic intermediate that can self-assemble into amyloid fibrils. Many experiments have shown that pH-mediated changes in structure are required for the formation of the amyloidogeneic intermediate, but it has proved impossible to characterize these conformational changes at high resolution using experimental means. To probe these conformational changes at atomic resolution, we have performed molecular dynamics simulations of these proteins at neutral and low pH. In low-pH simulations of all four proteins, we observe the formation of alpha-pleated sheet secondary structure, which was first proposed by L. Pauling and R. B. Corey [(1951) Proc. Natl. Acad. Sci. USA 37, 251-256]. In all beta-sheet proteins, transthyretin and beta(2)-microglobulin, alpha-pleated sheet structure formed over the strands that are highly protected in hydrogen-exchange experiments probing amyloidogenic conditions. In lysozyme and the prion protein, alpha-sheets formed in the specific regions of the protein implicated in the amyloidogenic conversion. We propose that the formation of alpha-pleated sheet structure may be a common conformational transition in amyloidosis.
Armen, Roger S.; DeMarco, Mari L.; Alonso, Darwin O. V.; Daggett, Valerie
2004-01-01
Transthyretin, β2-microglobulin, lysozyme, and the prion protein are four of the best-characterized proteins implicated in amyloid disease. Upon partial acid denaturation, these proteins undergo conformational change into an amyloidogenic intermediate that can self-assemble into amyloid fibrils. Many experiments have shown that pH-mediated changes in structure are required for the formation of the amyloidogeneic intermediate, but it has proved impossible to characterize these conformational changes at high resolution using experimental means. To probe these conformational changes at atomic resolution, we have performed molecular dynamics simulations of these proteins at neutral and low pH. In low-pH simulations of all four proteins, we observe the formation of α-pleated sheet secondary structure, which was first proposed by L. Pauling and R. B. Corey [(1951) Proc. Natl. Acad. Sci. USA 37, 251–256]. In all β-sheet proteins, transthyretin and β2-microglobulin, α-pleated sheet structure formed over the strands that are highly protected in hydrogen-exchange experiments probing amyloidogenic conditions. In lysozyme and the prion protein, α-sheets formed in the specific regions of the protein implicated in the amyloidogenic conversion. We propose that the formation of α-pleated sheet structure may be a common conformational transition in amyloidosis. PMID:15280548
ComprehensiveBench: a Benchmark for the Extensive Evaluation of Global Scheduling Algorithms
NASA Astrophysics Data System (ADS)
Pilla, Laércio L.; Bozzetti, Tiago C.; Castro, Márcio; Navaux, Philippe O. A.; Méhaut, Jean-François
2015-10-01
Parallel applications that present tasks with imbalanced loads or complex communication behavior usually do not exploit the underlying resources of parallel platforms to their full potential. In order to mitigate this issue, global scheduling algorithms are employed. As finding the optimal task distribution is an NP-Hard problem, identifying the most suitable algorithm for a specific scenario and comparing algorithms are not trivial tasks. In this context, this paper presents ComprehensiveBench, a benchmark for global scheduling algorithms that enables the variation of a vast range of parameters that affect performance. ComprehensiveBench can be used to assist in the development and evaluation of new scheduling algorithms, to help choose a specific algorithm for an arbitrary application, to emulate other applications, and to enable statistical tests. We illustrate its use in this paper with an evaluation of Charm++ periodic load balancers that stresses their characteristics.
Experimental Criticality Benchmarks for SNAP 10A/2 Reactor Cores
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krass, A.W.
2005-12-19
This report describes computational benchmark models for nuclear criticality derived from descriptions of the Systems for Nuclear Auxiliary Power (SNAP) Critical Assembly (SCA)-4B experimental criticality program conducted by Atomics International during the early 1960's. The selected experimental configurations consist of fueled SNAP 10A/2-type reactor cores subject to varied conditions of water immersion and reflection under experimental control to measure neutron multiplication. SNAP 10A/2-type reactor cores are compact volumes fueled and moderated with the hydride of highly enriched uranium-zirconium alloy. Specifications for the materials and geometry needed to describe a given experimental configuration for a model using MCNP5 are provided. Themore » material and geometry specifications are adequate to permit user development of input for alternative nuclear safety codes, such as KENO. A total of 73 distinct experimental configurations are described.« less
NASA Astrophysics Data System (ADS)
Zhao, Shan
Zinc has begun to be studied as a bio-degradable material in recent years due to its excellent corrosion rate and optimal biocompatibility. Unfortunately, pure Zn's intrinsic ultimate tensile strength (UTS; below 120 MPa) is lower than the benchmark (about 300 MPa) for cardiovascular stent materials, raising concerns about sufficient strength to support the blood vessel. Thus, modifying pure Zn to improve its mechanical properties is an important research topic. In this dissertation project, a new Zn-Li alloy has been developed to retain the outstanding corrosion behavior from Zn while improving the mechanical characteristics and uniform biodegradation once it is implanted into the artery of Sprague-Dawley rats. The completed work includes: Manufactured Zn-Li alloy ingots and sheets via induction vacuum casting, melt spinning, hot rolling deformation, and wire electro discharge machining (wire EDM) technique; processed alloy samples using cross sectioning, mounting, etching and polishing technique; • Characterized alloy ingots, sheets and wires using hardness and tensile test, XRD, BEI imaging, SEM, ESEM, FTIR, ICP-OES and electrochemical test; then selected the optimum composition for in vitro and in vivo experiments; • Mimicked the degradation behavior of the Zn-Li alloy in vitro using simulated body fluid (SBF) and explored the relations between corrosion rate, corrosion products and surface morphology with changing compositions; • Explanted the Zn-Li alloy wire in abdominal aorta of rat over 12 months and studied its degradation mechanism, rate of bioabsorption, cytotoxicity and corrosion product migration from histological analysis.
Unifying Principles of the Reductive Covalent Graphene Functionalization.
Abellán, Gonzalo; Schirowski, Milan; Edelthalhammer, Konstantin F; Fickert, Michael; Werbach, Katharina; Peterlik, Herwig; Hauke, Frank; Hirsch, Andreas
2017-04-12
Covalently functionalized graphene derivatives were synthesized via benchmark reductive routes using graphite intercalation compounds (GICs), in particular KC 8 . We have compared the graphene arylation and alkylation of the GIC using 4-tert-butylphenyldiazonium and bis(4-(tert-butyl)phenyl)iodonium salts, as well as phenyl iodide, n-hexyl iodide, and n-dodecyl iodide, as electrophiles in model reactions. We have put a particular focus on the evaluation of the degree of addition and the bulk functionalization homogeneity (H bulk ). For this purpose, we have employed statistical Raman spectroscopy (SRS), and a forefront characterization tool using thermogravimetric analysis coupled with FT-IR, gas chromatography, and mass spectrometry (TGA/FT-IR/GC/MS). The present study unambiguously shows that the graphene functionalization using alkyl iodides leads to the best results, in terms of both the degree of addition and the H bulk . Moreover, we have identified the reversible character of the covalent addition chemistry, even at temperatures below 200 °C. The thermally induced addend cleavage proceeds homolytically, which allows for the detection of dimeric cleavage products by TGA/FT-IR/GC/MS. This dimerization points to a certain degree of regioselectivity, leading to a low sheet homogeneity (H sheet ). Finally, we developed this concept by performing the reductive alkylation reaction in monolayer CVD graphene films. This work provides important insights into the understanding of basic principles of reductive graphene functionalization and will serve as a guide in the design of new graphene functionalization concepts.
Lee, Hae-Min; Lee, Kangtaek; Kim, Chang-Koo
2014-01-09
Manganese-nickel (Mn-Ni) oxide films were electrodeposited on a graphite sheet in a bath consisting of manganese acetate and nickel chloride, and the structural, morphological, and electrochemical properties of these films were investigated. The electrodeposited Mn-Ni oxide films had porous structures covered with nanofibers. The X-ray diffractometer pattern revealed the presence of separate manganese oxide (g-MnO₂) and nickel oxide (NiO) in the films. The electrodeposited Mn-Ni oxide electrode exhibited a specific capacitance of 424 F/g in Na₂SO₄ electrolyte. This electrode maintained 86% of its initial specific capacitance over 2000 cycles of the charge-discharge operation, showing good cycling stability.
Undergraduate nursing students' perceptions regarding factors that affect math abilities
NASA Astrophysics Data System (ADS)
Pyo, Katrina A.
2011-07-01
A review of the nursing literature reveals many undergraduate nursing students lack proficiency with basic mathematical skills, those necessary for safe medication preparation and administration. Few studies exploring the phenomenon from the undergraduate nursing student perspective are reported in the nursing literature. The purpose of this study was to explore undergraduate nursing students’ perceptions of math abilities, factors that affect math abilities, the use of math in nursing, and the extent to which specific math skills were addressed throughout a nursing curriculum. Polya’s Model for Problem Solving and the Bloom’s Taxonomy of Educational Objectives, Affective Domain served as the theoretical background for the study. Qualitative and quantitative methods were utilized to obtain data from a purposive sample of undergraduate nursing students from a private university in western Pennsylvania. Participants were selected based on the proficiency level with math skills, as determined by a score on the Elsevier’s HESI™ Admission Assessment (A2) Exam, Math Portion. Ten students from the “Excellent” benchmark group and eleven students from the “Needing Additional Assistance or Improvement” benchmark group participated in one-on-one, semi-structured interviews, and completed a 25-item, 4-point Likert scale survey that rated confidence levels with specific math skills and the extent to which these skills were perceived to be addressed in the nursing curriculum. Responses from the two benchmark groups were compared and contrasted. Eight themes emerged from the qualitative data. Findings related to mathematical approach and confidence levels with specific math skills were determined to be statistically significant.
Finite Element Modeling of the World Federation's Second MFL Benchmark Problem
NASA Astrophysics Data System (ADS)
Zeng, Zhiwei; Tian, Yong; Udpa, Satish; Udpa, Lalita
2004-02-01
This paper presents results obtained by simulating the second magnetic flux leakage benchmark problem proposed by the World Federation of NDE Centers. The geometry consists of notches machined on the internal and external surfaces of a rotating steel pipe that is placed between two yokes that are part of a magnetic circuit energized by an electromagnet. The model calculates the radial component of the leaked field at specific positions. The nonlinear material property of the ferromagnetic pipe is taken into account in simulating the problem. The velocity effect caused by the rotation of the pipe is, however, ignored for reasons of simplicity.
A CPU benchmark for protein crystallographic refinement.
Bourne, P E; Hendrickson, W A
1990-01-01
The CPU time required to complete a cycle of restrained least-squares refinement of a protein structure from X-ray crystallographic data using the FORTRAN codes PROTIN and PROLSQ are reported for 48 different processors, ranging from single-user workstations to supercomputers. Sequential, vector, VLIW, multiprocessor, and RISC hardware architectures are compared using both a small and a large protein structure. Representative compile times for each hardware type are also given, and the improvement in run-time when coding for a specific hardware architecture considered. The benchmarks involve scalar integer and vector floating point arithmetic and are representative of the calculations performed in many scientific disciplines.
Nair, Pradeep S; John, Eugene B
2007-01-01
Aligning specific sequences against a very large number of other sequences is a central aspect of bioinformatics. With the widespread availability of personal computers in biology laboratories, sequence alignment is now often performed locally. This makes it necessary to analyse the performance of personal computers for sequence aligning bioinformatics benchmarks. In this paper, we analyse the performance of a personal computer for the popular BLAST and FASTA sequence alignment suites. Results indicate that these benchmarks have a large number of recurring operations and use memory operations extensively. It seems that the performance can be improved with a bigger L1-cache.
Fitzpatrick, Christopher; Fleming, Fiona M.; Madin-Warburton, Matthew; Schneider, Timm; Meheus, Filip; Asiedu, Kingsley; Solomon, Anthony W.; Montresor, Antonio; Biswas, Gautam
2016-01-01
Background Advocacy around mass treatment for the elimination of selected Neglected Tropical Diseases (NTDs) has typically put the cost per person treated at less than US$ 0.50. Whilst useful for advocacy, the focus on a single number misrepresents the complexity of delivering “free” donated medicines to about a billion people across the world. We perform a literature review and meta-regression of the cost per person per round of mass treatment against NTDs. We develop a web-based software application (https://healthy.shinyapps.io/benchmark/) to calculate setting-specific unit costs against which programme budgets and expenditures or results-based pay-outs can be benchmarked. Methods We reviewed costing studies of mass treatment for the control, elimination or eradication of lymphatic filariasis, schistosomiasis, soil-transmitted helminthiasis, onchocerciasis, trachoma and yaws. These are the main 6 NTDs for which mass treatment is recommended. We extracted financial and economic unit costs, adjusted to a standard definition and base year. We regressed unit costs on the number of people treated and other explanatory variables. Regression results were used to “predict” country-specific unit cost benchmarks. Results We reviewed 56 costing studies and included in the meta-regression 34 studies from 23 countries and 91 sites. Unit costs were found to be very sensitive to economies of scale, and the decision of whether or not to use local volunteers. Financial unit costs are expected to be less than 2015 US$ 0.50 in most countries for programmes that treat 100 thousand people or more. However, for smaller programmes, including those in the “last mile”, or those that cannot rely on local volunteers, both economic and financial unit costs are expected to be higher. Discussion The available evidence confirms that mass treatment offers a low cost public health intervention on the path towards universal health coverage. However, more costing studies focussed on elimination are needed. Unit cost benchmarks can help in monitoring value for money in programme plans, budgets and accounts, or in setting a reasonable pay-out for results-based financing mechanisms. PMID:27918573
Fitzpatrick, Christopher; Fleming, Fiona M; Madin-Warburton, Matthew; Schneider, Timm; Meheus, Filip; Asiedu, Kingsley; Solomon, Anthony W; Montresor, Antonio; Biswas, Gautam
2016-12-01
Advocacy around mass treatment for the elimination of selected Neglected Tropical Diseases (NTDs) has typically put the cost per person treated at less than US$ 0.50. Whilst useful for advocacy, the focus on a single number misrepresents the complexity of delivering "free" donated medicines to about a billion people across the world. We perform a literature review and meta-regression of the cost per person per round of mass treatment against NTDs. We develop a web-based software application (https://healthy.shinyapps.io/benchmark/) to calculate setting-specific unit costs against which programme budgets and expenditures or results-based pay-outs can be benchmarked. We reviewed costing studies of mass treatment for the control, elimination or eradication of lymphatic filariasis, schistosomiasis, soil-transmitted helminthiasis, onchocerciasis, trachoma and yaws. These are the main 6 NTDs for which mass treatment is recommended. We extracted financial and economic unit costs, adjusted to a standard definition and base year. We regressed unit costs on the number of people treated and other explanatory variables. Regression results were used to "predict" country-specific unit cost benchmarks. We reviewed 56 costing studies and included in the meta-regression 34 studies from 23 countries and 91 sites. Unit costs were found to be very sensitive to economies of scale, and the decision of whether or not to use local volunteers. Financial unit costs are expected to be less than 2015 US$ 0.50 in most countries for programmes that treat 100 thousand people or more. However, for smaller programmes, including those in the "last mile", or those that cannot rely on local volunteers, both economic and financial unit costs are expected to be higher. The available evidence confirms that mass treatment offers a low cost public health intervention on the path towards universal health coverage. However, more costing studies focussed on elimination are needed. Unit cost benchmarks can help in monitoring value for money in programme plans, budgets and accounts, or in setting a reasonable pay-out for results-based financing mechanisms.
Tremblay, Marlène; Hess, Justin P; Christenson, Brock M; McIntyre, Kolby K; Smink, Ben; van der Kamp, Arjen J; de Jong, Lisanne G; Döpfer, Dörte
2016-07-01
Automatic milking systems (AMS) are implemented in a variety of situations and environments. Consequently, there is a need to characterize individual farming practices and regional challenges to streamline management advice and objectives for producers. Benchmarking is often used in the dairy industry to compare farms by computing percentile ranks of the production values of groups of farms. Grouping for conventional benchmarking is commonly limited to the use of a few factors such as farms' geographic region or breed of cattle. We hypothesized that herds' production data and management information could be clustered in a meaningful way using cluster analysis and that this clustering approach would yield better peer groups of farms than benchmarking methods based on criteria such as country, region, breed, or breed and region. By applying mixed latent-class model-based cluster analysis to 529 North American AMS dairy farms with respect to 18 significant risk factors, 6 clusters were identified. Each cluster (i.e., peer group) represented unique management styles, challenges, and production patterns. When compared with peer groups based on criteria similar to the conventional benchmarking standards, the 6 clusters better predicted milk produced (kilograms) per robot per day. Each cluster represented a unique management and production pattern that requires specialized advice. For example, cluster 1 farms were those that recently installed AMS robots, whereas cluster 3 farms (the most northern farms) fed high amounts of concentrates through the robot to compensate for low-energy feed in the bunk. In addition to general recommendations for farms within a cluster, individual farms can generate their own specific goals by comparing themselves to farms within their cluster. This is very comparable to benchmarking but adds the specific characteristics of the peer group, resulting in better farm management advice. The improvement that cluster analysis allows for is characterized by the multivariable approach and the fact that comparisons between production units can be accomplished within a cluster and between clusters as a choice. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Simulations of flow induced structural transition of the β-switch region of glycoprotein Ibα.
Han, Mengzhi; Xu, Ji; Ren, Ying; Li, Jinghai
2016-02-01
Binding of glycoprotein Ibα to von Willebrand factor induces platelet adhesion to injured vessel walls and initiates a multistep hemostatic process. It has been hypothesized that the flow condition could induce a loop to β-sheet conformational change in the β-switch region of glycoprotein Ibα, which regulates it binding to the von Willebrand factor and facilitates the blood clot formation and wound healing. In this work, direct molecular dynamics (MD), flow MD and metadynamics, were employed to investigate the mechanisms of this flow induced conformational transition process. Specifically, the free energy landscape of the whole transition process was drawn by metadynamics with the path collective variable approach. The results reveal that without flow, the free energy landscape has two main basins, including a random loop basin stabilized by large conformational entropy and a partially folded β-sheet basin. The free energy barrier separating these two basins is relatively high and the β-switch could not fold from loop to β-sheet state spontaneously. The fully β-sheet conformations located in high free energy regions, which are also unstable and gradually unfold into partially folded β-sheet state with flow. Relatively weak flow could trigger some folding of the β-switch but could not fold it into fully β-sheet state. Under strong flow conditions, the β-switch could readily overcome the high free energy barrier and fold into fully β-sheet state. Copyright © 2015 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-11
... an errata sheet are available in the Agencywide Documents Access and Management System (ADAMS) under... Agencywide Documents Access and Management System (ADAMS): Publicly available documents created or received... facilitate expedited approval of plant-specific adoption of TSTF-493, Revision 4. Documents: You can access...
Ledogar, Robert J; Fleming, John; Andersson, Neil
2009-10-14
In preparation for a cluster-randomized controlled trial of a community intervention to increase the demand for measles vaccination in Lasbela district of Pakistan, a balance sheet summarized published evidence on benefits and possible adverse effects of measles vaccination. The balance sheet listed: 1) major health conditions associated with measles; 2) the risk among the unvaccinated who contract measles; 3) the risk among the vaccinated; 4) the risk difference between vaccinated and unvaccinated; and 5) the likely net gain from vaccination for each condition. Two models revealed very different projections of net gain from measles vaccine. A Lasbela-specific combination of low period prevalence of measles among the unvaccinated, medium vaccination coverage and low vaccine efficacy rate, as revealed by the baseline survey, resulted in less-than-expected gains attributable to vaccination. Modelled on estimates where the vaccine had greater efficacy, the gains from vaccination would be more substantial. Specific local conditions probably explain the low rates among the unvaccinated while the high vaccine failure rate is likely due to weaknesses in the vaccination delivery system. Community perception of these realities may have had some role in household decisions about whether to vaccinate, although the major discouraging factor was inadequate access. The balance sheet may be useful as a communication tool in other circumstances, applied to up-to-date local evidence.
Rehman, Wasif Ur; Xu, Youlong; Sun, Xiaofei; Ullah, Inam; Zhang, Yuan; Li, Long
2018-05-30
Volume expansion is a major challenge associated with tin oxide (SnO x ), which causes poor cyclability in lithium-ion battery anode. Bare tin dioxide (SnO 2 ), tin dioxide with graphene sheets (SnO 2 @GS), and bouquet-like nanocomposite structure (Mn 2 SnO 4 @GS) are prepared via hydrothermal method followed by annealing. The obtained composite material presents a bouquet structure containing manganese and tin oxide nanoparticle network with graphene sheets. Benefiting from this porous nanostructure, in which graphene sheets provide high electronic pathways to enhance the electronic conductivity, uniformly distributed particles offer accelerated kinetic reaction with lithium ion and reduced volume deviation in the tin dioxide (SnO 2 ) particle during charge-discharge testing. As a consequence, ternary composite Mn 2 SnO 4 @GS showed a high rate performance and outstanding cyclability of anode material for lithium-ion batteries. The electrode achieved a specific capacity of about 1070 mA h g -1 at a current density of 400 mA g -1 after 200 cycles; meanwhile, the electrode still delivered a specific capacity of about 455 mA h g -1 at a high current density of 2500 mA g -1 . Ternary Mn 2 SnO 4 @GS material could facilitate fabrication of unique structure and conductive network as advanced lithium-ion battery.
Lang, Carrie L; Simon, Diane; Kilgore, Jane
The American College of Surgeons Committee on Trauma revised the Resources for Optimal Care of the Injured Patient to include the criteria for trauma centers to participate in a risk-adjusted benchmarking system. Trauma Quality Improvement Program is currently the risk-adjusted benchmarking program sponsored by the American College of Surgeons, which will be required of all trauma centers to participate in early 2017. Prior to this, there were no risk-adjusted programs for Level III verified trauma centers. The Ohio Society of Trauma Nurse Leaders is a collaborative group made up of trauma program managers, coordinators, and other trauma leaders who meet 6 times a year. Within this group, a Level III Subcommittee was formed initially to provide a place for the Level III centers to discuss issues specific to the Level III centers. When the new requirement regarding risk-adjustment became official, the subcommittee agreed to begin reporting simple data points with the idea to risk adjust in the future.
Equilibrium Partitioning Sediment Benchmarks (ESBs) for the ...
This document describes procedures to determine the concentrations of nonionic organic chemicals in sediment interstitial waters. In previous ESB documents, the general equilibrium partitioning (EqP) approach was chosen for the derivation of sediment benchmarks because it accounts for the varying bioavailability of chemicals in different sediments and allows for the incorporation of the appropriate biological effects concentration. This provides for the derivation of benchmarks that are causally linked to the specific chemical, applicable across sediments, and appropriately protective of benthic organisms. This equilibrium partitioning sediment benchmark (ESB) document was prepared by scientists from the Atlantic Ecology Division, Mid-Continent Ecology Division, and Western Ecology Division, the Office of Water, and private consultants. The document describes procedures to determine the interstitial water concentrations of nonionic organic chemicals in contaminated sediments. Based on these concentrations, guidance is provided on the derivation of toxic units to assess whether the sediments are likely to cause adverse effects to benthic organisms. The equilibrium partitioning (EqP) approach was chosen because it is based on the concentrations of chemical(s) that are known to be harmful and bioavailable in the environment. This document, and five others published over the last nine years, will be useful for the Program Offices, including Superfund, a
Evaluation of the Pool Critical Assembly Benchmark with Explicitly-Modeled Geometry using MCNP6
Kulesza, Joel A.; Martz, Roger Lee
2017-03-01
Despite being one of the most widely used benchmarks for qualifying light water reactor (LWR) radiation transport methods and data, no benchmark calculation of the Oak Ridge National Laboratory (ORNL) Pool Critical Assembly (PCA) pressure vessel wall benchmark facility (PVWBF) using MCNP6 with explicitly modeled core geometry exists. As such, this paper provides results for such an analysis. First, a criticality calculation is used to construct the fixed source term. Next, ADVANTG-generated variance reduction parameters are used within the final MCNP6 fixed source calculations. These calculations provide unadjusted dosimetry results using three sets of dosimetry reaction cross sections of varyingmore » ages (those packaged with MCNP6, from the IRDF-2002 multi-group library, and from the ACE-formatted IRDFF v1.05 library). These results are then compared to two different sets of measured reaction rates. The comparison agrees in an overall sense within 2% and on a specific reaction- and dosimetry location-basis within 5%. Except for the neptunium dosimetry, the individual foil raw calculation-to-experiment comparisons usually agree within 10% but is typically greater than unity. Finally, in the course of developing these calculations, geometry that has previously not been completely specified is provided herein for the convenience of future analysts.« less
Estimating erosion in a riverine watershed: Bayou Liberty-Tchefuncta River in Louisiana.
Martin, August; Gunter, James T; Regens, James L
2003-01-01
GOAL, SCOPE, BACKGROUND: Sheet erosion from agricultural, forest and urban lands may increase stream sediment loads as well as transport other pollutants that adversely affect water quality, reduce agricultural and forest production, and increase infrastructure maintenance costs. This study uses spatial analysis techniques and a numerical modeling approach to predict areas with the greatest sheet erosion potential given different soils disturbance scenarios. A Geographic Information System (GIS) and the Universal Soil Loss Equation (USLE) were used to estimate sheet erosion from 0.64 ha parcels of land within the watershed. The Soil Survey of St. Tammany Parish, Louisiana was digitized, required soil attributes entered into the GIS database, and slope factors determined for each 80 x 80 meter parcel in the watershed. The GIS/USLE model used series-specific erosion K factors, a rainfall factor of 89, and a GIS database of scenario-driven cropping and erosion control practice factors to estimate potential soil loss due to sheet erosion. A general trend of increased potential sheet erosion occurred for all land use categories (urban, agriculture/grasslands, forests) as soil disturbance increases from cropping, logging and construction activities. Modeling indicated that rapidly growing urban areas have the greatest potential for sheet erosion. Evergreen and mixed forests (production forest) had lower sheet erosion potentials; with deciduous forests (mostly riparian) having the least sheet erosion potential. Erosion estimates from construction activities may be overestimated because of the value chosen for the erosion control practice factor. This study illustrates the ease with which GIS can be integrated with the Universal Soil Loss Equation to identify areas with high sheet erosion potential for large scale management and policy decision making. The GIS/USLE modeling approach used in this study offers a quick and inexpensive tool for estimating sheet erosion within watersheds using publicly available information. This method can quickly identify discrete locations with relatively precise spatial boundaries (approximately 80 meter resolution) that have a high sheet erosion potential as well as areas where management interventions might be appropriate to prevent or ameliorate erosion.
Imaging galectin-3 dependent endocytosis with lattice light-sheet microscopy
NASA Astrophysics Data System (ADS)
Baek, Jongho; Lou, Jieqiong; Coelho, Simao; Lim, Yean Jin; Seidlitz, Silvia; Nicovich, Philip R.; Wunder, Christian; Johannes, Ludger; Gaus, Katharina
2017-04-01
Lattice light-sheet (LLS) microscopy provides ultrathin light sheets of a two-dimensional optical lattice that allows us imaging three-dimensional (3D) objects for hundreds of time points at sub-second intervals and at or below the diffraction limit. Galectin-3 (Gal3), a carbohydrate-binding protein, triggers glycosphingolipid (GSL)-dependent biogenesis of morphologically distinct endocytic vesicles that are cargo specific and clathrin independent. In this study, we apply LLS microscopy to study the dynamics of Gal3 dependent endocytosis in live T cells. This will allow us to observe Gal3-mediated endocytosis at high temporal and excellent 3D spatial resolution, which may shed light on our understanding of the mechanism and physiological function of Gal3-induced endocytosis.
Finite element simulation and Experimental verification of Incremental Sheet metal Forming
NASA Astrophysics Data System (ADS)
Kaushik Yanamundra, Krishna; Karthikeyan, R., Dr.; Naranje, Vishal, Dr
2018-04-01
Incremental sheet metal forming is now a proven manufacturing technique that can be employed to obtain application specific, customized, symmetric or asymmetric shapes that are required by automobile or biomedical industries for specific purposes like car body parts, dental implants or knee implants. Finite element simulation of metal forming process is being performed successfully using explicit dynamics analysis of commercial FE software. The simulation is mainly useful in optimization of the process as well design of the final product. This paper focuses on simulating the incremental sheet metal forming process in ABAQUS, and validating the results using experimental methods. The shapes generated for testing are of trapezoid, dome and elliptical shapes whose G codes are written and fed into the CNC milling machine with an attached forming tool with a hemispherical bottom. The same pre-generated coordinates are used to simulate a similar machining conditions in ABAQUS and the tool forces, stresses and strains in the workpiece while machining are obtained as the output data. The forces experimentally were recorded using a dynamometer. The experimental and simulated results were then compared and thus conclusions were drawn.
Ultrashort electromagnetic pulse control of intersubband quantum well transitions
2012-01-01
We study the creation of high-efficiency controlled population transfer in intersubband transitions of semiconductor quantum wells. We give emphasis to the case of interaction of the semiconductor quantum well with electromagnetic pulses with a duration of few cycles and even a single cycle. We numerically solve the effective nonlinear Bloch equations for a specific double GaAs/AlGaAs quantum well structure, taking into account the ultrashort nature of the applied field, and show that high-efficiency population inversion is possible for specific pulse areas. The dependence of the efficiency of population transfer on the electron sheet density and the carrier envelope phase of the pulse is also explored. For electromagnetic pulses with a duration of several cycles, we find that the change in the electron sheet density leads to a very different response of the population in the two subbands to pulse area. However, for pulses with a duration equal to or shorter than 3 cycles, we show that efficient population transfer between the two subbands is possible, independent of the value of electron sheet density, if the pulse area is Π. PMID:22916956
Ultrashort electromagnetic pulse control of intersubband quantum well transitions.
Paspalakis, Emmanuel; Boviatsis, John
2012-08-23
: We study the creation of high-efficiency controlled population transfer in intersubband transitions of semiconductor quantum wells. We give emphasis to the case of interaction of the semiconductor quantum well with electromagnetic pulses with a duration of few cycles and even a single cycle. We numerically solve the effective nonlinear Bloch equations for a specific double GaAs/AlGaAs quantum well structure, taking into account the ultrashort nature of the applied field, and show that high-efficiency population inversion is possible for specific pulse areas. The dependence of the efficiency of population transfer on the electron sheet density and the carrier envelope phase of the pulse is also explored. For electromagnetic pulses with a duration of several cycles, we find that the change in the electron sheet density leads to a very different response of the population in the two subbands to pulse area. However, for pulses with a duration equal to or shorter than 3 cycles, we show that efficient population transfer between the two subbands is possible, independent of the value of electron sheet density, if the pulse area is Π.
Zihni, Ceniz; Harris, Andrew R.; Bailly, Maryse; Charras, Guillaume T.; Balda, Maria S.; Matter, Karl
2012-01-01
Actinomyosin activity is an important driver of cell locomotion and has been shown to promote collective cell migration of epithelial sheets as well as single cell migration and tumor cell invasion. However, the molecular mechanisms underlying activation of cortical myosin to stimulate single cell movement, and the relationship between the mechanisms that drive single cell locomotion and those that mediate collective cell migration of epithelial sheets are incompletely understood. Here, we demonstrate that p114RhoGEF, an activator of RhoA that associates with non-muscle myosin IIA, regulates collective cell migration of epithelial sheets and tumor cell invasion. Depletion of p114RhoGEF resulted in specific spatial inhibition of myosin activation at cell-cell contacts in migrating epithelial sheets and the cortex of migrating single cells, but only affected double and not single phosphorylation of myosin light chain. In agreement, overall elasticity and contractility of the cells, processes that rely on persistent and more constant forces, were not affected, suggesting that p114RhoGEF mediates process-specific myosin activation. Locomotion was p114RhoGEF-dependent on Matrigel, which favors more roundish cells and amoeboid-like actinomyosin-driven movement, but not on fibronectin, which stimulates flatter cells and lamellipodia-driven, mesenchymal-like migration. Accordingly, depletion of p114RhoGEF led to reduced RhoA, but increased Rac activity. Invasion of 3D matrices was p114RhoGEF-dependent under conditions that do not require metalloproteinase activity, supporting a role of p114RhoGEF in myosin-dependent, amoeboid-like locomotion. Our data demonstrate that p114RhoGEF drives cortical myosin activation by stimulating myosin light chain double phosphorylation and, thereby, collective cell migration of epithelial sheets and amoeboid-like motility of tumor cells. PMID:23185572
Manufacture of a four-sheet complex component from different titanium alloys by superplastic forming
NASA Astrophysics Data System (ADS)
Allazadeh, M. R.; Zuelli, N.
2017-10-01
A superplastic forming (SPF) technology process was deployed to form a complex component with eight-pocket from a four-sheet sandwich panel sheetstock. Six sheetstock packs were composed of two core sheets made of Ti-6Al-4V or Ti-5Al-4Cr-4Mo-2Sn-2Zr titanium alloy and two skin sheets made of Ti-6Al-4V or Ti-6Al-2Sn-4Zr-2Mo titanium alloy in three different combinations. The sheets were welded with two subsequent welding patterns over the core and skin sheets to meet the required component's details. The applied welding methods were intermittent and continuous resistance seam welding for bonding the core sheets to each other and the skin sheets over the core panel, respectively. The final component configuration was predicted based on the die drawings and finite element method (FEM) simulations for the sandwich panels. An SPF system set-up with two inlet gas pipe feeding facilitated the trials to deliver two pressure-time load cycles acting simultaneously which were extracted from FEM analysis for specific forming temperature and strain rate. The SPF pressure-time cycles were optimized via GOM scanning and visually inspecting some sections of the packs in order to assess the levels of core panel formation during the inflation process of the sheetstock. Two sets of GOM scan results were compared via GOM software to inspect the surface and internal features of the inflated multisheet packs. The results highlighted the capability of the tested SPF process to form complex components from a flat multisheet pack made of different titanium alloys.
Tomura, Kyosuke; Ohguri, Takayuki; Mulder, Hendrik Thijmen; Murakami, Motohiro; Nakahara, Sota; Yahara, Katsuya; Korogi, Yukunori
2017-11-20
To evaluate the feasibility and efficacy of deep regional hyperthermia with the use of mobile insulator sheets in a capacitively coupled heating device. The heat was applied using an 8-MHz radiofrequency-capacitive device. The insulator sheet was inserted between the regular bolus and cooled overlay bolus in each of upper and lower side of the electrode. Several settings using the insulator sheets were investigated in an experimental study using an agar phantom to evaluate the temperature distributions. The specific absorption rate (SAR) distributions in several organs were also computed for the three-dimensional patient model. In a clinical prospective study, a total of five heating sessions were scheduled for the pelvic tumours, to assess the thermal parameters. The conventional setting was used during the first, third and fifth treatment sessions, and insulator sheets were used during the second and fourth treatment sessions. In the phantom study, the higher heating area improved towards the centre when the mobile insulator sheets were used. The subcutaneous fat/target ratios for the averaged SARs in the setting with the mobile insulator (median, 2.5) were significantly improved compared with those in the conventional setting (median, 3.4). In the clinical study, the thermal dose parameters of CEM43°CT90 in the sessions with the mobile insulator sheets (median, 1.9 min) were significantly better than those in the sessions using a conventional setting (median, 1.0 min). Our novel heating method using mobile insulator sheets was thus found to improve the thermal dose parameters. Further investigations are expected.
Surface water hydrology and the Greenland Ice Sheet
NASA Astrophysics Data System (ADS)
Smith, L. C.; Yang, K.; Pitcher, L. H.; Overstreet, B. T.; Chu, V. W.; Rennermalm, A. K.; Cooper, M. G.; Gleason, C. J.; Ryan, J.; Hubbard, A.; Tedesco, M.; Behar, A.
2016-12-01
Mass loss from the Greenland Ice Sheet now exceeds 260 Gt/year, raising global sea level by >0.7 mm annually. Approximately two-thirds of this total mass loss is now driven by negative ice sheet surface mass balance (SMB), attributed mainly to production and runoff of meltwater from the ice sheet surface. This new dominance of runoff as a driver of GrIS total mass loss will likely persist owing to anticipated further increases in surface melting, reduced meltwater storage in firn, and the waning importance of dynamical mass losses (ice calving) as the ice sheets retreat from their marine-terminating margins. It also creates the need and opportunity for integrative research pairing traditional surface water hydrology approaches with glaciology. As one example, we present a way to measure supraglacial "runoff" (i.e. specific discharge) at the supraglacial catchment scale ( 101-102 km2), using in situ measurements of supraglacial river discharge and high-resolution satellite/drone mapping of upstream catchment area. This approach, which is standard in terrestrial hydrology but novel for ice sheet science, enables independent verification and improvement of modeled SMB runoff estimates used to project sea level rise. Furthermore, because current SMB models do not consider the role of fluvial watershed processes operating on the ice surface, inclusion of even a simple surface routing model materially improves simulations of runoff delivered to moulins, the critical pathways for meltwater entry into the ice sheet. Incorporating principles of surface water hydrology and fluvial geomorphology and into glaciological models will thus aid estimates of Greenland meltwater runoff to the global ocean as well as connections to subglacial hydrology and ice sheet dynamics.
The sea-level fingerprints of ice-sheet collapse during interglacial periods
NASA Astrophysics Data System (ADS)
Hay, Carling; Mitrovica, Jerry X.; Gomez, Natalya; Creveling, Jessica R.; Austermann, Jacqueline; E. Kopp, Robert
2014-03-01
Studies of sea level during previous interglacials provide insight into the stability of polar ice sheets in the face of global climate change. Commonly, these studies correct ancient sea-level highstands for the contaminating effect of isostatic adjustment associated with past ice age cycles, and interpret the residuals as being equivalent to the peak eustatic sea level associated with excess melting, relative to present day, of ancient polar ice sheets. However, the collapse of polar ice sheets produces a distinct geometry, or fingerprint, of sea-level change, which must be accounted for to accurately infer peak eustatic sea level from site-specific residual highstands. To explore this issue, we compute fingerprints associated with the collapse of the Greenland Ice Sheet, West Antarctic Ice Sheet, and marine sectors of the East Antarctic Ice Sheet in order to isolate regions that would have been subject to greater-than-eustatic sea-level change for all three cases. These fingerprints are more robust than those associated with modern melting events, when applied to infer eustatic sea level, because: (1) a significant collapse of polar ice sheets reduces the sensitivity of the computed fingerprints to uncertainties in the geometry of the melt regions; and (2) the sea-level signal associated with the collapse will dominate the signal from steric effects. We evaluate these fingerprints at a suite of sites where sea-level records from interglacial marine isotopes stages (MIS) 5e and 11 have been obtained. Using these results, we demonstrate that previously discrepant estimates of peak eustatic sea level during MIS5e based on sea-level markers in Australia and the Seychelles are brought into closer accord.
Perthes Disease: The Quality and Reliability of Information on the Internet.
Nassiri, Mujtaba; Bruce-Brand, Robert A; O'Neill, Francis; Chenouri, Shojaeddin; Curtin, Paul
2015-01-01
Research has shown that up to 89% of parents used the Internet to seek health information regarding their child's medical condition. Much of the information on the Internet is valuable; however, the quality of health information is variable and unregulated. The aim of this study was to evaluate the quality and content of information about Perthes disease on the Internet using recognized scoring systems, identification of quality markers, and describe a novel specific score. We searched the top 3 search engines (Google, Yahoo!, and Bing) for the following keywords: "Perthes disease." Forty-five unique Web sites were identified. The Web sites were then categorized by type and assessed using the DISCERN score, the Journal of the American Medical Association (JAMA) benchmark criteria, and a novel Perthes-specific Content score. The presence of the Health On the Net (HON) code, a reported quality assurance marker, was noted. Of the Web sites analyzed, the Majority were Governmental and Nonprofit Organizations (NPO) (37.8%), followed by commercial Web sites (22.2%). Only 6 of the Web sites were HONcode certified. The mean DISCERN score was 53.1 (SD=9.0). The Governmental and NPO Web sites had the highest overall DISCERN scores followed closely by Physician Web sites. The mean JAMA benchmark criteria score was 2.1 (SD=1.2). Nine Web sites had maximal scores and the Academic Web sites had the highest overall JAMA benchmark scores. DISCERN scores, JAMA benchmark scores, and Perthes-specific Content scores were all greater for Web sites that bore the HONcode seal. The quality of information available online regarding Perthes disease is of variable quality. Governmental and NPO Web sites predominate and also provide higher quality content. The HONcode seal is a reliable indicator of Web site quality. Physicians should recommend the HONcode seal to their patients as a reliable indicator of Web site quality or, better yet, refer patients to sites they have personally reviewed. Supplying parents with a guide to health information on the Internet will help exclude Web sites as sources of misinformation.
Benchmarking of Typical Meteorological Year datasets dedicated to Concentrated-PV systems
NASA Astrophysics Data System (ADS)
Realpe, Ana Maria; Vernay, Christophe; Pitaval, Sébastien; Blanc, Philippe; Wald, Lucien; Lenoir, Camille
2016-04-01
Accurate analysis of meteorological and pyranometric data for long-term analysis is the basis of decision-making for banks and investors, regarding solar energy conversion systems. This has led to the development of methodologies for the generation of Typical Meteorological Years (TMY) datasets. The most used method for solar energy conversion systems was proposed in 1978 by the Sandia Laboratory (Hall et al., 1978) considering a specific weighted combination of different meteorological variables with notably global, diffuse horizontal and direct normal irradiances, air temperature, wind speed, relative humidity. In 2012, a new approach was proposed in the framework of the European project FP7 ENDORSE. It introduced the concept of "driver" that is defined by the user as an explicit function of the pyranometric and meteorological relevant variables to improve the representativeness of the TMY datasets with respect the specific solar energy conversion system of interest. The present study aims at comparing and benchmarking different TMY datasets considering a specific Concentrated-PV (CPV) system as the solar energy conversion system of interest. Using long-term (15+ years) time-series of high quality meteorological and pyranometric ground measurements, three types of TMY datasets generated by the following methods: the Sandia method, a simplified driver with DNI as the only representative variable and a more sophisticated driver. The latter takes into account the sensitivities of the CPV system with respect to the spectral distribution of the solar irradiance and wind speed. Different TMY datasets from the three methods have been generated considering different numbers of years in the historical dataset, ranging from 5 to 15 years. The comparisons and benchmarking of these TMY datasets are conducted considering the long-term time series of simulated CPV electric production as a reference. The results of this benchmarking clearly show that the Sandia method is not suitable for CPV systems. For these systems, the TMY datasets obtained using dedicated drivers (DNI only or more precise one) are more representative to derive TMY datasets from limited long-term meteorological dataset.
California State Waters Map Series-Offshore of Point Reyes, California
Watt, Janet T.; Dartnell, Peter; Golden, Nadine E.; Greene, H. Gary; Erdey, Mercedes D.; Cochrane, Guy R.; Johnson, Samuel Y.; Hartwell, Stephen R.; Kvitek, Rikk G.; Manson, Michael W.; Endris, Charles A.; Dieter, Bryan E.; Sliter, Ray W.; Krigsman, Lisa M.; Lowe, Erik; Chinn, John L.; Watt, Janet T.; Cochran, Susan A.
2015-01-01
This publication about the Offshore of Point Reyes map area includes ten map sheets that contain explanatory text, in addition to this descriptive pamphlet and a data catalog of geographic information system (GIS) files. Sheets 1, 2, and 3 combine data from four different sonar surveys to generate comprehensive high-resolution bathymetry and acoustic-backscatter coverage of the map area. These data reveal a range of physiographic features (highlighted in the perspective views on sheet 4) such as the flat, sediment-covered seafloor in Drakes Bay, as well as abundant “scour depressions” on the Bodega Head–Tomales Point shelf (see sheet 9) and local, tectonically controlled bedrock uplifts. To validate geological and biological interpretations of the sonar data shown in sheets 1, 2, and 3, the U.S. Geological Survey towed a camera sled over specific offshore locations, collecting both video and photographic imagery; these “ground-truth” surveying data are summarized on sheet 6. Sheet 5 is a “seafloor character” map, which classifies the seafloor on the basis of depth, slope, rugosity (ruggedness), and backscatter intensity and which is further informed by the ground-truth-survey imagery. Sheet 7 is a map of “potential habitats,” which are delineated on the basis of substrate type, geomorphology, seafloor process, or other attributes that may provide a habitat for a specific species or assemblage of organisms. Sheet 8 compiles representative seismic-reflection profiles from the map area, providing information on the subsurface stratigraphy and structure of the map area. Sheet 9 shows the distribution and thickness of young sediment (deposited over the last about 21,000 years, during the most recent sea-level rise) in both the map area and the larger Salt Point to Drakes Bay region, interpreted on the basis of the seismic-reflection data, and it identifies the Offshore of Point Reyes map area as lying within the Bodega Head–Tomales Point shelf, Point Reyes bar, and Bolinas shelf domains. Sheet 10 is a geologic map that merges onshore geologic mapping (compiled from existing maps by the California Geological Survey) and new offshore geologic mapping that is based on integration of high-resolution bathymetry and backscatter imagery (sheets 1, 2, 3), seafloor-sediment and rock samples (Reid and others, 2006), digital camera and video imagery (sheet 6), and high-resolution seismic-reflection profiles (sheet 8), as well as aerial-photographic interpretation of nearshore areas. The information provided by the map sheets, pamphlet, and data catalog have a broad range of applications. High-resolution bathymetry, acoustic backscatter, ground-truth-surveying imagery, and habitat mapping all contribute to habitat characterization and ecosystem-based management by providing essential data for delineation of marine protected areas and ecosystem restoration. Many of the maps provide high-resolution baselines that will be critical for monitoring environmental change associated with climate change, coastal development, or other forcings. High-resolution bathymetry is a critical component for modeling coastal flooding caused by storms and tsunamis, as well as inundation associated with longer term sea-level rise. Seismic-reflection and bathymetric data help characterize earthquake and tsunami sources, critical for natural-hazard assessments of coastal zones. Information on sediment distribution and thickness is essential to the understanding of local and regional sediment transport, as well as the development of regional sediment-management plans. In addition, siting of any new offshore infrastructure (for example, pipelines, cables, or renewable-energy facilities) will depend on high-resolution mapping. Finally, this mapping will both stimulate and enable new scientific research and also raise public awareness of, and education about, coastal environments and issues.
Denoising DNA deep sequencing data—high-throughput sequencing errors and their correction
Laehnemann, David; Borkhardt, Arndt
2016-01-01
Characterizing the errors generated by common high-throughput sequencing platforms and telling true genetic variation from technical artefacts are two interdependent steps, essential to many analyses such as single nucleotide variant calling, haplotype inference, sequence assembly and evolutionary studies. Both random and systematic errors can show a specific occurrence profile for each of the six prominent sequencing platforms surveyed here: 454 pyrosequencing, Complete Genomics DNA nanoball sequencing, Illumina sequencing by synthesis, Ion Torrent semiconductor sequencing, Pacific Biosciences single-molecule real-time sequencing and Oxford Nanopore sequencing. There is a large variety of programs available for error removal in sequencing read data, which differ in the error models and statistical techniques they use, the features of the data they analyse, the parameters they determine from them and the data structures and algorithms they use. We highlight the assumptions they make and for which data types these hold, providing guidance which tools to consider for benchmarking with regard to the data properties. While no benchmarking results are included here, such specific benchmarks would greatly inform tool choices and future software development. The development of stand-alone error correctors, as well as single nucleotide variant and haplotype callers, could also benefit from using more of the knowledge about error profiles and from (re)combining ideas from the existing approaches presented here. PMID:26026159
Benchmarking reference services: step by step.
Buchanan, H S; Marshall, J G
1996-01-01
This article is a companion to an introductory article on benchmarking published in an earlier issue of Medical Reference Services Quarterly. Librarians interested in benchmarking often ask the following questions: How do I determine what to benchmark; how do I form a benchmarking team; how do I identify benchmarking partners; what's the best way to collect and analyze benchmarking information; and what will I do with the data? Careful planning is a critical success factor of any benchmarking project, and these questions must be answered before embarking on a benchmarking study. This article summarizes the steps necessary to conduct benchmarking research. Relevant examples of each benchmarking step are provided.
MGmapper: Reference based mapping and taxonomy annotation of metagenomics sequence reads
Lukjancenko, Oksana; Thomsen, Martin Christen Frølund; Maddalena Sperotto, Maria; Lund, Ole; Møller Aarestrup, Frank; Sicheritz-Pontén, Thomas
2017-01-01
An increasing amount of species and gene identification studies rely on the use of next generation sequence analysis of either single isolate or metagenomics samples. Several methods are available to perform taxonomic annotations and a previous metagenomics benchmark study has shown that a vast number of false positive species annotations are a problem unless thresholds or post-processing are applied to differentiate between correct and false annotations. MGmapper is a package to process raw next generation sequence data and perform reference based sequence assignment, followed by a post-processing analysis to produce reliable taxonomy annotation at species and strain level resolution. An in-vitro bacterial mock community sample comprised of 8 genuses, 11 species and 12 strains was previously used to benchmark metagenomics classification methods. After applying a post-processing filter, we obtained 100% correct taxonomy assignments at species and genus level. A sensitivity and precision at 75% was obtained for strain level annotations. A comparison between MGmapper and Kraken at species level, shows MGmapper assigns taxonomy at species level using 84.8% of the sequence reads, compared to 70.5% for Kraken and both methods identified all species with no false positives. Extensive read count statistics are provided in plain text and excel sheets for both rejected and accepted taxonomy annotations. The use of custom databases is possible for the command-line version of MGmapper, and the complete pipeline is freely available as a bitbucked package (https://bitbucket.org/genomicepidemiology/mgmapper). A web-version (https://cge.cbs.dtu.dk/services/MGmapper) provides the basic functionality for analysis of small fastq datasets. PMID:28467460
MGmapper: Reference based mapping and taxonomy annotation of metagenomics sequence reads.
Petersen, Thomas Nordahl; Lukjancenko, Oksana; Thomsen, Martin Christen Frølund; Maddalena Sperotto, Maria; Lund, Ole; Møller Aarestrup, Frank; Sicheritz-Pontén, Thomas
2017-01-01
An increasing amount of species and gene identification studies rely on the use of next generation sequence analysis of either single isolate or metagenomics samples. Several methods are available to perform taxonomic annotations and a previous metagenomics benchmark study has shown that a vast number of false positive species annotations are a problem unless thresholds or post-processing are applied to differentiate between correct and false annotations. MGmapper is a package to process raw next generation sequence data and perform reference based sequence assignment, followed by a post-processing analysis to produce reliable taxonomy annotation at species and strain level resolution. An in-vitro bacterial mock community sample comprised of 8 genuses, 11 species and 12 strains was previously used to benchmark metagenomics classification methods. After applying a post-processing filter, we obtained 100% correct taxonomy assignments at species and genus level. A sensitivity and precision at 75% was obtained for strain level annotations. A comparison between MGmapper and Kraken at species level, shows MGmapper assigns taxonomy at species level using 84.8% of the sequence reads, compared to 70.5% for Kraken and both methods identified all species with no false positives. Extensive read count statistics are provided in plain text and excel sheets for both rejected and accepted taxonomy annotations. The use of custom databases is possible for the command-line version of MGmapper, and the complete pipeline is freely available as a bitbucked package (https://bitbucket.org/genomicepidemiology/mgmapper). A web-version (https://cge.cbs.dtu.dk/services/MGmapper) provides the basic functionality for analysis of small fastq datasets.
NASA Astrophysics Data System (ADS)
Dargent, J.; Aunai, N.; Belmont, G.; Dorville, N.; Lavraud, B.; Hesse, M.
2016-06-01
> Tangential current sheets are ubiquitous in space plasmas and yet hard to describe with a kinetic equilibrium. In this paper, we use a semi-analytical model, the BAS model, which provides a steady ion distribution function for a tangential asymmetric current sheet and we prove that an ion kinetic equilibrium produced by this model remains steady in a fully kinetic particle-in-cell simulation even if the electron distribution function does not satisfy the time independent Vlasov equation. We then apply this equilibrium to look at the dependence of magnetic reconnection simulations on their initial conditions. We show that, as the current sheet evolves from a symmetric to an asymmetric upstream plasma, the reconnection rate is impacted and the X line and the electron flow stagnation point separate from one another and start to drift. For the simulated systems, we investigate the overall evolution of the reconnection process via the classical signatures discussed in the literature and searched in the Magnetospheric MultiScale data. We show that they seem robust and do not depend on the specific details of the internal structure of the initial current sheet.
Universal inverse design of surfaces with thin nematic elastomer sheets.
Aharoni, Hillel; Xia, Yu; Zhang, Xinyue; Kamien, Randall D; Yang, Shu
2018-06-21
Programmable shape-shifting materials can take different physical forms to achieve multifunctionality in a dynamic and controllable manner. Although morphing a shape from 2D to 3D via programmed inhomogeneous local deformations has been demonstrated in various ways, the inverse problem-finding how to program a sheet in order for it to take an arbitrary desired 3D shape-is much harder yet critical to realize specific functions. Here, we address this inverse problem in thin liquid crystal elastomer (LCE) sheets, where the shape is preprogrammed by precise and local control of the molecular orientation of the liquid crystal monomers. We show how blueprints for arbitrary surface geometries can be generated using approximate numerical methods and how local extrinsic curvatures can be generated to assist in properly converting these geometries into shapes. Backed by faithfully alignable and rapidly lockable LCE chemistry, we precisely embed our designs in LCE sheets using advanced top-down microfabrication techniques. We thus successfully produce flat sheets that, upon thermal activation, take an arbitrary desired shape, such as a face. The general design principles presented here for creating an arbitrary 3D shape will allow for exploration of unmet needs in flexible electronics, metamaterials, aerospace and medical devices, and more.
NASA Astrophysics Data System (ADS)
Alves, J. L.; Oliveira, M. C.; Menezes, L. F.
2004-06-01
Two constitutive models used to describe the plastic behavior of sheet metals in the numerical simulation of sheet metal forming process are studied: a recently proposed advanced constitutive model based on the Teodosiu microstructural model and the Cazacu Barlat yield criterion is compared with a more classical one, based on the Swift law and the Hill 1948 yield criterion. These constitutive models are implemented into DD3IMP, a finite element home code specifically developed to simulate sheet metal forming processes, which generically is a 3-D elastoplastic finite element code with an updated Lagrangian formulation, following a fully implicit time integration scheme, large elastoplastic strains and rotations. Solid finite elements and parametric surfaces are used to model the blank sheet and tool surfaces, respectively. Some details of the numerical implementation of the constitutive models are given. Finally, the theory is illustrated with the numerical simulation of the deep drawing of a cylindrical cup. The results show that the proposed advanced constitutive model predicts with more exactness the final shape (medium height and ears profile) of the formed part, as one can conclude from the comparison with the experimental results.
Kelvin-Helmholtz instability of stratified jets.
NASA Astrophysics Data System (ADS)
Hanasz, M.; Sol, H.
1996-11-01
We investigate the Kelvin-Helmholtz instability of stratified jets. The internal component (core) is made of a relativistic gas moving with a relativistic bulk speed. The second component (sheath or envelope) flows between the core and external gas with a nonrelativistic speed. Such a two-component jet describes a variety of possible astrophysical jet configurations like e.g. (1) a relativistic electron-positron beam penetrating a classical electron-proton disc wind or (2) a beam-cocoon structure. We perform a linear stability analysis of such a configuration in the hydrodynamic, plane-parallel, vortex-sheet approximation. The obtained solutions of the dispersion relation show very apparent differences with respect to the single-jet solutions. Due to the reflection of sound waves at the boundary between sheet and external gas, the growth rate as a function of wavenumber presents a specific oscillation pattern. Overdense sheets can slow down the growth rate and contribute to stabilize the configuration. Moreover, we obtain the result that even for relatively small sheet widths the properties of sheet start to dominate the jet dynamics. Such effects could have important astrophysical implications, for instance on the origin of the dichotomy between radio-loud and radio-quiet objects.
Educating Next Generation Nuclear Criticality Safety Engineers at the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. D. Bess; J. B. Briggs; A. S. Garcia
2011-09-01
One of the challenges in educating our next generation of nuclear safety engineers is the limitation of opportunities to receive significant experience or hands-on training prior to graduation. Such training is generally restricted to on-the-job-training before this new engineering workforce can adequately provide assessment of nuclear systems and establish safety guidelines. Participation in the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) can provide students and young professionals the opportunity to gain experience and enhance critical engineering skills. The ICSBEP and IRPhEP publish annual handbooks that contain evaluations of experiments along withmore » summarized experimental data and peer-reviewed benchmark specifications to support the validation of neutronics codes, nuclear cross-section data, and the validation of reactor designs. Participation in the benchmark process not only benefits those who use these Handbooks within the international community, but provides the individual with opportunities for professional development, networking with an international community of experts, and valuable experience to be used in future employment. Traditionally students have participated in benchmarking activities via internships at national laboratories, universities, or companies involved with the ICSBEP and IRPhEP programs. Additional programs have been developed to facilitate the nuclear education of students while participating in the benchmark projects. These programs include coordination with the Center for Space Nuclear Research (CSNR) Next Degree Program, the Collaboration with the Department of Energy Idaho Operations Office to train nuclear and criticality safety engineers, and student evaluations as the basis for their Master's thesis in nuclear engineering.« less
Ebselen Preserves Tissue-Engineered Cell Sheets and their Stem Cells in Hypothermic Conditions
Katori, Ryosuke; Hayashi, Ryuhei; Kobayashi, Yuki; Kobayashi, Eiji; Nishida, Kohji
2016-01-01
Clinical trials have been performed using autologous tissue-engineered epithelial cell sheets for corneal regenerative medicine. To improve stem cell-based therapy for convenient clinical practice, new techniques are required for preserving reconstructed tissues and their stem/progenitor cells until they are ready for use. In the present study, we screened potential preservative agents and developed a novel medium for preserving the cell sheets and their stem/progenitor cells; the effects were evaluated with a luciferase-based viability assay. Nrf2 activators, specifically ebselen, could maintain high ATP levels during preservation. Ebselen also showed a strong influence on maintenance of the viability, morphology, and stem cell function of the cell sheets preserved under hypothermia by protecting them from reactive oxygen species-induced damage. Furthermore, ebselen drastically improved the preservation performance of human cornea tissues and their stem cells. Therefore, ebselen shows good potential as a useful preservation agent in regenerative medicine as well as in cornea transplantation. PMID:27966584
Jemielita, Matthew; Taormina, Michael J; Delaurier, April; Kimmel, Charles B; Parthasarathy, Raghuveer
2013-12-01
The combination of genetically encoded fluorescent proteins and three-dimensional imaging enables cell-type-specific studies of embryogenesis. Light sheet microscopy, in which fluorescence excitation is provided by a plane of laser light, is an appealing approach to live imaging due to its high speed and efficient use of photons. While the advantages of rapid imaging are apparent from recent work, the importance of low light levels to studies of development is not well established. We examine the zebrafish opercle, a craniofacial bone that exhibits pronounced shape changes at early developmental stages, using both spinning disk confocal and light sheet microscopies of fluorescent osteoblast cells. We find normal and aberrant opercle morphologies for specimens imaged with short time intervals using light sheet and spinning disk confocal microscopies, respectively, under equivalent exposure conditions over developmentally-relevant time scales. Quantification of shapes reveals that the differently imaged specimens travel along distinct trajectories in morphological space. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ultra-broadband microwave metamaterial absorber based on resistive sheets
NASA Astrophysics Data System (ADS)
Kim, Y. J.; Yoo, Y. J.; Hwang, J. S.; Lee, Y. P.
2017-01-01
We investigate a broadband perfect absorber for microwave frequencies, with a wide incident angle, using resistive sheets, based on both simulation and experiment. The absorber uses periodically-arranged meta-atoms, consisting of snake-shape metallic patterns and metal planes separated by three resistive sheet layers between four dielectric layers. We demonstrate the mechanism of the broadband by impedance matching with free space, and the distribution of surface currents at specific frequencies. In simulation, the absorption was over 96% in 1.4-6.0 GHz. The corresponding experimental absorption band over 96% was 1.4-4.0 GHz, however, the absorption was lower than 96% in the 4.0-6.0 GHz range because of the rather irregular thickness of the resistive sheets. Furthermore, it works for wide incident angles and is relatively independent of polarization. The design is scalable to smaller sizes in the THz range. The results of this study show potential for real applications in prevention of microwave frequency exposure, with devices such as cell phones, monitors, and microwave equipment.
Ebselen Preserves Tissue-Engineered Cell Sheets and their Stem Cells in Hypothermic Conditions.
Katori, Ryosuke; Hayashi, Ryuhei; Kobayashi, Yuki; Kobayashi, Eiji; Nishida, Kohji
2016-12-14
Clinical trials have been performed using autologous tissue-engineered epithelial cell sheets for corneal regenerative medicine. To improve stem cell-based therapy for convenient clinical practice, new techniques are required for preserving reconstructed tissues and their stem/progenitor cells until they are ready for use. In the present study, we screened potential preservative agents and developed a novel medium for preserving the cell sheets and their stem/progenitor cells; the effects were evaluated with a luciferase-based viability assay. Nrf2 activators, specifically ebselen, could maintain high ATP levels during preservation. Ebselen also showed a strong influence on maintenance of the viability, morphology, and stem cell function of the cell sheets preserved under hypothermia by protecting them from reactive oxygen species-induced damage. Furthermore, ebselen drastically improved the preservation performance of human cornea tissues and their stem cells. Therefore, ebselen shows good potential as a useful preservation agent in regenerative medicine as well as in cornea transplantation.
Masumoto, Hidetoshi; Ikuno, Takeshi; Takeda, Masafumi; Fukushima, Hiroyuki; Marui, Akira; Katayama, Shiori; Shimizu, Tatsuya; Ikeda, Tadashi; Okano, Teruo; Sakata, Ryuzo; Yamashita, Jun K.
2014-01-01
To realize cardiac regeneration using human induced pluripotent stem cells (hiPSCs), strategies for cell preparation, tissue engineering and transplantation must be explored. Here we report a new protocol for the simultaneous induction of cardiomyocytes (CMs) and vascular cells [endothelial cells (ECs)/vascular mural cells (MCs)], and generate entirely hiPSC-engineered cardiovascular cell sheets, which showed advantageous therapeutic effects in infarcted hearts. The protocol adds to a previous differentiation protocol of CMs by using stage-specific supplementation of vascular endothelial cell growth factor for the additional induction of vascular cells. Using this cell sheet technology, we successfully generated physically integrated cardiac tissue sheets (hiPSC-CTSs). HiPSC-CTS transplantation to rat infarcted hearts significantly improved cardiac function. In addition to neovascularization, we confirmed that engrafted human cells mainly consisted of CMs in >40% of transplanted rats four weeks after transplantation. Thus, our HiPSC-CTSs show promise for cardiac regenerative therapy. PMID:25336194
2013-01-01
Background The objective of screening programs is to discover life threatening diseases in as many patients as early as possible and to increase the chance of survival. To be able to compare aspects of health care quality, methods are needed for benchmarking that allow comparisons on various health care levels (regional, national, and international). Objectives Applications and extensions of algorithms can be used to link the information on disease phases with relative survival rates and to consolidate them in composite measures. The application of the developed SAS-macros will give results for benchmarking of health care quality. Data examples for breast cancer care are given. Methods A reference scale (expected, E) must be defined at a time point at which all benchmark objects (observed, O) are measured. All indices are defined as O/E, whereby the extended standardized screening-index (eSSI), the standardized case-mix-index (SCI), the work-up-index (SWI), and the treatment-index (STI) address different health care aspects. The composite measures called overall-performance evaluation (OPE) and relative overall performance indices (ROPI) link the individual indices differently for cross-sectional or longitudinal analyses. Results Algorithms allow a time point and a time interval associated comparison of the benchmark objects in the indices eSSI, SCI, SWI, STI, OPE, and ROPI. Comparisons between countries, states and districts are possible. Exemplarily comparisons between two countries are made. The success of early detection and screening programs as well as clinical health care quality for breast cancer can be demonstrated while the population’s background mortality is concerned. Conclusions If external quality assurance programs and benchmark objects are based on population-based and corresponding demographic data, information of disease phase and relative survival rates can be combined to indices which offer approaches for comparative analyses between benchmark objects. Conclusions on screening programs and health care quality are possible. The macros can be transferred to other diseases if a disease-specific phase scale of prognostic value (e.g. stage) exists. PMID:23316692
NASA Astrophysics Data System (ADS)
Jiang, J.; Kaloti, A. P.; Levinson, H. R.; Nguyen, N.; Puckett, E. G.; Lokavarapu, H. V.
2016-12-01
We present the results of three standard benchmarks for the new active tracer particle algorithm in ASPECT. The three benchmarks are SolKz, SolCx, and SolVI (also known as the 'inclusion benchmark') first proposed by Duretz, May, Gerya, and Tackley (G Cubed, 2011) and in subsequent work by Theilman, May, and Kaus (Pure and Applied Geophysics, 2014). Each of the three benchmarks compares the accuracy of the numerical solution to a steady (time-independent) solution of the incompressible Stokes equations with a known exact solution. These benchmarks are specifically designed to test the accuracy and effectiveness of the numerical method when the viscosity varies up to six orders of magnitude. ASPECT has been shown to converge to the exact solution of each of these benchmarks at the correct design rate when all of the flow variables, including the density and viscosity, are discretized on the underlying finite element grid (Krobichler, Heister, and Bangerth, GJI, 2012). In our work we discretize the density and viscosity by initially placing the true values of the density and viscosity at the intial particle positions. At each time step, including the initialization step, the density and viscosity are interpolated from the particles onto the finite element grid. The resulting Stokes system is solved for the velocity and pressure, and the particle positions are advanced in time according to this new, numerical, velocity field. Note that this procedure effectively changes a steady solution of the Stokes equaton (i.e., one that is independent of time) to a solution of the Stokes equations that is time dependent. Furthermore, the accuracy of the active tracer particle algorithm now also depends on the accuracy of the interpolation algorithm and of the numerical method one uses to advance the particle positions in time. Finally, we will present new interpolation algorithms designed to increase the overall accuracy of the active tracer algorithms in ASPECT and interpolation algotithms designed to conserve properties, such as mass density, that are being carried by the particles.
Rand, Hugh; Shumway, Martin; Trees, Eija K.; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E.; Defibaugh-Chavez, Stephanie; Carleton, Heather A.; Klimke, William A.; Katz, Lee S.
2017-01-01
Background As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. Methods We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and “known” phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Results Our “outbreak” benchmark datasets represent the four major foodborne bacterial pathogens (Listeria monocytogenes, Salmonella enterica, Escherichia coli, and Campylobacter jejuni) and one simulated dataset where the “known tree” can be accurately called the “true tree”. The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. Discussion These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools—we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines. PMID:29372115
Timme, Ruth E; Rand, Hugh; Shumway, Martin; Trees, Eija K; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E; Defibaugh-Chavez, Stephanie; Carleton, Heather A; Klimke, William A; Katz, Lee S
2017-01-01
As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and "known" phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Our "outbreak" benchmark datasets represent the four major foodborne bacterial pathogens ( Listeria monocytogenes , Salmonella enterica , Escherichia coli , and Campylobacter jejuni ) and one simulated dataset where the "known tree" can be accurately called the "true tree". The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools-we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines.
Benchmarking passive transfer of immunity and growth in dairy calves.
Atkinson, D J; von Keyserlingk, M A G; Weary, D M
2017-05-01
Poor health and growth in young dairy calves can have lasting effects on their development and future production. This study benchmarked calf-rearing outcomes in a cohort of Canadian dairy farms, reported these findings back to producers and their veterinarians, and documented the results. A total of 18 Holstein dairy farms were recruited, all in British Columbia. Blood samples were collected from calves aged 1 to 7 d. We estimated serum total protein levels using digital refractometry, and failure of passive transfer (FPT) was defined as values below 5.2 g/dL. We estimated average daily gain (ADG) for preweaned heifers (1 to 70 d old) using heart-girth tape measurements, and analyzed early (≤35 d) and late (>35 d) growth separately. At first assessment, the average farm FPT rate was 16%. Overall, ADG was 0.68 kg/d, with early and late growth rates of 0.51 and 0.90 kg/d, respectively. Following delivery of the benchmark reports, all participants volunteered to undergo a second assessment. The majority (83%) made at least 1 change in their colostrum-management or milk-feeding practices, including increased colostrum at first feeding, reduced time to first colostrum, and increased initial and maximum daily milk allowances. The farms that made these changes experienced improved outcomes. On the 11 farms that made changes to improve colostrum feeding, the rate of FPT declined from 21 ± 10% before benchmarking to 11 ± 10% after making the changes. On the 10 farms that made changes to improve calf growth, ADG improved from 0.66 ± 0.09 kg/d before benchmarking to 0.72 ± 0.08 kg/d after making the management changes. Increases in ADG were greatest in the early milk-feeding period, averaging 0.13 kg/d higher than pre-benchmarking values for calves ≤35 d of age. Benchmarking specific outcomes associated with calf rearing can motivate producer engagement in calf care, leading to improved outcomes for calves on farms that apply relevant management changes. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Liu, Yonghuan; Wang, Rutao; Yan, Xingbin
2015-06-08
Nanoscale electrode materials including metal oxide nanoparticles and two-dimensional graphene have been employed for designing supercapacitors. However, inevitable agglomeration of nanoparticles and layers stacking of graphene largely hamper their practical applications. Here we demonstrate an efficient co-ordination and synergistic effect between ultra-small Ni(OH)2 nanoparticles and reduced graphene oxide (RGO) sheets for synthesizing ideal electrode materials. On one hand, to make the ultra-small Ni(OH)2 nanoparticles work at full capacity as an ideal pseudocapacitive material, RGO sheets are employed as an suitable substrate to anchor these nanoparticles against agglomeration. As a consequence, an ultrahigh specific capacitance of 1717 F g(-1) at 0.5 A g(-1) is achieved. On the other hand, to further facilitate ion transfer within RGO sheets as an ideal electrical double layer capacitor material, the ultra-small Ni(OH)2 nanoparticles are introduced among RGO sheets as the recyclable sacrificial spacer to prevent the stacking. The resulting RGO sheets exhibit superior rate capability with a high capacitance of 182 F g(-1) at 100 A g(-1). On this basis, an asymmetric supercapacitor is assembled using the two materials, delivering a superior energy density of 75 Wh kg(-1) and an ultrahigh power density of 40 000 W kg(-1).
Liu, Yonghuan; Wang, Rutao; Yan, Xingbin
2015-01-01
Nanoscale electrode materials including metal oxide nanoparticles and two-dimensional graphene have been employed for designing supercapacitors. However, inevitable agglomeration of nanoparticles and layers stacking of graphene largely hamper their practical applications. Here we demonstrate an efficient co-ordination and synergistic effect between ultra-small Ni(OH)2 nanoparticles and reduced graphene oxide (RGO) sheets for synthesizing ideal electrode materials. On one hand, to make the ultra-small Ni(OH)2 nanoparticles work at full capacity as an ideal pseudocapacitive material, RGO sheets are employed as an suitable substrate to anchor these nanoparticles against agglomeration. As a consequence, an ultrahigh specific capacitance of 1717 F g−1 at 0.5 A g−1 is achieved. On the other hand, to further facilitate ion transfer within RGO sheets as an ideal electrical double layer capacitor material, the ultra-small Ni(OH)2 nanoparticles are introduced among RGO sheets as the recyclable sacrificial spacer to prevent the stacking. The resulting RGO sheets exhibit superior rate capability with a high capacitance of 182 F g−1 at 100 A g−1. On this basis, an asymmetric supercapacitor is assembled using the two materials, delivering a superior energy density of 75 Wh kg−1 and an ultrahigh power density of 40 000 W kg−1. PMID:26053847
Picture book support for preparing children ahead of and during day surgery.
Nilsson, Elisabeth; Svensson, Gunnar; Frisman, Gunilla Hollman
2016-10-07
Aim To develop and evaluate the use of a specific picture book aiming to prepare children for anaesthesia and surgery. Methods An intervention comparing two different information methods before ear, nose and throat day surgery was performed. The intervention involved using a specific information sheet and a specific picture book. Parents (n=104) of children aged 2-12 years completed open-ended questions that were analysed with qualitative content analysis. They were divided into two groups: one group received routine information and one received routine information and the intervention. Findings The picture sheet and picture book were valuable aids to prepare small children for anaesthesia and surgery by explaining the procedures that would take place. The parents expressed that knowledge of the procedures made them and the child feel secure. Conclusion Peri-operative information through pictures supports children and their parents during day surgery and may be helpful in future healthcare visits.
Aponte-Patel, Linda; Sen, Anita
2015-01-01
Although many pediatric intensive care units (PICUs) use beside communication sheets (BCSs) to highlight daily goals, the optimal format is unknown. A site-specific BCS could improve both PICU communication and compliance completing the BCS. Via written survey, PICU staff at an academic children's hospital provided recommendations for improving and revising an existing BCS. Pre- and post-BCS revision, PICU staff were polled regarding PICU communication and BCS effectiveness, and daily compliance for completing the BCS was monitored. After implementation of the revised BCS, staff reporting "excellent" or "very good" day-to-day communication within the PICU increased from 57% to 77% (P = .02). Compliance for completing the BCS also increased significantly (75% vs 83%, P = .03). Introduction of a focused and concise BCS tailored to a specific PICU leads to improved perceptions of communication by PICU staff and increased compliance completing the daily BCS. © The Author(s) 2014.
NASA Astrophysics Data System (ADS)
Lin, Mei; Chen, Bolei; Wu, Xiao; Qian, Jiasheng; Fei, Linfeng; Lu, Wei; Chan, Lai Wa Helen; Yuan, Jikang
2016-01-01
Well-organized epsilon-MnO2 hollow spheres/reduced graphene oxide (MnO2HS/RGO) composites have been successfully constructed via a facile and one-pot synthetic route. The ε-MnO2 hollow spheres with the diameter of ~500 nm were grown in situ with homogeneous distribution on both sides of graphene oxide (GO) sheets in aqueous suspensions. The formation mechanism of the MnO2HS/RGO composites has been systematically investigated, and a high specific capacitance and good cycling capability were achieved on using the composites as supercapacitors. The galvanostatic charge/discharge curves show a specific capacitance of 471.5 F g-1 at 0.8 A g-1. The hollow structures of ε-MnO2 and the crumpled RGO sheets can enhance the electroactive surface area and improve the electrical conductivity, thus further facilitating the charge transport. The MnO2HS/RGO composite exhibits a high capacitance of 272 F g-1 at 3 A g-1 (92% retention) even after 1000 cycles. The prominent electrochemical performance might be attributed to the combination of the pseudo-capacitance of the MnO2 nanospheres with a hollow structure and to the good electrical conductivity of the RGO sheets. This work explores a new concept in designing metal oxides/RGO composites as electrode materials.Well-organized epsilon-MnO2 hollow spheres/reduced graphene oxide (MnO2HS/RGO) composites have been successfully constructed via a facile and one-pot synthetic route. The ε-MnO2 hollow spheres with the diameter of ~500 nm were grown in situ with homogeneous distribution on both sides of graphene oxide (GO) sheets in aqueous suspensions. The formation mechanism of the MnO2HS/RGO composites has been systematically investigated, and a high specific capacitance and good cycling capability were achieved on using the composites as supercapacitors. The galvanostatic charge/discharge curves show a specific capacitance of 471.5 F g-1 at 0.8 A g-1. The hollow structures of ε-MnO2 and the crumpled RGO sheets can enhance the electroactive surface area and improve the electrical conductivity, thus further facilitating the charge transport. The MnO2HS/RGO composite exhibits a high capacitance of 272 F g-1 at 3 A g-1 (92% retention) even after 1000 cycles. The prominent electrochemical performance might be attributed to the combination of the pseudo-capacitance of the MnO2 nanospheres with a hollow structure and to the good electrical conductivity of the RGO sheets. This work explores a new concept in designing metal oxides/RGO composites as electrode materials. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr07900d
Yang, Tingzhou; Qian, Tao; Wang, Mengfan; Shen, Xiaowei; Xu, Na; Sun, Zhouzhou; Yan, Chenglin
2016-01-20
A sustainable route from the biomass byproduct okara as a natural nitrogen fertilizer to high-content N-doped carbon sheets is demonstrated. The as-prepared unique structure exhibits high specific capacity (292 mAh g(-1) ) and extremely long cycle life (exceeding 2000 cycles). A full battery is devised for the practical use of materials with a flexible/wearable LED screen. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Faria, J.; Silva, J.; Bernardo, P.; Araújo, M.; Alves, J. L.
2016-08-01
The manufacturing process and the behaviour of a spring manufactured from an aluminium sheet is described and investigated in this work considering the specifications for the in-service conditions. The spring is intended to be applied in car multimedia industry to replace bolted connections. Among others, are investigated the roles of the constitutive parameters and the hypothesis of evolutive elastic properties with the plastic work in the multistep forming process and in working conditions.
Facility Energy Performance Benchmarking in a Data-Scarce Environment
2017-08-01
environment, and analyze occupant-, system-, and component-level faults contributing to energy in- efficiency. A methodology for developing DoD-specific...Research, Development, Test, and Evaluation (RDTE) Program to develop an intelligent framework, encompassing methodology and model- ing, that...energy performers by installation, climate zone, and other criteria. A methodology for creating the DoD-specific EUIs would be an important part of a
ERIC Educational Resources Information Center
Storkel, Holly L.; Komesidou, Rouzana; Fleming, Kandace K.; Romine, Rebecca Swinburne
2017-01-01
Purpose: The goal of this study was to provide guidance to clinicians on early benchmarks of successful word learning in an interactive book reading treatment and to examine how encoding and memory evolution during treatment contribute to word learning outcomes by kindergarten children with specific language impairment (SLI). Method: Twenty-seven…
ERIC Educational Resources Information Center
Rice, Mabel L.; Redmond, Sean M.; Hoffman, Lesa
2006-01-01
Purpose: Although mean length of utterance (MLU) is a useful benchmark in studies of children with specific language impairment (SLI), some empirical and interpretive issues are unresolved. The authors report on 2 studies examining, respectively, the concurrent validity and temporal stability of MLU equivalency between children with SLI and…
NASA Astrophysics Data System (ADS)
Wuite, Jan; Nagler, Thomas; Hetzenecker, Markus; Blumthaler, Ursula; Ossowska, Joanna; Rott, Helmut
2017-04-01
The enhanced imaging capabilities of Sentinel-1A and 1B and the systematic acquisition planning of polar regions by ESA form the basis for the development and implementation of an operational system for monitoring ice dynamics and discharge of Antarctica, Greenland and other polar ice caps. Within the framework of the ESA CCI and the Austrian ASAP/FFG programs we implemented an automatic system for generation of ice velocity maps from repeat pass Sentinel-1 Terrain Observation by Progressive Scans (TOPS) mode data applying iterative offset tracking using both coherent and incoherent image cross-correlation. Greenland's margins are monitored by 6 tracks continuously since mid of 2015 with 12 days repeat observations using Sentinel-1A. With the twin satellite Sentinel-1B, launched in April 2016, the repeat acquisition period is reduced to only 6 days allowing frequent velocity retrievals - even in regions with high accumulation rates and very fast flow - and providing insight for studying short-term variations of ice flow and discharge. The Sentinel-1 ice velocity products continue the sparse coverage in time and space of previous velocity mapping efforts. The annual Greenland wide winter acquisition campaigns of 4 to 6 repeat track observations, acquired within a few weeks, provide nearly gapless and seamless ice sheet wide flow velocity maps on a yearly basis which are important for ice sheet modelling purposes and accurate mass balance assessments. An Antarctic ice sheet wide ice velocity map (with polar gap) was generated from Sentinel-1A data, acquired within 8 months, providing an important benchmark for gauging future changes in ice dynamics. For regions with significant warming continuous monitoring of ice streams with 6 to 12-day repeat intervals, exploiting both satellites, is ongoing to detect changes of ice flow as indicators of climate change. We present annual ice sheet wide velocity maps of Greenland from 2014/15 to 2016/17 and Antarctica from 2015/16 as well as dense time series of short-term velocity changes of outlet glaciers since 2014. We will highlight the improvements of the dual satellite constellation of Sentinel-1A and 1B, in particular for fast moving glaciers and regions with high accumulation rates. Derived surface velocities are combined with ice thickness from airborne Radio Echo Sounding data to compute ice discharge and its short-term variation across flux gates of major outlet glaciers in Greenland and Antarctica. Ice velocity maps, including dense time series for outlet glaciers, and ice discharge products are made available to registered users through our webtool at cryoportal.enveo.at.
Channel specificity and secondary structure of the glucose-inducible porins of Pseudomonas spp.
Adewoye, L O; Tschetter, L; O'Neil, J; Worobec, E A
1998-06-01
The OprB porin-mediated glucose transport system was investigated in Pseudomonas chlororaphis, Burkholderia cepacia, and Pseudomonas fluorescens. Kinetic studies of [U-14C]glucose uptake revealed an inducible system of low Km values (0.3-5 microM) and high specificity for glucose. OprB homologs were purified and reconstituted into proteoliposomes. The porin function and channel preference for glucose were demonstrated by liposome swelling assays. Examination of the periplasmic glucose-binding protein (GBP) components by Western immunoblotting using P. aeruginosa GBP-specific antiserum revealed some homology between P. aeruginosa GBP and periplasmic proteins from P. fluorescens and P. chlororaphis but not B. cepacia. Circular dichroism spectropolarimetry of purified OprB-like porins from the three species revealed beta sheet contents of 31-50% in agreement with 40% beta sheet content for the P. aeruginosa OprB porin. These findings suggest that the high-affinity glucose transport system is primarily specific for glucose and well conserved in the genus Pseudomonas although its outer membrane component may differ in channel architecture and specificity for other carbohydrates.
Zhang, Xiao; Tan, Wei; Smail, Fiona; De Volder, Michael; Fleck, Norman; Boies, Adam
2018-06-19
Some assemblies of nanomaterials, like carbon nanotube (CNT) sheet or film, always show outstanding and anisotropic thermal properties. However, there is still a lack of comprehensive thermal conductivity (κ) characterizations on CNT sheets, as well as lack of estimations of their true contributions on thermal enhancement of polymer composites when used as additives. Always, these characterizations were hindered by the low heat capacity, anisotropic thermal properties or low electrical conductivity of assemblies and their nanocomposites. And the transient κ measurement and calculations were also hampered by accurate determination of parameters, like specific heat capacity, density and cross-section, which could be difficult and controversial for nanomaterials, like CNT sheets. Here, to measure anisotropic κ of CNT sheets directly with high fidelity, we modified the conventional steady-state method by measuring under vacuum and by infrared camera, and then comparing temperature profiles on both reference standard material and a CNT sheet sample. The highly anisotropic thermal conductivities of CNT sheets were characterized comprehensively, with κ/ρ in alignment direction as ~95 mW·m^2/(K·kg). Furthermore, by comparing the measured thermal properties of different CNT-epoxy resin composites, the heat conduction pathway created by the CNT hierarchical network was demonstrated to remain intact after the in-situ polymerization and curing process. The reliable and direct κ measurement rituals used here, dedicated to nanomaterials, will be also essential to assist in assemblies' application to heat dissipation and composite thermal enhancement. © 2018 IOP Publishing Ltd.
Self-Contained Math Manual. Teacher's Guide.
ERIC Educational Resources Information Center
Grant, Shelia I.
This instructional manual consists of 11 competency-based units for a mathematics course for trade and industrial programs in Texas. Each instructional unit includes the following basic components: unit and specific objectives, notes to the instructor (outline of steps to follow in accomplishing specific objectives), information sheets,…
SITE-SPECIFIC MEASUREMENTS OF RESIDENTIAL RADON PROTECTION CATEGORY
The report describes a series of benchmark measurements of soil radon potential at seven Florida sites and compares the measurements with regional estimates of radon potential from the Florida radon protection map. The measurements and map were developed under the Florida Radon R...
Lightweight Specifications for Parallel Correctness
2012-12-05
Galenson, Benjamin Hindman, Thibaud Hottelier, Pallavi Joshi, Ben- jamin Lipshitz, Leo Meyerovich, Mayur Naik, Chang-Seo Park, and Philip Reames — many...violating executions. We discuss some of these errors in detail in the CHAPTER 5. SPECIFYING AND CHECKING SEMANTIC ATOMICITY 84 Benchmark Approx. LoC
Lepak, Jesse M.; Hooten, Mevin B.; Eagles-Smith, Collin A.; Tate, Michael T.; Lutz, Michelle A.; Ackerman, Joshua T.; Willacker, James J.; Jackson, Allyson K.; Evers, David C.; Wiener, James G.; Pritz, Colleen Flanagan; Davis, Jay
2016-01-01
Fish represent high quality protein and nutrient sources, but Hg contamination is ubiquitous in aquatic ecosystems and can pose health risks to fish and their consumers. Potential health risks posed to fish and humans by Hg contamination in fish were assessed in western Canada and the United States. A large compilation of inland fish Hg concentrations was evaluated in terms of potential health risk to the fish themselves, health risk to predatory fish that consume Hg contaminated fish, and to humans that consume Hg contaminated fish. The probability that a fish collected from a given location would exceed a Hg concentration benchmark relevant to a health risk was calculated. These exceedance probabilities and their associated uncertainties were characterized for fish of multiple size classes at multiple health-relevant benchmarks. The approach was novel and allowed for the assessment of the potential for deleterious health effects in fish and humans associated with Hg contamination in fish across this broad study area. Exceedance probabilities were relatively common at low Hg concentration benchmarks, particularly for fish in larger size classes. Specifically, median exceedances for the largest size classes of fish evaluated at the lowest Hg concentration benchmarks were 0.73 (potential health risks to fish themselves), 0.90 (potential health risk to predatory fish that consume Hg contaminated fish), and 0.97 (potential for restricted fish consumption by humans), but diminished to essentially zero at the highest benchmarks and smallest fish size classes. Exceedances of benchmarks are likely to have deleterious health effects on fish and limit recommended amounts of fish humans consume in western Canada and the United States. Results presented here are not intended to subvert or replace local fish Hg data or consumption advice, but provide a basis for identifying areas of potential health risk and developing more focused future research and monitoring efforts.
Watson, Gregory S; Green, David W; Cribb, Bronwen W; Brown, Christopher L; Meritt, Christopher R; Tobin, Mark J; Vongsvivut, Jitraporn; Sun, Mingxia; Liang, Ai-Ping; Watson, Jolanta A
2017-07-19
Nature has produced many intriguing and spectacular surfaces at the micro- and nanoscales. These small surface decorations act for a singular or, in most cases, a range of functions. The minute landscape found on the lotus leaf is one such example, displaying antiwetting behavior and low adhesion with foreign particulate matter. Indeed the lotus leaf has often been considered the "benchmark" for such properties. One could expect that there are animal counterparts of this self-drying and self-cleaning surface system. In this study, we show that the planthopper insect wing (Desudaba danae) exhibits a remarkable architectural similarity to the lotus leaf surface. Not only does the wing demonstrate a topographical likeness, but some surface properties are also expressed, such as nonwetting behavior and low adhering forces with contaminants. In addition, the insect-wing cuticle exhibits an antibacterial property in which Gram-negative bacteria (Porphyromonas gingivalis) are killed over many consecutive waves of attacks over 7 days. In contrast, eukaryote cell associations, upon contact with the insect membrane, lead to a formation of integrated cell sheets (e.g., among human stem cells (SHED-MSC) and human dermal fibroblasts (HDF)). The multifunctional features of the insect membrane provide a potential natural template for man-made applications in which specific control of liquid, solid, and biological contacts is desired and required. Moreover, the planthopper wing cuticle provides a "new" natural surface with which numerous interfacial properties can be explored for a range of comparative studies with both natural and man-made materials.
An audit of fresh frozen plasma usage in a tertiary referral centre in a developing country.
Prathiba, R; Jayaranee, S; Ramesh, J C; Lopez, C G; Vasanthi, N
2001-06-01
This paper evaluates the practice of fresh frozen plasma (FFP) transfusion at the University Hospital, Kuala Lumpur, and analyses its usage by the various clinical departments. The aim of this study is to identify where it is inappropriately used and the clinical indications in which such misuse is common. A retrospective analysis of the blood bank request forms and work sheets during a 6-month period between January 1998 and June 1998 formed the basis of this study. Overall, 40% of 2665 units transfused were considered appropriate. However, out of the 931 episodes of FFP transfusions only 31% were for appropriate indications. The average FFP requirement when used for appropriate indication was about 4 units per episode, whereas for inappropriate indication it was 2.5 units per episode. Inappropriate use in terms of the number of units was highest by the surgical services (68%) and Orthopaedics (64%), while the Department of Paediatrics had the lowest incidence of inappropriate use (40%). When Paediatrics was used as the benchmark, the incidence of inappropriate use by other departments was significantly higher (p < 0.01). As for FFP usage in common clinical indications, there was a high incidence of inappropriate use in burns (82%), perioperative period (73%), cardiac surgery (68%), massive bleeding (62%) and trauma (60%). The findings in this study, specifically the use of FFP for volume support in trauma, massive bleeding and burns, routine requests without identified indication in cardiac bypass surgery, and prophylactic use in the perioperative period can be the basis for recommendations to minimize the inappropriate use of FFP in the future.
Ponzio, Todd A; Feindt, Hans; Ferguson, Steven
2011-09-01
Biopharmaceuticals are therapeutic products based on biotechnology. They are manufactured by or from living organisms and are the most complex of all commercial medicines to develop, manufacture and qualify for regulatory approval. In recent years biopharmaceuticals have rapidly increased in number and importance with over 400() already marketed in the U.S. and European markets alone. Many companies throughout the world are now ramping up investments in biopharmaceutical R&D and expanding their portfolios through licensing of early-stage biotechnologies from universities and other non-profit research institutions, and there is an increasing number of license agreements for biopharmaceutical product development relative to traditional small molecule drug compounds. This trend will only continue as large numbers of biosimilars and biogenerics enter the market.A primary goal of technology transfer offices associated with publicly-funded, non-profit research institutions is to establish patent protection for inventions deemed to have commercial potential and license them for product development. Such licenses help stimulate economic development and job creation, bring a stream of royalty revenue to the institution and, hopefully, advance the public good or public health by bringing new and useful products to market. In the course of applying for such licenses, a commercial development plan is usually put forth by the license applicant. This plan indicates the path the applicant expects to follow to bring the licensed invention to market. In the case of small molecule drug compounds, there exists a widely-recognized series of clinical development steps, dictated by regulatory requirements, that must be met to bring a new drug to market, such as completion of preclinical toxicology, Phase 1, 2 and 3 testing and product approvals. These steps often become the milestone/benchmark schedule incorporated into license agreements which technology transfer offices use to monitor the licensee's diligence and progress; most exclusive licenses include a commercial development plan, with penalties, financial or even revocation of the license, if the plan is not followed, e.g., the license falls too far behind.This study examines whether developmental milestone schedules based on a small molecule drug development model are useful and realistic in setting expectations for biopharmaceutical product development. We reviewed the monitoring records of all exclusive Public Health Service (PHS) commercial development license agreements for small molecule drugs or therapeutics based on biotechnology (biopharmaceuticals) executed by the National Institutes of Health (NIH) Office of Technology Transfer (OTT) between 2003 and 2009. We found that most biopharmaceutical development license agreements required amending because developmental milestones in the negotiated schedule could not be met by the licensee. This was in stark contrast with license agreements for small molecule chemical compounds which rarely needed changes to their developmental milestone schedules. As commercial development licenses for biopharmaceuticals make up the vast majority of NIH's exclusive license agreements, there is clearly a need to: 1) more closely examine how these benchmark schedules are formed, 2) try to understand the particular risk factors contributing to benchmark schedule non-compliance, and 3) devise alternatives to the current license benchmark schedule structural model. Schedules that properly weigh the most relevant risk factors such as technology classification (e.g., vaccine vs recombinant antibody vs gene therapy), likelihood of unforeseen regulatory issues, and company size/structure may help assure compliance with original license benchmark schedules. This understanding, coupled with a modified approach to the license negotiation process that makes use of a clear and comprehensive term sheet to minimize ambiguities should result in a more realistic benchmark schedule.
Ponzio, Todd A.; Feindt, Hans; Ferguson, Steven
2011-01-01
Summary Biopharmaceuticals are therapeutic products based on biotechnology. They are manufactured by or from living organisms and are the most complex of all commercial medicines to develop, manufacture and qualify for regulatory approval. In recent years biopharmaceuticals have rapidly increased in number and importance with over 4001 already marketed in the U.S. and European markets alone. Many companies throughout the world are now ramping up investments in biopharmaceutical R&D and expanding their portfolios through licensing of early-stage biotechnologies from universities and other non-profit research institutions, and there is an increasing number of license agreements for biopharmaceutical product development relative to traditional small molecule drug compounds. This trend will only continue as large numbers of biosimilars and biogenerics enter the market. A primary goal of technology transfer offices associated with publicly-funded, non-profit research institutions is to establish patent protection for inventions deemed to have commercial potential and license them for product development. Such licenses help stimulate economic development and job creation, bring a stream of royalty revenue to the institution and, hopefully, advance the public good or public health by bringing new and useful products to market. In the course of applying for such licenses, a commercial development plan is usually put forth by the license applicant. This plan indicates the path the applicant expects to follow to bring the licensed invention to market. In the case of small molecule drug compounds, there exists a widely-recognized series of clinical development steps, dictated by regulatory requirements, that must be met to bring a new drug to market, such as completion of preclinical toxicology, Phase 1, 2 and 3 testing and product approvals. These steps often become the milestone/benchmark schedule incorporated into license agreements which technology transfer offices use to monitor the licensee’s diligence and progress; most exclusive licenses include a commercial development plan, with penalties, financial or even revocation of the license, if the plan is not followed, e.g., the license falls too far behind. This study examines whether developmental milestone schedules based on a small molecule drug development model are useful and realistic in setting expectations for biopharmaceutical product development. We reviewed the monitoring records of all exclusive Public Health Service (PHS) commercial development license agreements for small molecule drugs or therapeutics based on biotechnology (biopharmaceuticals) executed by the National Institutes of Health (NIH) Office of Technology Transfer (OTT) between 2003 and 2009. We found that most biopharmaceutical development license agreements required amending because developmental milestones in the negotiated schedule could not be met by the licensee. This was in stark contrast with license agreements for small molecule chemical compounds which rarely needed changes to their developmental milestone schedules. As commercial development licenses for biopharmaceuticals make up the vast majority of NIH’s exclusive license agreements, there is clearly a need to: 1) more closely examine how these benchmark schedules are formed, 2) try to understand the particular risk factors contributing to benchmark schedule non-compliance, and 3) devise alternatives to the current license benchmark schedule structural model. Schedules that properly weigh the most relevant risk factors such as technology classification (e.g., vaccine vs recombinant antibody vs gene therapy), likelihood of unforeseen regulatory issues, and company size/structure may help assure compliance with original license benchmark schedules. This understanding, coupled with a modified approach to the license negotiation process that makes use of a clear and comprehensive term sheet to minimize ambiguities should result in a more realistic benchmark schedule. PMID:22162900
Seeding for pervasively overlapping communities
NASA Astrophysics Data System (ADS)
Lee, Conrad; Reid, Fergal; McDaid, Aaron; Hurley, Neil
2011-06-01
In some social and biological networks, the majority of nodes belong to multiple communities. It has recently been shown that a number of the algorithms specifically designed to detect overlapping communities do not perform well in such highly overlapping settings. Here, we consider one class of these algorithms, those which optimize a local fitness measure, typically by using a greedy heuristic to expand a seed into a community. We perform synthetic benchmarks which indicate that an appropriate seeding strategy becomes more important as the extent of community overlap increases. We find that distinct cliques provide the best seeds. We find further support for this seeding strategy with benchmarks on a Facebook network and the yeast interactome.
Simulation of springback and microstructural analysis of dual phase steels
NASA Astrophysics Data System (ADS)
Kalyan, T. Sri.; Wei, Xing; Mendiguren, Joseba; Rolfe, Bernard
2013-12-01
With increasing demand for weight reduction and better crashworthiness abilities in car development, advanced high strength Dual Phase (DP) steels have been progressively used when making automotive parts. The higher strength steels exhibit higher springback and lower dimensional accuracy after stamping. This has necessitated the use of simulation of each stamped component prior to production to estimate the part's dimensional accuracy. Understanding the micro-mechanical behaviour of AHSS sheet may provide more accuracy to stamping simulations. This work can be divided basically into two parts: first modelling a standard channel forming process; second modelling the micro-structure of the process. The standard top hat channel forming process, benchmark NUMISHEET'93, is used for investigating springback effect of WISCO Dual Phase steels. The second part of this work includes the finite element analysis of microstructures to understand the behaviour of the multi-phase steel at a more fundamental level. The outcomes of this work will help in the dimensional control of steels during manufacturing stage based on the material's microstructure.
MOCAT: A Metagenomics Assembly and Gene Prediction Toolkit
Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R.; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer
2012-01-01
MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/. PMID:23082188
NASA Astrophysics Data System (ADS)
Brown, D. A.; Chadwick, M. B.; Capote, R.; Kahler, A. C.; Trkov, A.; Herman, M. W.; Sonzogni, A. A.; Danon, Y.; Carlson, A. D.; Dunn, M.; Smith, D. L.; Hale, G. M.; Arbanas, G.; Arcilla, R.; Bates, C. R.; Beck, B.; Becker, B.; Brown, F.; Casperson, R. J.; Conlin, J.; Cullen, D. E.; Descalle, M.-A.; Firestone, R.; Gaines, T.; Guber, K. H.; Hawari, A. I.; Holmes, J.; Johnson, T. D.; Kawano, T.; Kiedrowski, B. C.; Koning, A. J.; Kopecky, S.; Leal, L.; Lestone, J. P.; Lubitz, C.; Márquez Damián, J. I.; Mattoon, C. M.; McCutchan, E. A.; Mughabghab, S.; Navratil, P.; Neudecker, D.; Nobre, G. P. A.; Noguere, G.; Paris, M.; Pigni, M. T.; Plompen, A. J.; Pritychenko, B.; Pronyaev, V. G.; Roubtsov, D.; Rochman, D.; Romano, P.; Schillebeeckx, P.; Simakov, S.; Sin, M.; Sirakov, I.; Sleaford, B.; Sobes, V.; Soukhovitskii, E. S.; Stetcu, I.; Talou, P.; Thompson, I.; van der Marck, S.; Welser-Sherrill, L.; Wiarda, D.; White, M.; Wormald, J. L.; Wright, R. Q.; Zerkle, M.; Žerovnik, G.; Zhu, Y.
2018-02-01
We describe the new ENDF/B-VIII.0 evaluated nuclear reaction data library. ENDF/B-VIII.0 fully incorporates the new IAEA standards, includes improved thermal neutron scattering data and uses new evaluated data from the CIELO project for neutron reactions on 1H, 16O, 56Fe, 235U, 238U and 239Pu described in companion papers in the present issue of Nuclear Data Sheets. The evaluations benefit from recent experimental data obtained in the U.S. and Europe, and improvements in theory and simulation. Notable advances include updated evaluated data for light nuclei, structural materials, actinides, fission energy release, prompt fission neutron and γ-ray spectra, thermal neutron scattering data, and charged-particle reactions. Integral validation testing is shown for a wide range of criticality, reaction rate, and neutron transmission benchmarks. In general, integral validation performance of the library is improved relative to the previous ENDF/B-VII.1 library.
Anisotropic yield function capable of predicting eight ears
NASA Astrophysics Data System (ADS)
Yoon, J. H.; Cazacu, O.
2011-08-01
Deep drawing of a cylindrical cup from a rolled sheet is one of the typical forming operations where the effect of this anisotropy is most evident. Indeed, it is well documented in the literature that the number of ears and the shape of the earing pattern correlate with the r-values profile. For the strongly textured aluminum alloy AA 5042 (Numisheet Benchmark 2011), the experimental r-value distribution has two minima between the rolling and transverse direction data provided for this show that the r-value along the transverse direction (TD) is five times larger than the value corresponding to the rolling direction. Therefore, it is expected that there are more that the earing profile has more than four ears. The main objective of this paper is to assess whether a new form of CPB06ex2 yield function (Plunkett et al. (2008)) tailored for metals with no tension-compression asymmetry is capable of predicting more than four ears for this material.
2003-01-12
VANDENBERG AFB, Calif. -- A Boeing Delta II rocket soars above the clouds here today at Vandenberg AFB, Calif. The NASA payloads aboard the rocket are the ICESat, an Ice Cloud and land Elevation Satellite, and CHIPSat, a Cosmic Hot Interstellar Plasma Spectrometer. ICESat, a 661-pound satellite, is a benchmark satellite for the Earth Observing System that will help scientists determine if the global sea level is rising or falling. It will observe the ice sheets that blanket the Earth’s poles to determine if they are growing or shrinking. It will assist in developing an understanding of how changes in the Earth’s atmosphere and climate affect polar ice masses and global sea level. The Geoscience Laser Altimeter System is the sole instrument on the satellite. CHIPSat, a suitcase-size 131-pound satellite, will provide information about the origin, physical processes and properties of the hot gas contained in the interstellar medium. This launch marks the first Delta from Vandenberg this year. (USAF photo by: SSgt. Lee A Osberry Jr.)
2003-01-12
VANDENBERG AFB, Calif. -- A Boeing Delta II rocket soars above the clouds here today at Vandenberg AFB, Calif. The NASA payload aboard the rocket are the ICESat, an Ice Cloud and land Elevation Satellite, and CHIPSat, a Cosmic Hot Interstellar Plasma Spectrometer. ICESat, a 661-pound satellite, is a benchmark satellite for the Earth Observing System that will help scientists determine if the global sea level is rising or falling. It will observe the ice sheets that blanket the Earth’s poles to determine if they are growing or shrinking. It will assist in developing an understanding of how changes in the Earth’s atmosphere and climate affect polar ice masses and global sea level. The Geoscience Laser Altimeter System is the sole instrument on the satellite. CHIPSat, a suitcase-size 131-pound satellite, will provide information about the origin, physical processes and properties of the hot gas contained in the interstellar medium. This launch marks the first Delta from Vandenberg this year. (USAF photo by: SSgt Lee A Osberry Jr.)
MOCAT: a metagenomics assembly and gene prediction toolkit.
Kultima, Jens Roat; Sunagawa, Shinichi; Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer
2012-01-01
MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/.
Limitations of Community College Benchmarking and Benchmarks
ERIC Educational Resources Information Center
Bers, Trudy H.
2006-01-01
This chapter distinguishes between benchmarks and benchmarking, describes a number of data and cultural limitations to benchmarking projects, and suggests that external demands for accountability are the dominant reason for growing interest in benchmarking among community colleges.
Role of Polyalanine Domains in -Sheet Formation in Spider Silk Block Copolymers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabotyagova, O.; Cebe, P; Kaplan, D
2010-01-01
Genetically engineered spider silk-like block copolymers were studied to determine the influence of polyalanine domain size on secondary structure. The role of polyalanine block distribution on {beta}-sheet formation was explored using FT-IR and WAXS. The number of polyalanine blocks had a direct effect on the formation of crystalline {beta}-sheets, reflected in the change in crystallinity index as the blocks of polyalanines increased. WAXS analysis confirmed the crystalline nature of the sample with the largest number of polyalanine blocks. This approach provides a platform for further exploration of the role of specific amino acid chemistries in regulating the assembly of {beta}-sheetmore » secondary structures, leading to options to regulate material properties through manipulation of this key component in spider silks.« less
Mean-field theory of active electrolytes: Dynamic adsorption and overscreening
NASA Astrophysics Data System (ADS)
Frydel, Derek; Podgornik, Rudolf
2018-05-01
We investigate active electrolytes within the mean-field level of description. The focus is on how the double-layer structure of passive, thermalized charges is affected by active dynamics of constituting ions. One feature of active dynamics is that particles adhere to hard surfaces, regardless of chemical properties of a surface and specifically in complete absence of any chemisorption or physisorption. To carry out the mean-field analysis of the system that is out of equilibrium, we develop the "mean-field simulation" technique, where the simulated system consists of charged parallel sheets moving on a line and obeying active dynamics, with the interaction strength rescaled by the number of sheets. The mean-field limit becomes exact in the limit of an infinite number of movable sheets.
NASA Astrophysics Data System (ADS)
Priyono, S.; Lubis, B. M.; Humaidi, S.; Prihandoko, B.
2018-05-01
The synthesis of Li4Ti5O12 (LTO) and study of the heating effect on the manufacturing process of LTO sheet on the electrochemical performance have been investigated. LTO anode material composed with LiOH.H2O, TiO2 as raw materials were synthesized by the solid-state process. All raw materials were stoichiometrically mixed and milled with a planetary ball mill for 4 h to become the precursor of LTO. The precursor was characterized by Simultaneous Thermal Analyzer (STA) to determine sintering temperature. The STA analysis revealed that the minimum temperature to sinter the precursor was 600 °C. The precursor was sintered by using high-temperature furnace at 900 °C for 2 h in air atmosphere. The final product was ground and sieved with a screen to get finer and more homogenous particles. The final product was characterized by X-ray Diffraction (XRD) to determined crystal structure and phases. LTO sheet was prepared by mixing LTO powders with PTFE and AB in ratio 85:10:5 wt% by varrying heating process with 40 °C, 50 °C and 70 °C to become slurry. The slurry was coated on Cu foil with doctor blade method and dried at 80 °C for 1 h. LTO sheet was characterized by FTIR to analyze functional groups. LTO sheet was cut into circular discs with 16 mm in diameter. LTO sheet was arranged with a separator, metallic lithium and electrolyte become coin cell in a glove box. Automatic battery cycler was used to measure electrochemical performance and specific capacity of the cell. From the XRD analysis showed that single phase of LTO phase with a cubic crystal structure is formed. FTIR testing showed that there are stretching vibrations of Ti-O and H-F from tetrahedral TiO6 and PTFE respectively. Increasing temperature on LTO sheet manufacturing doesn’t change the structure of LTO. Cyclic voltammetry analysis showed that sample with heating of 40 °C showed better redox process than others. Charge-discharge test also showed that sample with heating of 40 °C has higher specific capacity than other samples with 53 mAh·g-1.
Abatacept therapy and safety management.
Pham, Thao; Bachelez, Hervé; Berthelot, Jean-Marie; Blacher, Jacques; Claudepierre, Pascal; Constantin, Arnaud; Fautrel, Bruno; Gaujoux-Viala, Cécile; Goëb, Vincent; Gossec, Laure; Goupille, Philippe; Guillaume-Czitrom, Séverine; Hachulla, Eric; Lequerré, Thierry; Marolleau, Jean-Pierre; Martinez, Valérie; Masson, Charles; Mouthon, Luc; Puéchal, Xavier; Richette, Pascal; Saraux, Alain; Schaeverbeke, Thierry; Soubrier, Martin; Viguier, Manuelle; Vittecoq, Olivier; Wendling, Daniel; Mariette, Xavier; Sibilia, Jean
2012-03-01
To develop and/or update fact sheets about abatacept treatment, in order to assist physicians in the management of patients with inflammatory joint disease. 1. selection by a committee of rheumatology experts of the main topics of interest for which fact sheets were desirable 2. identification and review of publications relevant to each topic 3. development and/or update of fact sheets based on three levels of evidence: evidence-based medicine, official recommendations, and expert opinion. The experts were rheumatologists and invited specialists in other fields (dermatologist, cardiologist, pediatric rheumatologist, endocrinologist, hematologist, immunologist, infectiologist), and they had extensive experience with the management of chronic inflammatory diseases, such as rheumatoid arthritis (RA). They were members of the CRI (Club Rhumatismes et Inflammation), a section of the French Rheumatology Society (Societe Francaise de Rhumatologie). Each fact sheet was revised by several experts and the overall process was coordinated by three experts. Several topics of major interest were selected: contraindications of abatacept treatment; management of adverse effects and concomitant diseases that may develop during abatacept treatment; and management of common situations such as pregnancy, surgery, patient older than 75 years of age, and patients with co-morbidities (such as dialysis, hemoglobinopathy, or splenectomy). After a review of the literature and discussion among experts, a consensus was developed about the content of the fact sheets presented here. These fact sheets focus on several points: 1. in RA, initiation and monitoring of the abatacept treatment, management of patients with specific past histories, and specific clinical situations such as pregnancy 2. diseases other than RA, such as juvenile idiopathic arthritis, spondylarthropathies, or autoimmune diseases (systemic lupus erythematosus and other systemic autoimmune diseases) 3. models of letters for informing the rheumatologist and general practitioner 4. patient information about the use of abatacept in RA 5. and data on the new abatacept formulation for subcutaneous administration (approved by the FDA in August 2011 for patients with moderate-to-severe RA). These fact sheets built on evidence-based medicine and expert opinion will serve as a practical tool for assisting physicians who manage patients on abatacept. They will be available continuously on www.cri-net.com and will be updated at appropriate intervals. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
Specifications for Supplementary Classroom Units, Stressed Skin Panel.
ERIC Educational Resources Information Center
Waring, Robert B.; And Others
Complete outline specifications are given for the construction of supplementary classroom units using stressed skin panels. Sections included are--(1) concrete and related work, (2) masonry, (3) structural and miscellaneous metal, (4) curtain walls and metal windows, (5) carpentry and related work, (6) roofing, sheet metal, and related work, (7)…
Driving personalized medicine: capturing maximum net present value and optimal return on investment.
Roth, Mollie; Keeling, Peter; Smart, Dave
2010-01-01
In order for personalized medicine to meet its potential future promise, a closer focus on the work being carried out today and the foundation it will provide for that future is imperative. While big picture perspectives of this still nascent shift in the drug-development process are important, it is more important that today's work on the first wave of targeted therapies is used to build specific benchmarking and financial models against which further such therapies may be more effectively developed. Today's drug-development teams need a robust tool to identify the exact drivers that will ensure the successful launch and rapid adoption of targeted therapies, and financial metrics to determine the appropriate resource levels to power those drivers. This special report will describe one such benchmarking and financial model that is specifically designed for the personalized medicine field and will explain how the use of this or similar models can help to capture the maximum net present value of targeted therapies and help to realize optimal return on investment.
Fan Noise Prediction with Applications to Aircraft System Noise Assessment
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Envia, Edmane; Burley, Casey L.
2009-01-01
This paper describes an assessment of current fan noise prediction tools by comparing measured and predicted sideline acoustic levels from a benchmark fan noise wind tunnel test. Specifically, an empirical method and newly developed coupled computational approach are utilized to predict aft fan noise for a benchmark test configuration. Comparisons with sideline noise measurements are performed to assess the relative merits of the two approaches. The study identifies issues entailed in coupling the source and propagation codes, as well as provides insight into the capabilities of the tools in predicting the fan noise source and subsequent propagation and radiation. In contrast to the empirical method, the new coupled computational approach provides the ability to investigate acoustic near-field effects. The potential benefits/costs of these new methods are also compared with the existing capabilities in a current aircraft noise system prediction tool. The knowledge gained in this work provides a basis for improved fan source specification in overall aircraft system noise studies.
Space station operating system study
NASA Technical Reports Server (NTRS)
Horn, Albert E.; Harwell, Morris C.
1988-01-01
The current phase of the Space Station Operating System study is based on the analysis, evaluation, and comparison of the operating systems implemented on the computer systems and workstations in the software development laboratory. Primary emphasis has been placed on the DEC MicroVMS operating system as implemented on the MicroVax II computer, with comparative analysis of the SUN UNIX system on the SUN 3/260 workstation computer, and to a limited extent, the IBM PC/AT microcomputer running PC-DOS. Some benchmark development and testing was also done for the Motorola MC68010 (VM03 system) before the system was taken from the laboratory. These systems were studied with the objective of determining their capability to support Space Station software development requirements, specifically for multi-tasking and real-time applications. The methodology utilized consisted of development, execution, and analysis of benchmark programs and test software, and the experimentation and analysis of specific features of the system or compilers in the study.
Roterman, I; KrUl, M; Nowak, M; Konieczny, L; Rybarska, J; Stopa, B; Piekarska, B; Zemanek, G
2001-01-01
The complexing of Congo red in two different ligand forms - unimolecular and supramolecular (seven molecules in a micelle) - with eight deca-peptides organized in a b-sheet was tested by computational analysis to identify its dye-binding preferences. Polyphenylananine and polylysine peptides were selected to represent the specific side chain interactions expected to ensure particularly the stabilization of the dye-protein complex. Polyalanine was used to verify the participation of non-specific backbone-derived interactions. The initial complexes for calculation were constructed by intercalating the dye between the peptides in the middle of the beta-sheet. The long axis of the dye molecule (in the case of unimolecular systems) or the long axis of the ribbon-like micelle (in the case of the supramolecular dye form) was oriented parallel to the peptide backbone. This positioning maximally reduced the exposure of the hydrophobic diphenyl (central dye fragment) to water. In general the complexes of supramolecular Congo red ligands appeared more stable than those formed by individual dye molecules. Specific interactions (electrostatic and/or ring stacking) dominated as binding forces in the case of the single molecule, while non-specific surface adsorption seemed decisive in complexing with the supramolecular ligand. Both the unimolecular and supramolecular versions of the dye ligand were found to be likely to form complexes of sufficient stability with peptides. The low stability of the protein and the gap accessible to penetration in the peptide sheet seem sufficient for supramolecular ligand binding, but the presence of positively charged or hydrophobic amino acids may strengthen binding significantly. The need for specific interaction makes single-molecule Congo red binding rather unusual as a general amyloid protein ligand. The structural feature of Congo red, which enables specific and common interaction with amyloid proteins, probably derives from the ribbon-like self-assembled form of the dye.
Long-term mortality study of steelworkers. IX. Mortality patterns among sheet and tin mill workers.
Mazumdar, S; Lerer, T; Redmond, C K
1975-12-01
As a result of findings of an earlier report in this series, this study examines the updated cause-specific mortality of men employed in the sheet and tin mill areas of the steel industry. In order to investigate possible relationships between occupational responsibilities or exposures and mortality from specific causes, the sheet and tin mills have been subdivided into 13 mutually exclusive work areas. Detailed analysis is limited primarily to white workers due to the small number of nonwhites in these areas. The most important observations are: 1. Increased overall mortality appears for men employed in 1953 in the sheet finishing and shipping area, confirming the findings of Lloyd, et al. The earlier observation of a significant excess in deaths from vascular lesions of the central nervous system does not hold over time. The previously noted excess for this cause may be related to selective factors or an extreme chance observation. The excess in mortality from all causes of death, which occurs over several disease categories, may not be a result of occupational exposures, but rather some selectivity. 2. Significant excesses in mortality from arteriosclerotic heart disease are noted among men employed in batch pickling and sheet dryer operations, which is in agreement with the earlier findings. Increased risks of dying from hypertensive heart disease are seen in the coating area. 3. Cancer of the lymphatic and hematopoietic tissues is found to be a significant source of excess mortality for workers in the heat treating and forging and tin finishing and shipping work areas. 4. Steelworkers employed in the annealing-normalizing work area show an excess in deaths from nonmalignant respiratory diseases, primarily pneumonia. Further study in these areas should attempt to investigate whether factors in the work environment may be responsible for the observed excess mortalities. More specifically, work should be done to find out whether men employed in heat treating and forging and tin finishing and shipping work in close proximity to chemicals or radiation exposure and whether workers employed in the annealing-normalizing area are exposed to any kind of oil, vapor, or chemical which might be irritating or infectious to the respiratory system. A similar analysis for men working in the batch pickling and sheet dryers and coating areas would also be worthwhile. The main emphasis of any future study should lie upon investigating whether the observed excess mortalities are due to any environmental factor, selection for health, or random fluctuation.
Stanislawski, L.V.
2009-01-01
The United States Geological Survey has been researching generalization approaches to enable multiple-scale display and delivery of geographic data. This paper presents automated methods to prune network and polygon features of the United States high-resolution National Hydrography Dataset (NHD) to lower resolutions. Feature-pruning rules, data enrichment, and partitioning are derived from knowledge of surface water, the NHD model, and associated feature specification standards. Relative prominence of network features is estimated from upstream drainage area (UDA). Network and polygon features are pruned by UDA and NHD reach code to achieve a drainage density appropriate for any less detailed map scale. Data partitioning maintains local drainage density variations that characterize the terrain. For demonstration, a 48 subbasin area of 1:24 000-scale NHD was pruned to 1:100 000-scale (100 K) and compared to a benchmark, the 100 K NHD. The coefficient of line correspondence (CLC) is used to evaluate how well pruned network features match the benchmark network. CLC values of 0.82 and 0.77 result from pruning with and without partitioning, respectively. The number of polygons that remain after pruning is about seven times that of the benchmark, but the area covered by the polygons that remain after pruning is only about 10% greater than the area covered by benchmark polygons. ?? 2009.
NASA Astrophysics Data System (ADS)
Rodriguez, Tony F.; Cushman, David A.
2003-06-01
With the growing commercialization of watermarking techniques in various application scenarios it has become increasingly important to quantify the performance of watermarking products. The quantification of relative merits of various products is not only essential in enabling further adoption of the technology by society as a whole, but will also drive the industry to develop testing plans/methodologies to ensure quality and minimize cost (to both vendors & customers.) While the research community understands the theoretical need for a publicly available benchmarking system to quantify performance, there has been less discussion on the practical application of these systems. By providing a standard set of acceptance criteria, benchmarking systems can dramatically increase the quality of a particular watermarking solution, validating the product performances if they are used efficiently and frequently during the design process. In this paper we describe how to leverage specific design of experiments techniques to increase the quality of a watermarking scheme, to be used with the benchmark tools being developed by the Ad-Hoc Watermark Verification Group. A Taguchi Loss Function is proposed for an application and orthogonal arrays used to isolate optimal levels for a multi-factor experimental situation. Finally, the results are generalized to a population of cover works and validated through an exhaustive test.
Assessing Ecosystem Model Performance in Semiarid Systems
NASA Astrophysics Data System (ADS)
Thomas, A.; Dietze, M.; Scott, R. L.; Biederman, J. A.
2017-12-01
In ecosystem process modelling, comparing outputs to benchmark datasets observed in the field is an important way to validate models, allowing the modelling community to track model performance over time and compare models at specific sites. Multi-model comparison projects as well as models themselves have largely been focused on temperate forests and similar biomes. Semiarid regions, on the other hand, are underrepresented in land surface and ecosystem modelling efforts, and yet will be disproportionately impacted by disturbances such as climate change due to their sensitivity to changes in the water balance. Benchmarking models at semiarid sites is an important step in assessing and improving models' suitability for predicting the impact of disturbance on semiarid ecosystems. In this study, several ecosystem models were compared at a semiarid grassland in southwestern Arizona using PEcAn, or the Predictive Ecosystem Analyzer, an open-source eco-informatics toolbox ideal for creating the repeatable model workflows necessary for benchmarking. Models included SIPNET, DALEC, JULES, ED2, GDAY, LPJ-GUESS, MAESPA, CLM, CABLE, and FATES. Comparison between model output and benchmarks such as net ecosystem exchange (NEE) tended to produce high root mean square error and low correlation coefficients, reflecting poor simulation of seasonality and the tendency for models to create much higher carbon sources than observed. These results indicate that ecosystem models do not currently adequately represent semiarid ecosystem processes.
Experimental benchmark of kinetic simulations of capacitively coupled plasmas in molecular gases
NASA Astrophysics Data System (ADS)
Donkó, Z.; Derzsi, A.; Korolov, I.; Hartmann, P.; Brandt, S.; Schulze, J.; Berger, B.; Koepke, M.; Bruneau, B.; Johnson, E.; Lafleur, T.; Booth, J.-P.; Gibson, A. R.; O'Connell, D.; Gans, T.
2018-01-01
We discuss the origin of uncertainties in the results of numerical simulations of low-temperature plasma sources, focusing on capacitively coupled plasmas. These sources can be operated in various gases/gas mixtures, over a wide domain of excitation frequency, voltage, and gas pressure. At low pressures, the non-equilibrium character of the charged particle transport prevails and particle-based simulations become the primary tools for their numerical description. The particle-in-cell method, complemented with Monte Carlo type description of collision processes, is a well-established approach for this purpose. Codes based on this technique have been developed by several authors/groups, and have been benchmarked with each other in some cases. Such benchmarking demonstrates the correctness of the codes, but the underlying physical model remains unvalidated. This is a key point, as this model should ideally account for all important plasma chemical reactions as well as for the plasma-surface interaction via including specific surface reaction coefficients (electron yields, sticking coefficients, etc). In order to test the models rigorously, comparison with experimental ‘benchmark data’ is necessary. Examples will be given regarding the studies of electron power absorption modes in O2, and CF4-Ar discharges, as well as on the effect of modifications of the parameters of certain elementary processes on the computed discharge characteristics in O2 capacitively coupled plasmas.
Performance Monitoring of Distributed Data Processing Systems
NASA Technical Reports Server (NTRS)
Ojha, Anand K.
2000-01-01
Test and checkout systems are essential components in ensuring safety and reliability of aircraft and related systems for space missions. A variety of systems, developed over several years, are in use at the NASA/KSC. Many of these systems are configured as distributed data processing systems with the functionality spread over several multiprocessor nodes interconnected through networks. To be cost-effective, a system should take the least amount of resource and perform a given testing task in the least amount of time. There are two aspects of performance evaluation: monitoring and benchmarking. While monitoring is valuable to system administrators in operating and maintaining, benchmarking is important in designing and upgrading computer-based systems. These two aspects of performance evaluation are the foci of this project. This paper first discusses various issues related to software, hardware, and hybrid performance monitoring as applicable to distributed systems, and specifically to the TCMS (Test Control and Monitoring System). Next, a comparison of several probing instructions are made to show that the hybrid monitoring technique developed by the NIST (National Institutes for Standards and Technology) is the least intrusive and takes only one-fourth of the time taken by software monitoring probes. In the rest of the paper, issues related to benchmarking a distributed system have been discussed and finally a prescription for developing a micro-benchmark for the TCMS has been provided.
U.S. Solar Photovoltaic System Cost Benchmark: Q1 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Ran; Feldman, David; Margolis, Robert
This report benchmarks U.S. solar photovoltaic (PV) system installed costs as of the first quarter of 2017 (Q1 2017). We use a bottom-up methodology, accounting for all system and projectdevelopment costs incurred during the installation to model the costs for residential, commercial, and utility-scale systems. In general, we attempt to model the typical installation techniques and business operations from an installed-cost perspective. Costs are represented from the perspective of the developer/installer; thus, all hardware costs represent the price at which components are purchased by the developer/installer, not accounting for preexisting supply agreements or other contracts. Importantly, the benchmark also representsmore » the sales price paid to the installer; therefore, it includes profit in the cost of the hardware, 1 along with the profit the installer/developer receives, as a separate cost category. However, it does not include any additional net profit, such as a developer fee or price gross-up, which is common in the marketplace. We adopt this approach owing to the wide variation in developer profits in all three sectors, where project pricing is highly dependent on region and project specifics such as local retail electricity rate structures, local rebate and incentive structures, competitive environment, and overall project or deal structures. Finally, our benchmarks are national averages weighted by state installed capacities.« less
The Pliocene-Pleistocene transition and the onset of the Northern Hemisphere glacial inception
NASA Astrophysics Data System (ADS)
Robinson, A.; Calov, R.; Ganopolski, A.
2011-12-01
The Pliocene-Pleistocene transition (PPT, ca. 3.3-2.4 Ma BP) marks a shift in the Earth's climate and is believed to coincide with the inception of the Northern Hemisphere (NH) ice sheets. This transition is not only characterized by a gradual reduction in atmospheric CO2 concentration, paleo records also show a strengthening in the amplitude of δ18O data and intensified ice rafted debris deposition in the North Atlantic. Previous modeling studies have demonstrated that the drop in atmospheric CO2 plays an important role in the glaciation of the NH ice sheets, and more specifically, it is considered to be the primary cause of the glaciation of Greenland. Here we apply a novel approach to produce transient simulations of the entire PPT, in order to study the glaciation of Greenland and the NH ice sheets and additionally, to investigate which conditions are necessary for full-scale glaciation. The fully-coupled Earth system model of intermediate complexity CLIMBER-2 is used to explore the effects of a suite of orbital and CO2 forcing scenarios on total NH glaciation. CLIMBER-2 includes low-resolution sub-models of the atmosphere, vegetation, ocean and ice sheets - the latter is designed to simulate the big NH ice sheets with a rather low resolution (and high computational efficiency). As a refinement, the results of the global simulations are then used to force regional simulations of the Greenland Ice Sheet (GIS) using the higher resolution (20 km) regional climate-ice sheet model, REMBO-SICOPOLIS. We present results of transient simulations driven by orbital forcing and several CO2 reduction scenarios that are consistent with best estimates from data for this time period. We discuss the growth and persistence of the NH ice sheets in terms of the forcing and feedbacks involved. Additionally, we present a set of simulations with the growth of the NH ice sheets disabled, in order to quantify the effect the large ice sheets have on global and regional temperature anomalies. By simulating the Greenland Ice Sheet (GIS) in our high resolution coupled global-regional approach, we identify with greater precision, the conditions neccesary for inception of the GIS and link these to global climatic changes.
Kajiki, Shigeyuki; Kobayashi, Yuichi; Uehara, Masamichi; Nakanishi, Shigemoto; Mori, Koji
2016-06-07
This study aimed to develop an information gathering check sheet to efficiently collect information necessary for Japanese companies to build global occupational safety and health management systems in overseas business places. The study group consisted of 2 researchers with occupational physician careers in a foreign-affiliated company in Japan and 3 supervising occupational physicians who were engaged in occupational safety and health activities in overseas business places. After investigating information and sources of information necessary for implementing occupational safety and health activities and building relevant systems, we conducted information acquisition using an information gathering check sheet in the field, by visiting 10 regions in 5 countries (first phase). The accuracy of the information acquired and the appropriateness of the information sources were then verified in study group meetings to improve the information gathering check sheet. Next, the improved information gathering check sheet was used in another setting (3 regions in 1 country) to confirm its efficacy (second phase), and the information gathering check sheet was thereby completed. The information gathering check sheet was composed of 9 major items (basic information on the local business place, safety and health overview, safety and health systems, safety and health staff, planning/implementation/evaluation/improvement, safety and health activities, laws and administrative organs, local medical care systems and public health, and medical support for resident personnel) and 61 medium items. We relied on the following eight information sources: the internet, company (local business place and head office in Japan), embassy/consulate, ISO certification body, university or other educational institutions, and medical institutions (aimed at Japanese people or at local workers). Through multiple study group meetings and a two-phased field survey (13 regions in 6 countries), an information gathering check sheet was completed. We confirmed the possibility that this check sheet would enable the user to obtain necessary information when expanding safety and health activities in a country or region that is new to the user. It is necessary in the future to evaluate safety and health systems and activities using this information gathering check sheet in a local business place in any country in which a Japanese business will be established, and to verify the efficacy of the check sheet by conducting model programs to test specific approaches.
Gelli, Aulo; Suwa, Yuko
2014-09-01
School feeding programs have been a key response to the recent food and economic crises and function to some degree in nearly every country in the world. However, school feeding programs are complex and exhibit different, context-specific models or configurations. To examine the trade-offs, including the costs and cost-efficiency, of an innovative cluster kitchen implementation model in Bangladesh using a standardized framework. A supply chain framework based on international standards was used to provide benchmarks for meaningful comparisons across models. Implementation processes specific to the program in Bangladesh were mapped against this reference to provide a basis for standardized performance measures. Qualitative and quantitative data on key metrics were collected retrospectively using semistructured questionnaires following an ingredients approach, including both financial and economic costs. Costs were standardized to a 200-feeding-day year and 700 kcal daily. The cluster kitchen model had similarities with the semidecentralized model and outsourced models in the literature, the main differences involving implementation scale, scale of purchasing volumes, and frequency of purchasing. Two important features stand out in terms of implementation: the nutritional quality of meals and the level of community involvement. The standardized full cost per child per year was US$110. Despite the nutritious content of the meals, the overall cost-efficiency in cost per nutrient output was lower than the benchmark for centralized programs, due mainly to support and start-up costs. Cluster kitchens provide an example of an innovative implementation model, combining an emphasis on quality meal delivery with strong community engagement. However, the standardized costs-per child were above the average benchmarks for both low-and middle-income countries. In contrast to the existing benchmark data from mature, centralized models, the main cost drivers of the program were associated with support and start-up activities. Further research is required to better understand changes in cost drivers as programs mature.
Teaching Regular Classroom Success.
ERIC Educational Resources Information Center
Harries, Rhonda J.
1986-01-01
Seven strategies are described to encourage resource room students' development of independent organizational skills. Suggestions include use of specific duty sheets, time management instruction, and teaching of proofreading and checking techniques. (CL)
Bereskie, Ty; Haider, Husnain; Rodriguez, Manuel J; Sadiq, Rehan
2017-08-23
Traditional approaches for benchmarking drinking water systems are binary, based solely on the compliance and/or non-compliance of one or more water quality performance indicators against defined regulatory guidelines/standards. The consequence of water quality failure is dependent on location within a water supply system as well as time of the year (i.e., season) with varying levels of water consumption. Conventional approaches used for water quality comparison purposes fail to incorporate spatiotemporal variability and degrees of compliance and/or non-compliance. This can lead to misleading or inaccurate performance assessment data used in the performance benchmarking process. In this research, a hierarchical risk-based water quality performance benchmarking framework is proposed to evaluate small drinking water systems (SDWSs) through cross-comparison amongst similar systems. The proposed framework (R WQI framework) is designed to quantify consequence associated with seasonal and location-specific water quality issues in a given drinking water supply system to facilitate more efficient decision-making for SDWSs striving for continuous performance improvement. Fuzzy rule-based modelling is used to address imprecision associated with measuring performance based on singular water quality guidelines/standards and the uncertainties present in SDWS operations and monitoring. This proposed R WQI framework has been demonstrated using data collected from 16 SDWSs in Newfoundland and Labrador and Quebec, Canada, and compared to the Canadian Council of Ministers of the Environment WQI, a traditional, guidelines/standard-based approach. The study found that the R WQI framework provides an in-depth state of water quality and benchmarks SDWSs more rationally based on the frequency of occurrence and consequence of failure events.
Electric load shape benchmarking for small- and medium-sized commercial buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Xuan; Hong, Tianzhen; Chen, Yixing
Small- and medium-sized commercial buildings owners and utility managers often look for opportunities for energy cost savings through energy efficiency and energy waste minimization. However, they currently lack easy access to low-cost tools that help interpret the massive amount of data needed to improve understanding of their energy use behaviors. Benchmarking is one of the techniques used in energy audits to identify which buildings are priorities for an energy analysis. Traditional energy performance indicators, such as the energy use intensity (annual energy per unit of floor area), consider only the total annual energy consumption, lacking consideration of the fluctuation ofmore » energy use behavior over time, which reveals the time of use information and represents distinct energy use behaviors during different time spans. To fill the gap, this study developed a general statistical method using 24-hour electric load shape benchmarking to compare a building or business/tenant space against peers. Specifically, the study developed new forms of benchmarking metrics and data analysis methods to infer the energy performance of a building based on its load shape. We first performed a data experiment with collected smart meter data using over 2,000 small- and medium-sized businesses in California. We then conducted a cluster analysis of the source data, and determined and interpreted the load shape features and parameters with peer group analysis. Finally, we implemented the load shape benchmarking feature in an open-access web-based toolkit (the Commercial Building Energy Saver) to provide straightforward and practical recommendations to users. The analysis techniques were generic and flexible for future datasets of other building types and in other utility territories.« less
Multi-Complementary Model for Long-Term Tracking
Zhang, Deng; Zhang, Junchang; Xia, Chenyang
2018-01-01
In recent years, video target tracking algorithms have been widely used. However, many tracking algorithms do not achieve satisfactory performance, especially when dealing with problems such as object occlusions, background clutters, motion blur, low illumination color images, and sudden illumination changes in real scenes. In this paper, we incorporate an object model based on contour information into a Staple tracker that combines the correlation filter model and color model to greatly improve the tracking robustness. Since each model is responsible for tracking specific features, the three complementary models combine for more robust tracking. In addition, we propose an efficient object detection model with contour and color histogram features, which has good detection performance and better detection efficiency compared to the traditional target detection algorithm. Finally, we optimize the traditional scale calculation, which greatly improves the tracking execution speed. We evaluate our tracker on the Object Tracking Benchmarks 2013 (OTB-13) and Object Tracking Benchmarks 2015 (OTB-15) benchmark datasets. With the OTB-13 benchmark datasets, our algorithm is improved by 4.8%, 9.6%, and 10.9% on the success plots of OPE, TRE and SRE, respectively, in contrast to another classic LCT (Long-term Correlation Tracking) algorithm. On the OTB-15 benchmark datasets, when compared with the LCT algorithm, our algorithm achieves 10.4%, 12.5%, and 16.1% improvement on the success plots of OPE, TRE, and SRE, respectively. At the same time, it needs to be emphasized that, due to the high computational efficiency of the color model and the object detection model using efficient data structures, and the speed advantage of the correlation filters, our tracking algorithm could still achieve good tracking speed. PMID:29425170
Impact of quality circles for improvement of asthma care: results of a randomized controlled trial
Schneider, Antonius; Wensing, Michel; Biessecker, Kathrin; Quinzler, Renate; Kaufmann-Kolle, Petra; Szecsenyi, Joachim
2008-01-01
Rationale and aims Quality circles (QCs) are well established as a means of aiding doctors. New quality improvement strategies include benchmarking activities. The aim of this paper was to evaluate the efficacy of QCs for asthma care working either with general feedback or with an open benchmark. Methods Twelve QCs, involving 96 general practitioners, were organized in a randomized controlled trial. Six worked with traditional anonymous feedback and six with an open benchmark; both had guided discussion from a trained moderator. Forty-three primary care practices agreed to give out questionnaires to patients to evaluate the efficacy of QCs. Results A total of 256 patients participated in the survey, of whom 185 (72.3%) responded to the follow-up 1 year later. Use of inhaled steroids at baseline was high (69%) and self-management low (asthma education 27%, individual emergency plan 8%, and peak flow meter at home 21%). Guideline adherence in drug treatment increased (P = 0.19), and asthma steps improved (P = 0.02). Delivery of individual emergency plans increased (P = 0.008), and unscheduled emergency visits decreased (P = 0.064). There was no change in asthma education and peak flow meter usage. High medication guideline adherence was associated with reduced emergency visits (OR 0.24; 95% CI 0.07–0.89). Use of theophylline was associated with hospitalization (OR 7.1; 95% CI 1.5–34.3) and emergency visits (OR 4.9; 95% CI 1.6–14.7). There was no difference between traditional and benchmarking QCs. Conclusions Quality circles working with individualized feedback are effective at improving asthma care. The trial may have been underpowered to detect specific benchmarking effects. Further research is necessary to evaluate strategies for improving the self-management of asthma patients. PMID:18093108
Electric load shape benchmarking for small- and medium-sized commercial buildings
Luo, Xuan; Hong, Tianzhen; Chen, Yixing; ...
2017-07-28
Small- and medium-sized commercial buildings owners and utility managers often look for opportunities for energy cost savings through energy efficiency and energy waste minimization. However, they currently lack easy access to low-cost tools that help interpret the massive amount of data needed to improve understanding of their energy use behaviors. Benchmarking is one of the techniques used in energy audits to identify which buildings are priorities for an energy analysis. Traditional energy performance indicators, such as the energy use intensity (annual energy per unit of floor area), consider only the total annual energy consumption, lacking consideration of the fluctuation ofmore » energy use behavior over time, which reveals the time of use information and represents distinct energy use behaviors during different time spans. To fill the gap, this study developed a general statistical method using 24-hour electric load shape benchmarking to compare a building or business/tenant space against peers. Specifically, the study developed new forms of benchmarking metrics and data analysis methods to infer the energy performance of a building based on its load shape. We first performed a data experiment with collected smart meter data using over 2,000 small- and medium-sized businesses in California. We then conducted a cluster analysis of the source data, and determined and interpreted the load shape features and parameters with peer group analysis. Finally, we implemented the load shape benchmarking feature in an open-access web-based toolkit (the Commercial Building Energy Saver) to provide straightforward and practical recommendations to users. The analysis techniques were generic and flexible for future datasets of other building types and in other utility territories.« less
International health IT benchmarking: learning from cross-country comparisons.
Zelmer, Jennifer; Ronchi, Elettra; Hyppönen, Hannele; Lupiáñez-Villanueva, Francisco; Codagnone, Cristiano; Nøhr, Christian; Huebner, Ursula; Fazzalari, Anne; Adler-Milstein, Julia
2017-03-01
To pilot benchmark measures of health information and communication technology (ICT) availability and use to facilitate cross-country learning. A prior Organization for Economic Cooperation and Development-led effort involving 30 countries selected and defined functionality-based measures for availability and use of electronic health records, health information exchange, personal health records, and telehealth. In this pilot, an Organization for Economic Cooperation and Development Working Group compiled results for 38 countries for a subset of measures with broad coverage using new and/or adapted country-specific or multinational surveys and other sources from 2012 to 2015. We also synthesized country learnings to inform future benchmarking. While electronic records are widely used to store and manage patient information at the point of care-all but 2 pilot countries reported use by at least half of primary care physicians; many had rates above 75%-patient information exchange across organizations/settings is less common. Large variations in the availability and use of telehealth and personal health records also exist. Pilot participation demonstrated interest in cross-national benchmarking. Using the most comparable measures available to date, it showed substantial diversity in health ICT availability and use in all domains. The project also identified methodological considerations (e.g., structural and health systems issues that can affect measurement) important for future comparisons. While health policies and priorities differ, many nations aim to increase access, quality, and/or efficiency of care through effective ICT use. By identifying variations and describing key contextual factors, benchmarking offers the potential to facilitate cross-national learning and accelerate the progress of individual countries. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Development of High Toughness Sheet and Extruded Products for Airplane Fuselage Structures
NASA Astrophysics Data System (ADS)
Magnusen, P. E.; Mooy, D. C.; Yocum, L. A.; Rioja, R. J.
High specific ultimate strength and high plane stress fracture toughness are primary requirements of aircraft fuselage skins. The performance of alloys/products used in high performance fuselage applications is first reviewed. The specific fracture toughness for products such as 2017-T3, 2024-T3, 2524-T3 and 6013-T6, is discussed as a function of their composition and microstructure. Then the performance of modern Al-Li alloys/products such as 2199 and 2060 sheet and 2099 and 2055 extrusions is examined. It is concluded that the performance of Li containing alloys/products offer significant improvements over non-Li containing conventional fuselage products because of the optimization of strengthening precipitates and grain microstructures. The role of chemical composition on resulting microstructures is discussed.
NASA Astrophysics Data System (ADS)
Zhou, Ping; Beeh, Elmar; Friedrich, Horst E.
2016-03-01
Magnesium alloys are promising materials for lightweight design in the automotive industry due to their high strength-to-mass ratio. This study aims to study the influence of tension-compression asymmetry on the radius of curvature and energy absorption capacity of AZ31B-O magnesium alloy sheets in bending. The mechanical properties were characterized using tension, compression, and three-point bending tests. The material exhibits significant tension-compression asymmetry in terms of strength and strain hardening rate due to extension twinning in compression. The compressive yield strength is much lower than the tensile yield strength, while the strain hardening rate is much higher in compression. Furthermore, the tension-compression asymmetry in terms of r value (Lankford value) was also observed. The r value in tension is much higher than that in compression. The bending results indicate that the AZ31B-O sheet can outperform steel and aluminum sheets in terms of specific energy absorption in bending mainly due to its low density. In addition, the AZ31B-O sheet was deformed with a larger radius of curvature than the steel and aluminum sheets, which brings a benefit to energy absorption capacity. Finally, finite element simulation for three-point bending was performed using LS-DYNA and the results confirmed that the larger radius of curvature of a magnesium specimen is mainly attributed to the high strain hardening rate in compression.
NASA Astrophysics Data System (ADS)
Balter, A.; Bromley, G. R.; Balco, G.; Thomas, H.; Jackson, M. S.
2017-12-01
Ice-free areas at high elevation in the central Transantarctic Mountains preserve extensive moraine sequences and drift deposits that comprise a geologic record of former East Antarctic Ice Sheet thickness and extent. We are applying cosmogenic-nuclide exposure dating to determine the ages of these moraine sequences at Roberts Massif and Otway Massif, at the heads of the Shackleton and Beardmore Glaciers, respectively. Moraines at these sites are for the most part openwork boulder belts characteristic of deposition by cold-based ice, which is consistent with present climate and glaciological conditions. To develop our chronology, we collected samples from 30 distinct ice-marginal landforms and have so far measured >100 3He, 10Be, and 21Ne exposure ages. Apparent exposure ages range from 1-14 Ma, which shows that these landforms record glacial events between the middle Pleistocene and middle Miocene. These data show that the thickness of the East Antarctic Ice Sheet in this region was similar to or thicker than present for long periods between the middle Miocene and today. The time range represented by these moraine sequences indicates that they may also provide direct geologic evidence for East Antarctic Ice Sheet behavior during past periods of warmer-than-present climate, specifically the Miocene and Pliocene. As the East Antarctic Ice Sheet is the largest ice sheet on earth, understanding its sensitivity to warm-climate conditions is critical for projections of ice sheet behavior and sea-level rise in future warm climates.
On the Magnetospheric Heating Problem
NASA Astrophysics Data System (ADS)
Nykyri, K.; Moore, T.; Dimmock, A. P.; Ma, X.; Johnson, J.; Delamere, P. A.
2016-12-01
In the Earth's magnetosphere the specific entropy, increases by approximately two orders of magnitude when transitioning from the magnetosheath into the magnetosphere. However, the origin of this non-adiabatic heating is not well understood. In addition, there exists a dawn-dusk temperature asymmetry in the flanks of the plasma sheet - the cold component ions are hotter by 30-40% at the dawnside plasma sheet compared to the duskside plasma sheet. Our recent statistical study of magnetosheath temperatures using 7 years of THEMIS data indicates that ion magnetosheath temperatures downstream of quasi-parallel (dawn-flank for the Parker-Spiral IMF) bow shock are only 15 percent higher than downstream of the quasi-perpendicular shock. This magnetosheath temperature asymmetry is therefore inadequate to cause the observed level of the plasma sheet temperature asymmetry. In this presentation we address the origin of non-adiabatic heating from the magnetosheath into the plasma sheet by utilizing small Cluster spacecraft separations, 9 years of statistical THEMIS data as well as Hall-MHD and hybrid simulations. We present evidence of a new physical mechanism capable of cross-scale energy transport at the flank magnetopause with strong contributions to the non-adiabatic heating observed between the magnetosheath and plasma sheet. This same heating mechanism may occur and drive asymmetries also in the magnetospheres of gas giants: Jupiter and Saturn, as well as play role elsewhere in the universe where significant flow shears are present such as in the solar corona, and other astrophysical and laboratory plasmas.
Inspiration & Insight - a tribute to Niels Reeh
NASA Astrophysics Data System (ADS)
Ahlstrom, A. P.; Vieli, A.
2009-12-01
Niels Reeh was highly regarded for his contributions to glaciology, specifically through his rigorous combination of numerical modelling and field observations. In 1966 he began his work on the application of beam mechanics to floating glaciers and ice shelves and throughout his life, Niels retained a strong interest in modelling glacier dynamics. In the early 1980s Niels developed a 3D-model for ice sheets and in the late 1980s an advanced flow-line model. Niels Reeh also took part in the early ice-core drilling efforts in Greenland and later pioneered the concept of retrieving similar records from the surface of the ice-sheet margin. Mass balance of glaciers and ice sheets was another theme in Niels Reeh’s research, with a number of important contributions and insights still used when teaching the subject to students. Niels developed elegant models for ablation and snow densification, notable for their applicability in large-scale ice-sheet models and studied the impact of climate change on ice sheets and glaciers. Niels also took his interest in ice-dynamics and mass balance into remote sensing and worked successfully on methods to utilize radar and laser data from airborne surveys and satellites in glaciology. In this, he pioneered the combination of field experiments, satellite observations and numerical modelling to solve problems on the Greenland Ice Sheet. In this presentation we will attempt to provide an overview of Niels Reeh’s many-facetted career in acknowledgement of his contributions to the field of glaciology.
Sun, Jin; Dong, Zhiwei; Zhang, Yang; He, Xiaoning; Fei, Dongdong; Jin, Fang; Yuan, Lin; Li, Bei; Jin, Yan
2017-07-12
Inflammatory microenvironment causes the change of epigenetic modification in periodontal ligament stem cells derived from periodontitis tissues (P-PDLSCs), which results in defective osteogenic differentiation compared to cells from healthy tissues. It's urgent to explore therapeutic strategies aimed at epigenetic targets associated with the regenerative ability of PDLSCs. Osthole, a small-molecule compound extracted from Chinese herbs, has been documented to promote osteogenesis and cell sheets formation of healthy PDLSCs. However, whether osthole shows same effect on P-PDLSCs and the mechanism of promotive effect is still unknown. The purpose of this study was to determine whether Osthole could restore defective osteogenic differentiation of P-PDLSCs via epigenetic modification. We demonstrated that 10 -7 Mol/L of Osthole was the best concentration for osteogenic differentiation and proliferation of P-PDLSCs. Mechanistically, we also found that Osthole upregulated MOZ and MORF, histone acetylases that specifically catalyze acetylation of Histone3 lisine9 (H3K9) and Histone3 lisine14 (H3K14), which are key regulators in osteogenic differentiation of P-PDLSCs. Furthermore, Osthole treatment improved cell sheet formation and enhanced the bone formation of PDLSC sheets in animal models of periodontitis. Our study suggests that Osthole is a promising drug to cure periodontitis via regulating epigenetic modification in cell sheets engineering.
The microbiome of glaciers and ice sheets.
Anesio, Alexandre M; Lutz, Stefanie; Chrismas, Nathan A M; Benning, Liane G
2017-01-01
Glaciers and ice sheets, like other biomes, occupy a significant area of the planet and harbour biological communities with distinct interactions and feedbacks with their physical and chemical environment. In the case of the glacial biome, the biological processes are dominated almost exclusively by microbial communities. Habitats on glaciers and ice sheets with enough liquid water to sustain microbial activity include snow, surface ice, cryoconite holes, englacial systems and the interface between ice and overridden rock/soil. There is a remarkable similarity between the different specific glacial habitats across glaciers and ice sheets worldwide, particularly regarding their main primary producers and ecosystem engineers. At the surface, cyanobacteria dominate the carbon production in aquatic/sediment systems such as cryoconite holes, while eukaryotic Zygnematales and Chlamydomonadales dominate ice surfaces and snow dynamics, respectively. Microbially driven chemolithotrophic processes associated with sulphur and iron cycle and C transformations in subglacial ecosystems provide the basis for chemical transformations at the rock interface under the ice that underpin an important mechanism for the delivery of nutrients to downstream ecosystems. In this review, we focus on the main ecosystem engineers of glaciers and ice sheets and how they interact with their chemical and physical environment. We then discuss the implications of this microbial activity on the icy microbiome to the biogeochemistry of downstream ecosystems.
A laboratory means to produce tough aluminum sheet from powder
NASA Technical Reports Server (NTRS)
Singleton, O. R.; Royster, D. M.; Thomas, J. R.
1990-01-01
The rapid solidification of aluminum alloys as powder and the subsequent fabrication processes can be used to develop and tailor alloys to satisfy specific aerospace design requirements, including high strength and toughness. Laboratory procedures to produce aluminum powder-metallurgy (PM) materials are efficient but require evidence that the laboratory methods used can produce a product with superior properties. This paper describes laboratory equipment and procedures which can be used to produce tough aluminum PM sheet. The processing of a 2124 + 0.9 percent Zr aluminum alloy powder is used as an example. The fully hardened sheet product is evaluated in terms of properties and microstructure. The key features of the vacuum hot press pressing operation used to consolidate the powder are described. The 2124 + 0.9 percent Zr - T8 temper aluminum sheet produced was both strong (460-490 MPa yield strength) and tough (Kahn Tear unit-propagation- energy values over three times those typical for ingot metallurgy 2024-T81). Both the longitudinal and longitudinal-transverse directions of the sheet were tested. The microstructure was well refined with subgrains of one or two micrometers. Fine dispersoids of Al3Zr in the precipitate free regions adjacent to boundaries are believed to contribute to the improved toughness.
Huang, Yi Fu; Ruan, Wen Hong; Lin, Dong Ling; Zhang, Ming Qiu
2017-01-11
Substituting conventional electrolyte for redox electrolyte has provided a new intriguing method for extending battery life. The efficiency of utilizing the contained redox species (RS) in the redox electrolyte can benefit from increasing the specific surface area of battery electrodes from the electrode side of the electrode-electrolyte interface, but is not limited to that. Herein, a new strategy using nanocomposite electrolyte is proposed to enlarge the interface with the aid of nanoinclusions from the electrolyte side. To do this, graphene oxide (GO) sheets are first dispersed in the electrolyte solution of tungstosilicic salt/lithium sulfate/poly(vinyl alcohol) (SiWLi/Li 2 SO 4 /PVA), and then the sheets are bridged to electrode, after casting and evaporating the solution on the electrode surface. By applying in situ conductive atomic force microscopy and Raman spectra, it is confirmed that the GO sheets doped with RS of SiWLi/Li 2 SO 4 can be bridged and electrically reduced as an extended electrode-electrolyte interface. As a result, the RS-coated GO sheets bridged to LiTi 2 (PO 4 ) 3 //LiMn 2 O 4 battery electrodes are found to deliver extra energy capacity (∼30 mAh/g) with excellent electrochemical cycling stability, which successfully extends the battery life by over 50%.
Property Criteria for Automotive Al-Mg-Si Sheet Alloys
Prillhofer, Ramona; Rank, Gunther; Berneder, Josef; Antrekowitsch, Helmut; Uggowitzer, Peter J.; Pogatscher, Stefan
2014-01-01
In this study, property criteria for automotive Al-Mg-Si sheet alloys are outlined and investigated in the context of commercial alloys AA6016, AA6005A, AA6063 and AA6013. The parameters crucial to predicting forming behavior were determined by tensile tests, bending tests, cross-die tests, hole-expansion tests and forming limit curve analysis in the pre-aged temper after various storage periods following sheet production. Roping tests were performed to evaluate surface quality, for the deployment of these alloys as an outer panel material. Strength in service was also tested after a simulated paint bake cycle of 20 min at 185 °C, and the corrosion behavior was analyzed. The study showed that forming behavior is strongly dependent on the type of alloy and that it is influenced by the storage period after sheet production. Alloy AA6016 achieves the highest surface quality, and pre-ageing of alloy AA6013 facilitates superior strength in service. Corrosion behavior is good in AA6005A, AA6063 and AA6016, and only AA6013 shows a strong susceptibility to intergranular corrosion. The results are discussed below with respect to the chemical composition, microstructure and texture of the Al-Mg-Si alloys studied, and decision-making criteria for appropriate automotive sheet alloys for specific applications are presented. PMID:28788119
Stress focusing and collapse of a thin film under constant pressure
NASA Astrophysics Data System (ADS)
Hamm, Eugenio; Cabezas, Nicolas
2012-02-01
Thin elastic sheets and shells are prone to focus stress when forced, due to their near inextensibility. Singular structures such as ridges, vertices, and folds arising from wrinkles, are characteristic of the deformation of such systems. Usually the forcing is exerted at the boundaries or at specific points of the surface, in displacement controlled experiments. On the other hand, much of the phenomenology of stress focusing can be found at micro and nanoscales, in physics and biology, making it universal. We will consider the post-buckling regime of a thin elastic sheet that is subjected to a constant normal distributed force. Specifically, we will present experiments made on thin elastoplastic sheets that collapse under atmospheric pressure. For instance, in vacuum-sealing technology, when a flat plastic bag is forced to wrap a solid volume, a series of self-contacts and folds develop. The unfolded bag shows a pattern of scars whose structure is determined by the geometry of the volume and by the exact way it stuck to its surface, by friction. Inspired by this everyday example we study the geometry of folds that result from collapsing a hermetic bag on regular rigid bodies.
Vadiyar, Madagonda M; Liu, Xudong; Ye, Zhibin
2018-05-14
In the present work, we demonstrate the synthesis of porous activated carbon (specific surface area, 1,883 m2 g-1), Fe3O4 nanoparticles, and carbon-Fe3O4 nanocomposites using local waste thermocol sheets and rusted iron wires. The resulting carbon, Fe3O4 nanoparticles, and carbon-Fe3O4 composites are used as electrode materials for supercapacitor application. In particular, C-Fe3O4 composite electrodes exhibit a high specific capacitance of 1,375 F g-1 at 1 A g-1 and longer cyclic stability with 98 % of capacitance retention over 10,000 cycles. Subsequently, asymmetric supercapacitor, i. e., C-Fe3O4//Ni(OH)2/CNT device exhibits a high energy density of 91.1 Wh kg-1 and a remarkable cyclic stability, showing 98% of capacitance retention over 10,000 cycles. Thus, this work has important implications not only for the fabrication of low-cost electrodes for high-performance supercapacitors but also for the recycling of waste thermocol sheets and rust iron wires for value-added reuse. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Adaptive scallop height tool path generation for robot-based incremental sheet metal forming
NASA Astrophysics Data System (ADS)
Seim, Patrick; Möllensiep, Dennis; Störkle, Denis Daniel; Thyssen, Lars; Kuhlenkötter, Bernd
2016-10-01
Incremental sheet metal forming is an emerging process for the production of individualized products or prototypes in low batch sizes and with short times to market. In these processes, the desired shape is produced by the incremental inward motion of the workpiece-independent forming tool in depth direction and its movement along the contour in lateral direction. Based on this shape production, the tool path generation is a key factor on e.g. the resulting geometric accuracy, the resulting surface quality, and the working time. This paper presents an innovative tool path generation based on a commercial milling CAM package considering the surface quality and working time. This approach offers the ability to define a specific scallop height as an indicator of the surface quality for specific faces of a component. Moreover, it decreases the required working time for the production of the entire component compared to the use of a commercial software package without this adaptive approach. Different forming experiments have been performed to verify the newly developed tool path generation. Mainly, this approach serves to solve the existing conflict of combining the working time and the surface quality within the process of incremental sheet metal forming.
Introduction to the IWA task group on biofilm modeling.
Noguera, D R; Morgenroth, E
2004-01-01
An International Water Association (IWA) Task Group on Biofilm Modeling was created with the purpose of comparatively evaluating different biofilm modeling approaches. The task group developed three benchmark problems for this comparison, and used a diversity of modeling techniques that included analytical, pseudo-analytical, and numerical solutions to the biofilm problems. Models in one, two, and three dimensional domains were also compared. The first benchmark problem (BM1) described a monospecies biofilm growing in a completely mixed reactor environment and had the purpose of comparing the ability of the models to predict substrate fluxes and concentrations for a biofilm system of fixed total biomass and fixed biomass density. The second problem (BM2) represented a situation in which substrate mass transport by convection was influenced by the hydrodynamic conditions of the liquid in contact with the biofilm. The third problem (BM3) was designed to compare the ability of the models to simulate multispecies and multisubstrate biofilms. These three benchmark problems allowed identification of the specific advantages and disadvantages of each modeling approach. A detailed presentation of the comparative analyses for each problem is provided elsewhere in these proceedings.