Science.gov

Sample records for addition verification studies

  1. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: ENVIROFUELS DIESEL FUEL CATALYZER FUEL ADDITIVE

    EPA Science Inventory

    EPA's Environmental Technology Verification Program has tested EnviroFuels diesel fuel additive, called the Diesel Fuel Catalyzer. EnviroFuels has stated that heavy-duty on and off road diesel engines are the intended market for the catalyzer. Preliminary tests conducted indicate...

  3. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  4. Environmental Technology Verification Report: Taconic Energy, Inc. TEA Fuel Additive

    EPA Science Inventory

    The Greenhouse Gas Technology Center (GHG Center) is one of six verification organizations operating under EPA’s ETV program. One sector of significant interest to GHG Center stakeholders is transportation - particularly technologies that result in fuel economy improvements. Taco...

  5. Expert system verification and validation study

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.

  6. Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.

  7. Learner Verification: A Publisher's Case Study.

    ERIC Educational Resources Information Center

    Wilson, George

    Learner verification, a process by which publishers monitor the effectiveness of their products and strive to improve their services to schools, is a practice that most companies take seriously. The quality of educational materials may be ensured in many ways: by analysis of sales, through firsthand investigation, and by employing a system of…

  8. Verification study of an emerging fire suppression system

    SciTech Connect

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; Gubernatis, David C.

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation and mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.

  9. Verification study of an emerging fire suppression system

    DOE PAGESBeta

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; Gubernatis, David C.

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less

  10. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  11. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    SciTech Connect

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  12. Metals Verification Study for Sinclair and Dyes Inlets,Washington

    SciTech Connect

    Kohn, Nancy P.; Miller, Martin C.; Brandenberger, Jill M.; Johnston, Robert K.

    2004-09-29

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington's 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. This Metals Verification Study was conducted to address the 303(d) segments that are listed for metal contaminants in marine sediment, because significant cleanup and source control activities have been conducted in the Inlets since the data supporting the 1998 303(d) listings were collected. The study was designed to obtain present-day sediment metals concentrations throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, with stations spatially distributed to support 303(d) listing updates and also watershed-level water quality and contaminant transport modeling efforts. A total of 160 surface sediment samples from Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage were screened for copper, lead, and zinc using X-Ray Fluorescence (XRF). 40 samples (25%) were selected for confirmatory metals analysis by ICP-MS for cadmium, silver, and arsenic in addition to copper, lead, and zinc. Regression relationships between the ICP-MS and XRF datasets were developed to estimate copper, lead, and zinc concentrations in all samples. The XRF results for copper, lead, and zinc correlated well with ICP-MS results, and predicted concentrations were calculated for all samples. The results of the Metals Verification Study show that sediment quality in Sinclair Inlet has improved markedly since implementation of cleanup and source control actions, and that the distribution of residual contaminants is limited to nearshore areas already within the actively managed Puget Sound Naval Shipyard Superfund Site where further source control actions and monitoring are under way. Outside of Sinclair Inlet, the target metals met state sediment quality standards.

  13. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  14. Addition polyimide end cap study

    NASA Technical Reports Server (NTRS)

    St.clair, T. L.

    1980-01-01

    The characterization of addition polyimides with various end caps for adhesive applications at 120-250 C environments is discussed. Oligometric polyimides were prepared from 3,3',4,4'-benzophenone tetracarboxylic dianhydride and 3,3'-methylenedianiline which were end-capped with functionally reactive moities which cause crosslinking when the oligomers are heated to 200-400 C. The syntheses of the oligomers are outlined. The thermolysis of the oligomers was studied by differential scanning calorimetry and the resulting polymers were characterized by differential thermal analysis and adhesive performance. The adhesive data include lap shear strengths on titanium 6-4 adherends both before and after aging for 1000 hours at 121 C and/or 232 C.

  15. Infectivity model verification studies, annual report - 1981

    SciTech Connect

    McGrath, J.J.

    1982-01-01

    The infectivity model has been used as one of the leading indicators of the potential health effects that may be associated with energy-related pollutants including nitrogen dioxide (NOs), ozone, and diesel exhaust. The original studies with the infectivity model and chronic exposure to NO2 reported by Ehrlich and Henry (1968) have not been replicated. This report details the work that has been performed in Texas Tech's laboratory thus far in initiating a chronic NO2 exposure study to replicate the original work by Ehrlich and Henry, and reviews the preliminary results. At the end of the first contract year, a functioning inhalation facility with a capability to expose animals continuously to low levels of NO2 is in place. One group of animals has been exposed to NO2 for eight months and challenged with Klebsiella pneumonia by inhalation. The results are similar to, but do not replicate entirely, those reported by Ehrlich and Henry. Two additional exposures have been initiated, and the animals will be challenged with the infectious agent in a bacterial infectivity chamber similar to that used by EPA.

  16. Advanced NSTS propulsion system verification study

    NASA Technical Reports Server (NTRS)

    Wood, Charles

    1989-01-01

    The merits of propulsion system development testing are discussed. The existing data base of technical reports and specialists is utilized in this investigation. The study encompassed a review of all available test reports of propulsion system development testing for the Saturn stages, the Titan stages, and the Space Shuttle main propulsion system. The knowledge on propulsion system development and system testing available from specialists and managers was also 'tapped' for inclusion.

  17. Verification Studies for Multi-Fluid Plasma Algorithms with Applications to Fast MHD Physics

    NASA Astrophysics Data System (ADS)

    Becker, Joe; Hakim, Ammar; Loverich, John; Stoltz, Peter

    2011-10-01

    In this paper we present a series of verification studies for finite volume algorithms in Nautilus, a numerical solver for fluid plasmas. Results include a set of typical Euler, Maxwell, MHD and Two-fluid benchmarks. In addition results and algorithms for a set of hyperbolic gauge cleaning schemes that can be applied to the MHD and Two-fluid systems using finite volume type methods will be presented. Finally we move onto applications in field reversed configuration (FRC) plasmas.

  18. Metrology test object for dimensional verification in additive manufacturing of metals for biomedical applications.

    PubMed

    Teeter, Matthew G; Kopacz, Alexander J; Nikolov, Hristo N; Holdsworth, David W

    2015-01-01

    Additive manufacturing continues to increase in popularity and is being used in applications such as biomaterial ingrowth that requires sub-millimeter dimensional accuracy. The purpose of this study was to design a metrology test object for determining the capabilities of additive manufacturing systems to produce common objects, with a focus on those relevant to medical applications. The test object was designed with a variety of features of varying dimensions, including holes, cylinders, rectangles, gaps, and lattices. The object was built using selective laser melting, and the produced dimensions were compared to the target dimensions. Location of the test objects on the build plate did not affect dimensions. Features with dimensions less than 0.300 mm did not build or were overbuilt to a minimum of 0.300 mm. The mean difference between target and measured dimensions was less than 0.100 mm in all cases. The test object is applicable to multiple systems and materials, tests the effect of location on the build, uses a minimum of material, and can be measured with a variety of efficient metrology tools (including measuring microscopes and micro-CT). Investigators can use this test object to determine the limits of systems and adjust build parameters to achieve maximum accuracy. PMID:25542613

  19. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  20. Additional field verification of convective scaling for the lateral dispersion parameter

    SciTech Connect

    Sakiyama, S.K.; Davis, P.A.

    1988-07-01

    The results of a series of diffusion trials over the heterogeneous surface of the Canadian Precambrian Shield provide additional support for the convective scaling of the lateral dispersion parameter. The data indicate that under convective conditions, the lateral dispersion parameter can be scaled with the convective velocity scale and the mixing depth. 10 references.

  1. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  2. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  3. Overhead imaging for verification and peacekeeping: Three studies. Arms Control Verification Studies No. 6

    SciTech Connect

    Banner, A.V.

    1991-01-01

    This paper examines commercially available overhead remote sensing systems and their applications for international security. The paper describes the basic operating characteristics and features of commercially available systems, then uses two case studies to examine potential applications. In the first, imagery acquired during the Soviet withdrawal from Afghanistan in 1988 and 1989 is used to assess whether commercially available satellite imagery would be useful for monitoring large scale withdrawals of conventionally armed forces. In the second case study, imagery of selected sites in Namibia and Angola is used to examine whether such imagery could have supported United Nations peacekeeping operations in those countries. Potential applications of airborne remote sensing systems are also demonstrated using previously acquired imagery to show the kinds of results which could be obtained using commercially available systems.

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  5. Environmental Technology Verification Program - ETV - Case Studies: Demonstrating Program Outcomes

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This cd con...

  6. Shuttle payload interface verification equipment study. Volume 2: Technical document. Part 2: Appendices

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Appendices to the shuttle payload integration study provide for: (1) The interface verification equipment hardware utilization list; (2) the horizontal IVE in-field assembly procedure; and (3) payload integration baseline functional flow block diagrams and options.

  7. Reactive system verification case study: Fault-tolerant transputer communication

    NASA Technical Reports Server (NTRS)

    Crane, D. Francis; Hamory, Philip J.

    1993-01-01

    A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.

  8. Verification of multi-proxy paleoclimatic studies: a case study

    NASA Astrophysics Data System (ADS)

    McIntyre, S.; McKitrick, R.

    2004-12-01

    Multi-proxy studies have been the primary means of transmitting paleoclimatic findings to public policy. For policy use, such studies should be replicable in the sense of King (1995). The best-known and most widely applied multi-proxy study is Mann, Bradley and Hughes (1998) ("MBH98") and its 1999 extension, which claimed to have exceptional "robustness" and "skill". We attempted to replicate MBH98 results and found, among other problems, that MBH98 methodology included two important unreported steps: (1) Subtraction of the 1902-1980 mean prior to principal components (PC) calculations (rather than, say, the 1400-1980 mean in the AD1400 step); (2) Extrapolation of a duplicate version of the Gaspé tree ring series. We show that high early 15th century values occur in important variations and that their results are not robust to the following: (1) Presence or absence of the extrapolation of 4 years at the beginning of the Gaspé tree ring series; (2) subtraction of the 1400-1980 mean rather than subtraction of the 1902-1980 mean, while using the same number of retained PC series in each step as MBH98; (3) the presence or absence of the North American PC4, while subtracting the 1400-1980 mean and using 5 PCs in the AD1400 step; (4) presence or absence of a small subset of high-altitude tree ring sites, mostly "strip bark" bristlecone pines, mostly collected by one researcher, Donald Graybill. The subtraction of the 1902-1980 mean dramatically inflates the role of the bristlecone pine sites, which then impart a distinctive hockey stick shape to the MBH98 PC1 and then to the NH temperature reconstruction. MBH98 claimed "skill" through apparently significant Reduction of Error (RE) statistics, reporting 0.51 in the AD1400 step, as compared to a reported 99 percent significance level of 0, which they calculated through simulations using red noise with low AR1 coefficients (0.2). We benchmarked a more realistic significance level by applying MBH98 PC methods to 10

  9. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  10. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  11. Additional EIPC Study Analysis. Final Report

    SciTech Connect

    Hadley, Stanton W; Gotham, Douglas J.; Luciani, Ralph L.

    2014-12-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 14 topics was developed for further analysis. This paper brings together the earlier interim reports of the first 13 topics plus one additional topic into a single final report.

  12. VAMOS: The verification and monitoring options study: Current research options for in-situ monitoring and verification of contaminant remediation and containment within the vadose zone

    SciTech Connect

    Betsill, J.D.; Gruebel, R.D.

    1995-09-01

    The Verification and Monitoring Options Study Project (VAMOS) was established to identify high-priority options for future vadose-zone environmental research in the areas of in-situ remediation monitoring, post-closure monitoring, and containment emplacement and verification monitoring. VAMOS examined projected needs not currently being met with applied technology in order to develop viable monitoring and verification research options. The study emphasized a compatible systems approach to reinforce the need for utilizing compatible components to provide user friendly site monitoring systems. To identify the needs and research options related to vadose-zone environmental monitoring and verification, a literature search and expert panel forums were conducted. The search included present drivers for environmental monitoring technology, technology applications, and research efforts. The forums included scientific, academic, industry, and regulatory environmental professionals as well as end users of environmental technology. The experts evaluated current and future monitoring and verification needs, methods for meeting these needs, and viable research options and directions. A variety of high-priority technology development, user facility, and technology guidance research options were developed and presented as an outcome of the literature search and expert panel forums.

  13. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-01-01

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials. PMID:23470926

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, FLINT HILLS RESOURCES, LP, CCD15010 DIESEL FUEL FORMULATION WITH HITEC4121 ADDITIVE

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  15. Hands-on Verification of Mechanics Training: A Cost-Effectiveness Study of Videodisc Simulation.

    ERIC Educational Resources Information Center

    Maher, Thomas G.

    This document reports the results of a study on the feasibility of training smog check mechanics in California via hands-on verification of mechanics' ability to inspect and repair vehicles. The reviews of the research literature that compare the learning effectiveness of different delivery media tend to support the position that in learning, the…

  16. ECOLOGICAL STUDIES AND MATHEMATICAL MODELING OF 'CLADOPHORA' IN LAKE HURON: 7. MODEL VERIFICATION AND SYSTEM RESPONSE

    EPA Science Inventory

    This manuscript describes the verification of a calibrated mathematical model designed to predict the spatial and temporal distribution of Cladophora about a point source of nutrients. The study site was located at Harbor Beach, Michigan, on Lake Huron. The model is intended to h...

  17. Verification of a New Biocompatible Single-Use Film Formulation with Optimized Additive Content for Multiple Bioprocess Applications

    PubMed Central

    Jurkiewicz, Elke; Husemann, Ute; Greller, Gerhard; Barbaroux, Magali; Fenge, Christel

    2014-01-01

    Single-use bioprocessing bags and bioreactors gained significant importance in the industry as they offer a number of advantages over traditional stainless steel solutions. However, there is continued concern that the plastic materials might release potentially toxic substances negatively impacting cell growth and product titers, or even compromise drug safety when using single-use bags for intermediate or drug substance storage. In this study, we have focused on the in vitro detection of potentially cytotoxic leachables originating from the recently developed new polyethylene (PE) multilayer film called S80. This new film was developed to guarantee biocompatibility for multiple bioprocess applications, for example, storage of process fluids, mixing, and cell culture bioreactors. For this purpose, we examined a protein-free cell culture medium that had been used to extract leachables from freshly gamma-irradiated sample bags in a standardized cell culture assay. We investigated sample bags from films generated to establish the operating ranges of the film extrusion process. Further, we studied sample bags of different age after gamma-irradiation and finally, we performed extended media extraction trials at cold room conditions using sample bags. In contrast to a nonoptimized film formulation, our data demonstrate no cytotoxic effect of the S80 polymer film formulation under any of the investigated conditions. The S80 film formulation is based on an optimized PE polymer composition and additive package. Full traceability alongside specifications and controls of all critical raw materials, and process controls of the manufacturing process, that is, film extrusion and gamma-irradiation, have been established to ensure lot-to-lot consistency. © 2014 American Institute of Chemical Engineers Biotechnol. Prog., 30:1171–1176, 2014 PMID:24850537

  18. Structure Property Studies for Additively Manufactured Parts

    SciTech Connect

    Milenski, Helen M; Schmalzer, Andrew Michael; Kelly, Daniel

    2015-08-17

    Since the invention of modern Additive Manufacturing (AM) processes engineers and designers have worked hard to capitalize on the unique building capabilities that AM allows. By being able to customize the interior fill of parts it is now possible to design components with a controlled density and customized internal structure. The creation of new polymers and polymer composites allow for even greater control over the mechanical properties of AM parts. One of the key reasons to explore AM, is to bring about a new paradigm in part design, where materials can be strategically optimized in a way that conventional subtractive methods cannot achieve. The two processes investigated in my research were the Fused Deposition Modeling (FDM) process and the Direct Ink Write (DIW) process. The objectives of the research were to determine the impact of in-fill density and morphology on the mechanical properties of FDM parts, and to determine if DIW printed samples could be produced where the filament diameter was varied while the overall density remained constant.

  19. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    SciTech Connect

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  20. Aircraft surface coatings study: Verification of selected materials

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Three liquid coatings and four films that might improve and/or maintain the smoothness of transport aircraft surfaces are considered. Laboratory tests were performed on the liquid coatings (elastomeric polyurethanes) exposed to synthetic type hydraulic fluid, with and without a protective topcoat. Results were analyzed of a 14-month flight service evaluation of coatings applied to leading edges of an airline 727. Two additional airline service evaluations were initiated. Labortory tests were conducted on the films, bonded to aluminum substrate with various adhesives, to determine the best film/adhesive combinations. A cost/benefits analysis was performed and recommendations made for future work toward the application of this technology to commercial transports.

  1. Statistical Design for Biospecimen Cohort Size in Proteomics-based Biomarker Discovery and Verification Studies

    PubMed Central

    Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.

    2014-01-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748

  2. Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.

  3. Energy management and control system verification study. Master's thesis

    SciTech Connect

    Boulware, K.E.; Williamson, G.C.

    1983-09-01

    Energy Management and Control Systems (EMCS) are being installed and operated throughout the Air Force. Millions of dollars have been spent on EMCS, but no study has conclusively proved that EMCS has actually saved the Air Force energy. This thesis used the Regression subprogram of Statistical Packages for the Social Sciences (SPSS) to determine if these systems are indeed saving the Air Force energy. Previous studies have shown that Multiple Linear Regression (MLR) is the best statistical predictor of base energy consumption. Eight bases were selected that had an operational EMCS. Two EMCS bases were compared with one control base for each of four CONUS winter heating zones. The results indicated small (less than 2%) energy savings have occurred at half of the EMCS bases studied. Therefore, this study does not conclusively prove that EMCS's have saved energy on Air Force bases. However, the methodology developed in this report could be applied on a broader scale to develop a more conclusive result.

  4. The Cooking and Pneumonia Study (CAPS) in Malawi: Implementation of Remote Source Data Verification

    PubMed Central

    Weston, William; Smedley, James; Bennett, Andrew; Mortimer, Kevin

    2016-01-01

    Background Source data verification (SDV) is a data monitoring procedure which compares the original records with the Case Report Form (CRF). Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS) is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org). CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF). In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi. Methods At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators) recorded in the eCRF with the data in the digital images of the original records. Result 664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92%) had a finding of pneumonia in the original records. All digital images of the original records were clear and legible. Conclusion Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints. PMID:27355447

  5. Shuttle payload interface verification equipment study. Volume 3: Specification data

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A complete description is given of the IVE physical and performance design requirements as evolved in this study. The data are presented in a format to facilitate the development of an item specification. Data were used to support the development of the project plan data (schedules, cost, etc.) contained in Volume 4 of this report.

  6. Verification of a Quality Management Theory: Using a Delphi Study

    PubMed Central

    Mosadeghrad, Ali Mohammad

    2013-01-01

    Background: A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. Methods: The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. Results: The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. Conclusion: A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence. PMID:24596883

  7. Formulation verification study results for 241-AN-106 waste grout

    SciTech Connect

    Lokken, R.O.; Martin, P.F.C.; Morrison, L.C.; Palmer, S.E.; Anderson, C.M.

    1993-06-01

    Tests were conducted to determine whether the reference formulation and variations around the formulation are adequate for solidifying 241-AN-106 (106-AN) waste into a grout waste form. The reference formulation consists of 21 wt% type I/II Portland cement, 68 wt% fly ash, and 11 wt% attapulgite clay. The mix ratio is 8.4 lb/gal. Variations in dry blend component ratios, mix ratio, and waste concentration were assessed by using a statistically designed experimental matrix consisting of 44 grout compositions. Based on the results of the statistically designed variability study, the 106-AN grout formulations tested met all the formulation criteria except for the heat of hydration.

  8. Subject-Specific Planning of Femoroplasty: An Experimental Verification Study

    PubMed Central

    Basafa, Ehsan; Murphy, Ryan J.; Otake, Yoshito; Kutzer, Michael D.; Belkoff, Stephen M.; Mears, Simon C.; Armand, Mehran

    2014-01-01

    The risk of osteoporotic hip fractures may be reduced by augmenting susceptible femora with acrylic polymethylmethacrylate (PMMA) bone cement. Grossly filling the proximal femur with PMMA has shown promise, but the augmented bones can suffer from thermal necrosis or cement leakage, among other side effects. We hypothesized that, using subject-specific planning and computer-assisted augmentation, we can minimize cement volume while increasing bone strength and reducing the risk of fracture. We mechanically tested eight pairs of osteoporotic femora, after augmenting one from each pair following patient-specific planning reported earlier, which optimized cement distribution and strength increase. An average of 9.5(±1.7)ml of cement was injected in the augmented set. Augmentation significantly (P<0.05) increased the yield load by 33%, maximum load by 30%, yield energy by 118%, and maximum energy by 94% relative to the non-augmented controls. Also predicted yield loads correlated well (R2=0.74) with the experiments and, for augmented specimens, cement profiles were predicted with an average surface error of <2mm, further validating our simulation techniques. Results of the current study suggest that subject-specific planning of femoroplasty reduces the risk of hip fracture while minimizing the amount of cement required. PMID:25468663

  9. PET/CT imaging for treatment verification after proton therapy: a study with plastic phantoms and metallic implants.

    PubMed

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B; Bonab, Ali A; Alpert, Nathaniel M; Lohmann, Kevin; Bortfeld, Thomas

    2007-02-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of

  10. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    SciTech Connect

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2007-02-15

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of

  11. Additional Treatments Offer Little Benefit for Pancreatic Cancer: Study

    MedlinePlus

    ... 158633.html Additional Treatments Offer Little Benefit for Pancreatic Cancer: Study Neither extra chemotherapy drug nor add-on ... 2016 (HealthDay News) -- Additional treatments for locally advanced pancreatic cancer don't appear to boost survival, a new ...

  12. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  13. A Study of Additional Costs of Second Language Instruction.

    ERIC Educational Resources Information Center

    McEwen, Nelly

    A study was conducted whose primary aim was to identify and explain additional costs incurred by Alberta, Canada school jurisdictions providing second language instruction in 1980. Additional costs were defined as those which would not have been incurred had the second language program not been in existence. Three types of additional costs were…

  14. Computer Simulations to Study Diffraction Effects of Stacking Faults in Beta-SiC: II. Experimental Verification. 2; Experimental Verification

    NASA Technical Reports Server (NTRS)

    Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)

    2000-01-01

    Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.

  15. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    PubMed Central

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  16. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants.

    PubMed

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  17. ICAN/PART: Particulate composite analyzer, user's manual and verification studies

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Murthy, Pappu L. N.; Mital, Subodh K.

    1996-01-01

    A methodology for predicting the equivalent properties and constituent microstresses for particulate matrix composites, based on the micromechanics approach, is developed. These equations are integrated into a computer code developed to predict the equivalent properties and microstresses of fiber reinforced polymer matrix composites to form a new computer code, ICAN/PART. Details of the flowchart, input and output for ICAN/PART are described, along with examples of the input and output. Only the differences between ICAN/PART and the original ICAN code are described in detail, and the user is assumed to be familiar with the structure and usage of the original ICAN code. Detailed verification studies, utilizing dim dimensional finite element and boundary element analyses, are conducted in order to verify that the micromechanics methodology accurately models the mechanics of particulate matrix composites. ne equivalent properties computed by ICAN/PART fall within bounds established by the finite element and boundary element results. Furthermore, constituent microstresses computed by ICAN/PART agree in average sense with results computed using the finite element method. The verification studies indicate that the micromechanics programmed into ICAN/PART do indeed accurately model the mechanics of particulate matrix composites.

  18. GFO-1 Geophysical Data Record and Orbit Verifications for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This final report summarizes the research work conducted under NASA's Physical Oceanography Program, entitled, GFO-1 Geophysical Data Record And Orbit Verifications For Global Change Studies, for the investigation time period from December 1, 1997 through November 30, 2000. The primary objectives of the investigation include providing verification and improvement for the precise orbit, media, geophysical, and instrument corrections to accurately reduce U.S. Navy's Geosat-Followon-1 (GFO-1) mission radar altimeter data to sea level measurements. The status of the GFO satellite (instrument and spacecraft operations, orbital tracking and altimeter) is summarized. GFO spacecraft has been accepted by the Navy from Ball Aerospace and has been declared operational since November, 2000. We have participated in four official GFO calibration/validation periods (Cal/Val I-IV), spanning from June 1999 through October 2000. Results of verification of the GFO orbit and geophysical data record measurements both from NOAA (IGDR) and from the Navy (NGDR) are reported. Our preliminary results indicate that: (1) the precise orbit (GSFC and OSU) can be determined to approx. 5 - 6 cm rms radially using SLR and altimeter crossovers; (2) estimated GFO MOE (GSFC or NRL) radial orbit accuracy is approx. 7 - 30 cm and Operational Doppler orbit accuracy is approx. 60 - 350 cm. After bias and tilt adjustment (1000 km arc), estimated Doppler orbit accuracy is approx. 1.2 - 6.5 cm rms and the MOE accuracy is approx. 1.0 - 2.3 cm; (3) the geophysical and media corrections have been validated versus in situ measurements and measurements from other operating altimeters (T/P and ERS-2). Altimeter time bias is insignificant with 0-2 ms. Sea state bias is about approx. 3 - 4.5% of SWH. Wet troposphere correction has approx. 1 cm bias and approx. 3 cm rms when compared with ERS-2 data. Use of GIM and IRI95 provide ionosphere correction accurate to 2-3 cm rms during medium to high solar activities; (4

  19. A study on the factors that affect the advanced mask defect verification

    NASA Astrophysics Data System (ADS)

    Woo, Sungha; Jang, Heeyeon; Lee, Youngmo; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    Defect verification has become significantly difficult to higher technology nodes over the years. Traditional primary method of defect (include repair point) control consists of inspection, AIMS and repair steps. Among them, AIMS process needs various wafer lithography conditions, such as NA, inner/outer sigma, illumination shape and etc. It has a limit to analyze for every layer accurately because AIMS tool uses the physical aperture system. And it requires meticulous management of exposure condition and CD target value which change frequently in advanced mask. We report on the influence of several AIMS parameters on the defect analysis including repair point. Under various illumination conditions with different patterns, it showed the significant correlation in defect analysis results. It is able to analyze defect under certain error budget based on the management specification required for each layer. In addition, it provided us with one of the clues in the analysis of wafer repeating defect. Finally we will present 'optimal specification' for defect management with common AIMS recipe and suggest advanced mask process flow.

  20. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES, VOLUME II

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  2. Dosimetric study of 2D ion chamber array matrix for the modern radiotherapy treatment verification.

    PubMed

    Saminathan, Sathiyan; Manickam, Ravikumar; Chandraraj, Varatharaj; Supe, Sanjay S

    2010-01-01

    Intensity-modulated radiotherapy treatment demands stringent quality assurance and accurate dose determination for delivery of highly conformal dose to the patients. Generally 3D dose distributions obtained from a treatment planning system have to be verified by dosimetric methods. Mainly, a comparison of two-dimensional calculated and measured data in several coplanar planes is performed. In principle, there are many possibilities to measure two-dimensional dose distributions such as films, flat-panel electronic portal imaging devices (EPID), ion chambers and ionization chamber arrays, and radiographic and radiochromic films. The flat-panel EPIDs show a good resolution and offer a possibility for real-time measurements: however to convert the signal into dose, a separate commercial algorithm is required. The 2D ion chamber array system offers the real-time measurements. In this study, dosimetric characteristics of 2D ion chamber array matrix were analyzed for verification of radiotherapy treatments. The dose linearity and dose rate effect of the I'matriXX device was studied using 6 MV, 18 MV photons and 12 MeV electrons. The output factor was estimated using I'matriXX device and compared with ion chamber measurements. The ion chamber array system was found to be linear in the dose range of 2-500 cGy and the response of the detector was found to be independent of dose rate between 100 MU/min to 600 MU/min. The estimated relative output factor with I'matriXX was found to match very well with the ion chamber measurements. To check the final dose delivered during IMRT planning, dose distribution patterns such as field-in-field, pyramidal, and chair tests were generated with the treatment planning system (TPS) and the same was executed in the accelerator and measured with the I'matriXX device. The dose distribution pattern measured by the matrix device for field-in-field, pyramidal, and chair test were found to be in good agreement with the calculated dose distribution

  3. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  4. Registration of DRRs and portal images for verification of stereotactic body radiotherapy: a feasibility study in lung cancer treatment

    NASA Astrophysics Data System (ADS)

    Künzler, Thomas; Grezdo, Jozef; Bogner, Joachim; Birkfellner, Wolfgang; Georg, Dietmar

    2007-04-01

    Image guidance has become a pre-requisite for hypofractionated radiotherapy where the applied dose per fraction is increased. Particularly in stereotactic body radiotherapy (SBRT) for lung tumours, one has to account for set-up errors and intrafraction tumour motion. In our feasibility study, we compared digitally reconstructed radiographs (DRRs) of lung lesions with MV portal images (PIs) to obtain the displacement of the tumour before irradiation. The verification of the tumour position was performed by rigid intensity based registration and three different merit functions such as the sum of squared pixel intensity differences, normalized cross correlation and normalized mutual information. The registration process then provided a translation vector that defines the displacement of the target in order to align the tumour with the isocentre. To evaluate the registration algorithms, 163 test images were created and subsequently, a lung phantom containing an 8 cm3 tumour was built. In a further step, the registration process was applied on patient data, containing 38 tumours in 113 fractions. To potentially improve registration outcome, two filter types (histogram equalization and display equalization) were applied and their impact on the registration process was evaluated. Generated test images showed an increase in successful registrations when applying a histogram equalization filter whereas the lung phantom study proved the accuracy of the selected algorithms, i.e. deviations of the calculated translation vector for all test algorithms were below 1 mm. For clinical patient data, successful registrations occurred in about 59% of anterior-posterior (AP) and 46% of lateral projections, respectively. When patients with a clinical target volume smaller than 10 cm3 were excluded, successful registrations go up to 90% in AP and 50% in lateral projection. In addition, a reliable identification of the tumour position was found to be difficult for clinical target volumes

  5. WTS-4 system verification unit for wind/hydroelectric integration study

    NASA Technical Reports Server (NTRS)

    Watts, A. W.

    1982-01-01

    The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.

  6. Transition Metal Complexes of Naproxen: Synthesis, Characterization, Forced Degradation Studies, and Analytical Method Verification

    PubMed Central

    Hasan, Md. Sharif; Kayesh, Ruhul; Begum, Farida; Rahman, S. M. Abdur

    2016-01-01

    The aim of our current research was to synthesize some transition metal complexes of Naproxen, determine their physical properties, and examine their relative stability under various conditions. Characterizations of these complexes were done by 1H-NMR, Differential Scanning Calorimetry (DSC), FT-IR, HPLC, and scanning electron microscope (SEM). Complexes were subjected to acidic, basic, and aqueous hydrolysis as well as oxidation, reduction, and thermal degradation. Also the reversed phase high-performance liquid chromatography (RP-HPLC) method of Naproxen outlined in USP was verified for the Naproxen-metal complexes, with respect to accuracy, precision, solution stability, robustness, and system suitability. The melting points of the complexes were higher than that of the parent drug molecule suggesting their thermal stability. In forced degradation study, complexes were found more stable than the Naproxen itself in all conditions: acidic, basic, oxidation, and reduction media. All the HPLC verification parameters were found within the acceptable value. Therefore, it can be concluded from the study that the metal complexes of Naproxen can be more stable drug entity and offer better efficacy and longer shelf life than the parent Naproxen. PMID:27034891

  7. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  8. Transition Metal Complexes of Naproxen: Synthesis, Characterization, Forced Degradation Studies, and Analytical Method Verification.

    PubMed

    Hasan, Md Sharif; Kayesh, Ruhul; Begum, Farida; Rahman, S M Abdur

    2016-01-01

    The aim of our current research was to synthesize some transition metal complexes of Naproxen, determine their physical properties, and examine their relative stability under various conditions. Characterizations of these complexes were done by 1H-NMR, Differential Scanning Calorimetry (DSC), FT-IR, HPLC, and scanning electron microscope (SEM). Complexes were subjected to acidic, basic, and aqueous hydrolysis as well as oxidation, reduction, and thermal degradation. Also the reversed phase high-performance liquid chromatography (RP-HPLC) method of Naproxen outlined in USP was verified for the Naproxen-metal complexes, with respect to accuracy, precision, solution stability, robustness, and system suitability. The melting points of the complexes were higher than that of the parent drug molecule suggesting their thermal stability. In forced degradation study, complexes were found more stable than the Naproxen itself in all conditions: acidic, basic, oxidation, and reduction media. All the HPLC verification parameters were found within the acceptable value. Therefore, it can be concluded from the study that the metal complexes of Naproxen can be more stable drug entity and offer better efficacy and longer shelf life than the parent Naproxen. PMID:27034891

  9. WTS-4 system verification unit for wind/hydroelectric integration study

    NASA Astrophysics Data System (ADS)

    Watts, A. W.

    The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.

  10. Electrostatic Levitation for Studies of Additive Manufactured Materials

    NASA Technical Reports Server (NTRS)

    SanSoucie, Michael P.; Rogers, Jan R.; Tramel, Terri

    2014-01-01

    The electrostatic levitation (ESL) laboratory at NASA's Marshall Space Flight Center is a unique facility for investigators studying high temperature materials. The laboratory boasts two levitators in which samples can be levitated, heated, melted, undercooled, and resolidified. Electrostatic levitation minimizes gravitational effects and allows materials to be studied without contact with a container or instrumentation. The lab also has a high temperature emissivity measurement system, which provides normal spectral and normal total emissivity measurements at use temperature. The ESL lab has been instrumental in many pioneering materials investigations of thermophysical properties, e.g., creep measurements, solidification, triggered nucleation, and emissivity at high temperatures. Research in the ESL lab has already led to the development of advanced high temperature materials for aerospace applications, coatings for rocket nozzles, improved medical and industrial optics, metallic glasses, ablatives for reentry vehicles, and materials with memory. Modeling of additive manufacturing materials processing is necessary for the study of their resulting materials properties. In addition, the modeling of the selective laser melting processes and its materials property predictions are also underway. Unfortunately, there is very little data for the properties of these materials, especially of the materials in the liquid state. Some method to measure thermophysical properties of additive manufacturing materials is necessary. The ESL lab is ideal for these studies. The lab can provide surface tension and viscosity of molten materials, density measurements, emissivity measurements, and even creep strength measurements. The ESL lab can also determine melting temperature, surface temperatures, and phase transition temperatures of additive manufactured materials. This presentation will provide background on the ESL lab and its capabilities, provide an approach to using the ESL

  11. A Modeling and Verification Study of Summer Precipitation Systems Using NASA Surface Initialization Datasets

    NASA Technical Reports Server (NTRS)

    Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.

    2010-01-01

    One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using

  12. Experimental Study and Numerical Verification of Heat Transfer in Squeeze Casting of Aluminum Alloy A443

    NASA Astrophysics Data System (ADS)

    Sun, Zhizhong; Hu, Henry; Niu, Xiaoping

    2012-12-01

    As an effective tool, simulation helps do the analysis and optimization in advance and undertake preventive action. A critical portion of casting simulation is the heat transfer at the metal/mold interface. However, it is difficult to determine the values of interfacial heat-transfer coefficients (IHTCs) in squeeze casting of aluminum alloys due to many influence factors. In this work, IHTCs were determined by using the inverse algorithm based on measured temperature histories and finite-difference analysis in a five-step squeeze casting of aluminum alloy A443. The results showed the IHTCs initially reached a maximum peak value followed by a gradually decline to a lower level. Similar characteristics of IHTC peak values were also observed at 30, 60, and 90 MPa applied pressures. With the applied pressure of 60 MPa, the peak IHTC values of aluminum alloy A443 from steps 1 to 5 varied from 5629 W/m2K to 9419 W/m2K. The comparison of the predicted cooling curves with the experimental measurement manifested the cooling temperatures calculated by the IHTC values determined in the current study were in the best agreement with experimental ones. The verification of the determined IHTC values demonstrates that the inverse algorithm is an effective tool for determination of the IHTC at the squeeze casting-die interface.

  13. BIG FROG WILDERNESS STUDY AREA AND ADDITIONS, TENNESSEE AND GEORGIA.

    USGS Publications Warehouse

    Slack, John F.; Gazdik, Gertrude C.

    1984-01-01

    A mineral-resource survey was made of the Big Frog Wilderness Study Area and additions, Tennessee-Georgia. Geochemical sampling found traces of gold, zinc, copper, and arsenic in rocks, stream sediments, and panned concentrates, but not in sufficient quantities to indicate the presence of deposits of these metals. The results of the survey indicate that there is little promise for the occurrence of metallic mineral deposits within the study area. The only apparent resources are nonmetallic commodities including rock suitable for construction materials, and small amounts of sand and gravel; however, these commodities are found in abundance outside the study area. A potential may exist for oil and natural gas at great depths, but this cannot be evaluated by the present study.

  14. Recommended Protocol for Round Robin Studies in Additive Manufacturing

    PubMed Central

    Moylan, Shawn; Brown, Christopher U.; Slotwinski, John

    2016-01-01

    One way to improve confidence and encourage proliferation of additive manufacturing (AM) technologies and parts is by generating more high quality data describing the performance of AM processes and parts. Many in the AM community see round robin studies as a way to generate large data sets while distributing the cost among the participants, thereby reducing the cost to individual users. The National Institute of Standards and Technology (NIST) has conducted and participated in several of these AM round robin studies. While the results of these studies are interesting and informative, many of the lessons learned in conducting these studies concern the logistics and methods of the study and unique issues presented by AM. Existing standards for conducting interlaboratory studies of measurement methods, along with NIST’s experience, form the basis for recommended protocols for conducting AM round robin studies. The role of round robin studies in AM qualification, some of the limitations of round robin studies, and the potential benefit of less formal collaborative experiments where multiple factors, AM machine being only one, are varied simultaneously are also discussed. PMID:27274602

  15. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing. PMID:15015859

  16. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  17. Genotoxicity studies of the food additive ester gum.

    PubMed

    Mukherjee, A; Agarwal, K; Chakrabarti, J

    1992-07-01

    Ester gum (EG) is used in citrus oil-based beverage flavourings as a weighting or colouring agent. In the present study, concentrations of 50, 100 and 150 mg/kg body weight were administered orally to male Swiss albino mice, and sister chromatid exchange and chromosomal aberration were used as the cytogenetic endpoints to determine the genotoxic and clastogenic potential of the food additive. Although EG was weakly clastogenic and could induce a marginal increase in sister chromatid exchange frequencies, it was not a potential health hazard at the doses tested. PMID:1521837

  18. Making intelligent systems team players: Additional case studies

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra L.; Rhoads, Ron W.

    1993-01-01

    Observations from a case study of intelligent systems are reported as part of a multi-year interdisciplinary effort to provide guidance and assistance for designers of intelligent systems and their user interfaces. A series of studies were conducted to investigate issues in designing intelligent fault management systems in aerospace applications for effective human-computer interaction. The results of the initial study are documented in two NASA technical memoranda: TM 104738 Making Intelligent Systems Team Players: Case Studies and Design Issues, Volumes 1 and 2; and TM 104751, Making Intelligent Systems Team Players: Overview for Designers. The objective of this additional study was to broaden the investigation of human-computer interaction design issues beyond the focus on monitoring and fault detection in the initial study. The results of this second study are documented which is intended as a supplement to the original design guidance documents. These results should be of interest to designers of intelligent systems for use in real-time operations, and to researchers in the areas of human-computer interaction and artificial intelligence.

  19. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  20. RAMSEYS DRAFT WILDERNESS STUDY AREA AND ADDITION, VIRGINIA.

    USGS Publications Warehouse

    Lesure, Frank G.; Mory, Peter C.

    1984-01-01

    Mineral-resource surveys of the Ramseys Draft Wilderness Study Area and adjoining roadless area addition in George Washington National Forest in the western valley and ridge province, Augusta and Highland Counties, Virginia, were done. The surveys outlined three small areas containing anomalous amounts of copper, lead, and zinc related to stratabound red-bed copper mineralization, but these occurrences are not large and are not considered as having mineral-resource potential. The area contains abundant sandstone suitable for construction materials and shale suitable for making brick, tile, and other low-grade ceramic products, but these commodities occur in abundance outside the wilderness study area. Structural conditions are probably favorable for the accumulation of natural gas, but exploratory drilling has not been done sufficiently near the area to evaluate the gas potential.

  1. TEST/QA PLAN FOR THE VERIFICATION TESTING OF ALTERNATIVES OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSONS, AND LUBRICANTS FOR HIGHWAY AND NONROAD USE HEAVY DUTY DIESEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  2. Stormwater pollutant loads modelling: epistemological aspects and case studies on the influence of field data sets on calibration and verification.

    PubMed

    Bertrand-Krajewski, Jean-Luc

    2007-01-01

    In urban drainage, stormwater quality models have been used by researchers and practitioners for more than 15 years. Most of them were initially developed for research purposes, and have been later on implemented in commercial software packages devoted to operational needs. This paper presents some epistemological problems and difficulties with practical consequences in the application of stormwater quality models, such as simplified representation of reality, scaling-up, over-parameterisation, transition from calibration to verification and prediction, etc. Two case studies (one to estimate pollutant loads at the outlet of a catchment, one to design a detention tank to reach a given pollutant interception efficiency), with simple and detailed stormwater quality models, illustrate some of the above problems. It is hard to find, if not impossible, an "optimum" or "best" unique set of parameters values. Model calibration and verification appear to dramatically depend on the data sets used for their calibration and verification. Compared to current practice, collecting more and reliable data is absolutely necessary. PMID:17425067

  3. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  4. Experimental Study of Additives on Viscosity biodiesel at Low Temperature

    NASA Astrophysics Data System (ADS)

    Fajar, Berkah; Sukarno

    2015-09-01

    An experimental investigation was performed to find out the viscosity of additive and biodiesel fuel mixture in the temperature range from 283 K to 318 K. Solutions to reduce the viscosity of biodiesel is to add the biodiesel with some additive. The viscosity was measured using a Brookfield Rheometer DV-II. The additives were the generic additive (Diethyl Ether/DDE) and the commercial additive Viscoplex 10-330 CFI. Each biodiesel blends had a concentration of the mixture: 0.0; 0.25; 0.5; 0.75; 1.0; and 1.25% vol. Temperature of biodiesel was controlled from 40°C to 0°C. The viscosity of biodiesel and additive mixture at a constant temperature can be approximated by a polynomial equation and at a constant concentration by exponential equation. The optimum mixture is at 0.75% for diethyl ether and 0.5% for viscoplex.

  5. Feasibility study of a dual detector configuration concept for simultaneous megavoltage imaging and dose verification in radiotherapy

    SciTech Connect

    Deshpande, Shrikant; McNamara, Aimee L.; Holloway, Lois; Metcalfe, Peter; Vial, Philip

    2015-04-15

    Purpose: To test the feasibility of a dual detector concept for comprehensive verification of external beam radiotherapy. Specifically, the authors test the hypothesis that a portal imaging device coupled to a 2D dosimeter provides a system capable of simultaneous imaging and dose verification, and that the presence of each device does not significantly detract from the performance of the other. Methods: The dual detector configuration comprised of a standard radiotherapy electronic portal imaging device (EPID) positioned directly on top of an ionization-chamber array (ICA) with 2 cm solid water buildup material (between EPID and ICA) and 5 cm solid backscatter material. The dose response characteristics of the ICA and the imaging performance of the EPID in the dual detector configuration were compared to the performance in their respective reference clinical configurations. The reference clinical configurations were 6 cm solid water buildup material, an ICA, and 5 cm solid water backscatter material as the reference dosimetry configuration, and an EPID with no additional buildup or solid backscatter material as the reference imaging configuration. The dose response of the ICA was evaluated by measuring the detector’s response with respect to off-axis position, field size, and transit object thickness. Clinical dosimetry performance was evaluated by measuring a range of clinical intensity-modulated radiation therapy (IMRT) beams in transit and nontransit geometries. The imaging performance of the EPID was evaluated quantitatively by measuring the contrast-to-noise ratio (CNR) and spatial resolution. Images of an anthropomorphic phantom were also used for qualitative assessment. Results: The measured off-axis and field size response with the ICA in both transit and nontransit geometries for both dual detector configuration and reference dosimetry configuration agreed to within 1%. Transit dose response as a function of object thickness agreed to within 0.5%. All

  6. Fixed-point arithmetic for mobile devices: a fingerprinting verification case study

    NASA Astrophysics Data System (ADS)

    Moon, Yiu S.; Luk, Franklin T.; Ho, Ho C.; Tang, T. Y.; Chan, Kit C.; Leung, C. W.

    2002-12-01

    Mobile devices use embedded processors with low computing capabilities to reduce power consumption. Since floating-point arithmetic units are power hungry, computationally intensive jobs must be accomplished with either digital signal processors or hardware co-processors. In this paper, we propose to perform fixed-point arithmetic on an integer hardware unit. We illustrate the advantages of our approach by implementing fingerprint verification on mobile devices.

  7. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  8. Simulation verification techniques study. Task report 4: Simulation module performance parameters and performance standards

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Shuttle simulation software modules in the environment, crew station, vehicle configuration and vehicle dynamics categories are discussed. For each software module covered, a description of the module functions and operational modes, its interfaces with other modules, its stored data, inputs, performance parameters and critical performance parameters is given. Reference data sources which provide standards of performance are identified for each module. Performance verification methods are also discussed briefly.

  9. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  10. Additive Manufacturing in Production: A Study Case Applying Technical Requirements

    NASA Astrophysics Data System (ADS)

    Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni

    Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.

  11. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  12. Health studies indicate MTBE is safe gasoline additive

    SciTech Connect

    Anderson, E.V.

    1993-09-01

    Implementation of the oxygenated fuels program by EPA in 39 metropolitan areas, including Fairbanks and Anchorage, Alaska, in the winter of 1992, encountered some unexpected difficulties. Complaints of headaches, dizziness, nausea, and irritated eyes started in Fairbanks, jumped to Anchorage, and popped up in various locations in the lower 48 states. The suspected culprit behind these complaints was the main additive for oxygenation of gasoline is methyl tert-butyl ether (MTBE). A test program, hastily organized in response to these complaints, has indicated that MTBE is a safe gasoline additive. However, official certification of the safety of MTBE is still awaited.

  13. Pilot field-verification studies of the sodium sulfide/ferrous sulfate treatment process. Final report, September 1987-May 1988

    SciTech Connect

    Wiloff, P.M.; Suciu, D.F.; Prescott, D.S.; Schober, R.K.; Loyd, F.S.

    1988-09-01

    In previous project, jar and dynamic testing showed that the sodium sulfide/ferrous sulfate process was a viable method for reducing hexavalent chromium and removing heavy metals from the Tinker AFB industrial wastewater with significant decrease in sludge production and treatment costs. In this phase, pilot-plant field verification studies were conducted to evaluate the chemical and physical parameters of the chromium reduction process, the precipitation and clarification process, and the activated-sludge system. Sludge production was evaluated and compared to the sulfuric acid/sulfur dioxide/lime process.

  14. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  15. Verification of Multiphysics software: Space and time convergence studies for nonlinearly coupled applications

    SciTech Connect

    Jean C. Ragusa; Vijay Mahadevan; Vincent A. Mousseau

    2009-05-01

    High-fidelity modeling of nuclear reactors requires the solution of a nonlinear coupled multi-physics stiff problem with widely varying time and length scales that need to be resolved correctly. A numerical method that converges the implicit nonlinear terms to a small tolerance is often referred to as nonlinearly consistent (or tightly coupled). This nonlinear consistency is still lacking in the vast majority of coupling techniques today. We present a tightly coupled multiphysics framework that tackles this issue and present code-verification and convergence analyses in space and time for several models of nonlinear coupled physics.

  16. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    NASA Technical Reports Server (NTRS)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  17. Expert system verification and validation study. Phase 2: Requirements identification. Delivery 1: Updated survey report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.

  18. Study of novel techniques for verification imaging and patient dose reconstruction in external beam radiation therapy

    NASA Astrophysics Data System (ADS)

    Jarry, Genevieve

    Treatment delivery verification is an essential step of radiotherapy. The purpose of this thesis is to develop new methods to improve the verification of photon and electron beam radiotherapy treatments. This is achieved through developing and testing (1) a way to acquire portal images during electron beam treatments, (2) a method to reconstruct the dose delivered to patients during photon beam treatments and (3) a technique to improve image quality in kilovoltage (kV) cone beam computed tomography (CBCT) by correcting for scattered radiation. The portal images were acquired using the Varian CL21EX linac and the Varian aS500 electronic portal imaging device (EPID). The EGSnrc code was used to model fully the CL21EX, the aS500 and the kV CBCT system. We demonstrate that portal images of electron beam treatments with adequate contrast and resolution can be produced using the bremsstrahlung photons portion of the electron beam. Monte Carlo (MC) calculations were used to characterize the bremsstrahlung photons and to obtain predicted images of various phantoms. The technique was applied on a head and neck patient. An algorithm to reconstruct the dose given to patients during photon beam radiotherapy was developed and validated. The algorithm uses portal images and MC simulations. The primary fluence at the detector is back-projected through the patient. CT geometry to obtain a reconstructed phase space file. The reconstructed phase space file is used to calculate the reconstructed dose to the patient using MC simulations. The reconstruction method was validated in homogeneous and heterogeneous phantoms for conventional and IMRT fields. The scattered radiation present in kV CBCT images was evaluated using MC simulations. Simulated predictions of the scatter distribution were subtracted from CBCT projection images prior to the reconstruction to improve the reconstructed image quality. Reducing the scattered radiation was found to improve contrast and reduce shading

  19. Increased cortical grey matter lesion detection in multiple sclerosis with 7 T MRI: a post-mortem verification study.

    PubMed

    Kilsdonk, Iris D; Jonkman, Laura E; Klaver, Roel; van Veluw, Susanne J; Zwanenburg, Jaco J M; Kuijer, Joost P A; Pouwels, Petra J W; Twisk, Jos W R; Wattjes, Mike P; Luijten, Peter R; Barkhof, Frederik; Geurts, Jeroen J G

    2016-05-01

    The relevance of cortical grey matter pathology in multiple sclerosis has become increasingly recognized over the past decade. Unfortunately, a large part of cortical lesions remain undetected on magnetic resonance imaging using standard field strength. In vivo studies have shown improved detection by using higher magnetic field strengths up to 7 T. So far, a systematic histopathological verification of ultra-high field magnetic resonance imaging pulse sequences has been lacking. The aim of this study was to determine the sensitivity of 7 T versus 3 T magnetic resonance imaging pulse sequences for the detection of cortical multiple sclerosis lesions by directly comparing them to histopathology. We obtained hemispheric coronally cut brain sections of 19 patients with multiple sclerosis and four control subjects after rapid autopsy and formalin fixation, and scanned them using 3 T and 7 T magnetic resonance imaging systems. Pulse sequences included T1-weighted, T2-weighted, fluid attenuated inversion recovery, double inversion recovery and T2*. Cortical lesions (type I-IV) were scored on all sequences by an experienced rater blinded to histopathology and clinical data. Staining was performed with antibodies against proteolipid protein and scored by a second reader blinded to magnetic resonance imaging and clinical data. Subsequently, magnetic resonance imaging images were matched to histopathology and sensitivity of pulse sequences was calculated. Additionally, a second unblinded (retrospective) scoring of magnetic resonance images was performed. Regardless of pulse sequence, 7 T magnetic resonance imaging detected more cortical lesions than 3 T. Fluid attenuated inversion recovery (7 T) detected 225% more cortical lesions than 3 T fluid attenuated inversion recovery (Z = 2.22, P < 0.05) and 7 T T2* detected 200% more cortical lesions than 3 T T2* (Z = 2.05, P < 0.05). Sensitivity of 7 T magnetic resonance imaging was influenced by cortical lesion type: 100% for type

  20. Requirements of Operational Verification of the NWSRFS-ESP Forecasts

    NASA Astrophysics Data System (ADS)

    Imam, B.; Werner, K.; Hartmann, H.; Sorooshian, S.; Pritchard, E.

    2006-12-01

    Forecast verification is the process of determining the quality of forecasts. This requires the utilization of quality measures that summarize one or more aspects of the relationship between forecasts and observations. Technically, the three main objectives of forecast verification are (a) monitoring, (b) improving, and (c) comparing the quality of different forecasting systems. However, users of forecast verification results range from administrators, who want to know the value of investing in forecast system improvement to forecasters and modelers, who want to assess areas of improving their own predictions, to forecast users, who weigh their decision based not only on the forecast but also on the perceived quality of such forecast. Our discussions with several forecasters and hydrologists in charge at various River Forecast Centers (RFCs) indicated that operational hydrologists view verification in a broader sense than their counterparts within the meteorological community. Their view encompasses verification as a possible tool in determining whether a forecast is ready for issuance as an "official" product or that it needs more work. In addition to the common challenges associated with verification of monthly and seasonal probabilistic forecasts. which include determining and obtaining the appropriate size of "forecast-observation" pairs data set, operational verification also requires the consideration of verification strategies for short-term forecasts. Under such condition, the identification of conditional verification (i.e., similar conditions) samples, tracking model states, input, and output, relative to their climatology, and the establishment of links between the forecast issuance, verification, and simulation components of the forecast system become important. In this presentation, we address the impacts of such view on the potential requirements of an operational verification system for the Ensemble Streamflow Prediction (ESP) component of the

  1. Satellite Power Systems (SPS) concept definition study. Volume 6: SPS technology requirements and verification

    NASA Technical Reports Server (NTRS)

    Hanley, G.

    1978-01-01

    Volume 6 of the SPS Concept Definition Study is presented and also incorporates results of NASA/MSFC in-house effort. This volume includes a supporting research and technology summary. Other volumes of the final report that provide additional detail are as follows: (1) Executive Summary; (2) SPS System Requirements; (3) SPS Concept Evolution; (4) SPS Point Design Definition; (5) Transportation and Operations Analysis; and Volume 7, SPS Program Plan and Economic Analysis.

  2. Simulation verification techniques study: Simulation self test hardware design and techniques report

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.

  3. Expert system verification and validation study. ES V/V guidelines/workshop conference summary

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    The intent of the workshop was to start moving research on the verification and validation (V&V) of knowledge based systems (KBSs) in the direction of providing tangible 'products' that a KBS developer could use. In the near term research will focus on identifying the kinds of experiences encountered during KBS development of 'real' KBSs. These will be stored in a repository and will serve as the foundation for the rest of the activities described here. One specific approach to be pursued is 'benchmarking'. With this approach, a KBS developer can use either 'canned' KBSs with seeded errors or existing KBSs with known errors to evaluate a given tool's ability to satisfactorily identify errors.

  4. Soil moisture verification study of the ESTAR microwave radiometer - Walnut Gulch, AZ 1991

    NASA Technical Reports Server (NTRS)

    Jackson, T. J.; Le Vine, D. M.; Griffis, A.; Goodrich, D. C.; Schmugge, T. J.; Swift, C. T.; O'Neill, P. E.; Roberts, R. R.; Parry, R.

    1992-01-01

    The application of an electronically steered thinned array L-band radiometer (ESTAR) for soil moisture mapping is investigated over the arid rangeland Walnut Gulch Watershed. Antecedent rainfall and evaporation for the flights are very different and result in a wide range of soil moisture conditions. The high spatial variability of rainfall events within this region results in moisture conditions with dramatic spatial patterns. Sensor performance is verified using two approaches. Microwave data are used in conjunction with a microwave emission model to predict soil moisture. These predictions are compared to ground observations of soil moisture. A second verification is possible using an extensive data set. Both tests showed that the ESTAR is capable of providing soil moisture with the same level of accuracy as existing systems.

  5. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.

  6. Additional Treatments Offer Little Benefit for Pancreatic Cancer: Study

    MedlinePlus

    ... of gastroenterology-pancreatology at Beaujon Hospital, in Clichy, France. The study was funded by the pharmaceutical company ... D., department of gastroenterology-pancreatology, Beaujon Hospital, Clichy, France; Deborah Schrag, M.D., M.P.H., chief ...

  7. NMR relaxometry study of plaster mortar with polymer additives

    SciTech Connect

    Jumate, E.; Manea, D.; Moldovan, D.; Fechete, R.

    2013-11-13

    The cement mixed with water forms a plastic paste or slurry which stiffness in time and finally hardens into a resistant stone. The addition of sand aggregates, polymers (Walocel) and/or calcium carbonate will modify dramatically the final mortar mechanic and thermal properties. The hydration processes can be observed using the 1D NMR measurements of transverse T{sub 2} relaxation times distributions analysed by a Laplace inversion algorithm. These distributions were obtained for mortar pasta measured at 2 hours after preparation then at 3, 7 and 28 days after preparation. Multiple components are identified in the T{sub 2} distributions. These can be associated with the proton bounded chemical or physical to the mortar minerals characterized by a short T{sub 2} relaxation time and to water protons in pores with three different pore sizes as observed from SEM images. The evaporation process is faster in the first hours after preparation, while the mortar hydration (bonding of water molecules to mortar minerals) can be still observed after days or months from preparation. Finally, the mechanic resistance was correlated with the transverse T{sub 2} relaxation rates corresponding to the bound water.

  8. ETV - VERIFICATION TESTING (ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM)

    EPA Science Inventory

    Verification testing is a major component of the Environmental Technology Verification (ETV) program. The ETV Program was instituted to verify the performance of innovative technical solutions to problems that threaten human health or the environment and was created to substantia...

  9. Additional EIPC Study Analysis: Interim Report on High Priority Topics

    SciTech Connect

    Hadley, Stanton W

    2013-11-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 13 topics was developed for further analysis; this paper discusses the first five.

  10. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  11. GENERIC VERIFICATION PROTOCOL FOR DETERMINATION OF EMISSIONS REDUCTIONS OBTAINED BY USE OF ALTERNATIVE OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSIONS AND LUBRICANTS FOR HIGHWAY AND NONROAD USE DISEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    This report sets standards by which the emissions reduction provided by fuel and lubricant technologies can be tested and be tested in a comparable way. It is a generic protocol under the Environmental Technology Verification program.

  12. A Preliminary Study of In-House Monte Carlo Simulations: An Integrated Monte Carlo Verification System

    SciTech Connect

    Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hidek; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki

    2009-10-01

    Purpose: To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. Methods and Materials: The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. Results: The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Conclusions: Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.

  13. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.

    The U.S.-Mexico Border Program is sponsored ...

  14. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.

    The National Human Exposure Assessment Sur...

  15. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (IIT-A-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures for the initial and periodic verification and validation of computer programs. The programs are used during the NHEXAS project, and "Border" study at the Illinois Institute of Technology (IIT) site. Keywords: computers; softwa...

  16. Psychiatric Residents' Attitudes toward and Experiences with the Clinical-Skills Verification Process: A Pilot Study on U.S. and International Medical Graduates

    ERIC Educational Resources Information Center

    Rao, Nyapati R.; Kodali, Rahul; Mian, Ayesha; Ramtekkar, Ujjwal; Kamarajan, Chella; Jibson, Michael D.

    2012-01-01

    Objective: The authors report on a pilot study of the experiences and perceptions of foreign international medical graduate (F-IMG), United States international medical graduate (US-IMG), and United States medical graduate (USMG) psychiatric residents with the newly mandated Clinical Skills Verification (CSV) process. The goal was to identify and…

  17. Monte Carlo verification of polymer gel dosimetry applied to radionuclide therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Gear, J. I.; Charles-Edwards, E.; Partridge, M.; Flux, G. D.

    2011-11-01

    This study evaluates the dosimetric performance of the polymer gel dosimeter 'Methacrylic and Ascorbic acid in Gelatin, initiated by Copper' and its suitability for quality assurance and analysis of I-131-targeted radionuclide therapy dosimetry. Four batches of gel were manufactured in-house and sets of calibration vials and phantoms were created containing different concentrations of I-131-doped gel. Multiple dose measurements were made up to 700 h post preparation and compared to equivalent Monte Carlo simulations. In addition to uniformly filled phantoms the cross-dose distribution from a hot insert to a surrounding phantom was measured. In this example comparisons were made with both Monte Carlo and a clinical scintigraphic dosimetry method. Dose-response curves generated from the calibration data followed a sigmoid function. The gels appeared to be stable over many weeks of internal irradiation with a delay in gel response observed at 29 h post preparation. This was attributed to chemical inhibitors and slow reaction rates of long-chain radical species. For this reason, phantom measurements were only made after 190 h of irradiation. For uniformly filled phantoms of I-131 the accuracy of dose measurements agreed to within 10% when compared to Monte Carlo simulations. A radial cross-dose distribution measured using the gel dosimeter compared well to that calculated with Monte Carlo. Small inhomogeneities were observed in the dosimeter attributed to non-uniform mixing of monomer during preparation. However, they were not detrimental to this study where the quantitative accuracy and spatial resolution of polymer gel dosimetry were far superior to that calculated using scintigraphy. The difference between Monte Carlo and gel measurements was of the order of a few cGy, whilst with the scintigraphic method differences of up to 8 Gy were observed. A manipulation technique is also presented which allows 3D scintigraphic dosimetry measurements to be compared to polymer

  18. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    SciTech Connect

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  19. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  20. Simulated sudden increase in geomagnetic activity and its effect on heart rate variability: Experimental verification of correlation studies

    NASA Astrophysics Data System (ADS)

    Caswell, Joseph M.; Singh, Manraj; Persinger, Michael A.

    2016-08-01

    Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings.

  1. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  2. Implementation of the Short-Term Ensemble Prediction System (STEPS) in Belgium and verification of case studies

    NASA Astrophysics Data System (ADS)

    Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent

    2014-05-01

    The Short-Term Ensemble Prediction System (STEPS) is a probabilistic precipitation nowcasting scheme developed at the Australian Bureau of Meteorology in collaboration with the UK Met Office. In order to account for the multiscaling nature of rainfall structures, the radar field is decomposed into an 8 levels multiplicative cascade using a Fast Fourier Transform. The cascade is advected using the velocity field estimated with optical flow and evolves stochastically according to a hierarchy of auto-regressive processes. This allows reproducing the empirical observation that the rate of temporal evolution of the small scales is faster than the large scales. The uncertainty in radar rainfall measurement and the unknown future development of the velocity field are also considered by stochastic modelling in order to reflect their typical spatial and temporal variability. Recently, a 4 years national research program has been initiated by the University of Leuven, the Royal Meteorological Institute (RMI) of Belgium and 3 other partners: PLURISK ("forecasting and management of extreme rainfall induced risks in the urban environment"). The project deals with the nowcasting of rainfall and subsequent urban inundations, as well as socio-economic risk quantification, communication, warning and prevention. At the urban scale it is widely recognized that the uncertainty of hydrological and hydraulic models is largely driven by the input rainfall estimation and forecast uncertainty. In support to the PLURISK project the RMI aims at integrating STEPS in the current operational deterministic precipitation nowcasting system INCA-BE (Integrated Nowcasting through Comprehensive Analysis). This contribution will illustrate examples of STEPS ensemble and probabilistic nowcasts for a few selected case studies of stratiform and convective rain in Belgium. The paper focuses on the development of STEPS products for potential hydrological users and a preliminary verification of the nowcasts

  3. Test/QA plan for the verification testing of alternative or reformulated liquid fuels, fuel additives, fuel emulsions, and lubricants for highway and nonroad use heavy-duty diesel engines

    EPA Science Inventory

    This Environmental Technology Verification Program test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR P...

  4. Environmental Technology Verification (ETV) Program Case Studies: Demonstrating Program Outcomes, Volume III

    EPA Science Inventory

    This booklet, ETV Program Case Studies: Demonstrating Program Outcomes, Volume III contains two case studies, addressing verified environmental technologies for decentalized wastewater treatment and converting animal waste to energy. Each case study contains a brief description ...

  5. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  6. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-01

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment. PMID:27492599

  7. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  8. A Teacher-Verification Study of Speaking and Writing Prototype Tasks for a New TOEFL

    ERIC Educational Resources Information Center

    Cumming, A.; Grant, L.; Mulcahy-Ernt, P.; Powers, D.E.

    2004-01-01

    This study was undertaken, in conjunction with other studies field-testing prototype tasks for a new TOEFL, to evaluate the content validity, perceived authenticity and educational appropriateness of these prototype tasks. We interviewed seven highly experienced instructors of English as a Second Language (ESL) at three universities, asking them…

  9. Study of wood plastic composite in the presence of nitrogen containing additives

    NASA Astrophysics Data System (ADS)

    Ali, K. M. Idriss; Khan, Mubarak A.; Husain, M. M.

    1994-10-01

    Effect of nitrogen-containing additives in the study of wood plastic composites of MMA with simul and mango wood of Bangladesh has been investigated. Nine different additives were used and the additives containing carboamide group induce the highest tensile strength to the composite.

  10. RESULTS OF A METHOD VERIFICATION STUDY FOR ANALYSES OF PCP IN SOIL

    EPA Science Inventory

    As a prelude to a field demonstration of the fungal treatment technology by the SITE Program, a field treatability study was performed to select optimal fungal species and loading rates.using the site-specific soil matrix contaminated with Wood preserving wastes: PCP and PAHS. ur...

  11. Case studies of thermal energy storage (TES) systems: Evaluation and verification of system performance

    SciTech Connect

    Akbari, H.; Sezgen, O.

    1992-01-01

    We have developed two case studies to review and analyze energy performance of thermal energy storage CMS systems in commercial buildings. Our case studies considered two partial ice storage systems in Northern California. For each case, we compiled historical data on TES design, installation, and operation. This information was further enhanced by data obtained through interviews with the building owners and operators. The performance and historical data of the TES systems and their components were grouped into issues related to design, installation, operation, and maintenance of the systems. Our analysis indicated that (1) almost all problems related to the operation of TES and non-TES systems could be traced back to the design of the system, and (2) the identified problems were not unique to the TES systems. There were as many original problems with conventional'' HVAC systems and components as with TES systems. Judging from the problems related to non-TES components identified in these two case studies, it is reasonable to conclude that conventional systems have as many problems as TES systems, but a failure, in a TES system may have a more dramatic impact on thermal comfort and electricity charges. The objective of the designers of the TES systems in the case-study buildings was to design just-the-right-size systems so that both the initial investment and operating costs would be minimized. Given such criteria, a system is typically designed only for normal and steady-state operating conditions-which often precludes due consideration to factors such as maintenance, growth in the needed capacity, ease of the operation, and modularity of the systems. Therefore, it is not surprising to find that these systems, at least initially, did not perform to the design intent and expectation and that they had to go through extended periods of trouble-shooting.

  12. Case studies of thermal energy storage (TES) systems: Evaluation and verification of system performance. Final report

    SciTech Connect

    Akbari, H.; Sezgen, O.

    1992-01-01

    We have developed two case studies to review and analyze energy performance of thermal energy storage CMS systems in commercial buildings. Our case studies considered two partial ice storage systems in Northern California. For each case, we compiled historical data on TES design, installation, and operation. This information was further enhanced by data obtained through interviews with the building owners and operators. The performance and historical data of the TES systems and their components were grouped into issues related to design, installation, operation, and maintenance of the systems. Our analysis indicated that (1) almost all problems related to the operation of TES and non-TES systems could be traced back to the design of the system, and (2) the identified problems were not unique to the TES systems. There were as many original problems with ``conventional`` HVAC systems and components as with TES systems. Judging from the problems related to non-TES components identified in these two case studies, it is reasonable to conclude that conventional systems have as many problems as TES systems, but a failure, in a TES system may have a more dramatic impact on thermal comfort and electricity charges. The objective of the designers of the TES systems in the case-study buildings was to design just-the-right-size systems so that both the initial investment and operating costs would be minimized. Given such criteria, a system is typically designed only for normal and steady-state operating conditions-which often precludes due consideration to factors such as maintenance, growth in the needed capacity, ease of the operation, and modularity of the systems. Therefore, it is not surprising to find that these systems, at least initially, did not perform to the design intent and expectation and that they had to go through extended periods of trouble-shooting.

  13. Marine induction studies based on sea surface scalar magnetic field measurements. A concept and its verification

    NASA Astrophysics Data System (ADS)

    Kuvshinov, A. V.; Poedjono, B.; Matzka, J.; Olsen, N.; Pai, S.; Samrock, F.

    2013-12-01

    Most marine EM studies are based on sea-bottom measurements which are expensive and logistically demanding. We propose a low-cost and easy-to-deploy magnetic survey concept which exploits sea surface measurements. It is assumed that the exciting source can be described by a plane wave. The concept is based on responses that relate variations of the scalar magnetic field at the survey sites with variations of the horizontal magnetic field at a base site. It can be shown that these scalar responses are a mixture of standard tipper responses and elements of the horizontal magnetic tensor and thus can be used to probe the electrical conductivity of the subsoil. This opens an avenue for sea-surface induction studies which so far was believed very difficult to conduct if conventional approaches based on vector measurements are invoked. We perform 3-D realistic model studies where the target region was Oahu Island and its surroundings, and USGS operated Honolulu geomagnetic observatory was chosen as the base site. We compare the predicted responses with the responses estimated from the scalar data collected at a few locations around Oahu Island by the unmanned, autonomous, wave and solar powered 'Wave Glider' developed and operated by Liquid Robotics Oil and Gas/Schlumberger. The marine robots observation platform is equipped with a tow Overhauser magnetometer (validated by USGS). The studies show an encouraging agreement between predictions and experiment in both components of the scalar response at all locations and we consider this as a proof of the suggested concept.

  14. Microbial ureolysis in the seawater-catalysed urine phosphorus recovery system: Kinetic study and reactor verification.

    PubMed

    Tang, Wen-Tao; Dai, Ji; Liu, Rulong; Chen, Guang-Hao

    2015-12-15

    Our previous study has confirmed the feasibility of using seawater as an economical precipitant for urine phosphorus (P) precipitation. However, we still understand very little about the ureolysis in the Seawater-based Urine Phosphorus Recovery (SUPR) system despite its being a crucial step for urine P recovery. In this study, batch experiments were conducted to investigate the kinetics of microbial ureolysis in the seawater-urine system. Indigenous bacteria from urine and seawater exhibited relatively low ureolytic activity, but they adapted quickly to the urine-seawater mixture during batch cultivation. During cultivation, both the abundance and specific ureolysis rate of the indigenous bacteria were greatly enhanced as confirmed by a biomass-dependent Michaelis-Menten model. The period for fully ureolysis was decreased from 180 h to 2.5 h after four cycles of cultivation. Based on the successful cultivation, a lab-scale SUPR reactor was set up to verify the fast ureolysis and efficient P recovery in the SUPR system. Nearly complete urine P removal was achieved in the reactor in 6 h without adding any chemicals. Terminal Restriction Fragment Length Polymorphism (TRFLP) analysis revealed that the predominant groups of bacteria in the SUPR reactor likely originated from seawater rather than urine. Moreover, batch tests confirmed the high ureolysis rates and high phosphorus removal efficiency induced by cultivated bacteria in the SUPR reactor under seawater-to-urine mixing ratios ranging from 1:1 to 9:1. This study has proved that the enrichment of indigenous bacteria in the SUPR system can lead to sufficient ureolytic activity for phosphate precipitation, thus providing an efficient and economical method for urine P recovery. PMID:26378727

  15. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    PubMed Central

    Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos

    2016-01-01

    Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451

  16. Experimental verification of a tank to tank He II transfer model with trade study results

    NASA Technical Reports Server (NTRS)

    Yuan, S. W. K.; Frederking, T. H. K.

    1990-01-01

    A computer program has been developed to study the thermodynamics of tank to tank superfluid helium transfer. The model includes a supply and a receiver tank connected by a transfer line. The convey of He II from one tank to the other is controlled by a fountain effect pump (FEP). Phase separators are present in both the supply and receiver tank to regulate the bath temperature. Description of this model has been published elsewhere. In the present paper, data from a transfer experiment are used to verify the accuracy of this model. The experiment consisted of an FEP made of a 2-micron sintered stainless steel porous plug. Superfluid has been transferred from a liquid helium bath into a glass beaker. Bath temperatures, flowrate and heater power records are available. These results are compared to the predictions of the computer program and good agreement is found between the two. This model is very useful for the study and design of superfluid transfer systems, e.g., the Superfluid Helium Tanker (SFHT) and the Particle Astrophysics Magnet Facility (ASTROMAG).

  17. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies.

    PubMed

    Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos

    2016-01-01

    Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451

  18. A study on the estimation and verification of the blended precipitation forecast for hydrological use in Korea

    NASA Astrophysics Data System (ADS)

    Yang, H.; Jeong, J.; Nam, K.; Ko, H.; Choi, Y.

    2012-12-01

    Quantitative precipitation forecasts of nowcasting based on the extrapolation of radar and numerical weather prediction models is the crucial information for severe weather such as floods, droughts, debris flows, and water quality, and to determine current and future availability of water resources. Meso-scale models well represent the cumulus convection process and the change of magnitude of precipitation, but need the spin-up time defined as the time needed to reach from the initial non-existent cloud to the actual state of cloud cumulus. The spin-up problem of meso-scale model yields the low skill score within the short range forecast lead time. Nowcasting model including the advection process of the rainfall is one of the alternatives to avoid this problem. The purpose of this study is to produce the optimized quantitative precipitation forecast by blending both the forecasted precipitation of the nowcasting and numerical weather prediction (NWP) forecast at catchment scale for hydrometeorological application. Korea Meteorological Administration (KMA) have been operating nowcasting models, which are VSRF (Very Short Range Forecast of precipitation) and MAPLE (McGill Algorithm for Precipitation Nowcasting by Lagrangian), and short term forecast model, which are UMRG (Unified Model of Regional Grid) and KWRF (Korean WRF). The blended precipitation forecast are estimated by using a weight scheme based on the long term average value of critical success index of each individual component of the model. The hydrological verification of the blended precipitation forecast has been conducted to 117 mid-watersheds of Korea for summertime in 2011. The performance of the blended precipitation has shown to be better than that individual forecasted precipitation.

  19. High-Dose-Rate 192Ir Brachytherapy Dose Verification: A Phantom Study

    PubMed Central

    Nikoofar, Alireza; Hoseinpour, Zohreh; Rabi Mahdavi, Seied; Hasanzadeh, Hadi; Rezaei Tavirani, Mostafa

    2015-01-01

    Background: The high-dose-rate (HDR) brachytherapy might be an effective tool for palliation of dysphagia. Because of some concerns about adverse effects due to absorbed radiation dose, it is important to estimate absorbed dose in risky organs during this treatment. Objectives: This study aimed to measure the absorbed dose in the parotid, thyroid, and submandibular gland, eye, trachea, spinal cord, and manubrium of sternum in brachytherapy in an anthropomorphic phantom. Materials and Methods: To measure radiation dose, eye, parotid, thyroid, and submandibular gland, spine, and sternum, an anthropomorphic phantom was considered with applicators to set thermoluminescence dosimeters (TLDs). A specific target volume of about 23 cm3 in the upper thoracic esophagus was considered as target, and phantom planned computed tomography (CT) for HDR brachytherapy, then with a micro-Selectron HDR (192Ir) remote after-loading unit. Results: Absorbed doses were measured with calibrated TLDs and were expressed in centi-Gray (cGy). In regions far from target (≥ 16 cm) such as submandibular, parotid and thyroid glands, mean measured dose ranged from 1.65 to 5.5 cGy. In closer regions (≤ 16 cm), the absorbed dose might be as high as 113 cGy. Conclusions: Our study showed similar depth and surface doses; in closer regions, the surface and depth doses differed significantly due to the role of primary radiation that had imposed a high-dose gradient and difference between the plan and measurement, which was more severe because of simplifications in tissue inhomogeneity, considered in TPS relative to phantom. PMID:26413250

  20. Governance of preventive Health Intervention and On time Verification of its Efficiency: the GIOVE Study

    PubMed Central

    Baio, Gianluca; Montagano, Giuseppe; Cauzillo, Gabriella; Locuratolo, Francesco; Becce, Gerardo; Gitto, Lara; Marcellusi, Andrea; Zweifel, Peter; Capone, Alessandro; Favato, Giampiero

    2012-01-01

    Objectives The GIOVE Study was aimed to the achievement of allocative efficiency of the budget allocated to the prevention of human papillomavirus (HPV)-induced diseases. An ex-ante determination of the most efficient allocation of resources between screening and multicohort quadrivalent immunisation programmes was followed by the ex-post assessment of the allocative efficiency actually achieved after a 12-month period. Design A bound optimisation model was developed to determine the ex-ante allocative efficiency of resources. The alternatives compared were the screening programme alone and the quadrivalent immunisation with access to screening. A sensitivity analysis was carried out to assess the uncertainty associated with the main inputs of the model. Subsequently, a cohort of girls with a complete recorded vaccination history were enrolled in an observational retrospective study for 18 months to ensure full compliance with the recommended schedule of vaccination (0, 2, 6 months) within a 12-month time horizon. Setting Basilicata region, in the south of Italy. Participants 12 848 girls aged 12, 15, 18 or 25 years. Intervention Immunisation with quadrivalent anti-HPV vaccine. Outcome measures The vaccination coverage rate was considered to be the indicator of the best achievable benefit, given the budgetary constraints. Results Assuming a vaccine price of €100 per dose, a vaccination coverage rate of 59.6% was required for the most effective allocation of resources. The optimal rate of coverage was initially in favour of the multicohort strategy of vaccination against HPV (72.8%±2%). When the price paid for the quadrivalent vaccine dropped to €85 per dose, the most efficient coverage rate (69.5%) shifted closer to the immunisation rate actually achieved during the 12-month observation period. Conclusions The bound optimisation model demonstrated to be a useful approach to the ex-ante allocation and the ex-post assessment of the resources allocated to

  1. Preliminary studies of PQS PET detector module for dose verification of carbon beam therapy

    NASA Astrophysics Data System (ADS)

    Kim, H.-I.; An, S. Jung; Lee, C. Y.; Jo, W. J.; Min, E.; Lee, K.; Kim, Y.; Joung, J.; Chung, Y. H.

    2014-05-01

    PET imaging can be used to verify dose distributions of therapeutic particle beams such as carbon ion beams. The purpose of this study was to develop a PET detector module which was designed for an in-beam PET scanner geometry integrated into a carbon beam therapy system, and to evaluate its feasibility as a monitoring system of patient dose distribution. A C-shaped PET geometry was proposed to avoid blockage of the carbon beam by the detector modules. The proposed PET system consisted of 14 detector modules forming a bore with 30.2 cm inner diameter for brain imaging. Each detector module is composed of a 9 × 9 array of 4.0 mm × 4.0 mm × 20.0 mm LYSO crystal module optically coupled with four 29 mm diameter PMTs using Photomultiplier-quadrant-sharing (PQS) technique. Because the crystal pixel was identified based upon the distribution of scintillation lights of four PMTs, the design of the reflector between crystal elements should be well optimized. The optical design of reflectors was optimized using DETECT2000, a Monte Carlo code for light photon transport. A laser-cut reflector set was developed using the Enhanced Specular Reflector (ESR, 3M Co.) mirror-film with a high reflectance of 98% and a thickness of 0.064 mm. All 81 crystal elements of detector module were identified. Our result demonstrates that the C-shaped PET system is under development and we present the first reconstructed image.

  2. NRC model simulations in support of the hydrologic code intercomparison study (HYDROCOIN): Level 1-code verification

    SciTech Connect

    Not Available

    1988-03-01

    HYDROCOIN is an international study for examining ground-water flow modeling strategies and their influence on safety assessments of geologic repositories for nuclear waste. This report summarizes only the combined NRC project temas' simulation efforts on the computer code bench-marking problems. The codes used to simulate thesee seven problems were SWIFT II, FEMWATER, UNSAT2M USGS-3D, AND TOUGH. In general, linear problems involving scalars such as hydraulic head were accurately simulated by both finite-difference and finite-element solution algorithms. Both types of codes produced accurate results even for complex geometrics such as intersecting fractures. Difficulties were encountered in solving problems that invovled nonlinear effects such as density-driven flow and unsaturated flow. In order to fully evaluate the accuracy of these codes, post-processing of results using paricle tracking algorithms and calculating fluxes were examined. This proved very valuable by uncovering disagreements among code results even through the hydraulic-head solutions had been in agreement. 9 refs., 111 figs., 6 tabs.

  3. Theory for noise of propellers in angular inflow with parametric studies and experimental verification

    NASA Technical Reports Server (NTRS)

    Hanson, Donald B.; Parzych, David J.

    1993-01-01

    This report presents the derivation of a frequency domain theory and working equations for radiation of propeller harmonic noise in the presence of angular inflow. In applying the acoustic analogy, integration over the tangential coordinate of the source region is performed numerically, permitting the equations to be solved without approximation for any degree of angular inflow. Inflow angle is specified in terms of yaw, pitch, and roll angles of the aircraft. Since these can be arbitrarily large, the analysis applies with equal accuracy to propellers and helicopter rotors. For thickness and loading, the derivation is given in complete detail with working equations for near and far field. However, the quadrupole derivation has been carried only far enough to show feasibility of the numerical approach. Explicit formulas are presented for computation of source elements, evaluation of Green's functions, and location of observer points in various visual and retarded coordinate systems. The resulting computer program, called WOBBLE has been written in FORTRAN and follows the notation of this report very closely. The new theory is explored to establish the effects of varying inflow angle on axial and circumferential directivity. Also, parametric studies were performed to evaluate various phenomena outside the capabilities of earlier theories, such as an unsteady thickness effect. Validity of the theory was established by comparison with test data from conventional propellers and Prop Fans in flight and in wind tunnels under a variety of operating conditions and inflow angles.

  4. First in situ TOF-PET study using digital photon counters for proton range verification.

    PubMed

    Cambraia Lopes, P; Bauer, J; Salomon, A; Rinaldi, I; Tabacchini, V; Tessonnier, T; Crespo, P; Parodi, K; Schaart, D R

    2016-08-21

    Positron emission tomography (PET) is the imaging modality most extensively tested for treatment monitoring in particle therapy. Optimal use of PET in proton therapy requires in situ acquisition of the relatively strong (15)O signal due to its relatively short half-life (~2 min) and high oxygen content in biological tissues, enabling shorter scans that are less sensitive to biological washout. This paper presents the first performance tests of a scaled-down in situ time-of-flight (TOF) PET system based on digital photon counters (DPCs) coupled to Cerium-doped Lutetium Yttrium Silicate (LYSO:Ce) crystals, providing quantitative results representative of a dual-head tomograph that complies with spatial constraints typically encountered in clinical practice (2  ×  50°, of 360°, transaxial angular acceptance). The proton-induced activity inside polymethylmethacrylate (PMMA) and polyethylene (PE) phantoms was acquired within beam pauses (in-beam) and immediately after irradiation by an actively-delivered synchrotron pencil-beam, with clinically relevant 125.67 MeV/u, 4.6  ×  10(8) protons s(-1), and 10(10) total protons. 3D activity maps reconstructed with and without TOF information are compared to FLUKA simulations, demonstrating the benefit of TOF-PET to reduce limited-angle artefacts using a 382 ps full width at half maximum coincidence resolving time. The time-dependent contributions from different radionuclides to the total count-rate are investigated. We furthermore study the impact of the acquisition time window on the laterally integrated activity depth-profiles, with emphasis on 2 min acquisitions starting at different time points. The results depend on phantom composition and reflect the differences in relative contributions from the radionuclides originating from carbon and oxygen. We observe very good agreement between the shapes of the simulated and measured activity depth-profiles for post-beam protocols. However, our results

  5. First in situ TOF-PET study using digital photon counters for proton range verification

    NASA Astrophysics Data System (ADS)

    Cambraia Lopes, P.; Bauer, J.; Salomon, A.; Rinaldi, I.; Tabacchini, V.; Tessonnier, T.; Crespo, P.; Parodi, K.; Schaart, D. R.

    2016-08-01

    Positron emission tomography (PET) is the imaging modality most extensively tested for treatment monitoring in particle therapy. Optimal use of PET in proton therapy requires in situ acquisition of the relatively strong 15O signal due to its relatively short half-life (~2 min) and high oxygen content in biological tissues, enabling shorter scans that are less sensitive to biological washout. This paper presents the first performance tests of a scaled-down in situ time-of-flight (TOF) PET system based on digital photon counters (DPCs) coupled to Cerium-doped Lutetium Yttrium Silicate (LYSO:Ce) crystals, providing quantitative results representative of a dual-head tomograph that complies with spatial constraints typically encountered in clinical practice (2  ×  50°, of 360°, transaxial angular acceptance). The proton-induced activity inside polymethylmethacrylate (PMMA) and polyethylene (PE) phantoms was acquired within beam pauses (in-beam) and immediately after irradiation by an actively-delivered synchrotron pencil-beam, with clinically relevant 125.67 MeV/u, 4.6  ×  108 protons s‑1, and 1010 total protons. 3D activity maps reconstructed with and without TOF information are compared to FLUKA simulations, demonstrating the benefit of TOF-PET to reduce limited-angle artefacts using a 382 ps full width at half maximum coincidence resolving time. The time-dependent contributions from different radionuclides to the total count-rate are investigated. We furthermore study the impact of the acquisition time window on the laterally integrated activity depth-profiles, with emphasis on 2 min acquisitions starting at different time points. The results depend on phantom composition and reflect the differences in relative contributions from the radionuclides originating from carbon and oxygen. We observe very good agreement between the shapes of the simulated and measured activity depth-profiles for post-beam protocols. However, our results also

  6. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  7. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  8. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  9. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 11 2013-01-01 2013-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  10. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  11. Practical mask inspection system with printability and pattern priority verification

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Hideo; Ozaki, Fumio; Takahara, Kenichi; Inoue, Takafumi; Kikuiri, Nobutaka

    2011-05-01

    Through the four years of study in Association of Super-Advanced Electronics Technologies (ASET) on reducing mask manufacturing Turn Around Time (TAT) and cost, we have been able to establish a technology to improve the efficiency of the review process by applying a printability verification function that utilizes computational lithography simulations to analyze defects detected by a high-resolution mask inspection system. With the advent of Source-Mask Optimization (SMO) and other technologies that extend the life of existing optical lithography, it is becoming extremely difficult to judge a defect only by the shape of a mask pattern, while avoiding pseudo-defects. Thus, printability verification is indispensable for filtering out nuisance defects from high-resolution mask inspection results. When using computational lithography simulations to verify printability with high precision, the image captured by the inspection system must be prepared with extensive care. However, for practical applications, this preparation process needs to be simplified. In addition, utilizing Mask Data Rank (MDR) to vary the defect detection sensitivity according to the patterns is also useful for simultaneously inspecting minute patterns and avoiding pseudo-defects. Combining these two technologies, we believe practical mask inspection for next generation lithography is achievable. We have been improving the estimation accuracy of the printability verification function through discussion with several customers and evaluation of their masks. In this report, we will describe the progress of these practical mask verification functions developed through customers' evaluations.

  12. Cold Flow Verification Test Facility

    SciTech Connect

    Shamsi, A.; Shadle, L.J.

    1996-12-31

    The cold flow verification test facility consists of a 15-foot high, 3-foot diameter, domed vessel made of clear acrylic in two flanged sections. The unit can operate up to pressures of 14 psig. The internals include a 10-foot high jetting fluidized bed, a cylindrical baffle that hangs from the dome, and a rotating grate for control of continuous solids removal. The fluid bed is continuously fed solids (20 to 150 lb/hr) through a central nozzle made up of concentric pipes. It can either be configured as a half or full cylinder of various dimensions. The fluid bed has flow loops for separate air flow control for conveying solids (inner jet, 500 to 100000 scfh) , make-up into the jet (outer jet, 500 to 8000 scfh), spargers in the solids removal annulus (100 to 2000 scfh), and 6 air jets (20 to 200 scfh) on the sloping conical grid. Additional air (500 to 10000 scfh) can be added to the top of the dome and under the rotating grate. The outer vessel, the hanging cylindrical baffles or skirt, and the rotating grate can be used to study issues concerning moving bed reactors. There is ample allowance for access and instrumentation in the outer shell. Furthermore, this facility is available for future Cooperative Research and Development Program Manager Agreements (CRADA) to study issues and problems associated with fluid- and fixed-bed reactors. The design allows testing of different dimensions and geometries.

  13. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  14. COS Internal NUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM2 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS14 {program 11474 - COS NUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS NUV ERO observations and NUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each NUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  15. COS Internal FUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM1 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS29 {program 11487 - COS FUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS FUV ERO observations that require accurate wavelength scales {if any} and FUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each FUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  16. A direct anatomical study of additional renal arteries in a Colombian mestizo population.

    PubMed

    Saldarriaga, B; Pérez, A F; Ballesteros, L E

    2008-05-01

    Traditional anatomy describes each kidney as receiving irrigation from a single renal artery. However, current literature reports great variability in renal blood supply, the number of renal arteries mentioned being the most frequently found variation. Such variation has great implications when surgery is indicated, such as in renal transplants, uroradiological procedures, renovascular hypertension, renal trauma and hydronephrosis. This article pretends to determine the frequency of additional renal arteries and their morphological expression in Colombian population in a cross-sectional study. A total of 196 of renal blocks were analysed from autopsies carried out in the Bucaramanga Institute of Forensic Medicine, Colombia; these renal blocks were processed by the injection- corrosion technique. The average age of the people being studied was 33.8 +/- 15.6 years; 85.4% of them were male and the rest female. An additional renal artery was found in 22.3% of the whole population and two additional ones were found in 2.6% of the same sample. The additional renal artery was most frequently found on the left side. The additional artery arose from the aorta's lateral aspect (52.4%); these additional arteries usually entered the renal parenchyma through the hilum. No difference was established according to gender. Nearly a third of the Colombian population presents one additional renal artery and about 3% of the same population presents two additional renal arteries. Most of them reached the kidney through its hilar region. PMID:18521812

  17. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  18. National Energy Efficiency Evaluation, Measurement and Verification (EM&V) Standard: Scoping Study of Issues and Implementation Requirements

    SciTech Connect

    Schiller Consulting, Inc.; Schiller, Steven R.; Goldman, Charles A.; Galawish, Elsia

    2011-02-04

    This report is a scoping study that identifies issues associated with developing a national evaluation, measurement and verification (EM&V) standard for end-use, non-transportation, energy efficiency activities. The objectives of this study are to identify the scope of such a standard and define EM&V requirements and issues that will need to be addressed in a standard. To explore these issues, we provide and discuss: (1) a set of definitions applicable to an EM&V standard; (2) a literature review of existing guidelines, standards, and 'initiatives' relating to EM&V standards as well as a review of 'bottom-up' versus 'top-down' evaluation approaches; (3) a summary of EM&V related provisions of two recent federal legislative proposals (Congressman Waxman's and Markey's American Clean Energy and Security Act of 2009 and Senator Bingaman's American Clean Energy Leadership Act of 2009) that include national efficiency resource requirements; (4) an annotated list of issues that that are likely to be central to, and need to be considered when, developing a national EM&V standard; and (5) a discussion of the implications of such issues. There are three primary reasons for developing a national efficiency EM&V standard. First, some policy makers, regulators and practitioners believe that a national standard would streamline EM&V implementation, reduce costs and complexity, and improve comparability of results across jurisdictions; although there are benefits associated with each jurisdiction setting its own EM&V requirements based on their specific portfolio and evaluation budgets and objectives. Secondly, if energy efficiency is determined by the US Environmental Protection Agency to be a Best Available Control Technology (BACT) for avoiding criteria pollutant and/or greenhouse gas emissions, then a standard can be required for documenting the emission reductions resulting from efficiency actions. The third reason for a national EM&V standard is that such a standard is

  19. Additional short-term plutonium urinary excretion data from the 1945-1947 plutonium injection studies

    SciTech Connect

    Moss, W.D.; Gautier, M.A.

    1986-01-01

    The amount of plutonium excreted per day following intravenous injection was shown to be significantly higher than predicted by the Langham power function model. Each of the Los Alamos National Laboratory notebooks used to record the original analytical data was studied for details that could influence the findings. It was discovered there were additional urine excretion data for case HP-3. This report presents the additional data, as well as data on case HP-6. (ACR)

  20. The 2014 Sandia Verification and Validation Challenge: Problem statement

    DOE PAGESBeta

    Hu, Kenneth; Orient, George

    2016-01-18

    This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less

  1. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 11 2014-01-01 2014-01-01 false Engineering and cost studies-addition of generation capacity. 1710.253 Section 1710.253 Agriculture Regulations of the Department of Agriculture (Continued... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253...

  2. Evaluating Drugs and Food Additives for Public Use: A Case Studies Approach.

    ERIC Educational Resources Information Center

    Merritt, Sheridan V.

    1980-01-01

    Described is a case study used in an introductory college biology course that provides a basis for generating debate on an issue concerning the regulation of controversial food additives and prescription drugs. The case study contained within this article deals with drug screening, specifically with information related to thalidomide. (CS)

  3. Study on the Tritium Behaviors in the VHTR System. Part 1: Development of Tritium Analysis Code for VHTR and Verification

    SciTech Connect

    Eung Soo Kim; Chang Ho Oh; Mike Patterson

    2010-07-01

    A tritium permeation analyses code (TPAC) has been developed in Idaho National Laboratory (INL) by using MATLAB SIMULINK package for analysis of tritium behaviors in the VHTRs integrated with hydrogen production and process heat application systems. The modeling is based on the mass balance of tritium containing species and hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. The code includes (1) tritium sources from ternary fission and neutron reactions with 6Li, 7Li 10B, 3He, (2) tritium purification system, (3) leakage of tritium with coolant, (4) permeation through pipes, vessels, and heat exchangers, (4) electrolyzer for high temperature steam electrolysis (HTSE), and (5) isotope exchange for SI process. Verification of the code has been performed by comparisons with the analytical solutions, the experimental data, and the benchmark code results based on the Peach Bottom reactor design. The results showed that all the governing equations are well implemented into the code and correctly solved. This paper summarizes all the background, the theory, the code structures, and some verification results related to the TPAC code development in Idaho National Laboratory (INL).

  4. Fitting additive hazards models for case-cohort studies: a multiple imputation approach.

    PubMed

    Jung, Jinhyouk; Harel, Ofer; Kang, Sangwook

    2016-07-30

    In this paper, we consider fitting semiparametric additive hazards models for case-cohort studies using a multiple imputation approach. In a case-cohort study, main exposure variables are measured only on some selected subjects, but other covariates are often available for the whole cohort. We consider this as a special case of a missing covariate by design. We propose to employ a popular incomplete data method, multiple imputation, for estimation of the regression parameters in additive hazards models. For imputation models, an imputation modeling procedure based on a rejection sampling is developed. A simple imputation modeling that can naturally be applied to a general missing-at-random situation is also considered and compared with the rejection sampling method via extensive simulation studies. In addition, a misspecification aspect in imputation modeling is investigated. The proposed procedures are illustrated using a cancer data example. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26194861

  5. [TG-FTIR study on pyrolysis of wheat-straw with abundant CaO additives].

    PubMed

    Han, Long; Wang, Qin-Hui; Yang, Yu-Kun; Yu, Chun-Jiang; Fang, Meng-Xiang; Luo, Zhong-Yang

    2011-04-01

    Biomass pyrolysis in presence of abundant CaO additives is a fundamental process prior to CaO sorption enhanced gasification in biomass-based zero emission system. In the present study, thermogravimetric Fourier transform infrared (TG-FTIR) analysis was adopted to examine the effects of CaO additives on the mass loss process and volatiles evolution of wheat-straw pyrolysis. Observations from TG and FTIR analyses simultaneously demonstrated a two-stage process for CaO catalyzed wheat-straw pyrolysis, different from the single stage process for pure wheat-straw pyrolysis. CaO additives could not only absorb the released CO2 but also reduce the yields of tar species such as toluene, phenol, and formic acid in the first stage, resulting in decreased mass loss and maximum mass loss rate in this stage with an increase in CaO addition. The second stage was attributed to the CaCO3 decomposition and the mass loss and maximum mass loss rate increased with increasing amount of CaO additives. The results of the present study demonstrated the great potential of CaO additives to capture CO2 and reduce tars yields in biomass-based zero emission system. The gasification temperature in the system should be lowered down to avoid CaCO3 decomposition. PMID:21714234

  6. SHEEP MOUNTAIN WILDERNESS STUDY AREA AND CUCAMONGA WILDERNESS AND ADDITIONS, CALIFORNIA.

    USGS Publications Warehouse

    Evans, James G.; Ridenour, James

    1984-01-01

    The Sheep Mountain Wilderness Study Area and Cucamonga Wilderness and additions encompass approximately 104 sq mi of the eastern San Gabriel Mountains, Los Angeles and San Bernardino Counties, California. A mineral survey indicates areas of probable and substantiated tungsten and gold resource potential for parts of the Sheep Mountain Wilderness Study Area and an area of probable tungsten and gold resource potential in the Cucamonga Wilderness and additions. The rugged topography, withdrawal of lands from mineral entry to protect watershed, and restricted entry of lands during periods of high fire danger have contributed to the continuing decline in mineral exploration. The geologic setting precludes the presence of energy resources.

  7. Influence of Polarization on Carbohydrate Hydration: A Comparative Study Using Additive and Polarizable Force Fields.

    PubMed

    Pandey, Poonam; Mallajosyula, Sairam S

    2016-07-14

    Carbohydrates are known to closely modulate their surrounding solvent structures and influence solvation dynamics. Spectroscopic investigations studying far-IR regions (below 1000 cm(-1)) have observed spectral shifts in the libration band (around 600 cm(-1)) of water in the presence of monosaccharides and polysaccharides. In this paper, we use molecular dynamics simulations to gain atomistic insight into carbohydrate-water interactions and to specifically highlight the differences between additive (nonpolarizable) and polarizable simulations. A total of six monosaccharide systems, α and β anomers of glucose, galactose, and mannose, were studied using additive and polarizable Chemistry at HARvard Macromolecular Mechanics (CHARMM) carbohydrate force fields. Solvents were modeled using three additive water models TIP3P, TIP4P, and TIP5P in additive simulations and polarizable water model SWM4 in polarizable simulations. The presence of carbohydrate has a significant effect on the microscopic water structure, with the effects being pronounced for proximal water molecules. Notably, disruption of the tetrahedral arrangement of proximal water molecules was observed due to the formation of strong carbohydrate-water hydrogen bonds in both additive and polarizable simulations. However, the inclusion of polarization resulted in significant water-bridge occupancies, improved ordered water structures (tetrahedral order parameter), and longer carbohydrate-water H-bond correlations as compared to those for additive simulations. Additionally, polarizable simulations also allowed the calculation of power spectra from the dipole-dipole autocorrelation function, which corresponds to the IR spectra. From the power spectra, we could identify spectral signatures differentiating the proximal and bulk water structures, which could not be captured from additive simulations. PMID:27266974

  8. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  9. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  10. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  11. Design, analysis, and test verification of advanced encapsulation system

    NASA Technical Reports Server (NTRS)

    Garcia, A.; Minning, C.

    1981-01-01

    Procurement of 4 in x 4 in polycrystalline solar cells were proceeded with some delays. A total of 1200 cells were procured for use in both the verification testing and qualification testing. Additional thermal structural analyses were run and the data are presented. An outline of the verification testing is included with information on test specimen construction.

  12. Cognitive Bias in the Verification and Validation of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  13. Generating Scenarios of Addition and Subtraction: A Study of Japanese University Students

    ERIC Educational Resources Information Center

    Kinda, Shigehiro

    2013-01-01

    Students are presented with problems involving three scenario types of addition and subtraction in elementary mathematics: one dynamic ("Change") and two static ("Combine, Compare"). Previous studies have indicated that the dynamic type is easier for school children, whereas the static types are more difficult and comprehended only gradually…

  14. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Engineering and cost studies-addition of generation capacity. 1710.253 Section 1710.253 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO ELECTRIC LOANS AND...

  15. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 11 2012-01-01 2012-01-01 false Engineering and cost studies-addition of generation capacity. 1710.253 Section 1710.253 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO ELECTRIC LOANS AND...

  16. Experimental study of combustion of decane, dodecane and hexadecane with polymeric and nano-particle additives

    NASA Astrophysics Data System (ADS)

    Ghamari, Mohsen; Ratner, Albert

    2015-11-01

    Recent studies have shown that adding combustible nano-particles could have promising effects on increasing burning rate of liquid fuels. Combustible nano-particles could enhance the heat conduction and mixing within the droplet. Polymers have also higher burning rate than regular hydrocarbon fuels because of having the flame closer to the droplet surface. Therefore adding polymeric additive could have the potential to increase the burning rate. In this study, combustion of stationary fuel droplets of n-Decane, n-Dodecane and n-Hexadecane doped with different percentages of a long chain polymer and also a very fine nano carbon was examined and compared with the pure hydrocarbon behavior. In contrast with hydrocarbon droplets with no polymer addition, several zones of combustion including a slow and steady burning zone, a strong swelling zone and a final fast and fairly steady combustion zone were also detected. In addition, increasing polymer percentage resulted in a more extended swelling zone and shorter slow burning zone in addition to a shorter total burning time. Addition of nano-particles also resulted in an overall increased burning rate and shortened burning time which is due to enhanced heat conduction within the droplet.

  17. Anatomically ordered tapping interferes more with one-digit addition than two-digit addition: a dual-task fMRI study.

    PubMed

    Soylu, Firat; Newman, Sharlene D

    2016-02-01

    Fingers are used as canonical representations for numbers across cultures. In previous imaging studies, it was shown that arithmetic processing activates neural resources that are known to participate in finger movements. Additionally, in one dual-task study, it was shown that anatomically ordered finger tapping disrupts addition and subtraction more than multiplication, possibly due to a long-lasting effect of early finger counting experiences on the neural correlates and organization of addition and subtraction processes. How arithmetic task difficulty and tapping complexity affect the concurrent performance is still unclear. If early finger counting experiences have bearing on the neural correlates of arithmetic in adults, then one would expect anatomically and non-anatomically ordered tapping to have different interference effects, given that finger counting is usually anatomically ordered. To unravel these issues, we studied how (1) arithmetic task difficulty and (2) the complexity of the finger tapping sequence (anatomical vs. non-anatomical ordering) affect concurrent performance and use of key neural circuits using a mixed block/event-related dual-task fMRI design with adult participants. The results suggest that complexity of the tapping sequence modulates interference on addition, and that one-digit addition (fact retrieval), compared to two-digit addition (calculation), is more affected from anatomically ordered tapping. The region-of-interest analysis showed higher left angular gyrus BOLD response for one-digit compared to two-digit addition, and in no-tapping conditions than dual tapping conditions. The results support a specific association between addition fact retrieval and anatomically ordered finger movements in adults, possibly due to finger counting strategies that deploy anatomically ordered finger movements early in the development. PMID:26410214

  18. Pilot study on verification of effectiveness on operability of assistance system for robotic tele-surgery using simulation.

    PubMed

    Kawamura, Kazuya; Kobayashi, Yo; Fujie, Masakatsu G

    2010-01-01

    Tele-surgery enables medical care even in remote regions, and has been accomplished in clinical cases by means of dedicated communication lines. To make tele-surgery a more widespread method of providing medical care, a surgical environment needs to be made available using public lines of communication, such as the Internet. Moreover, a support system during surgery is required, as the use of surgical tools is performed in an environment subject to delay. In our research, we focus on the operability of specific tasks conducted by surgeons during a medical procedure, with the aim of clarifying, by means of a simulation, the optimum environment for robotic tele-surgery. In the study, we set up experimental systems using our proposed simulation system. In addition, we investigate the mental workloads on subjects and verify the effect of visual-assistance information as a pilot study. The operability of the task of gripping soft tissue was evaluated using a subjective workload assessment tool, the NASA Task Load Index. Results show that the tasks were completed, but the workload did not improve to less than 300ms and 400ms in the simulated environment. Verifying the effect of the support system was an important task under a more-than 200ms delay using this experiment, and future studies will evaluate the operability of the system under varying conditions of comfort. In addition, an intra-operative assistance system will be constructed using a simulation. PMID:21096798

  19. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    NASA Astrophysics Data System (ADS)

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-10-01

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused by a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. The observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys.

  20. Microstructural Study Of Zinc Hot Dip Galvanized Coatings with Titanium Additions In The Zinc Melt

    NASA Astrophysics Data System (ADS)

    Konidaris, S.; Pistofidis, N.; Vourlias, G.; Pavlidou, E.; Stergiou, A.; Stergioudis, G.; Polychroniadis, E. K.

    2007-04-01

    Zinc hot-dip galvanizing is a method for protecting iron and steel against corrosion. Galvanizing with pure Zn or Zn with additions like Ni, Al, Pb and Bi has been extensively studied, but there is a lack of scientific information about other additions. The present work examines the effect of a 0.5 wt% Ti addition in the Zn melt. The samples were exposed to accelerated corrosion in a salt spray chamber (SSC). The microstructure and chemical composition of the coatings were determined by Optical Microscopy, XRD and SEM associated with an EDS Analyzer. The results indicate that the coatings have a typical morphology, while Zn-Ti phases were also detected.

  1. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    PubMed Central

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-01-01

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused by a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. The observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys. PMID:26446425

  2. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    SciTech Connect

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-10-08

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused by a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. In conclusion, the observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys.

  3. Summary of Previous Chamber or Controlled Anthrax Studies and Recommendations for Possible Additional Studies

    SciTech Connect

    Piepel, Gregory F.; Amidan, Brett G.; Morrow, Jayne B.

    2010-12-29

    This report and an associated Excel file(a) summarizes the investigations and results of previous chamber and controlled studies(b) to characterize the performance of methods for collecting, storing and/or transporting, extracting, and analyzing samples from surfaces contaminated by Bacillus anthracis (BA) or related simulants. This report and the Excel are the joint work of the Pacific Northwest National Laboratory (PNNL) and the National Institute of Standards and Technology (NIST) for the Department of Homeland Security, Science and Technology Directorate. The report was originally released as PNNL-SA-69338, Rev. 0 in November 2009 with limited distribution, but was subsequently cleared for release with unlimited distribution in this Rev. 1. Only minor changes were made to Rev. 0 to yield Rev. 1. A more substantial update (including summarizing data from other studies and more condensed summary tables of data) is underway

  4. Studies of levels of biogenic amines in meat samples in relation to the content of additives.

    PubMed

    Jastrzębska, Aneta; Kowalska, Sylwia; Szłyk, Edward

    2016-01-01

    The impact of meat additives on the concentration of biogenic amines and the quality of meat was studied. Fresh white and red meat samples were fortified with the following food additives: citric and lactic acids, disodium diphosphate, sodium nitrite, sodium metabisulphite, potassium sorbate, sodium chloride, ascorbic acid, α-tocopherol, propyl 3,4,5-trihydroxybenzoate (propyl gallate) and butylated hydroxyanisole. The content of spermine, spermidine, putrescine, cadaverine, histamine, tyramine, tryptamine and 2-phenylethylamine was determined by capillary isotachophoretic methods in meat samples (fresh and fortified) during four days of storage at 4°C. The results were applied to estimate the impact of the tested additives on the formation of biogenic amines in white and red meat. For all tested meats, sodium nitrite, sodium chloride and disodium diphosphate showed the best inhibition. However, cadaverine and putrescine were characterised by the biggest changes in concentration during the storage time of all the additives. Based on the presented data for the content of biogenic amines in meat samples analysed as a function of storage time and additives, we suggest that cadaverine and putrescine have a significant impact on meat quality. PMID:26515667

  5. A patient-specific quality assurance study on absolute dose verification using ionization chambers of different volumes in RapidArc treatments

    SciTech Connect

    Syam Kumar, S.A.; Sukumar, Prabakar; Sriram, Padmanaban; Rajasekaran, Dhanabalan; Aketi, Srinu; Vivekanandan, Nagarajan

    2012-01-01

    The recalculation of 1 fraction from a patient treatment plan on a phantom and subsequent measurements have become the norms for measurement-based verification, which combines the quality assurance recommendations that deal with the treatment planning system and the beam delivery system. This type of evaluation has prompted attention to measurement equipment and techniques. Ionization chambers are considered the gold standard because of their precision, availability, and relative ease of use. This study evaluates and compares 5 different ionization chambers: phantom combinations for verification in routine patient-specific quality assurance of RapidArc treatments. Fifteen different RapidArc plans conforming to the clinical standards were selected for the study. Verification plans were then created for each treatment plan with different chamber-phantom combinations scanned by computed tomography. This includes Medtec intensity modulated radiation therapy (IMRT) phantom with micro-ionization chamber (0.007 cm{sup 3}) and pinpoint chamber (0.015 cm{sup 3}), PTW-Octavius phantom with semiflex chamber (0.125 cm{sup 3}) and 2D array (0.125 cm{sup 3}), and indigenously made Circular wax phantom with 0.6 cm{sup 3} chamber. The measured isocenter absolute dose was compared with the treatment planning system (TPS) plan. The micro-ionization chamber shows more deviations when compared with semiflex and 0.6 cm{sup 3} with a maximum variation of -4.76%, -1.49%, and 2.23% for micro-ionization, semiflex, and farmer chambers, respectively. The positive variations indicate that the chamber with larger volume overestimates. Farmer chamber shows higher deviation when compared with 0.125 cm{sup 3}. In general the deviation was found to be <1% with the semiflex and farmer chambers. A maximum variation of 2% was observed for the 0.007 cm{sup 3} ionization chamber, except in a few cases. Pinpoint chamber underestimates the calculated isocenter dose by a maximum of 4.8%. Absolute dose

  6. Numerical study of water entry supercavitating flow around a vertical circular cylinder influenced by turbulent drag-reducing additives

    NASA Astrophysics Data System (ADS)

    Jiang, C. X.; Cheng, J. P.; Li, F. C.

    2015-01-01

    This paper attempts to introduce a numerical simulation procedure to simulate water-entry problems influenced by turbulent drag-reducing additives in a viscous incompressible medium. Firstly we performed a numerical investigation on water-entry supercavities in water and turbulent drag-reducing solution at the impact velocity of 28.4 m/s to confirm the accuracy of the numerical method. Based on the verification, projectile entering water and turbulent drag-reducing solution at relatively high velocity of 142.7 m/s (phase transition is considered) is simulated. The cross viscosity equation was adopted to represent the shear-thinning characteristic of aqueous solution of drag-reducing additives. The configuration and dynamic characteristics of water entry supercavity, flow resistance were discussed respectively. It was obtained that the numerical simulation results are in consistence with experimental data. Numerical results show that the supercavity length in drag-reducing solution is larger than one in water and the velocity attenuates faster at high velocity than at low velocity; the influence of drag-reducing solution is more obvious at high impact velocity. Turbulent drag-reducing additives have the great potential for enhancement of supercavity.

  7. a Study on the Role of Sintering Additives for Fabrication of sic Ceramic

    NASA Astrophysics Data System (ADS)

    Yoon, Han Ki; Lee, Young Ju; Cho, Ho Jun; Kim, Tae Gyu

    Silicon carbide (SiC) materials have been extensively studied for high temperature components in advanced energy system and advanced gas turbine. The SiC ceramics have been fabricated by a NITE (Nano Infiltration Transient Eutectic Phase) Process, using Nano-SiC powder. The sintering additives used for forming liquid phase under sintering process, used the sintering additives ratios were an Al2O3-Y2O3 system or add SiO2 contents. A major R&D focus for the SiC ceramics is the production to obtain high purity SiC ceramics. In this study, we investigated roles of the sintering additives(Al2O3:Y2O3) to fabrication of the SiC ceramics. The effects of SiO2 contents and density properties of the SiC ceramics were also investigated. To investigate the effects of SiO2, Al2O3/Y2O3 composition were fixed and then SiO2 ratios were changed as several kinds, and to confirm the effects of sintering additives ratios (Al2O3:Y2O3) they were changed between 4:6 and 6:4 in x wt.%.

  8. SU-E-J-82: Intra-Fraction Proton Beam-Range Verification with PET Imaging: Feasibility Studies with Monte Carlo Simulations and Statistical Modeling

    SciTech Connect

    Lou, K; Mirkovic, D; Sun, X; Zhu, X; Poenisch, F; Grosshans, D; Shao, Y; Clark, J

    2014-06-01

    Purpose: To study the feasibility of intra-fraction proton beam-range verification with PET imaging. Methods: Two phantoms homogeneous cylindrical PMMA phantoms (290 mm axial length, 38 mm and 200 mm diameter respectively) were studied using PET imaging: a small phantom using a mouse-sized PET (61 mm diameter field of view (FOV)) and a larger phantom using a human brain-sized PET (300 mm FOV). Monte Carlo (MC) simulations (MCNPX and GATE) were used to simulate 179.2 MeV proton pencil beams irradiating the two phantoms and be imaged by the two PET systems. A total of 50 simulations were conducted to generate 50 positron activity distributions and correspondingly 50 measured activity-ranges. The accuracy and precision of these activity-ranges were calculated under different conditions (including count statistics and other factors, such as crystal cross-section). Separate from the MC simulations, an activity distribution measured from a simulated PET image was modeled as a noiseless positron activity distribution corrupted by Poisson counting noise. The results from these two approaches were compared to assess the impact of count statistics on the accuracy and precision of activity-range calculations. Results: MC Simulations show that the accuracy and precision of an activity-range are dominated by the number (N) of coincidence events of the reconstructed image. They are improved in a manner that is inversely proportional to 1/sqrt(N), which can be understood from the statistical modeling. MC simulations also indicate that the coincidence events acquired within the first 60 seconds with 10{sup 9} protons (small phantom) and 10{sup 10} protons (large phantom) are sufficient to achieve both sub-millimeter accuracy and precision. Conclusion: Under the current MC simulation conditions, the initial study indicates that the accuracy and precision of beam-range verification are dominated by count statistics, and intra-fraction PET image-based beam-range verification is

  9. Using epidemiology to regulate food additives: saccharin case-control studies.

    PubMed

    Cordle, F; Miller, S A

    1984-01-01

    The increasing use of nonnutritive sweeteners and the widely publicized 1969 ban on cyclamate led to additional investigations in rodents of the carcinogenic potential of saccharin. Preliminary results of a long-term feeding study indicated formation of bladder tumors in rodents, and collective experimental evidence has demonstrated that high doses of the synthetic sweetener saccharin can cause bladder cancer in rodents. Based on the results of that and other rodent studies indicating an increased risk of bladder cancer associated with saccharin, the Commissioner of the Food and Drug Administration announced the agency's intention to propose a ban on saccharin. This intention was made known in April 1977 under the Delaney Clause of the Food, Drug, and Cosmetic Act. The clause essentially states that no additive shall be deemed safe if it is found to induce cancer in man or animals, or if it is found, after tests appropriate for the evaluation of the safety of food additives, to induce cancer in man or animals. Also in 1977, a group of epidemiologists began to assess the available epidemiologic information to determine the potential human risk. This report describes the assessment of several human epidemiologic studies available then and the results of more recent epidemiologic studies. PMID:6431484

  10. Monte Carlo patient study on the comparison of prompt gamma and PET imaging for range verification in proton therapy

    NASA Astrophysics Data System (ADS)

    Moteabbed, M.; España, S.; Paganetti, H.

    2011-02-01

    The purpose of this work was to compare the clinical adaptation of prompt gamma (PG) imaging and positron emission tomography (PET) as independent tools for non-invasive proton beam range verification and treatment validation. The PG range correlation and its differences with PET have been modeled for the first time in a highly heterogeneous tissue environment, using different field sizes and configurations. Four patients with different tumor locations (head and neck, prostate, spine and abdomen) were chosen to compare the site-specific behaviors of the PG and PET images, using both passive scattered and pencil beam fields. Accurate reconstruction of dose, PG and PET distributions was achieved by using the planning computed tomography (CT) image in a validated GEANT4-based Monte Carlo code capable of modeling the treatment nozzle and patient anatomy in detail. The physical and biological washout phenomenon and decay half-lives for PET activity for the most abundant isotopes such as 11C, 15O, 13N, 30P and 38K were taken into account in the data analysis. The attenuation of the gamma signal after traversing the patient geometry and respective detection efficiencies were estimated for both methods to ensure proper comparison. The projected dose, PG and PET profiles along many lines in the beam direction were analyzed to investigate the correlation consistency across the beam width. For all subjects, the PG method showed on average approximately 10 times higher gamma production rates than the PET method before, and 60 to 80 times higher production after including the washout correction and acquisition time delay. This rate strongly depended on tissue density and elemental composition. For broad passive scattered fields, it was demonstrated that large differences exist between PG and PET signal falloff positions and the correlation with the dose distribution for different lines in the beam direction. These variations also depended on the treatment site and the

  11. Study of asphalt/asphaltene precipitation during addition of solvents to West Sak crude

    SciTech Connect

    Jiang, J.C.; Patil, S.L.; Kamath, V.A. )

    1990-07-01

    In this study, experimental data on the amount of asphalt and asphaltene precipitation due to addition of solvents to West Sak crude were gathered. The first set of tests were conducted for two types of West Sak stock tank oils. Solvents used include: ethane, carbon dioxide, propane, n-butane, n-pentane, n-heptane, Prudhoe Bay natural gas (PBG) and natural gas liquids (NGL). Effect of solvent to oil dilution ratio on the amount of precipitation was studied. Alteration of crude oil composition due to asphalt precipitation was measured using gas-liquid chromatography. A second set of experiments were conducted to measure asphaltene precipitation due to addition of CO{sub 2} to live (recombined) West Sak crude.

  12. A Study of Aluminum Combustion in Solids, Powders, Foams, Additively-Manufactured Lattices, and Composites

    NASA Astrophysics Data System (ADS)

    Black, James; Trammell, Norman; Batteh, Jad; Curran, Nicholas; Rogers, John; Littrell, Donald

    2015-06-01

    This study examines the fireball characteristics, blast parameters, and combustion efficiency of explosively-shocked aluminum-based materials. The materials included structural and non-structural aluminum forms - such as solid cylinders, foams, additively-manufactured lattices, and powders - and some polytetrafluoroethylene-aluminum (PTFE-Al) composites. The materials were explosively dispersed in a small blast chamber, and the blast properties and products were measured with pressure transducers, thermocouples, slow and fast ultraviolet/visible spectrometers, and high-speed video.

  13. Spectra-temporal patterns underlying mental addition: an ERP and ERD/ERS study.

    PubMed

    Ku, Yixuan; Hong, Bo; Gao, Xiaorong; Gao, Shangkai

    2010-03-12

    Functional neuroimaging data have shown that mental calculation involves fronto-parietal areas that are composed of different subsystems shared with other cognitive functions such as working memory and language. Event-related potential (ERP) analysis has also indicated sequential information changes during the calculation process. However, little is known about the dynamic properties of oscillatory networks in this process. In the present study, we applied both ERP and event-related (de-)synchronization (ERS/ERD) analyses to EEG data recorded from normal human subjects performing tasks for sequential visual/auditory mental addition. Results in the study indicate that the late positive components (LPCs) can be decomposed into two separate parts. The earlier element LPC1 (around 360ms) reflects the computing attribute and is more prominent in calculation tasks. The later element LPC2 (around 590ms) indicates an effect of number size and appears larger only in a more complex 2-digit addition task. The theta ERS and alpha ERD show modality-independent frontal and parietal differential patterns between the mental addition and control groups, and discrepancies are noted in the beta ERD between the 2-digit and 1-digit mental addition groups. The 2-digit addition (both visual and auditory) results in similar beta ERD patterns to the auditory control, which may indicate a reliance on auditory-related resources in mental arithmetic, especially with increasing task difficulty. These results coincide with the theory of simple calculation relying on the visuospatial process and complex calculation depending on the phonological process. PMID:20105450

  14. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  15. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR DATA ENTRY AND DATA VERIFICATION (HAND ENTRY) (UA-D-15.0)

    EPA Science Inventory

    The purpose of this SOP is to define the steps involved in data entry and data verification of physical forms. It applies to the data entry and data verification of all physical forms. The procedure defined herein was developed for use in the Arizona NHEXAS project and the "Bor...

  16. Verification of heterogeneous multi-agent system using MCMAS

    NASA Astrophysics Data System (ADS)

    Choi, Jiyoung; Kim, Seungkeun; Tsourdos, Antonios

    2015-03-01

    The focus of the paper is how to model autonomous behaviours of heterogeneous multi-agent systems such that it can be verified that they will always operate within predefined mission requirements and constraints. This is done by using formal methods with an abstraction of the behaviours modelling and model checking for their verification. Three case studies are presented to verify the decision-making behaviours of heterogeneous multi-agent system using a convoy mission scenario. The multi-agent system in a case study has been extended by increasing the number of agents and function complexity gradually. For automatic verification, model checker for multi-agent systems (MCMAS) is adopted due to its novel capability to accommodate the multi-agent system and successfully verifies the targeting behaviours of the team-level autonomous systems. The verification results help retrospectively the design of decision-making algorithms improved by considering additional agents and behaviours during three steps of scenario modification. Consequently, the last scenario deals with the system composed of a ground control system, two unmanned aerial vehicles, and four unmanned ground vehicles with fault-tolerant and communication relay capabilities.

  17. Mineral resources of the Buffalo Hump and Sand Dunes Addition Wilderness Study Areas, Sweetwater County, Wyoming

    SciTech Connect

    Gibbons, A.B.; Barbon, H.N.; Kulik, D.M. ); McDonnell, J.R. Jr. )

    1990-01-01

    The authors present a study to assess the potential for undiscovered mineral resources and appraise the identified resources of the Buffalo Hump and Sand Dunes Addition Wilderness Study Areas, southwestern Wyoming, There are no mines, prospects, or mineralized areas nor any producing oil or gas wells; however, there are occurrences of coal, claystone and shale, and sand. There is a moderate resource potential for oil shale and natural gas and a low resource potential for oil, for metals, including uranium, and for geothermal sources.

  18. Additional Treatment Services in a Cocaine Treatment Study: Level of Services Obtained and Impact on Outcome

    PubMed Central

    Worley, Matthew; Gallop, Robert; Gibbons, Mary Beth Connolly; Ring-Kurtz, Sarah; Present, Julie; Weiss, Roger D.; Crits-Christoph, Paul

    2009-01-01

    The objective of this study was to examine the level of additional treatment services obtained by patients enrolled in the NIDA Cocaine Collaborative Study, a multi-center efficacy trial of four treatments for cocaine dependence, and to determine whether these services impact treatment outcome. Cocaine-dependent patients (N = 487) were recruited at five sites and randomly assigned to six months of one of four psychosocial treatments. Assessments were made at baseline, monthly during treatment, and at follow-ups at 9, 12, 15, and 18 months post-randomization. On average, patients received little or no additional treatment services during active treatment (first 6 months), but the rate of obtaining most services increased during the follow-up phase (month 7 to 18). In general, the treatment groups did not differ in the rates of obtaining non-protocol services. For all treatment groups, patients with greater psychiatric severity received more medical and psychiatric services during active treatment and follow-up. Use of treatment services was unrelated to drug use outcomes during active treatment. However, during the follow-up period, increased use of psychiatric medication, 12-step attendance, and 12-step participation was related to less drug use. The results suggest that during uncontrolled follow-up phases, additional non-protocol services may potentially confound the interpretation of treatment group comparisons in drug use outcomes. PMID:18463998

  19. Patient Study of In Vivo Verification of Beam Delivery and Range, Using Positron Emission Tomography and Computed Tomography Imaging After Proton Therapy

    SciTech Connect

    Parodi, Katia . E-mail: Katia.Parodi@med.uni-heidelberg.de; Paganetti, Harald; Shih, Helen A.; Michaud, Susan; Loeffler, Jay S.; DeLaney, Thomas F.; Liebsch, Norbert J.; Munzenrider, John E.; Fischman, Alan J.; Knopf, Antje; Bortfeld, Thomas

    2007-07-01

    Purpose: To investigate the feasibility and value of positron emission tomography and computed tomography (PET/CT) for treatment verification after proton radiotherapy. Methods and Materials: This study included 9 patients with tumors in the cranial base, spine, orbit, and eye. Total doses of 1.8-3 GyE and 10 GyE (for an ocular melanoma) per fraction were delivered in 1 or 2 fields. Imaging was performed with a commercial PET/CT scanner for 30 min, starting within 20 min after treatment. The same treatment immobilization device was used during imaging for all but 2 patients. Measured PET/CT images were coregistered to the planning CT and compared with the corresponding PET expectation, obtained from CT-based Monte Carlo calculations complemented by functional information. For the ocular case, treatment position was approximately replicated, and spatial correlation was deduced from reference clips visible in both the planning radiographs and imaging CT. Here, the expected PET image was obtained from an analytical model. Results: Good spatial correlation and quantitative agreement within 30% were found between the measured and expected activity. For head-and-neck patients, the beam range could be verified with an accuracy of 1-2 mm in well-coregistered bony structures. Low spine and eye sites indicated the need for better fixation and coregistration methods. An analysis of activity decay revealed as tissue-effective half-lives of 800-1,150 s. Conclusions: This study demonstrates the feasibility of postradiation PET/CT for in vivo treatment verification. It also indicates some technological and methodological improvements needed for optimal clinical application.

  20. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  1. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  2. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  3. TFE verification program

    NASA Astrophysics Data System (ADS)

    1994-01-01

    This is the final semiannual progress report for the Thermionic Fuel Elements (TFE) verification. A decision was made in August 1993 to begin a Close Out Program on October 1, 1993. Final reports summarizing the design analyses and test activities of the TFE Verification Program will be written, stand-alone documents for each task. The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein includes evaluated test data, design evaluations, the results of analyses and the significance of results.

  4. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  5. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  6. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    DOE PAGESBeta

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-10-08

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused bymore » a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. In conclusion, the observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys.« less

  7. Isomeric Selective Studies of the Dominant Addition Channel in OH Initiated Oxidation of Isoprene

    NASA Astrophysics Data System (ADS)

    Ghosh, B.; Bugarin, A.; Connell, B.; North, S. W.

    2009-12-01

    We report the first isomeric selective study of the dominant isomeric pathway in the OH initiated oxidation of isoprene in the presence of O2 and NO using the Laser Photolysis-Laser Induced Fluorescence (LP-LIF) technique. The photolysis of monodeuterated/non deuterated 2-iodo-2-methyl-but-3-en-1-ol results exclusively in the dominant OH-isoprene addition product, providing important insight into the oxidation mechanism. Based on kinetic analysis of OH cycling experiments we have determined the rate constant for O2 addition to the hydroxy alkyl radical to be (1.0±0.5) × 10^(-12) cm^(3) s^(-1) and we find a value of (8.05±2.3) × 10^(-12) cm^(3) s^(-1) for the overall reaction rate constant of the hydroxy peroxy radical with NO. We also report the first clear experimental evidence of the (E-) form of the δ-hydroxyalkoxy channel through isotopic labeling experiments and quantify its branching ratio to be 0.1±0.025. Since it corresponds to missing carbon balance in isoprene oxidation, we have been able to identify some of the missing carbon balance. Since our measured isomeric selective rate constants for the dominant outer channel in OH initiated isoprene chemistry are similar to the overall rate constants derived from non isomeric kinetics, we predict that the remaining outer addition channel will have similar reactivity. We have extended this study to the OH initiated oxidation of 1,3-butadiene. We have obtained isomeric selective rate constants on the dominant channel of the butadiene oxidation chemistry and measured the branching ratio for the δ-hydroxyalkoxy channel. These results on butadiene studies will be discussed.

  8. Evaluation of the National School Lunch Program Application/Verification Pilot Projects: Volume II: Data Collection, Study Methods and Supplementary Tables on Certification Impacts. Nutrition Assistance Program Report Series. Report No. CN-04-AV2

    ERIC Educational Resources Information Center

    Burghardt, John; Gleason, Philip; Sinclair, Michael; Cohen, Rhoda; Hulsey, Lara; Milliner-Waddell, Julita

    2004-01-01

    This is Volume II of the report on the evaluation of the NSLP Application Verification Pilot Projects. It supplements Volume I, which presents the evaluation findings. Volume II has two objectives: (1) to provide a detailed description of the methods used to conduct the study; and (2) to present tabulations that supplement and extend the analyses…

  9. Immunotoxic effects of the color additive caramel color III: immune function studies in rats.

    PubMed

    Houben, G F; Penninks, A H; Seinen, W; Vos, J G; Van Loveren, H

    1993-01-01

    Administration of the color additive caramel color III (AC) may cause a reduction in total white blood cell counts in rats due to reduced lymphocyte counts. Beside lymphopenia, several other effects in rat have been described. The effects are caused by the imidazole derivative 2-acetyl-4(5)-(1,2,3,4-tetrahydroxybutyl)imidazole (THI) and occur in rats fed a diet low in vitamin B6. In the present paper, immune function studies on AC and THI with rats fed a diet low, but not deficient in vitamin B6 are presented and discussed. Rats were exposed to 0.4 or 4% AC or to 5.72 ppm THI in drinking water during and for 28 days prior to the start of immune function assays. Resistance to Trichinella spiralis was examined in an oral infection model and clearance of Listeria monocytogenes upon an intravenous infection was studied. In addition, natural cell-mediated cytotoxicity of splenic and nonadherent peritoneal cells and the antibody response to sheep red blood cells were studied. From the results it is concluded that exposure of rats to AC or THI influenced various immune function parameters. Thymus-dependent immunity was suppressed, while parameters of the nonspecific resistance were also affected, as shown by a decreased natural cell-mediated cytotoxicity in the spleen and an enhanced clearance of L. monocytogenes. PMID:8432426

  10. Improving wound care simulation with the addition of odor: a descriptive, quasi-experimental study.

    PubMed

    Roberson, Donna W; Neil, Janice A; Bryant, Elizabeth T

    2008-08-01

    Improving problem-solving skills and expertise in complex clinical care provision requires engaging students in the learning process--a challenging goal when clinical practicums and supervisors are limited. High-fidelity simulation has created many new opportunities for educating healthcare professionals. Because addressing malodorous wounds is a common problem that may be difficult to "teach," a descriptive, quasi-experimental simulation study was conducted. Following completion of a wound care simulation and Laerdal's Simulation Experience Evaluation Tool by 137 undergraduate nursing students, 50 control subjects were randomly selected and 49 volunteer students (experimental group) participated in a wound care simulation after one of three cheeses with a strong odor was added to simulate a malodorous wound. Compared to the control group, study group responses were significantly better (P <0.001) for eight of the 12 survey variables tested and indicated the addition of odor was beneficial in enhancing the perceived realism and value of the simulation. Students responded that the addition of odor in the simulation laboratory improved realism and they felt better prepared to handle malodorous wounds in a clinical setting. An unanticipated outcome was the enhanced feeling of involvement associated with paired care teams as opposed to working in larger groups. The results of this study indicate that wound care education outcomes improve when nursing students are able to practice using a multi-sensorial wound care simulation model. PMID:18716340

  11. Verification and Validation of RADTRAN 5.5.

    SciTech Connect

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  12. Context Effects in Sentence Verification.

    ERIC Educational Resources Information Center

    Kiger, John I.; Glass, Arnold L.

    1981-01-01

    Three experiments examined what happens to reaction time to verify easy items when they are mixed with difficult items in a verification task. Subjects verification of simple arithmetic equations and sentences took longer when placed in a difficult list. Difficult sentences also slowed the verification of easy arithmetic equations. (Author/RD)

  13. A water soluble additive to suppress respirable dust from concrete-cutting chainsaws: a case study.

    PubMed

    Summers, Michael P; Parmigiani, John P

    2015-01-01

    Respirable dust is of particular concern in the construction industry because it contains crystalline silica. Respirable forms of silica are a severe health threat because they heighten the risk of numerous respirable diseases. Concrete cutting, a common work practice in the construction industry, is a major contributor to dust generation. No studies have been found that focus on the dust suppression of concrete-cutting chainsaws, presumably because, during normal operation water is supplied continuously and copiously to the dust generation points. However, there is a desire to better understand dust creation at low water flow rates. In this case study, a water-soluble surfactant additive was used in the chainsaw's water supply. Cutting was performed on a free-standing concrete wall in a covered outdoor lab with a hand-held, gas-powered, concrete-cutting chainsaw. Air was sampled at the operator's lapel, and around the concrete wall to simulate nearby personnel. Two additive concentrations were tested (2.0% and 0.2%), across a range of fluid flow rates (0.38-3.8 Lpm [0.1-1.0 gpm] at 0.38 Lpm [0.1 gpm] increments). Results indicate that when a lower concentration of additive is used exposure levels increase. However, all exposure levels, once adjusted for 3 hours of continuous cutting in an 8-hour work shift, are below the Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) of 5 mg/m(3). Estimates were made using trend lines to predict the fluid flow rates that would cause respirable dust exposure to exceed both the OSHA PEL and the American Conference of Governmental Industrial Hygienists (ACGIH®) threshold limit value (TLV). PMID:25714034

  14. Augmenting a Waste Glass Mixture Experiment Study with Additional Glass Components and Experimental Runs

    SciTech Connect

    Piepel, Gregory F. ); Cooley, Scott K. ); Peeler, David K.; Vienna, John D. ); Edwards, Tommy B.

    2002-01-01

    A glass composition variation study (CVS) for high-level waste (HLW) stored in Idaho is being statistically designed and performed in phases over several years. The purpose of the CVS is to investigate and model how HLW-glass properties depend on glass composition. The resulting glass property-composition models will be used to develop desirable glass formulations and for other purposes. Phases 1 and 2 of the CVS have been completed and are briefly described. This paper focuses on the CVS Phase 3 experimental design, which was chosen to augment the Phase 1 and 2 data with additional data points, as well as to account for additional glass components not studied in Phases 1 and/or 2. In total, 16 glass components were varied in the Phase 3 experimental design. The paper describes how these Phase 3 experimental design augmentation challenges were addressed using the previous data, preliminary property-composition models, and statistical mixture experiment and optimal experimental design methods and software.

  15. Role of sulfite additives in wine induced asthma: single dose and cumulative dose studies

    PubMed Central

    Vally, H; Thompson, P

    2001-01-01

    BACKGROUND—Wine appears to be a significant trigger for asthma. Although sulfite additives have been implicated as a major cause of wine induced asthma, direct evidence is limited. Two studies were undertaken to assess sulfite reactivity in wine sensitive asthmatics. The first study assessed sensitivity to sulfites in wine using a single dose sulfited wine challenge protocol followed by a double blind, placebo controlled challenge. In the second study a cumulative dose sulfited wine challenge protocol was employed to establish if wine sensitive asthmatics as a group have an increased sensitivity to sulfites.
METHODS—In study 1, 24 asthmatic patients with a strong history of wine induced asthma were screened. Subjects showing positive responses to single blind high sulfite (300 ppm) wine challenge were rechallenged on separate days in a double blind, placebo controlled fashion with wines of varying sulfite levels to characterise their responses to these drinks. In study 2, wine sensitive asthmatic patients (n=12) and control asthmatics (n=6) were challenged cumulatively with wine containing increasing concentrations of sulfite in order to characterise further their sensitivity to sulfites in wine.
RESULTS—Four of the 24 self-reporting wine sensitive asthmatic patients were found to respond to sulfite additives in wine when challenged in a single dose fashion (study 1). In the double blind dose-response study all four had a significant fall in forced expiratory volume in one second (FEV1) (>15% from baseline) following exposure to wine containing 300 ppm sulfite, but did not respond to wines containing 20, 75 or 150 ppm sulfite. Responses were maximal at 5 minutes (mean (SD) maximal decline in FEV1 28.7 (13)%) and took 15-60 minutes to return to baseline levels. In the cumulative dose-response study (study 2) no significant difference was observed in any of the lung function parameters measured (FEV1, peak expiratory flow (PEF), mid phase forced expiratory

  16. Imaging quality full chip verification for yield improvement

    NASA Astrophysics Data System (ADS)

    Yang, Qing; Zhou, CongShu; Quek, ShyueFong; Lu, Mark; Foong, YeeMei; Qiu, JianHong; Pandey, Taksh; Dover, Russell

    2013-04-01

    Basic image intensity parameters, like maximum and minimum intensity values (Imin and Imax), image logarithm slope (ILS), normalized image logarithm slope (NILS) and mask error enhancement factor (MEEF) , are well known as indexes of photolithography imaging quality. For full chip verification, hotspot detection is typically based on threshold values for line pinching or bridging. For image intensity parameters it is generally harder to quantify an absolute value to define where the process limit will occur, and at which process stage; lithography, etch or post- CMP. However it is easy to conclude that hot spots captured by image intensity parameters are more susceptible to process variation and very likely to impact yield. In addition these image intensity hot spots can be missed by using resist model verification because the resist model normally is calibrated by the wafer data on a single resist plane and is an empirical model which is trying to fit the resist critical dimension by some mathematic algorithm with combining optical calculation. Also at resolution enhancement technology (RET) development stage, full chip imaging quality check is also a method to qualify RET solution, like Optical Proximity Correct (OPC) performance. To add full chip verification using image intensity parameters is also not as costly as adding one more resist model simulation. From a foundry yield improvement and cost saving perspective, it is valuable to quantify the imaging quality to find design hot spots to correctly define the inline process control margin. This paper studies the correlation between image intensity parameters and process weakness or catastrophic hard failures at different process stages. It also demonstrated how OPC solution can improve full chip image intensity parameters. Rigorous 3D resist profile simulation across the full height of the resist stack was also performed to identify a correlation to the image intensity parameter. A methodology of post-OPC full

  17. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    SciTech Connect

    Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.; Sauer, Jeremy A.

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verification test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.

  18. Characterization studies on the additives mixed L-arginine phosphate monohydrate (LAP) crystals

    NASA Astrophysics Data System (ADS)

    Haja Hameed, A. S.; Karthikeyan, C.; Ravi, G.; Rohani, S.

    2011-04-01

    L-arginine phosphate monohydrate (LAP), potassium thiocyanate (KSCN) mixed LAP (LAP:KSCN) and sodium sulfite (Na 2SO 3) mixed LAP (LAP:Na 2SO 3) single crystals were grown by slow cooling technique. The effect of microbial contamination and coloration on the growth solutions was studied. The crystalline powders of the grown crystals were examined by X-ray diffraction and the lattice parameters of the crystals were estimated. From the FTIR spectroscopic analysis, various functional group frequencies associated with the crystals were assigned. Vickers microhardness studies were done on {1 0 0} faces for pure and additives mixed LAP crystals. From the preliminary surface second harmonic generation (SHG) results, it was found that the SHG intensity at (1 0 0) face of LAP:KSCN crystal was much stronger than that of pure LAP.

  19. The guanidine and maleic acid (1:1) complex. The additional theoretical and experimental studies

    NASA Astrophysics Data System (ADS)

    Drozd, Marek; Dudzic, Damian

    2012-04-01

    On the basis of experimental literature data the theoretical studies for guanidinium and maleic acid complex with using DFT method are performed. In these studies the experimental X-ray data for two different forms of investigated crystal were used. During the geometry optimization process one equilibrium structure was found, only. According to this result the infrared spectrum for one theoretical molecule was calculated. On the basis of potential energy distribution (PED) analysis the clear-cut assignments of observed bands were performed. For the calculated molecule with energy minimum the highest occupied molecular orbital (HOMO) and the lowest unoccupied molecular orbital (LUMO) were obtained and graphically illustrated. The energy difference (GAP) between HOMO and LUMO was analyzed. Additionally, the nonlinear properties of this molecule were calculated. The α and β (first and second order) hyperpolarizability values are obtained. On the basis of these results the title crystal was classified as new second order NLO generator.

  20. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  1. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  3. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  4. Prazosin addition to fluvoxamine: A preclinical study and open clinical trial in OCD.

    PubMed

    Feenstra, Matthijs G P; Klompmakers, André; Figee, Martijn; Fluitman, Sjoerd; Vulink, Nienke; Westenberg, Herman G M; Denys, Damiaan

    2016-02-01

    The efficacy of selective serotonin reuptake inhibitors (SRIs) in psychiatric disorders may be "augmented" through the addition of atypical antipsychotic drugs. A synergistic increase in dopamine (DA) release in the prefrontal cortex has been suggested to underlie this augmentation effect, though the mechanism of action is not clear yet. We used in vivo microdialysis in rats to study DA release following the administration of combinations of fluvoxamine (10 mg/kg) and quetiapine (10 mg/kg) with various monoamine-related drugs. The results confirmed that the selective 5-HT1A antagonist WAY-100635 (0.05 mg/kg) partially blocked the fluvoxamine-quetiapine synergistic effect (maximum DA increase dropped from 325% to 214%). A novel finding is that the α1-adrenergic blocker prazosin (1 mg/kg), combined with fluvoxamine, partially mimicked the effect of augmentation (maximum DA increase 205%; area-under-the-curve 163%). As this suggested that prazosin augmentation might be tested in a clinical study, we performed an open clinical trial of prazosin 20 mg addition to SRI in therapy-resistant patients with obsessive-compulsive disorder applying for neurosurgery. A small, non-significant reduction in Yale Brown Obsessive Compulsive Scale (Y-BOCS) scores was observed in 10 patients and one patient was classified as a responder with a reduction in Y-BOCS scores of more than 25%. We suggest that future clinical studies augmenting SRIs with an α1-adrenergic blocker in less treatment resistant cases should be considered. The clinical trial "Prazosin in combination with a serotonin reuptake inhibitor for patients with Obsessive Compulsive disorder: an open label study" was registered at 24/05/2011 under trial number ISRCTN61562706: http://www.controlled-trials.com/ISRCTN61562706. PMID:26712326

  5. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  6. Impact of contacting study authors to obtain additional data for systematic reviews: diagnostic accuracy studies for hepatic fibrosis

    PubMed Central

    2014-01-01

    Background Seventeen of 172 included studies in a recent systematic review of blood tests for hepatic fibrosis or cirrhosis reported diagnostic accuracy results discordant from 2 × 2 tables, and 60 studies reported inadequate data to construct 2 × 2 tables. This study explores the yield of contacting authors of diagnostic accuracy studies and impact on the systematic review findings. Methods Sixty-six corresponding authors were sent letters requesting additional information or clarification of data from 77 studies. Data received from the authors were synthesized with data included in the previous review, and diagnostic accuracy sensitivities, specificities, and positive and likelihood ratios were recalculated. Results Of the 66 authors, 68% were successfully contacted and 42% provided additional data for 29 out of 77 studies (38%). All authors who provided data at all did so by the third emailed request (ten authors provided data after one request). Authors of more recent studies were more likely to be located and provide data compared to authors of older studies. The effects of requests for additional data on the conclusions regarding the utility of blood tests to identify patients with clinically significant fibrosis or cirrhosis were generally small for ten out of 12 tests. Additional data resulted in reclassification (using median likelihood ratio estimates) from less useful to moderately useful or vice versa for the remaining two blood tests and enabled the calculation of an estimate for a third blood test for which previously the data had been insufficient to do so. We did not identify a clear pattern for the directional impact of additional data on estimates of diagnostic accuracy. Conclusions We successfully contacted and received results from 42% of authors who provided data for 38% of included studies. Contacting authors of studies evaluating the diagnostic accuracy of serum biomarkers for hepatic fibrosis and cirrhosis in hepatitis C patients

  7. A combined toxicity study of zinc oxide nanoparticles and vitamin C in food additives

    NASA Astrophysics Data System (ADS)

    Wang, Yanli; Yuan, Lulu; Yao, Chenjie; Ding, Lin; Li, Chenchen; Fang, Jie; Sui, Keke; Liu, Yuanfang; Wu, Minghong

    2014-11-01

    At present, safety evaluation standards for nanofood additives are made based on the toxic effects of a single additive. Since the size, surface properties and chemical nature influence the toxicity of nanomaterials, the toxicity may have dramatically changed when nanomaterials are used as food additives in a complex system. Herein, we investigated the combined toxicity of zinc oxide nanoparticles (ZnO NPs) and vitamin C (Vc, ascorbic acid). The results showed that Vc increased the cytotoxicity significantly compared with that of the ZnO only NPs. When the cells were exposed to ZnO NPs at a concentration less than 15 mg L-1, or to Vc at a concentration less than 300 mg L-1, there was no significant cytotoxicity, both in the case of gastric epithelial cell line (GES-1) and neural stem cells (NSCs). However, when 15 mg L-1 of ZnO NPs and 300 mg L-1 of Vc were introduced to cells together, the cell viability decreased sharply indicating significant cytotoxicity. Moreover, the significant increase in toxicity was also shown in the in vivo experiments. The dose of the ZnO NPs and Vc used in the in vivo study was calculated according to the state of food and nutrition enhancer standard. After repeated oral exposure to ZnO NPs plus Vc, the injury of the liver and kidneys in mice has been indicated by the change of these indices. These findings demonstrate that the synergistic toxicity presented in a complex system is essential for the toxicological evaluation and safety assessment of nanofood.At present, safety evaluation standards for nanofood additives are made based on the toxic effects of a single additive. Since the size, surface properties and chemical nature influence the toxicity of nanomaterials, the toxicity may have dramatically changed when nanomaterials are used as food additives in a complex system. Herein, we investigated the combined toxicity of zinc oxide nanoparticles (ZnO NPs) and vitamin C (Vc, ascorbic acid). The results showed that Vc increased the

  8. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  9. A study of pyrazines in cigarettes and how additives might be used to enhance tobacco addiction

    PubMed Central

    Alpert, Hillel R; Agaku, Israel T; Connolly, Gregory N

    2016-01-01

    Background Nicotine is known as the drug that is responsible for the addicted behaviour of tobacco users, but it has poor reinforcing effects when administered alone. Tobacco product design features enhance abuse liability by (A) optimising the dynamic delivery of nicotine to central nervous system receptors, and affecting smokers’ withdrawal symptoms, mood and behaviour; and (B) effecting conditioned learning, through sensory cues, including aroma, touch and visual stimulation, to create perceptions of pending nicotine reward. This study examines the use of additives called ‘pyrazines’, which may enhance abuse potential, their introduction in ‘lights’ and subsequently in the highly market successful Marlboro Lights (Gold) cigarettes and eventually many major brands. Methods We conducted internal tobacco industry research using online databases in conjunction with published scientific literature research, based on an iterative feedback process. Results Tobacco manufacturers developed the use of a range of compounds, including pyrazines, in order to enhance ‘light’ cigarette products’ acceptance and sales. Pyrazines with chemosensory and pharmacological effects were incorporated in the first ‘full-flavour, low-tar’ product achieving high market success. Such additives may enhance dependence by helping to optimise nicotine delivery and dosing and through cueing and learned behaviour. Conclusions Cigarette additives and ingredients with chemosensory effects that promote addiction by acting synergistically with nicotine, increasing product appeal, easing smoking initiation, discouraging cessation or promoting relapse should be regulated by the US Food and Drug Administration. Current models of tobacco abuse liability could be revised to include more explicit roles with regard to non-nicotine constituents that enhance abuse potential. PMID:26063608

  10. Additional follow-up telephone counselling and initial smoking relapse: a longitudinal, controlled study

    PubMed Central

    Wu, Lei; He, Yao; Jiang, Bin; Zuo, Fang; Liu, Qinghui; Zhang, Li; Zhou, Changxi

    2016-01-01

    Objectives Smoking cessation services can help smokers to quit; however, many smoking relapse cases occur over time. Initial relapse prevention should play an important role in achieving the goal of long-term smoking cessation. Several studies have focused on the effect of extended telephone support in relapse prevention, but the conclusions remain conflicting. Design and setting From October 2008 to August 2013, a longitudinal, controlled study was performed in a large general hospital of Beijing. Participants The smokers who sought treatment at our smoking cessation clinic were non-randomised and divided into 2 groups: face-to-face individual counselling group (FC group), and face-to-face individual counselling plus telephone follow-up counselling group (FCF group). No pharmacotherapy was offered. Outcomes The timing of initial smoking relapse was compared between FC and FCF groups. Predictors of initial relapse were investigated during the first 180 days, using the Cox proportional hazards model. Results Of 547 eligible male smokers who volunteered to participate, 457 participants (117 in FC group and 340 in FCF group) achieved at least 24 h abstinence. The majority of the lapse episodes occurred during the first 2 weeks after the quit date. Smokers who did not receive the follow-up telephone counselling (FC group) tended to relapse to smoking earlier than those smokers who received the additional follow-up telephone counselling (FCF group), and the log-rank test was statistically significant (p=0.003). A Cox regression model showed that, in the FCF group, being married, and having a lower Fagerström test score, normal body mass index and doctor-diagnosed tobacco-related chronic diseases, were significantly independent protective predictors of smoking relapse. Conclusions Within the limitations of this study, it can be concluded that additional follow-up telephone counselling might be an effective strategy in preventing relapse. Further research is still

  11. A Mechanistic Study of Halogen Addition and Photoelimination from π-Conjugated Tellurophenes.

    PubMed

    Carrera, Elisa I; Lanterna, Anabel E; Lough, Alan J; Scaiano, Juan C; Seferos, Dwight S

    2016-03-01

    The ability to drive reactivity using visible light is of importance for many disciplines of chemistry and has significant implications for sustainable chemistry. Identifying photochemically active compounds and understanding photochemical mechanisms is important for the development of useful materials for synthesis and catalysis. Here we report a series of photoactive diphenyltellurophene compounds bearing electron-withdrawing and electron-donating substituents synthesized by alkyne coupling/ring closing or palladium-catalyzed ipso-arylation chemistry. The redox chemistry of these compounds was studied with respect to oxidative addition and photoelimination of bromine, which is of importance for energy storage reactions involving X2. The oxidative addition reaction mechanism was studied using density functional theory, the results of which support a three-step mechanism involving the formation of an initial η(1) association complex, a monobrominated intermediate, and finally the dibrominated product. All of the tellurophene derivatives undergo photoreduction using 430, 447, or 617 nm light depending on the absorption properties of the compound. Compounds bearing electron-withdrawing substituents have the highest photochemical quantum efficiencies in the presence of an alkene trap, with efficiencies of up to 42.4% for a pentafluorophenyl-functionalized tellurophene. The photoelimination reaction was studied in detail through bromine trapping experiments and laser flash photolysis, and a mechanism is proposed. The photoreaction, which occurs by release of bromine radicals, is competitive with intersystem crossing to the triplet state of the brominated species, as evidenced by the formation of singlet oxygen. These findings should be useful for the design of new photochemically active compounds supported by main-group elements. PMID:26853739

  12. Study of mandible reconstruction using a fibula flap with application of additive manufacturing technology

    PubMed Central

    2014-01-01

    Background This study aimed to establish surgical guiding techniques for completing mandible lesion resection and reconstruction of the mandible defect area with fibula sections in one surgery by applying additive manufacturing technology, which can reduce the surgical duration and enhance the surgical accuracy and success rate. Methods A computer assisted mandible reconstruction planning (CAMRP) program was used to calculate the optimal cutting length and number of fibula pieces and design the fixtures for mandible cutting, registration, and arrangement of the fibula segments. The mandible cutting and registering fixtures were then generated using an additive manufacturing system. The CAMRP calculated the optimal fibula cutting length and number of segments based on the location and length of the defective portion of the mandible. The mandible cutting jig was generated according to the boundary surface of the lesion resection on the mandible STL model. The fibular cutting fixture was based on the length of each segment, and the registered fixture was used to quickly arrange the fibula pieces into the shape of the defect area. In this study, the mandibular lesion was reconstructed using registered fibular sections in one step, and the method is very easy to perform. Results and conclusion The application of additive manufacturing technology provided customized models and the cutting fixtures and registered fixtures, which can improve the efficiency of clinical application. This study showed that the cutting fixture helped to rapidly complete lesion resection and fibula cutting, and the registered fixture enabled arrangement of the fibula pieces and allowed completion of the mandible reconstruction in a timely manner. Our method can overcome the disadvantages of traditional surgery, which requires a long and different course of treatment and is liable to cause error. With the help of optimal cutting planning by the CAMRP and the 3D printed mandible resection jig and

  13. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region. PMID:23242683

  14. Covalent binding of aniline to humic substances. 2. 15N NMR studies of nucleophilic addition reactions

    USGS Publications Warehouse

    Thorn, K.A.; Pettigrew, P.J.; Goldenberg, W.S.; Weber, E.J.

    1996-01-01

    Aromatic amines are known to undergo covalent binding with humic substances in the environment. Although previous studies have examined reaction conditions and proposed mechanisms, there has been no direct spectroscopic evidence for the covalent binding of the amines to the functional groups in humic substances. In order to further elucidate the reaction mechanisms, the Suwannee River and IHSS soil fulvic and humic acids were reacted with 15N-labeled aniline at pH 6 and analyzed using 15N NMR spectrometry. Aniline underwent nucleophilic addition reactions with the quinone and other carbonyl groups in the samples and became incorporated in the form of anilinohydroquinone, anilinoquinone, anilide, imine, and heterocyclic nitrogen, the latter comprising 50% or more of the bound amine. The anilide and anilinohydroquinone nitrogens were determined to be susceptible to chemical exchange by ammonia. In the case of Suwannee River fulvic acid, reaction under anoxic conditions and pretreatment with sodium borohydride or hydroxylamine prior to reaction under oxic conditions resulted in a decrease in the proportion of anilinohydroquinone nitrogen incorporated. The relative decrease in the incorporation of anilinohydroquinone nitrogen with respect to anilinoquinone nitrogen under anoxic conditions suggested that inter- or intramolecular redox reactions accompanied the nucleophilic addition reactions.

  15. Toxicogenomics concepts and applications to study hepatic effects of food additives and chemicals

    SciTech Connect

    Stierum, Rob . E-mail: stierum@voeding.tno.nl; Heijne, Wilbert; Kienhuis, Anne; Ommen, Ben van; Groten, John

    2005-09-01

    Transcriptomics, proteomics and metabolomics are genomics technologies with great potential in toxicological sciences. Toxicogenomics involves the integration of conventional toxicological examinations with gene, protein or metabolite expression profiles. An overview together with selected examples of the possibilities of genomics in toxicology is given. The expectations raised by toxicogenomics are earlier and more sensitive detection of toxicity. Furthermore, toxicogenomics will provide a better understanding of the mechanism of toxicity and may facilitate the prediction of toxicity of unknown compounds. Mechanism-based markers of toxicity can be discovered and improved interspecies and in vitro-in vivo extrapolations will drive model developments in toxicology. Toxicological assessment of chemical mixtures will benefit from the new molecular biological tools. In our laboratory, toxicogenomics is predominantly applied for elucidation of mechanisms of action and discovery of novel pathway-supported mechanism-based markers of liver toxicity. In addition, we aim to integrate transcriptome, proteome and metabolome data, supported by bioinformatics to develop a systems biology approach for toxicology. Transcriptomics and proteomics studies on bromobenzene-mediated hepatotoxicity in the rat are discussed. Finally, an example is shown in which gene expression profiling together with conventional biochemistry led to the discovery of novel markers for the hepatic effects of the food additives butylated hydroxytoluene, curcumin, propyl gallate and thiabendazole.

  16. Assessment of Nano Cellulose from Peach Palm Residue as Potential Food Additive: Part II: Preliminary Studies.

    PubMed

    Andrade, Dayanne Regina Mendes; Mendonça, Márcia Helena; Helm, Cristiane Vieira; Magalhães, Washington L E; de Muniz, Graciela Ines Bonzon; Kestur, Satyanarayana G

    2015-09-01

    High consumption of dietary fibers in the diet is related to the reduction of the risk of non-transmitting of chronic diseases, prevention of the constipation etc. Rich diets in dietary fibers promote beneficial effects for the metabolism. Considering the above and recognizing the multifaceted advantages of nano materials, there have been many attempts in recent times to use the nano materials in the food sector including as food additive. However, whenever new product for human and animal consumption is developed, it has to be tested for their effectiveness regarding improvement in the health of consumers, safety aspects and side effects. However, before it is tried with human beings, normally such materials would be assessed through biological tests on a living organism to understand its effect on health condition of the consumer. Accordingly, based on the authors' finding reported in a previous paper, this paper presents body weight, biochemical (glucose, cholesterol and lipid profile in blood, analysis of feces) and histological tests carried out with biomass based cellulose nano fibrils prepared by the authors for its possible use as food additive. Preliminary results of the study with mice have clearly brought out potential of these fibers for the said purpose. PMID:26344977

  17. Feasibility study on using fast calorimetry technique to measure a mass attribute as part of a treaty verification regime

    SciTech Connect

    Hauck, Danielle K; Bracken, David S; Mac Arthur, Duncan W; Santi, Peter A; Thron, Jonathan

    2010-01-01

    The attribute measurement technique provides a method for determining whether or not an item containing special nuclear material (SNM) possesses attributes that fall within an agreed upon range of values. One potential attribute is whether the mass of an SNM item is larger than some threshold value that has been negotiated as part of a nonproliferation treaty. While the historical focus on measuring mass attributes has been on using neutron measurements, calorimetry measurements may be a viable alternative for measuring mass attributes for plutonium-bearing items. Traditionally, calorimetry measurements have provided a highly precise and accurate determination of the thermal power that is being generated by an item. In order to achieve this high level of precision and accuracy, the item must reach thermal equilibrium inside the calorimeter prior to determining the thermal power of the item. Because the approach to thermal equilibrium is exponential in nature, a large portion of the time spent approaching equilibrium is spent with the measurement being within {approx}10% of its final equilibrium value inside the calorimeter. Since a mass attribute measurement only needs to positively determine if the mass of a given SNM item is greater than a threshold value, performing a short calorimetry measurement to determine how the system is approaching thermal equilibrium may provide sufficient information to determine if an item has a larger mass than the agreed upon threshold. In previous research into a fast calorimetry attribute technique, a two-dimensional heat flow model of a calorimeter was used to investigate the possibility of determining a mass attribute for plutonium-bearing items using this technique. While the results of this study looked favorable for developing a fast calorimetry attribute technique, additional work was needed to determine the accuracy of the model used to make the calculations. In this paper, the results from the current work investigating

  18. Comparative study of dimensional accuracy of different impression techniques using addition silicone impression material.

    PubMed

    Penaflor, C F; Semacio, R C; De Las Alas, L T; Uy, H G

    1998-01-01

    This study compared dimensional accuracy of the single, double with spacer, double with cut-out and double mix impression technique using addition silicone impression material. A typhodont containing Ivorine teeth model with six (6) full-crown tooth preparations were used as the positive control. Two stone replication models for each impression technique were made as test materials. Accuracy of the techniques were assessed by measuring four dimensions on the stone dies poured from the impression of the Ivorine teeth model. Results indicated that most of the measurements for the height, width and diameter slightly decreased and a few increased compared with the Ivorine teeth model. The double with cut-out and double mix technique presents the least difference from the master model as compared to the two latter impression techniques. PMID:10202524

  19. Spectroscopic studies of nucleic acid additions during seed-mediated growth of gold nanoparticles

    PubMed Central

    Tapp, Maeling; Sullivan, Rick; Dennis, Patrick; Naik, Rajesh R.

    2015-01-01

    The effect of adding nucleic acids to gold seeds during the growth stage of either nanospheres or nanorods was investigated using UV-Vis spectroscopy to reveal any oligonucleotide base or structure-specific effects on nanoparticle growth kinetics or plasmonic signatures. Spectral data indicate that the presence of DNA duplexes during seed ageing drastically accelerated nanosphere growth while the addition of single-stranded polyadenine at any point during seed ageing induces nanosphere aggregation. For seeds added to a gold nanorod growth solution, single-stranded polythymine induces a modest blue-shift in the longitudinal peak wavelength. Moreover, a particular sequence comprised of 50% thymine bases was found to induce a faster, more dramatic blue-shift in the longitudinal peak wavelength compared to any of the homopolymer incubation cases. Monomeric forms of the nucleic acids, however, do not yield discernable spectral differences in any of the gold suspensions studied. PMID:25960601

  20. Study on Friction and Wear Properties of Silver Matrix Brush Material with Different Additives

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoli; Wang, Wenfang; Hong, Yu; Wu, Yucheng

    2013-07-01

    Friction and wear processes of AgCuX (G, CF and AlN) composites-CuAgV alloy friction pair and effects of different additive content in silver based composite on friction and wear behavior are studied in this paper. The microstructure of the brush wear surface is observed by SEM. The results show that when graphite content is up to 9 wt.%, Ag-Cu-CF-G composite exhibits the best wear properties; when the content of aluminum nitride is up to 0.5 wt.%, Ag-Cu-AlN-G composites has the most comprehensive performance. The wear loss of both composites arises with the increase of both pressure and speed, but when speed reaches a critical value, the increased amplitude of wear loss tends to be steady.

  1. Genetic assessment of additional endophenotypes from the Consortium on the Genetics of Schizophrenia Family Study.

    PubMed

    Greenwood, Tiffany A; Lazzeroni, Laura C; Calkins, Monica E; Freedman, Robert; Green, Michael F; Gur, Raquel E; Gur, Ruben C; Light, Gregory A; Nuechterlein, Keith H; Olincy, Ann; Radant, Allen D; Seidman, Larry J; Siever, Larry J; Silverman, Jeremy M; Stone, William S; Sugar, Catherine A; Swerdlow, Neal R; Tsuang, Debby W; Tsuang, Ming T; Turetsky, Bruce I; Braff, David L

    2016-01-01

    The Consortium on the Genetics of Schizophrenia Family Study (COGS-1) has previously reported our efforts to characterize the genetic architecture of 12 primary endophenotypes for schizophrenia. We now report the characterization of 13 additional measures derived from the same endophenotype test paradigms in the COGS-1 families. Nine of the measures were found to discriminate between schizophrenia patients and controls, were significantly heritable (31 to 62%), and were sufficiently independent of previously assessed endophenotypes, demonstrating utility as additional endophenotypes. Genotyping via a custom array of 1536 SNPs from 94 candidate genes identified associations for CTNNA2, ERBB4, GRID1, GRID2, GRIK3, GRIK4, GRIN2B, NOS1AP, NRG1, and RELN across multiple endophenotypes. An experiment-wide p value of 0.003 suggested that the associations across all SNPs and endophenotypes collectively exceeded chance. Linkage analyses performed using a genome-wide SNP array further identified significant or suggestive linkage for six of the candidate endophenotypes, with several genes of interest located beneath the linkage peaks (e.g., CSMD1, DISC1, DLGAP2, GRIK2, GRIN3A, and SLC6A3). While the partial convergence of the association and linkage likely reflects differences in density of gene coverage provided by the distinct genotyping platforms, it is also likely an indication of the differential contribution of rare and common variants for some genes and methodological differences in detection ability. Still, many of the genes implicated by COGS through endophenotypes have been identified by independent studies of common, rare, and de novo variation in schizophrenia, all converging on a functional genetic network related to glutamatergic neurotransmission that warrants further investigation. PMID:26597662

  2. Serum Potassium and Glucose Regulation in the ADDITION-Leicester Screening Study

    PubMed Central

    Carter, Patrice; Bodicoat, Danielle H.; Quinn, Lauren M.; Zaccardi, Francesco; Webb, David R.; Khunti, Kamlesh; Davies, Melanie J.

    2015-01-01

    Introduction. Previous observational studies have shown conflicting results between plasma K+ concentrations and risk of type 2 diabetes. To help clarify the evidence we aimed to determine whether an association existed between serum K+ and glucose regulation within a UK multiethnic population. Methods. Participants were recruited as part of the ADDITION Leicester study, a population based screening study. Individuals from primary care between the age of 40 and 75 years if White European or 25 and 75 years if South Asian or Afro Caribbean were recruited. Tests for associations between baseline characteristics and K+ quartiles were conducted using linear regression models. Results. Data showed individuals in the lowest K+ quartile had significantly greater 2-hour glucose levels (0.53 mmol/L, 95% CI: 0.36 to 0.70, P ≤ 0.001) than those in the highest K+ quartile. This estimation did not change with adjustment for potential confounders. Conversely, participants in the lowest K+ quartile had a 0.14% lower HbA1c (95% CI −0.19 to −0.10: P ≤ 0.001) compared to those in the highest K+ quartile. Conclusion. This cross-sectional analysis demonstrated that lower K+ was associated with greater 2 hr glucose. The data supports the possibility that K+ may influence glucose regulation and further research is warranted. PMID:25883988

  3. Study of triallyl phosphate as an electrolyte additive for high voltage lithium-ion cells

    NASA Astrophysics Data System (ADS)

    Xia, J.; Madec, L.; Ma, L.; Ellis, L. D.; Qiu, W.; Nelson, K. J.; Lu, Z.; Dahn, J. R.

    2015-11-01

    The role of triallyl phosphate as an electrolyte additive in Li(Ni0.42Mn0.42Co0.16)O2/graphite pouch cells was studied using ex-situ gas measurements, ultra high precision coulometry, automated storage experiments, electrochemical impedance spectroscopy, long-term cycling and X-ray photoelectron spectroscopy. Cells containing triallyl phosphate produced less gas during formation, cycling and storage than control cells. The use of triallyl phosphate led to higher coulombic efficiency and smaller charge endpoint capacity slippage during ultra high precision charger testing. Cells containing triallyl phosphate showed smaller potential drop during 500 h storage at 40 °C and 60 °C and the voltage drop decreased as the triallyl phosphate content in the electrolyte increased. However, large amounts of triallyl phosphate (>3% by weight in the electrolyte) led to large impedance after cycling and storage. Symmetric cell studies showed large amounts of triallyl phosphate (5% or more) led to significant impedance increase at both negative and positive electrodes. X-ray photoelectron spectroscopy studies suggested that the high impedance came from the polymerization of triallyl phosphate molecules which formed thick solid electrolyte interphase films at the surfaces of both negative and positive electrodes. An optimal amount of 2%-3% triallyl phosphate led to better capacity retention during long term cycling.

  4. A study on the relationship between the protein supplements intake satisfaction level and repurchase intention: Verification of mediation effects of word-of-mouth intention.

    PubMed

    Kim, Ill-Gwang

    2016-05-18

    The purpose of this study is to examine the relationship between the protein supplements intake satisfaction level and repurchase intention of university students majoring in physical education and verify the mediation effects of word-of-mouth intention. To achieve the purpose of this study, 700 university students majoring in physical education from 10 universities in Korea were selected from October 2013 to December 2013 as the target of this study through the cluster random sampling and data of 228 university students who had experience in the intake of protein supplements among them was analyzed. The composite reliability of each factor was in between 0.869 and 0.958, and the convergent validity and discriminant validity were verified. SPSS 18.0 and Amos 22.0 were utilized as data processing methods and the verification of significance on the medication effects and indirect effects of word-of-mouth intention was carried out using the frequency analysis, correlation analysis, CFA, SEM, and Amos bootstrapping. The result is as follows. The protein supplements intake satisfaction level had a positive effect on the word-of-mouth intention and the word-of-mouth intention had a positive effect on the repurchase intention. Also, it was shown that the word-of-mouth intention played a full mediation role between the intake satisfaction level and the repurchase intention. PMID:26684403

  5. SU-E-T-600: Patient Specific IMRT Verification Using a Phosphor-Screen Based Geometric QA System: A Preliminary Study

    SciTech Connect

    Lee, M; Hu, E; Yi, B

    2015-06-15

    Purpose: Raven QA (JPLC, MD) is a unified and comprehensive quality assurance system for QA of TG-142, which use a phosphor screen, a mirror system and a camera. It is to test if this device can be used for IMRT QA dosimetry. Methods: A lung IMRT case is used deliver dose to Raven QA. Accuracy of dose distribution of 5cm slab phantom using Eclipse planning system (Varian) has been confirmed both from a Monte Carlo Simulation and from a MapCheck (SunNuclear) measurement. Geometric distortion and variation of spatial dose response are corrected after background subtraction. A pin-hole grid plate is designed and used to determine the light scatter in the Raven QA box and the spatial dose response. Optic scatter model was not applied in this preliminary study. Dose is normalized to the response of the 10×10 field and the TMR of 5cm depth was considered. Results: Time to setup the device for IMRT QA takes less than 5 minutes as other commercially available devices. It shows excellent dose linearity and dose rate independent, less than 1 %. Background signal, however, changes for different field sizes. It is believed to be due to inaccurate correction of optic scatter. Absolute gamma (5%, 5mm) passing rate was higher than 95%. Conclusion: This study proves that the Raven QA can be used for a patient specific IMRT verification. Part of this study is supported by the Maryland Industrial Partnership Grant.

  6. Experimental Study of Disruption of Columnar Grains During Rapid Solidification in Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Manogharan, Guha; Yelamanchi, Bharat; Aman, Ronald; Mahbooba, Zaynab

    2016-03-01

    Over the years, many studies have been conducted to study and analyze the grain structures of metal alloys during additive manufacturing to improve mechanical properties. In particular, columnar grains are observed predominantly during rapid solidification of molten metal. This leads to lower mechanical properties and requires expensive secondary heat-treatment processes. This study is aimed at disrupting the formation of columnar grain growth during rapid solidification using ultrasonic vibration and analyzes the effects on grain structure and mechanical properties. A gas-metal arc welder mounted on a Rep-Rap-based low-cost metal 3 Dimension printer was used to deposit ER70S-6 mild steel layers on a plate. A contact-type ultrasonic transducer with a control system to vary the frequency and power of the vibration was used. The effects of ultrasonic vibration were determined from the statistical analysis of microstructure and micro-indentation techniques on the deposited layer and heat-affected zone. It was found that both frequency and interaction between frequency and power had significant impact on the refinement of average grain size up to 10.64% and increased the number of grains by approximately 41.78%. Analysis of micro-indentation tests showed that there was an increase of approximately 14.30% in micro-hardness due to the applied frequency during rapid solidification. A pole diagram shows that application of vibration causes randomization of grain orientation. Along with the results from this study, further efforts in modeling and experimentation of multi-directional vibrations would lead to a better understanding of disrupting columnar grains in applications that use mechanical vibrations, such as welding, directed energy deposition, brazing, etc.

  7. Synthesis, Characterization, Molecular Modeling, and DNA Interaction Studies of Copper Complex Containing Food Additive Carmoisine Dye.

    PubMed

    Shahabadi, Nahid; Akbari, Alireza; Jamshidbeigi, Mina; Khodarahmi, Reza

    2016-06-01

    A copper complex of carmoisine dye; [Cu(carmoisine)2(H2O)2]; was synthesized and characterized by using physico-chemical and spectroscopic methods. The binding of this complex with calf thymus (ct) DNA was investigated by circular dichroism, absorption studies, emission spectroscopy, and viscosity measurements. UV-vis results confirmed that the Cu complex interacted with DNA to form a ground-state complex and the observed binding constant (2× 10(4) M(-1)) is more in keeping with the groove bindings with DNA. Furthermore, the viscosity measurement result showed that the addition of complex causes no significant change on DNA viscosity and it indicated that the intercalation mode is ruled out. The thermodynamic parameters are calculated by van't Hoff equation, which demonstrated that hydrogen bonds and van der Waals interactions played major roles in the reaction. The results of circular dichroism (CD) suggested that the complex can change the conformation of DNA from B-like form toward A-like conformation. The cytotoxicity studies of the carmoisine dye and its copper complex indicated that both of them had anticancer effects on HT-29 (colon cancer) cell line and they may be new candidates for treatment of the colon cancer. PMID:27152751

  8. Percutaneous Dorsal Instrumentation of Vertebral Burst Fractures: Value of Additional Percutaneous Intravertebral Reposition—Cadaver Study

    PubMed Central

    Krüger, Antonio; Schmuck, Maya; Noriega, David C.; Ruchholtz, Steffen; Baroud, Gamal; Oberkircher, Ludwig

    2015-01-01

    Purpose. The treatment of vertebral burst fractures is still controversial. The aim of the study is to evaluate the purpose of additional percutaneous intravertebral reduction when combined with dorsal instrumentation. Methods. In this biomechanical cadaver study twenty-eight spine segments (T11-L3) were used (male donors, mean age 64.9 ± 6.5 years). Burst fractures of L1 were generated using a standardised protocol. After fracture all spines were allocated to four similar groups and randomised according to surgical techniques (posterior instrumentation; posterior instrumentation + intravertebral reduction device + cement augmentation; posterior instrumentation + intravertebral reduction device without cement; and intravertebral reduction device + cement augmentation). After treatment, 100000 cycles (100–600 N, 3 Hz) were applied using a servohydraulic loading frame. Results. Overall anatomical restoration was better in all groups where the intravertebral reduction device was used (p < 0.05). In particular, it was possible to restore central endplates (p > 0.05). All techniques decreased narrowing of the spinal canal. After loading, clearance could be maintained in all groups fitted with the intravertebral reduction device. Narrowing increased in the group treated with dorsal instrumentation. Conclusions. For height and anatomical restoration, the combination of an intravertebral reduction device with dorsal instrumentation showed significantly better results than sole dorsal instrumentation. PMID:26137481

  9. Density functional theory study of the effects of alloying additions on sulfur adsorption on nickel surfaces

    NASA Astrophysics Data System (ADS)

    Malyi, Oleksandr I.; Chen, Zhong; Kulish, Vadym V.; Bai, Kewu; Wu, Ping

    2013-01-01

    Reactions of hydrogen sulfide (H2S) with Nickel/Ytrria-doped zirconia (Ni/YDZ) anode materials might cause degradation of the performance of solid oxide fuel cells when S containing fuels are used. In this paper, we employ density functional theory to investigate S adsorption on metal (M)-doped and undoped Ni(0 0 1) and Ni(1 1 1) surfaces. Based on the performed calculations, we analyze the effects of 12 alloying additions (Ag, Au, Al, Bi, Cd, Co, Cu, Fe, Sn, Sb, V, and Zn) on the temperature of transition between clean (S atoms do not adsorb on the surfaces) and contaminated (S atoms can adsorb on the surfaces spontaneously) M-doped Ni surfaces for different concentrations of H2S in the fuel. Predicted results are consistent with many experimental studies relevant to S poisoning of both Ni/YDZ and M-doped Ni/YDZ anode materials. This study is important to understand S poisoning phenomena and to develop new S tolerant anode materials.

  10. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.

  11. Implications of Non-Systematic Observations for Verification of Forecasts of Aviation Weather Variables

    NASA Astrophysics Data System (ADS)

    Brown, B. G.; Young, G. S.; Fowler, T. L.

    2001-12-01

    Over the last several years, efforts have been undertaken to develop improved automated forecasts of weather phenomena that have large impacts on aviation, including turbulence and in-flight icing conditions. Verification of these forecasts - which has played a major role in their development - is difficult due to the nature of the limited observations available for these evaluations; in particular, voice reports by pilots (PIREPs). These reports, which are provided inconsistently by pilots, currently are the best observations of turbulence and in-flight icing conditions available. However, their sampling characteristics make PIREPs a difficult dataset to use for these evaluations. In particular, PIREPs have temporal and spatial biases (e.g., they are more frequent during daylight hours, and they occur most frequently along flight routes and in the vicinity of major airports, where aircraft are concentrated), and they are subjective. Most importantly, the observations are non-systematic. That is, observations are not consistently reported at the same location and time. This characteristic of the reports has numerous implications for the verification of forecasts of these phenomena. In particular, it is inappropriate to estimate certain common verification statistics that normally are of interest in forecast evaluations. For example, estimates of the false alarm ratio and critical success index are incorrect, due to the unrepresentativeness of the observations. Analytical explanations for this result have been developed, and the magnitudes of the errors associated with estimating these statistics have been estimated through Monte Carlo simulations. In addition, several approaches have been developed to compensate for these characteristics of PIREPs in verification studies, including methods for estimating confidence intervals for the verification statistics, which take into account their sampling variability. These approaches also have implications for verification

  12. Bone Marrow Stromal Antigen 2 Is a Novel Plasma Biomarker and Prognosticator for Colorectal Carcinoma: A Secretome-Based Verification Study

    PubMed Central

    Chiang, Sum-Fu; Kan, Chih-Yen; Hsiao, Yung-Chin; Tang, Reiping; Hsieh, Ling-Ling; Chiang, Jy-Ming; Tsai, Wen-Sy; Yeh, Chien-Yuh; Hsieh, Pao-Shiu; Liang, Ying; Chen, Jinn-Shiun; Yu, Jau-Song

    2015-01-01

    Background. The cancer cell secretome has been recognized as a valuable reservoir for identifying novel serum/plasma biomarkers for different cancers, including colorectal cancer (CRC). This study aimed to verify four CRC cell-secreted proteins (tumor-associated calcium signal transducer 2/trophoblast cell surface antigen 2 (TACSTD2/TROP2), tetraspanin-6 (TSPAN6), bone marrow stromal antigen 2 (BST2), and tumor necrosis factor receptor superfamily member 16 (NGFR)) as potential plasma CRC biomarkers. Methods. The study population comprises 152 CRC patients and 152 controls. Target protein levels in plasma and tissue samples were assessed by ELISA and immunohistochemistry, respectively. Results. Among the four candidate proteins examined by ELISA in a small sample set, only BST2 showed significantly elevated plasma levels in CRC patients versus controls. Immunohistochemical analysis revealed the overexpression of BST2 in CRC tissues, and higher BST2 expression levels correlated with poorer 5-year survival (46.47% versus 65.57%; p = 0.044). Further verification confirmed the elevated plasma BST2 levels in CRC patients (2.35 ± 0.13 ng/mL) versus controls (1.04 ± 0.03 ng/mL) (p < 0.01), with an area under the ROC curve (AUC) being 0.858 comparable to that of CEA (0.867). Conclusion. BST2, a membrane protein selectively detected in CRC cell secretome, may be a novel plasma biomarker and prognosticator for CRC. PMID:26494939

  13. Using a multi-scale approach to identify and quantify oil and gas emissions: a case study for GHG emissions verification

    NASA Astrophysics Data System (ADS)

    Sobel, A. H.; Daleu, C. L.; Woolnough, S. J.; Plant, R.; Raymond, D. J.; Sessions, S. L.; Wang, S.; Bellon, G.

    2014-12-01

    Along with a boom in oil and natural gas production in the US, there has been a substantial effort to understand the true environmental impact of these operations on air and water quality, as well asnet radiation balance. This multi-institution effort funded by both governmental and non-governmental agencies has provided a case study for identification and verification of emissions using a multi-scale, top-down approach. This approach leverages a combination of remote sensing to identify areas that need specific focus and airborne in-situ measurements to quantify both regional and large- to mid-size single-point emitters. Ground-based networks of mobile and stationary measurements provide the bottom tier of measurements from which process-level information can be gathered to better understand the specific sources and temporal distribution of the emitters. The motivation for this type of approach is largely driven by recent work in the Barnett Shale region in Texas as well as the San Juan Basin in New Mexico and Colorado; these studies suggest that relatively few single-point emitters dominate the regional emissions of CH4.

  14. Using a multi-scale approach to identify and quantify oil and gas emissions: a case study for GHG emissions verification

    NASA Astrophysics Data System (ADS)

    Sweeney, C.; Kort, E. A.; Rella, C.; Conley, S. A.; Karion, A.; Lauvaux, T.; Frankenberg, C.

    2015-12-01

    Along with a boom in oil and natural gas production in the US, there has been a substantial effort to understand the true environmental impact of these operations on air and water quality, as well asnet radiation balance. This multi-institution effort funded by both governmental and non-governmental agencies has provided a case study for identification and verification of emissions using a multi-scale, top-down approach. This approach leverages a combination of remote sensing to identify areas that need specific focus and airborne in-situ measurements to quantify both regional and large- to mid-size single-point emitters. Ground-based networks of mobile and stationary measurements provide the bottom tier of measurements from which process-level information can be gathered to better understand the specific sources and temporal distribution of the emitters. The motivation for this type of approach is largely driven by recent work in the Barnett Shale region in Texas as well as the San Juan Basin in New Mexico and Colorado; these studies suggest that relatively few single-point emitters dominate the regional emissions of CH4.

  15. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  16. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  17. Using laser diagnostics for studying the injection of a dry additive

    NASA Astrophysics Data System (ADS)

    Tokunov, Iu. M.; Zhilin, V. G.; Liuliukin, V. I.; Mostinskii, I. L.; Putin, Iu. A.

    1987-02-01

    In MHD generators using an ionizing additive, the uniform injection of the additive into the combustion chamber and the dispersity of the injected particles are important. Some of the problems associated with the injection of an ionizing additive can be alleviated by using dry additives. In the experiment reported here, an He-Ne laser was used to monitor the injection of potash powder with an average size of 40 microns. It is shown that laser diagnostics can be successfully used to determine the mean particle diameter and variation of the powder flow rate with time.

  18. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  19. Evaluation of additional head of biceps brachii: a study with autopsy material.

    PubMed

    Ballesteros, L E; Forero, P L; Buitrago, E R

    2014-05-01

    Additional head of the biceps brachii (AHBB) has been reported in different population groups with a frequency of 1-25%. The purpose of this study was to determine the incidence and morphologic expression of the AHBB as determined in a sample of the Colombian population. An exploration was conducted with 106 arms corresponding to unclaimed corpses autopsied at Institute of Legal and Forensic Medicine of Bucaramanga, Colombia. Using medial incision involvingskin, subcutaneous tissue, and brachial fascia, the heads of the biceps and their innervating branches were visualised. One AHBB was observed in 21 (19.8%) of the arms evaluated, with non-significant difference (p = 0.568) per side of presentation: 11 (52.4%) cases on the right side and 10 (47.6%) on the left side. All AHBBs were originated in the infero-medial segment of the humerus, with a mean thickness of 17.8 ± 6.8 mm. In 4 (19%) cases the fascicle was thin, less than 10 mm; in 7 (33.3%) cases it was of medium thickness, between 11 and 20 mm, whereas in 47.6% it was longer than 20 mm. The length of the AHBB was 118.3 ± 26.8 mm; its motor point supplied by the musculocutaneous nerve was located at 101.3 ± 20.9 mm of the bi-epicondylar line. The incidence of AHBB in this study is located at the upper segment of what has been reportedin the literature and could be a morphologic trait of the Colombian population; in agreement with prior studies, the origin was the infero-medial surface of the humerus. PMID:24902098

  20. Increased Risk of Additional Cancers Among Patients with Gastrointestinal Stromal Tumors: A Population-Based Study

    PubMed Central

    Murphy, James D.; Ma, Grace L.; Baumgartner, Joel M.; Madlensky, Lisa; Burgoyne, Adam M.; Tang, Chih-Min; Martinez, Maria Elena; Sicklick, Jason K.

    2015-01-01

    Purpose Most gastrointestinal stromal tumors (GIST) are considered non-hereditary or sporadic. However, single-institution studies suggest that GIST patients develop additional malignancies with increased frequencies. We hypothesized that we could gain greater insight into possible associations between GIST and other malignancies using a national cancer database inquiry. Methods Patients diagnosed with GIST (2001–2011) in the Surveillance, Epidemiology, and End Results database were included. Standardized prevalence ratios (SPRs) and standardized incidence ratios (SIRs) were used to quantify cancer risks incurred by GIST patients before and after GIST diagnoses, respectively, when compared with the general U.S. population. Results Of 6,112 GIST patients, 1,047 (17.1%) had additional cancers. There were significant increases in overall cancer rates: 44% (SPR=1.44) before diagnosis and 66% (SIR=1.66) after GIST diagnoses. Malignancies with significantly increased occurrence both before/after diagnoses included other sarcomas (SPR=5.24/SIR=4.02), neuroendocrine-carcinoid tumors (SPR=3.56/SIR=4.79), non-Hodgkin’s lymphoma (SPR=1.69/SIR=1.76), and colorectal adenocarcinoma (SPR=1.51/SIR=2.16). Esophageal adenocarcinoma (SPR=12.0), bladder adenocarcinoma (SPR=7.51), melanoma (SPR=1.46), and prostate adenocarcinoma (SPR=1.20) were significantly more common only before GIST. Ovarian carcinoma (SIR=8.72), small intestine adenocarcinoma (SIR=5.89), papillary thyroid cancer (SIR=5.16), renal cell carcinoma (SIR=4.46), hepatobiliary adenocarcinomas (SIR=3.10), gastric adenocarcinoma (SIR=2.70), pancreatic adenocarcinoma (SIR=2.03), uterine adenocarcinoma (SIR=1.96), non-small cell lung cancer (SIR=1.74), and transitional cell carcinoma of the bladder (SIR=1.65) were significantly more common only after GIST. Conclusion This is the first population-based study to characterize the associations and temporal relationships between GIST and other cancers, both by site and

  1. TFE verification program

    NASA Astrophysics Data System (ADS)

    1990-03-01

    The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a Thermionic Fuel Element (TFE) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program; (3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-88; and (5) Thermionic Program in 1986 and 1987.

  2. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  3. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  4. Additional Language Teaching within the International Baccalaureate Primary Years Programme: A Comparative Study

    ERIC Educational Resources Information Center

    Lebreton, Marlène

    2014-01-01

    The International Baccalaureate Primary Years Programme supports the learning of languages and cultures, but the role of the additional language within this programme is often unclear. There remains a great variability in schools regarding the frequency of lessons and the way that the additional language is taught within the Primary Years…

  5. Beyond the Call of Duty: A Qualitative Study of Teachers' Additional Responsibilities Related to Sexuality Education

    ERIC Educational Resources Information Center

    Eisenberg, Marla E.; Madsen, Nikki; Oliphant, Jennifer A.; Resnick, Michael

    2011-01-01

    Seven focus groups were conducted with sexuality educators in Minnesota to explore ways that teaching sexuality education differs from teaching other health education content and to determine if additional supports or resources are needed for sexuality educators. Teachers described many specific additional responsibilities or concerns related to…

  6. Experimental study of combustion characteristics of nanoscale metal and metal oxide additives in biofuel (ethanol)

    PubMed Central

    2011-01-01

    An experimental investigation of the combustion behavior of nano-aluminum (n-Al) and nano-aluminum oxide (n-Al2O3) particles stably suspended in biofuel (ethanol) as a secondary energy carrier was conducted. The heat of combustion (HoC) was studied using a modified static bomb calorimeter system. Combustion element composition and surface morphology were evaluated using a SEM/EDS system. N-Al and n-Al2O3 particles of 50- and 36-nm diameters, respectively, were utilized in this investigation. Combustion experiments were performed with volume fractions of 1, 3, 5, 7, and 10% for n-Al, and 0.5, 1, 3, and 5% for n-Al2O3. The results indicate that the amount of heat released from ethanol combustion increases almost linearly with n-Al concentration. N-Al volume fractions of 1 and 3% did not show enhancement in the average volumetric HoC, but higher volume fractions of 5, 7, and 10% increased the volumetric HoC by 5.82, 8.65, and 15.31%, respectively. N-Al2O3 and heavily passivated n-Al additives did not participate in combustion reactively, and there was no contribution from Al2O3 to the HoC in the tests. A combustion model that utilized Chemical Equilibrium with Applications was conducted as well and was shown to be in good agreement with the experimental results. PMID:21711760

  7. A theoretical study of wave dispersion and thermal conduction for HMX/additive interfaces

    NASA Astrophysics Data System (ADS)

    Long, Yao; Chen, Jun

    2014-04-01

    The wave dispersion rule for non-uniform material is useful for ultrasonic inspection and engine life prediction, and also is key in achieving an understanding of the energy dissipation and thermal conduction properties of solid material. On the basis of linear response theory and molecular dynamics, we derive a set of formulas for calculating the wave dispersion rate of interface systems, and study four kinds of interfaces inside plastic bonded explosives: HMX/{HMX, TATB, F2312, F2313}. (HMX: octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine; TATB: 1,3,5-triamino-2,4,6-trinitrobenzene; F2312, F2313: fluoropolymers). The wave dispersion rate is obtained over a wide frequency range from kHz to PHz. We find that at low frequency, the rate is proportional to the square of the frequency, and at high frequency, the rate couples with the molecular vibration modes at the interface. By using the results, the thermal conductivities of HMX/additive interfaces are derived, and a physical model is built for describing the total thermal conductivity of mixture explosives, including HMX multi-particle systems and {TATB, F2312, F2313}-coated HMX.

  8. Addition reaction of alkyl radical to C60 fullerene: Density functional theory study

    NASA Astrophysics Data System (ADS)

    Tachikawa, Hiroto; Kawabata, Hiroshi

    2016-02-01

    Functionalized fullerenes are known as a high-performance molecules. In this study, the alkyl-functionalized fullerenes (denoted by R-C60) have been investigated by means of the density functional theory (DFT) method to elucidate the effects of functionalization on the electronic states of fullerene. Also, the reaction mechanism of alkyl radicals with C60 was investigated. The methyl, ethyl, propyl, and butyl radicals (denoted by n = 1-4, where n means the number of carbon atoms in the alkyl radical) were examined as alkyl radicals. The DFT calculation showed that the alkyl radical binds to the carbon atom of C60 at the on-top site, and a strong C-C single bond is formed. The binding energies of alkyl radicals to C60 were distributed in the range of 31.8-35.1 kcal mol-1 at the CAM-B3LYP/6-311G(d,p) level. It was found that the activation barrier exists before alkyl addition, the barrier heights were calculated to be 2.1-2.8 kcal mol-1. The electronic states of R-C60 complexes were discussed on the basis of the theoretical results.

  9. Ecological Optimization and Parametric Study of an Irreversible Regenerative Modified Brayton Cycle with Isothermal Heat Addition

    NASA Astrophysics Data System (ADS)

    Tyagi, Sudhir K.; Kaushik, Subhash C.; Tiwari, Vivek

    2003-12-01

    An ecological optimization along with a detailed parametric study of an irreversible regenerative Brayton heat engine with isothermal heat addition have been carried out with external as well as internal irreversibilities. The ecological function is defined as the power output minus the power loss (irreversibility) which is ambient temperature times the entropy generation rate. The external irreversibility is due to finite temperature difference between the heat engine and the external reservoirs while the internal irreversibilities are due to nonisentropic compression and expansion processes in the compressor and the turbine respectively and the regenerative heat loss. The ecological function is found to be an increasing function of the isothermal-, sink- and regenerative-side effectiveness, isothermal-side inlet temperature, component efficiencies and sink-side temperature while it is found to be a decreasing function of the isobaric-side temperature and effectiveness and the working fluid heat capacitance rate. The effects of the isobaric-side effectiveness are found to be more than those of the other parameters and the effects of turbine efficiency are found to be more than those of the compressor efficiency on all the performance parameters of the cycle.

  10. Theoretical study of addition reactions of carbene, silylene, and germylene to carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Chu, Ying-Ying; Su, Ming-Der

    2004-08-01

    A theoretical study of the mechanism of the reaction of a single-walled carbon nanotube (SWCNT) with carbene (H 2C), silylene (H 2Si), and germylene (H 2Ge) has been carried out using a two-layered ONIOM(B3LYP/6-311G ∗:PM3) approach. The main findings are as follows: (1) The computational results based on the method used in this work are in good agreement with recent theoretical findings [Angew. Chem. Int. Ed. 41 (2002) 1853]. That is, SWCNTs with H 2C, H 2Si, and H 2Ge addends favor opened structures rather than three-membered rings. (2) The greater the atomic number of the carbene center, the larger the activation energy and the less exothermic (or the more endothermic) the cycloaddition reaction becomes. Therefore, addition to the C dbnd C bond of a SWCNT is more difficult the heavier the carbene center. (3) The theoretical observations suggest that the singlet-triplet splitting of a carbene can be used as a guide to its reactivity during the SWCNT cycloaddition process.

  11. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  12. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  13. Food additives.

    PubMed

    Berglund, F

    1978-01-01

    The use of additives to food fulfils many purposes, as shown by the index issued by the Codex Committee on Food Additives: Acids, bases and salts; Preservatives, Antioxidants and antioxidant synergists; Anticaking agents; Colours; Emulfifiers; Thickening agents; Flour-treatment agents; Extraction solvents; Carrier solvents; Flavours (synthetic); Flavour enhancers; Non-nutritive sweeteners; Processing aids; Enzyme preparations. Many additives occur naturally in foods, but this does not exclude toxicity at higher levels. Some food additives are nutrients, or even essential nutritents, e.g. NaCl. Examples are known of food additives causing toxicity in man even when used according to regulations, e.g. cobalt in beer. In other instances, poisoning has been due to carry-over, e.g. by nitrate in cheese whey - when used for artificial feed for infants. Poisonings also occur as the result of the permitted substance being added at too high levels, by accident or carelessness, e.g. nitrite in fish. Finally, there are examples of hypersensitivity to food additives, e.g. to tartrazine and other food colours. The toxicological evaluation, based on animal feeding studies, may be complicated by impurities, e.g. orthotoluene-sulfonamide in saccharin; by transformation or disappearance of the additive in food processing in storage, e.g. bisulfite in raisins; by reaction products with food constituents, e.g. formation of ethylurethane from diethyl pyrocarbonate; by metabolic transformation products, e.g. formation in the gut of cyclohexylamine from cyclamate. Metabolic end products may differ in experimental animals and in man: guanylic acid and inosinic acid are metabolized to allantoin in the rat but to uric acid in man. The magnitude of the safety margin in man of the Acceptable Daily Intake (ADI) is not identical to the "safety factor" used when calculating the ADI. The symptoms of Chinese Restaurant Syndrome, although not hazardous, furthermore illustrate that the whole ADI

  14. Verification and validation studies of the time-averaged velocity field in the very near-wake of a finite elliptical cylinder

    NASA Astrophysics Data System (ADS)

    Flynn, Michael R.; Eisner, Alfred D.

    2004-04-01

    This paper presents verification and validation results for the time-averaged, three-dimensional velocity field immediately downstream of a finite elliptic cylinder at a Reynolds number of 1.35 × 10 4. Numerical simulations were performed with the finite element package, Fidap, using the steady state, standard k-epsilon model. The ratio of the cylinder height to the major axis of the elliptical cross section is 5.0; the aspect ratio of the cross section is 0.5625. This particular geometry is selected as a crude surrogate for the human form in consideration of further applied occupational and environmental health studies. Predictions of the velocity and turbulence kinetic energy fields in the very near-wake are compared to measurements taken in a wind tunnel using laser Doppler anemometry. Results show that at all locations where a reliable grid convergence index can be calculated there is not a demonstrable difference between simulated and measured values. The overall topology of the time-averaged flow field is reasonably well predicted, although the simulated near-wake is narrower than the measured one.

  15. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  16. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  17. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  18. Fortification of yogurts with different antioxidant preservatives: A comparative study between natural and synthetic additives.

    PubMed

    Caleja, Cristina; Barros, Lillian; Antonio, Amilcar L; Carocho, Márcio; Oliveira, M Beatriz P P; Ferreira, Isabel C F R

    2016-11-01

    Consumers demand more and more so-called "natural" products and, therefore, the aim of this work was to compare the effects of natural versus synthetic antioxidant preservatives in yogurts. Matricaria recutita L. (chamomile) and Foeniculum vulgare Mill. (fennel) decoctions were tested as natural additives, while potassium sorbate (E202) was used as a synthetic additive. The fortification of yogurts with natural and synthetic antioxidants did not cause significant changes in the yoghurt pH and nutritional value, in comparison with control samples (yogurt without any additive). However, the fortified yogurts showed higher antioxidant activity, mainly the yogurts with natural additives (and among these, the ones with chamomile decoction). Overall, it can be concluded that plant decoctions can be used to develop novel yogurts, by replacing synthetic preservatives and improving the antioxidant properties of the final product, without changing the nutritional profile. PMID:27211646

  19. ALUM ADDITION AND STEP-FEED STUDIES IN OXYGEN-ACTIVATED SLUDGE

    EPA Science Inventory

    A plug flow, O2-activated sludge process was operated with alum addition to remove phosphorus and with lime addition to prevent the process pH from decreasing below 6.4. The O2 reactor was operated at F/M ratios between 0.18 to 0.24 gm of BOD5/gm of MLVSS/day in a typical co-curr...

  20. Chemostat Studies of TCE-Dehalogenating Anaerobic Consortia under Excess and Limited Electron Donor Addition

    NASA Astrophysics Data System (ADS)

    Semprini, L.; Azizian, M.; Green, J.; Mayer-Blackwell, K.; Spormann, A. M.

    2015-12-01

    Two cultures - the Victoria Strain (VS) and the Evanite Strain (EV), enriched with the organohalide respiring bacteria Dehalococcoides mccartyi - were grown in chemostats for more than 4 years at a mean cell residence time of 50 days. The slow doubling rate represents growth likely experienced in the subsurface. The chemostats were fed formate as an electron donor and trichloroethene (TCE) as the terminal electron acceptor. Under excess formate conditions, stable operation was observed with respect to TCE transformation, steady-state hydrogen (H2) concentrations (40 nM), and the structure of the dehalogenating community. Both cultures completely transformed TCE to ethene, with minor amounts of vinyl chloride (VC) observed, along with acetate formation. When formate was limited, TCE was transformed incompletely to ethene (40-60%) and VC (60- 40%), and H2 concentrations ranged from 1 to 3 nM. The acetate concentration dropped below detection. Batch kinetic studies of TCE transformation with chemostat harvested cells found transformation rates of c-DCE and VC were greatly reduced when the cells were grown with limited formate. Upon increasing formate addition to the chemostats, from limited to excess, essentially complete transformation of TCE to ethene was achieved. The increase in formate was associated with an increase in H2 concentration and the production of acetate. Results of batch kinetic tests showed increases in transformation rates for TCE and c-DCE by factors of 3.5 and 2.5, respectively, while VC rates increased by factors of 33 to 500, over a six month period. Molecular analysis of chemostat samples is being performed to quantify the changes in copy numbers of reductase genes and to determine whether shifts in the strains of Dehalococcoides mccartyi where responsible for the observed rate increases. The results demonstrate the importance of electron donor supply for successful in-situ remediation.

  1. A study of the electrochemistry of nickel hydroxide electrodes with various additives

    NASA Astrophysics Data System (ADS)

    Zhu, Wen-Hua; Ke, Jia-Jun; Yu, Hong-Mei; Zhang, Deng-Jun

    Nickel composite electrodes (NCE) with various additives are prepared by a chemical impregnation method from nitrate solutions on sintered porous plaques. The electrochemical properties, such as utilization of active material, swelling and the discharge potential of the nickel oxide electrode (NOE) are determined mainly through the composition of the active material and the characteristics of nickel plaques. Most additives (Mg, Ca, Sr, Ba, Zn, Cd, Co, Li and Al hydroxide) exert effects on the discharge potential and swelling of the NOE. Chemical co-precipitation with the addition of calcium, zinc, magnesium and barium hydroxide increases the discharge potential by more than 20 mV, but that with zinc hydroxide results in an obvious decrease of active-material utilization and that with calcium and magnesium hydroxide produces a larger increase of electrode thickness. The effects of anion additives are also examined. Less than 1% mol of NiS in the active material increases the discharge potential. Cadmium, cobalt and zinc hydroxide are excellent additives for preventing swelling of the NCE. Slow voltammetry (0.2 mV s -1) in 6 M KOH is applied to characterize the oxygen-evolving potential of the NCE. The difference between the oxygen-evolution potential and the potential of the oxidation peak for the NCE with additives of calcium, lithium, barium and aluminium hydroxide is at least + 60 mV.

  2. NES++: number system for encryption based privacy preserving speaker verification

    NASA Astrophysics Data System (ADS)

    Xu, Lei; Feng, Tao; Zhao, Xi; Shi, Weidong

    2014-05-01

    As speech based operation becomes a main hand-free interaction solution between human and mobile devices (i.e., smartphones, Google Glass), privacy preserving speaker verification receives much attention nowadays. Privacy preserving speaker verification can be achieved through many different ways, such as fuzzy vault and encryption. Encryption based solutions are promising as cryptography is based on solid mathematic foundations and the security properties can be easily analyzed in a well established framework. Most current asymmetric encryption schemes work on finite algebraic structures, such as finite group and finite fields. However, the encryption scheme for privacy preserving speaker verification must handle floating point numbers. This gap must be filled to make the overall scheme practical. In this paper, we propose a number system that meets the requirements of both speaker verification and the encryption scheme used in the process. It also supports addition homomorphic property of Pailliers encryption, which is crucial for privacy preserving speaker verification. As asymmetric encryption is expensive, we propose a method of packing several numbers into one plain-text and the computation overhead is greatly reduced. To evaluate the performance of this method, we implement Pailliers encryption scheme over proposed number system and the packing technique. Our findings show that the proposed solution can fulfill the gap between speaker verification and encryption scheme very well, and the packing technique improves the overall performance. Furthermore, our solution is a building block of encryption based privacy preserving speaker verification, the privacy protection and accuracy rate are not affected.

  3. Verification for ALEGRA using magnetized shock hydrodynamics problems.

    SciTech Connect

    Rider, William J.; Niederhaus, John H.; Robinson, Allen Conrad; Gardiner, Thomas Anthony

    2008-10-01

    Two classical verification problems from shock hydrodynamics are adapted for verification in the context of ideal magnetohydrodynamics (MHD) by introducing strong transverse magnetic fields, and simulated using the finite element Lagrange-remap MHD code ALEGRA for purposes of rigorous code verification. The concern in these verification tests is that inconsistencies related to energy advection are inherent in Lagrange-remap formulations for MHD, such that conservation of the kinetic and magnetic components of the energy may not be maintained. Hence, total energy conservation may also not be maintained. MHD shock propagation may therefore not be treated consistently in Lagrange-remap schemes, as errors in energy conservation are known to result in unphysical shock wave speeds and post-shock states. That kinetic energy is not conserved in Lagrange-remap schemes is well known, and the correction of DeBar has been shown to eliminate the resulting errors. Here, the consequences of the failure to conserve magnetic energy are revealed using order verification in the two magnetized shock-hydrodynamics problems. Further, a magnetic analog to the DeBar correction is proposed and its accuracy evaluated using this verification testbed. Results indicate that only when the total energy is conserved, by implementing both the kinetic and magnetic components of the DeBar correction, can simulations in Lagrange-remap formulation capture MHD shock propagation accurately. Additional insight is provided by the verification results, regarding the implementation of the DeBar correction and the advection scheme.

  4. 78 FR 68461 - Guidance for Industry: Studies To Evaluate the Utility of Anti-Salmonella Chemical Food Additives...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-14

    ... Guidance for Industry: Studies to Evaluate the Utility of Anti-Salmonella Chemical Food Additives in Feeds... HUMAN SERVICES Food and Drug Administration Guidance for Industry: Studies To Evaluate the Utility of Anti- Salmonella Chemical Food Additives in Feeds; Request for Comments AGENCY: Food and...

  5. Maltreated children's representations of mother and an additional caregiver: a longitudinal study.

    PubMed

    Manashko, Shany; Besser, Avi; Priel, Beatriz

    2009-04-01

    In the current longitudinal investigation, we explored the continuity of and changes in the mental representations of the mother and an additional caregiver among forty-five 9- to 11-year-old children who had been severely maltreated and subsequently placed in long-term residential care as well as the relationships between the content and structure of these representations and teacher's assessments of the child's externalizing and internalizing symptoms. At Time 1, a nonmaltreated comparison group was assessed concomitantly. Compared to nonmaltreated children, maltreated children scored higher for externalizing and internalizing symptoms, and their maternal representations were found to be significantly less benevolent and integrated and more punitive. In addition, among the maltreated children, the additional caregiver representations were found to be more benevolent and integrated, and less punitive, than the maternal representations. After 30 months, the maltreated children's levels of externalizing and internalizing symptoms diminished, their maternal representations become more benevolent and less punitive, and the additional caregiver representations became less benevolent. Moreover, the Benevolence of the additional caregiver representation was found to predict these children's changes in externalizing symptoms beyond the effects of their symptomatology and its associations with the Benevolence of these representations at Time 1. PMID:19220720

  6. Mechanical characterization of filler sandcretes with rice husk ash additions. Study applied to Senegal

    SciTech Connect

    Cisse, I.K.; Laquerbe, M.

    2000-01-01

    To capitalize on the local materials of Senegal (agricultural and industrial wastes, residual fines from crushing process, sands from dunes, etc.), rise husk ash and residues of industrial and agricultural wastes have been used as additions in sandcretes. The mechanical resistance of sandcrete blocks obtained when unground ash (and notably the ground ash) is added reveals that there is an increase in performance over the classic mortar blocks. In addition, the use of unground rice husk ash enables production of a lightweight sandcrete with insulating properties, at a reduced cost. The ash pozzolanic reactivity explains the high strengths obtained.

  7. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  8. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  9. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    PubMed Central

    Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca

    2013-01-01

    Summary The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG ‡ and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  10. Study on automatic optical element addition or deletion in lens optimization

    NASA Astrophysics Data System (ADS)

    Cheng, Xuemin; Wang, Yongtian; Hao, Qun

    2002-09-01

    Two lens form parameters, quantifying the symmetry of the optical system and the optical power distribution among the individual lens elements, are used as the criteria for automatic element addition or deletion in lens optimization. The scheme based on the criteria is described in this paper. Design examples are provided, which demonstrate that the scheme is practicable.

  11. Teaching Young Children Decomposition Strategies to Solve Addition Problems: An Experimental Study

    ERIC Educational Resources Information Center

    Cheng, Zi-Juan

    2012-01-01

    The ability to count has traditionally been considered an important milestone in children's development of number sense. However, using counting (e.g., counting on, counting all) strategies to solve addition problems is not the best way for children to achieve their full mathematical potential and to prepare them to develop more complex and…

  12. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines.

    PubMed

    Gansäuer, Andreas; Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca; Grimme, Stefan

    2013-01-01

    The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol(-1) and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG (‡) and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  13. Vector generalized additive models for extreme rainfall data analysis (study case rainfall data in Indramayu)

    NASA Astrophysics Data System (ADS)

    Utami, Eka Putri Nur; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall pattern are good indicators for potential disasters. Global Circulation Model (GCM) contains global scale information that can be used to predict the rainfall data. Statistical downscaling (SD) utilizes the global scale information to make inferences in the local scale. Essentially, SD can be used to predict local scale variables based on global scale variables. SD requires a method to accommodate non linear effects and extreme values. Extreme value Theory (EVT) can be used to analyze the extreme value. One of methods to identify the extreme events is peak over threshold that follows Generalized Pareto Distribution (GPD). The vector generalized additive model (VGAM) is an extension of the generalized additive model. It is able to accommodate linear or nonlinear effects by involving more than one additive predictors. The advantage of VGAM is to handle multi response models. The key idea of VGAM are iteratively reweighted least square for maximum likelihood estimation, penalized smoothing, fisher scoring and additive models. This works aims to analyze extreme rainfall data in Indramayu using VGAM. The results show that the VGAM with GPD is able to predict extreme rainfall data accurately. The prediction in February is very close to the actual value at quantile 75.

  14. Studies on the Food Additive Propyl Gallate: Synthesis, Structural Characterization, and Evaluation of the Antioxidant Activity

    ERIC Educational Resources Information Center

    Garrido, Jorge; Garrido, E. Manuela; Borges, Fernanda

    2012-01-01

    Antioxidants are additives largely used in industry for delaying, retarding, or preventing the development of oxidative deterioration. Propyl gallate (E310) is a phenolic antioxidant extensively used in the food, cosmetics, and pharmaceutical industries. A series of lab experiments have been developed to teach students about the importance and…

  15. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING AND CODING VERIFICATION (HAND ENTRY) (UA-D-14.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for coding and coding verification of hand-entered data. It applies to the coding of all physical forms, especially those coded by hand. The strategy was developed for use in the Arizona NHEXAS project and the "Border" st...

  16. Analytical and experimental studies of ventilation systems subjected to simulated tornado conditions: Verification of the TVENT computer code

    SciTech Connect

    Martin, R.A.; Gregory, W.S.; Ricketts, C.I.; Smith, P.R.; Littleton, P.E.; Talbott, D.V.

    1988-04-01

    Analytical and experimental studies of ventilation systems have been conducted to verify the Los Alamos National Laboratory TVENT accident analysis computer code for simulated tornado conditions. This code was developed to be a user-friendly analysis tool for designers and regulatory personnel and was designed to predict pressure and flow transients in arbitrary ventilation systems. The experimental studies used two relatively simple, yet sensitive, physical systems designed using similitude analysis. These physical models were instrumented end-to-end for pressure and volumetric flow rate and then subjected to the worst credible tornado conditions using a special blowdown apparatus. We verified TVENT by showing that it successfully predicted our experimental results. By comparing experimental results from both physical models with TVENT results, we showed that we have derived the proper similitude relations (governed by compressibility effects) for all sizes of ventilation systems. As a by-product of our studies, we determined the need for fan speed variation modeling in TVENT. This modification was made and resulted in a significant improvement in our comparisons of analytical and experimental results.

  17. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  18. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  19. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  20. HDL to verification logic translator

    NASA Astrophysics Data System (ADS)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  1. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  2. A laboratory study of the perceived benefit of additional noise attenuation by houses

    NASA Technical Reports Server (NTRS)

    Flindell, I. H.

    1983-01-01

    Two Experiments were conducted to investigate the perceived benefit of additional house attenuation against aircraft flyover noise. First, subjects made annoyance judgments in a simulated living room while an operative window with real and dummy storm windows was manipulated in full view of those subjects. Second, subjects made annoyance judgments in an anechoic audiometric test chamber of frequency shaped noise signals having spectra closely matched to those of the aircraft flyover noises reproduced in the first experiment. These stimuli represented the aircraft flyover noises in levels and spectra but without the situational and visual cues present in the simulated living room. Perceptual constancy theory implies that annoyance tends to remain constant despite reductions in noise level caused by additional attenuation of which the subjects are fully aware. This theory was supported when account was taken for a reported annoyance overestimation for certain spectra and for a simulated condition cue overreaction.

  3. Enhanced flux pinning in MOCVD-YBCO films through Zr-additions:Systematic feasibility studies

    SciTech Connect

    Aytug, Tolga; Paranthaman, Mariappan Parans; Specht, Eliot D; Kim, Kyunghoon; Zhang, Yifei; Cantoni, Claudia; Zuev, Yuri L; Goyal, Amit; Christen, David K; Maroni, Victor A.

    2009-01-01

    Systematic effects of Zr additions on the structural and flux pinning properties of YBa{sub 2}Cu{sub 3}O{sub 7-{delta}} (YBCO) films deposited by metal-organic chemical vapor deposition (MOCVD) have been investigated. Detailed characterization, conducted by coordinated transport, x-ray diffraction, scanning and transmission electron microscopy analyses, and imaging Raman microscopy have revealed trends in the resulting property/performance correlations of these films with respect to varying mole percentages (mol%) of added Zr. For compositions {le} 7.5 mol%, Zr additions lead to improved in-field critical current density, as well as extra correlated pinning along the c-axis direction of the YBCO films via the formation of columnar, self-assembled stacks of BaZrO{sub 3} nanodots.

  4. Enhanced flux pinning in MOCVD-YBCO films through Zr additions : systematic feasibility studies.

    SciTech Connect

    Aytug, T.; Paranthaman, M.; Specht, E. D.; Zhang, Y.; Kim, K.; Zuev, Y. L.; Cantoni, C.; Goyal, A.; Christen, D. K.; Maroni, V. A.; Chen, Y.; Selvamanickam, V.; ORNL; SuperPower, Inc.

    2010-01-01

    Systematic effects of Zr additions on the structural and flux pinning properties of YBa{sub 2}Cu{sub 3}O{sub 7-{delta}} (YBCO) films deposited by metal-organic chemical vapor deposition (MOCVD) have been investigated. Detailed characterization, conducted by coordinated transport, x-ray diffraction, scanning and transmission electron microscopy analyses, and imaging Raman microscopy have revealed trends in the resulting property/performance correlations of these films with respect to varying mole percentages (mol%) of added Zr. For compositions {le} 7.5 mol%, Zr additions lead to improved in-field critical current density, as well as extra correlated pinning along the c-axis direction of the YBCO films via the formation of columnar, self-assembled stacks of BaZrO{sub 3} nanodots.

  5. Indolyne Experimental and Computational Studies: Synthetic Applications and Origins of Selectivities of Nucleophilic Additions

    PubMed Central

    Im, G-Yoon J.; Bronner, Sarah M.; Goetz, Adam E.; Paton, Robert S.; Cheong, Paul H.-Y.; Houk, K. N.; Garg, Neil K.

    2010-01-01

    Efficient syntheses of 4,5-, 5,6-, and 6,7-indolyne precursors beginning from commercially available hydroxyindole derivatives are reported. The synthetic routes are versatile and allow access to indolyne precursors that remain unsubstituted on the pyrrole ring. Indolynes can be generated under mild fluoride-mediated conditions, trapped by a variety of nucleophilic reagents, and used to access a number of novel substituted indoles. Nucleophilic addition reactions to indolynes proceed with varying degrees of regioselectivity; distortion energies control regioselectivity and provide a simple model to predict the regioselectivity in the nucleophilic additions to indolynes and other unsymmetrical arynes. This model has led to the design of a substituted 4,5-indolyne that exhibits enhanced nucleophilic regioselectivity. PMID:21114321

  6. Structural changes in gluten protein structure after addition of emulsifier. A Raman spectroscopy study

    NASA Astrophysics Data System (ADS)

    Ferrer, Evelina G.; Gómez, Analía V.; Añón, María C.; Puppo, María C.

    2011-06-01

    Food protein product, gluten protein, was chemically modified by varying levels of sodium stearoyl lactylate (SSL); and the extent of modifications (secondary and tertiary structures) of this protein was analyzed by using Raman spectroscopy. Analysis of the Amide I band showed an increase in its intensity mainly after the addition of the 0.25% of SSL to wheat flour to produced modified gluten protein, pointing the formation of a more ordered structure. Side chain vibrations also confirmed the observed changes.

  7. A design study for the addition of higher order parametric discrete elements to NASTRAN

    NASA Technical Reports Server (NTRS)

    Stanton, E. L.

    1972-01-01

    The addition of discrete elements to NASTRAN poses significant interface problems with the level 15.1 assembly modules and geometry modules. Potential problems in designing new modules for higher-order parametric discrete elements are reviewed in both areas. An assembly procedure is suggested that separates grid point degrees of freedom on the basis of admissibility. New geometric input data are described that facilitate the definition of surfaces in parametric space.

  8. Dosimetric Study and Verification of Total Body Irradiation Using Helical Tomotherapy and its Comparison to Extended SSD Technique

    SciTech Connect

    Zhuang, Audrey H.; Liu An; Schultheiss, Timothy E.; Wong, Jeffrey Y.C.

    2010-01-01

    The American College of Radiology practice guideline for total body irradiation (TBI) requires a back-up treatment delivery system. This study investigates the development of helical tomotherapy (HT) for delivering TBI and compares it with conventional extended source-to-surface distance (X-SSD) technique. Four patients' head-to-thigh computed tomographic images were used in this study, with the target defined as the body volume without the left and right lungs. HT treatment plans with the standard TBI prescription (1.2 Gy/fx, 10 fractions) were generated and verified on phantoms. To compare HT plans with X-SSD treatment, the dose distribution of X-SSD technique was simulated using the Eclipse software. The average dose received by 90% of the target volume was 12.3 Gy (range, 12.2-12.4 Gy) for HT plans and 10.3 Gy (range, 10.08-10.58 Gy) for X-SSD plans (p < 0.001). The left and right lung median doses were 5.44 Gy and 5.40 Gy, respectively, for HT plans and 8.34 Gy and 8.95 Gy, respectively, for X-SSD treatment. The treatment planning time was comparable between the two methods. The beam delivery time of HT treatment was longer than X-SSD treatment. In conclusion, HT-based TBI plans have better dose coverage to the target and better dose sparing to the lungs compared with X-SSD technique, which applies dose compensators, lung blocks, and electron boosts. This study demonstrates that HT is possible for delivering TBI. Clinical validation of the feasibility of this approach would be of interest in the future.

  9. Load bearing and stiffness tailored NiTi implants produced by additive manufacturing: a simulation study

    NASA Astrophysics Data System (ADS)

    Rahmanian, Rasool; Shayesteh Moghaddam, Narges; Haberland, Christoph; Dean, David; Miller, Michael; Elahinia, Mohammad

    2014-03-01

    Common metals for stable long-term implants (e.g. stainless steel, Titanium and Titanium alloys) are much stiffer than spongy cancellous and even stiffer than cortical bone. When bone and implant are loaded this stiffness mismatch results in stress shielding and as a consequence, degradation of surrounding bony structure can lead to disassociation of the implant. Due to its lower stiffness and high reversible deformability, which is associated with the superelastic behavior, NiTi is an attractive biomaterial for load bearing implants. However, the stiffness of austenitic Nitinol is closer to that of bone but still too high. Additive manufacturing provides, in addition to the fabrication of patient specific implants, the ability to solve the stiffness mismatch by adding engineered porosity to the implant. This in turn allows for the design of different stiffness profiles in one implant tailored to the physiological load conditions. This work covers a fundamental approach to bring this vision to reality. At first modeling of the mechanical behavior of different scaffold designs are presented as a proof of concept of stiffness tailoring. Based on these results different Nitinol scaffolds can be produced by additive manufacturing.

  10. Magnetic Force Microscopy Study of Zr2Co11 -Based Nanocrystalline Materials: Effect of Mo Addition

    DOE PAGESBeta

    Yue, Lanping; Jin, Yunlong; Zhang, Wenyong; Sellmyer, David J.

    2015-01-01

    Tmore » he addition of Molybdenum was used to modify the nanostructure and enhance coercivity of rare-earth-free Zr2Co11-based nanocrystalline permanent magnets. he effect of Mo addition on magnetic domain structures of melt spun nanocrystalline Zr16Co84-xMox(x=0, 0.5, 1, 1.5, and 2.0) ribbons has been investigated. It was found that magnetic properties and local domain structures are strongly influenced by Mo doping. he coercivity of the samples increases with the increase in Mo content (x≤1.5). he maximum energy product(BH)maxincreases with increasingxfrom 0.5 MGOe forx=0to a maximum value of 4.2 MGOe forx=1.5. he smallest domain size with a relatively short magnetic correlation length of 128 nm and largest root-mean-square phase shiftΦrmsvalue of 0.66° are observed for thex=1.5. he optimal Mo addition promotes magnetic domain structure refinement and thus leads to a significant increase in coercivity and energy product in this sample.« less

  11. Arthroscopic verification of objectivity of the orthopaedic examination and magnetic resonance imaging in intra-articular knee injury. Retrospective study

    PubMed Central

    Skowronek, Michał; Skowronek, Paweł; Dutka, Łukasz

    2011-01-01

    Introduction Arthroscopy of the knee joint is regarded as the most objective diagnostic method in intra-articular knee joint lesions. Aim The purpose of this study was to assess the objectivity and diagnostic value of orthopaedic examination (OE) and magnetic resonance imaging (MRI) in reference to the arthroscopic result. Material and methods In a group of 113 patients treated by arthroscopic surgery for post-traumatic knee pathology between 2008 and 2010 in our department, accuracy of clinical and MRI findings that preceded surgery were studied retrospectively using a statistical method. Sensitivity, specificity, accuracy and predictive negative and positive values were the subject of analysis. Results In the presented trial, sensitivity values of the orthopaedic examination for injuries of the anterior cruciate ligament (ACL), meniscus medialis (MM), meniscus lateralis (ML) and chondral injuries (ChI) were 86%, 65%, 38% and 51%, respectively. Specificity values were 90%, 65%, 100% and 100%, respectively. The MR sensitivity and specificity values were 80%, 88%, 44% and 32%, and 86%, 64%, 93% and 97%, respectively. Conclusions Assessment of intra-articular knee joint lesions is a difficult diagnostic problem. In making a decision about arthroscopy of the knee joint, an appropriate sequence of examinations should be carried out: OE, MRI and arthroscopy. The improvement in the effectiveness of the orthopaedic examination and MRI can limit the too high frequency of diagnostic arthroscopies, which generates the risk of operation treatment and costs. PMID:23255995

  12. Parameters and pitfalls to consider in the conduct of food additive research, Carrageenan as a case study.

    PubMed

    Weiner, Myra L

    2016-01-01

    This paper provides guidance on the conduct of new in vivo and in vitro studies on high molecular weight food additives, with carrageenan, the widely used food additive, as a case study. It is important to understand the physical/chemical properties and to verify the identity/purity, molecular weight and homogeneity/stability of the additive in the vehicle for oral delivery. The strong binding of CGN to protein in rodent chow or infant formula results in no gastrointestinal tract exposure to free CGN. It is recommended that doses of high Mw non-caloric, non-nutritive additives not exceed 5% by weight of total solid diet to avoid potential nutritional effects. Addition of some high Mw additives at high concentrations to liquid nutritional supplements increases viscosity and may affect palatability, caloric intake and body weight gain. In in vitro studies, the use of well-characterized, relevant cell types and the appropriate composition of the culture media are necessary for proper conduct and interpretation. CGN is bound to media protein and not freely accessible to cells in vitro. Interpretation of new studies on food additives should consider the interaction of food additives with the vehicle components and the appropriateness of the animal or cell model and dose-response. PMID:26615870

  13. Synthesis, verification, and optimization of systolic arrays

    SciTech Connect

    Rajopadhye, S.V.

    1986-01-01

    This dissertation addresses the issue of providing a sound theoretical basis for three important issues relating to systolic arrays, namely synthesis, verification, and optimization. Former research has concentrated on analysis of the dependency structure of the computation, and there have been numerous approaches to map this dependency structure onto a locally interconnected network. This study pursues a similar approach, but with a major generalization of the class of problems analyzed. In earlier research, it was essential that the dependencies were expressible as constant vectors (from a point in the domain to the points that it depended on); here they are permitted to be arbitrary linear functions of the point. Theory for synthesizing systolic architectures from such generalized specifications is developed. Also a systematic (mechanizable) approach to the synthesis of systolic architectures that have control signals is presented. In the areas of verification and optimization, a rigorous mathematical framework is presented that permits reasoning about the behavior of systolic arrays as functions on streams of data. Using this approach, the verification of such architectures reduces to the problem of verification of functional program.s

  14. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  15. Validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1991-01-01

    Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.

  16. PET-based dose delivery verification in proton therapy: a GATE based simulation study of five PET system designs in clinical conditions

    NASA Astrophysics Data System (ADS)

    Robert, Charlotte; Fourrier, Nicolas; Sarrut, David; Stute, Simon; Gueth, Pierre; Grevillot, Loïc; Buvat, Irène

    2013-10-01

    PET is a promising technique for in vivo treatment verification in hadrontherapy. Three main PET geometries dedicated to in-beam treatment monitoring have been proposed in the literature: the dual-head PET geometry, the OpenPET geometry and the slanted-closed ring geometry. The aim of this work is to characterize the performance of two of these dedicated PET detectors in realistic clinical conditions. Several configurations of the dual-head PET and OpenPET systems were simulated using GATE v6.2. For the dual-head configuration, two aperture angles (15° and 45°) were studied. For the OpenPET system, two gaps between rings were investigated (110 and 160 mm). A full-ring PET system was also simulated as a reference. After preliminary evaluation of the sensitivity and spatial resolution using a Derenzo phantom, a real small-field head and neck treatment plan was simulated, with and without introducing patient displacements. No wash-out was taken into account. 3D maps of the annihilation photon locations were deduced from the PET data acquired right after the treatment session (5 min acquisition) using a dedicated OS-EM reconstruction algorithm. Detection sensitivity at the center of the field-of-view (FOV) varied from 5.2% (45° dual-head system) to 7.0% (full-ring PET). The dual-head systems had a more uniform efficiency within the FOV than the OpenPET systems. The spatial resolution strongly depended on the location within the FOV for the ϕ = 45° dual-head system and for the two OpenPET systems. All investigated architectures identified the magnitude of mispositioning introduced in the simulations within a 1.5 mm accuracy. The variability on the estimated mispositionings was less than 2 mm for all PET systems.

  17. Improved fluid dynamics similarity, analysis and verification. Part 5: Analytical and experimental studies of thermal stratification phenomena

    NASA Technical Reports Server (NTRS)

    Winter, E. R. F.; Schoenhals, R. J.; Haug, R. I.; Libby, T. L.; Nelson, R. N.; Stevenson, W. H.

    1968-01-01

    The stratification behavior of a contained fluid subjected to transient free convection heat transfer was studied. A rectangular vessel was employed with heat transfer from two opposite walls of the vessel to the fluid. The wall temperature was increased suddenly to initiate the process and was then maintained constant throughout the transient stratification period. Thermocouples were positioned on a post at the center of the vessel. They were adjusted so that temperatures could be measured at the fluid surface and at specific depths beneath the surface. The predicted values of the surface temperature and the stratified layer thickness were found to agree reasonably well with the experimental measurements. The experiments also provided information on the transient centerline temperature distribution and the transient flow distribution.

  18. Dosimetric study and in-vivo dose verification for conformal avoidance treatment of anal adenocarcinoma using helical tomotherapy

    SciTech Connect

    Han Chunhui . E-mail: chan@coh.org; Chen Yijen; Liu An; Schultheiss, Timothy E.; Wong, Jeffrey Y.C.

    2007-04-01

    This study evaluated the efficacy of using helical tomotherapy for conformal avoidance treatment of anal adenocarcinoma. We retrospectively generated step-and-shoot intensity-modulated radiotherapy (sIMRT) plans and helical tomotherapy plans for two anal cancer patients, one male and one female, who were treated by the sIMRT technique. Dose parameters for the planning target volume (PTV) and the organs-at-risk (OARs) were compared between the sIMRT and the helical tomotherapy plans. The helical tomotherapy plans showed better dose homogeneity in the PTV, better dose conformity around the PTV, and, therefore, better sparing of nearby OARs compared with the sIMRT plans. In-vivo skin dose measurements were performed during conformal avoidance helical tomotherapy treatment of an anal cancer patient to verify adequate delivery of skin dose and sparing of OARs.

  19. New method for detection of complex 3D fracture motion - Verification of an optical motion analysis system for biomechanical studies

    PubMed Central

    2012-01-01

    Background Fracture-healing depends on interfragmentary motion. For improved osteosynthesis and fracture-healing, the micromotion between fracture fragments is undergoing intensive research. The detection of 3D micromotions at the fracture gap still presents a challenge for conventional tactile measurement systems. Optical measurement systems may be easier to use than conventional systems, but, as yet, cannot guarantee accuracy. The purpose of this study was to validate the optical measurement system PONTOS 5M for use in biomechanical research, including measurement of micromotion. Methods A standardized transverse fracture model was created to detect interfragmentary motions under axial loadings of up to 200 N. Measurements were performed using the optical measurement system and compared with a conventional high-accuracy tactile system consisting of 3 standard digital dial indicators (1 μm resolution; 5 μm error limit). Results We found that the deviation in the mean average motion detection between the systems was at most 5.3 μm, indicating that detection of micromotion was possible with the optical measurement system. Furthermore, we could show two considerable advantages while using the optical measurement system. Only with the optical system interfragmentary motion could be analyzed directly at the fracture gap. Furthermore, the calibration of the optical system could be performed faster, safer and easier than that of the tactile system. Conclusion The PONTOS 5 M optical measurement system appears to be a favorable alternative to previously used tactile measurement systems for biomechanical applications. Easy handling, combined with a high accuracy for 3D detection of micromotions (≤ 5 μm), suggests the likelihood of high user acceptance. This study was performed in the context of the deployment of a new implant (dynamic locking screw; Synthes, Oberdorf, Switzerland). PMID:22405047

  20. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in social media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation

  1. Early Additional Immune-Modulators for Mycoplasma pneumoniae Pneumonia in Children: An Observation Study

    PubMed Central

    Lee, Sung-Churl; Rhim, Jung-Woo; Shin, Myung-Seok; Kang, Jin-Han

    2014-01-01

    Background Mycoplasma pneumoniae (MP) pneumonia is a self-limiting disease, but some patients complain of progressive pneumonia, despite of appropriate antibiotic treatment. We aimed to introduce the role of immune-modulators (corticosteroid and/or intravenous immunoglobulin, IVIG) treatment for childhood MP pneumonia based on previous our experiences. Materials and Methods A retrospective case series analysis for 183 children with MP pneumonia was performed. MP pneumonia patients were diagnosed by two Immunoglobulin M (IgM) tests: the micro-particle agglutination method (≥1:40) and the cold agglutination test (≥1:4), and were examined twice at the initial admission and at discharge. Among 183 MP pneumonia patients, 90 patients with persistent fever for over 48 hours after admission or those with severe respiratory symptoms and signs received additional prednisolone (82 patients, 1 mg/kg/day) or intravenous methylprednisolone (8 patients, 5-10 mg/kg/day) with antibiotics. Four patients with aggravated clinical symptoms and chest radiographic findings after corticosteroid treatment received IVIG (1 g/kg/day, 1-2 doses). Results Mean age of 183 patients was 5.5 ± 3.2 years (6 months-15 years), and the male: female ratio was 1.1:1 (96:87). Fifty-seven patients (31%) were seroconverters and 126 seropositive patients showed increased diagnostic IgM antibody titres during admission (over 4 folds). The majority of the patients who received corticosteroids (86/90 cases) showed rapid defervescence within 48 hours with improved clinical symptoms, regardless of the used antibiotics. Also, 4 patients who received additional IVIG improved both clinically and radiographically within 2 days without adverse reaction. Conclusions In the era of macrolide-resistant MP strains, early additional immune-modulator therapy with antibiotics might prevent from the disease progression and reduce the disease morbidity without adverse reaction. PMID:25566403

  2. Strategic Petroleum Reserve (SPR) additional geologic site characterization studies, Bryan Mound Salt Dome, Texas

    SciTech Connect

    Neal, J.T.; Magorian, T.R.; Ahmad, S.

    1994-11-01

    This report revises the original report that was published in 1980. Some of the topics covered in the earlier report were provisional and it is now practicable to reexamine them using new or revised geotechnical data and that obtained from SPR cavern operations, which involves 16 new caverns. Revised structure maps and sections show interpretative differences as compared with the 1980 report and more definition in the dome shape and caprock structural contours, especially a major southeast-northwest trending anomalous zone. The original interpretation was of westward tilt of the dome, this revision shows a tilt to the southeast, consistent with other gravity and seismic data. This interpretation refines the evaluation of additional cavern space, by adding more salt buffer and allowing several more caverns. Additional storage space is constrained on this nearly full dome because of low-lying peripheral wetlands, but 60 MMBBL or more of additional volume could be gained in six or more new caverns. Subsidence values at Bryan Mound are among the lowest in the SPR system, averaging about 11 mm/yr (0.4 in/yr), but measurement and interpretation issues persist, as observed values are about the same as survey measurement accuracy. Periodic flooding is a continuing threat because of the coastal proximity and because peripheral portions of the site are at elevations less than 15 ft. This threat may increase slightly as future subsidence lowers the surface, but the amount is apt to be small. Caprock integrity may be affected by structural features, especially the faulting associated with anomalous zones. Injection wells have not been used extensively at Bryan Mound, but could be a practicable solution to future brine disposal needs. Environmental issues center on the areas of low elevation that are below 15 feet above mean sea level: the coastal proximity and lowland environment combined with the potential for flooding create conditions that require continuing surveillance.

  3. A near-infrared spectroscopic study of young field ultracool dwarfs: additional analysis

    NASA Astrophysics Data System (ADS)

    Allers, K. N.; Liu, M. C.

    We present additional analysis of the classification system presented in \\citet{allers13}. We refer the reader to \\citet{allers13} for a detailed discussion of our near-IR spectral type and gravity classification system. Here, we address questions and comments from participants of the Brown Dwarfs Come of Age meeting. In particular, we examine the effects of binarity and metallicity on our classification system. We also present our classification of Pleiades brown dwarfs using published spectra. Lastly, we determine SpTs and calculate gravity-sensitive indices for the BT-Settl atmospheric models and compare them to observations.

  4. A simulation study of a dual-plate in-room PET system for dose verification in carbon ion therapy

    NASA Astrophysics Data System (ADS)

    Chen, Ze; Hu, Zheng-Guo; Chen, Jin-Da; Zhang, Xiu-Ling; Guo, Zhong-Yan; Xiao, Guo-Qing; Sun, Zhi-Yu; Huang, Wen-Xue; Wang, Jian-Song

    2014-08-01

    During carbon ion therapy, lots of positron emitters such as 11C, 15O, 10C are generated in irradiated tissues by nuclear reactions, and can be used to track the carbon beam in the tissue by a positron emission tomography (PET) scanner. In this study, an dual-plate in-room PET scanner has been designed and evaluated based on the GATE simulation platform to monitor patient dose in carbon ion therapy. The dual-plate PET is designed to avoid interference with the carbon beamline and with patient positioning. Its performance was compared with that of four-head and full-ring PET scanners. The dual-plate, four-head and full-ring PET scanners consisted of 30, 60, 60 detector modules, respectively, with a 36 cm distance between directly opposite detector modules for dose deposition measurements. Each detector module consisted of a 24×24 array of 2 mm×2 mm×18 mm LYSO pixels coupled to a Hamamatsu H8500 PMT. To estimate the production yield of positron emitters, a 10 cm×15 cm×15 cm cuboid PMMA phantom was irradiated with 172, 200, 250 MeV/u 12C beams. 3D images of the activity distribution measured by the three types of scanner are produced by an iterative reconstruction algorithm. By comparing the longitudinal profile of positron emitters along the carbon beam path, it is indicated that use of the dual-plate PET scanner is feasible for monitoring the dose distribution in carbon ion therapy.

  5. Real time bolt preload monitoring using piezoceramic transducers and time reversal technique—a numerical study with experimental verification

    NASA Astrophysics Data System (ADS)

    Parvasi, Seyed Mohammad; Ho, Siu Chun Michael; Kong, Qingzhao; Mousavi, Reza; Song, Gangbing

    2016-08-01

    Bolted joints are ubiquitous structural elements, and form critical connections in mechanical and civil structures. As such, loosened bolted joints may lead to catastrophic failures of these structures, thus inspiring a growing interest in monitoring of bolted joints. A novel energy based wave method is proposed in this study to monitor the axial load of bolted joint connections. In this method, the time reversal technique was used to focus the energy of a piezoelectric (PZT)-generated ultrasound wave from one side of the interface to be measured as a signal peak by another PZT transducer on the other side of the interface. A tightness index (TI) was defined and used to correlate the peak amplitude to the bolt axial load. The TI bypasses the need for more complex signal processing required in other energy-based methods. A coupled, electro-mechanical analysis with elasto-plastic finite element method was used to simulate and analyze the PZT based ultrasonic wave propagation through the interface of two steel plates connected by a single nut and bolt connection. Numerical results, backed by experimental results from testing on a bolted connection between two steel plates, revealed that the peak amplitude of the focused signal increases as the bolt preload (torque level) increases due to the enlarging true contact area of the steel plates. The amplitude of the focused peak saturates and the TI reaches unity as the bolt axial load reaches a threshold value. These conditions are associated with the maximum possible true contact area between the surfaces of the bolted connection.

  6. Study and verification of the superposition method used for determining the pressure losses of the heat exchangers

    NASA Astrophysics Data System (ADS)

    Petru, Michal; Kulhavy, Petr; Srb, Pavel; Rachitsky, Gary

    2015-05-01

    This paper deals with study of the pressure losses of the new heat convectors product line. For all devices connected to the heating circuit of the building, it`s required to declare a tabulated values of pressure drops. The heat exchangers are manufactured in a lot of different dimensions and atypical shapes. An individual assessment of the pressure losses for each type is very time consuming. Therefore based on the resulting data of the experiments and numerical models, an electronic database was created that can be used for calculating the total values of the pressure losses in the optionally assembled exchanger. The measurements are standardly performed by the manufacturer Licon heat hydrodynamic laboratory and the numerical models are carried out in COMSOL Multiphysics. Different variations of the convectors geometry cause non-linear process of energy losses, which is proportionately about 30% larger for the smaller exchanger than for the larger types. The results of the experiments and the numerical simulations were in a very good conjuncture. Considerable influence of the water temperature onto the total size of incurred energy losses has been proven. This is mainly caused by the different ranges of the Reynolds number depending on the viscosity of the used liquid. Concerning to the tested method of superposition, it is not possible to easily find the characteristic values appropriate for the each individual components of the heat exchanger. Every of the components behaves differently, depend on the complexity of the exchanger. However, the correction coefficient, depended on the matrix of the exchanger, that is suitable for the entire range of the developed product line has been found.

  7. Automated radiotherapy treatment plan integrity verification

    SciTech Connect

    Yang Deshan; Moore, Kevin L.

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  8. The influence of bioaugmentation and biosurfactant addition on bioremediation efficiency of diesel-oil contaminated soil: feasibility during field studies.

    PubMed

    Szulc, Alicja; Ambrożewicz, Damian; Sydow, Mateusz; Ławniczak, Łukasz; Piotrowska-Cyplik, Agnieszka; Marecik, Roman; Chrzanowski, Łukasz

    2014-01-01

    The study focused on assessing the influence of bioaugmentation and addition of rhamnolipids on diesel oil biodegradation efficiency during field studies. Initial laboratory studies (measurement of emitted CO2 and dehydrogenase activity) were carried out in order to select the consortium for bioaugmentation as well as to evaluate the most appropriate concentration of rhamnolipids. The selected consortium consisted of following bacterial taxa: Aeromonas hydrophila, Alcaligenes xylosoxidans, Gordonia sp., Pseudomonas fluorescens, Pseudomonas putida, Rhodococcus equi, Stenotrophomonas maltophilia, Xanthomonas sp. It was established that the application of rhamnolipids at 150 mg/kg of soil was most appropriate in terms of dehydrogenase activity. Based on the obtained results, four treatment methods were designed and tested during 365 days of field studies: I) natural attenuation; II) addition of rhamnolipids; III) bioaugmentation; IV) bioaugmentation and addition of rhamnolipids. It was observed that bioaugmentation contributed to the highest diesel oil biodegradation efficiency, whereas the addition of rhamnolipids did not notably influence the treatment process. PMID:24291585

  9. Kaolinite flocculation induced by smectite addition - a transmission X-ray microscopic study.

    PubMed

    Zbik, Marek S; Song, Yen-Fang; Frost, Ray L

    2010-09-01

    The influence of smectite addition on kaolinite suspensions in water was investigated by transmission X-ray microscopy (TXM) and Scanning Electron Microscopy (SEM). Sedimentation test screening was also conducted. Micrographs were processed by the STatistic IMage Analysing (STIMAN) program and structural parameters were calculated. From the results of the sedimentation tests important influences of small smectite additions to about 3wt.% on kaolinite suspension flocculation has been found. In order to determine the reason for this smectite impact on kaolinite suspension, macroscopic behaviour micro-structural examination using Transmission X-ray Microscope (TXM) and SEM has been undertaken. TXM & SEM micrographs of freeze-dried kaolinite-smectite suspensions with up to 20% smectite showed a high degree of orientation of the fabric made of highly oriented particles and greatest density when 3wt.% of smectite was added to the 10wt.% dense kaolinite suspension. In contrast, suspensions containing pure kaolinite do not show such platelet mutual orientation but homogenous network of randomly oriented kaolinite platelets. This suggests that in kaolinite-smectite suspensions, smectite forms highly oriented basic framework into which kaolinite platelets may bond in face to face preferential contacts strengthening structure and allowing them to show plastic behaviour which is cause of platelets orientation. PMID:20621806

  10. Professional Competence Development of the Social Work Specialists in the Period of Study in the System of Additional Education

    ERIC Educational Resources Information Center

    Davletkaliev, Denis Kuanyshevich; Zueva, Natalia Konstantinovna; Lebedeva, Natalya Vasilevna; Mkrtumova, Irina Vladimirovna; Timofeeva, Olga

    2015-01-01

    The goal of this work is the study of psychological-pedagogical approaches to the understanding of the idea of professional competence of social work specialists as well as the role of study in the system of additional educations in professional-personal development of the listeners. In the process of study of this problem we define main…

  11. Preliminary study of neutron absorption by concrete with boron carbide addition

    SciTech Connect

    Abdullah, Yusof Yusof, Mohd Reusmaazran; Zali, Nurazila Mat; Ahmad, Megat Harun Al Rashid Megat; Yazid, Hafizal; Ariffin, Fatin Nabilah Tajul; Ahmad, Sahrim; Hamid, Roszilah; Mohamed, Abdul Aziz

    2014-02-12

    Concrete has become a conventional material in construction of nuclear reactor due to its properties like safety and low cost. Boron carbide was added as additives in the concrete construction as it has a good neutron absorption property. The sample preparation for concrete was produced with different weight percent of boron carbide powder content. The neutron absorption rate of these samples was determined by using a fast neutron source of Americium-241/Be (Am-Be 241) and detection with a portable backscattering neutron detector. Concrete with 20 wt % of boron carbide shows the lowest count of neutron transmitted and this indicates the most neutrons have been absorbed by the concrete. Higher boron carbide content may affect the concrete strength and other properties.

  12. A study of the effects of an additional sound source on RASS performance

    SciTech Connect

    Coulter, R.L.

    1998-12-31

    The Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site of the Atmospheric Radiation Measurements (ARM) Program continuously operates a nine panel 915 MHz wind profiler with Radio Acoustic Sounding System (RASS), measuring wind profiles for 50 minutes and virtual temperature profiles for the remaining 10 minutes during each hour. It is well recognized that one of the principal limits on RASS performance is high horizontal wind speed that moves the acoustic wave front sufficiently to prevent the microwave energy produced by the radar and scattered from the acoustic wave from being reflected back t the radar antenna. With this limitation in mind, the ARM program purchased an additional, portable acoustic source that could be mounted on a small trailer and placed in strategic locations to enhance the RASS performance (when it was not being used for spare parts). A test of the resulting improvement in RASS performance was performed during the period 1995--1997.

  13. Additive Manufacturing of a Microbial Fuel Cell—A detailed study

    PubMed Central

    Calignano, Flaviana; Tommasi, Tonia; Manfredi, Diego; Chiolerio, Alessandro

    2015-01-01

    In contemporary society we observe an everlasting permeation of electron devices, smartphones, portable computing tools. The tiniest living organisms on Earth could become the key to address this challenge: energy generation by bacterial processes from renewable stocks/waste through devices such as microbial fuel cells (MFCs). However, the application of this solution was limited by a moderately low efficiency. We explored the limits, if any, of additive manufacturing (AM) technology to fabricate a fully AM-based powering device, exploiting low density, open porosities able to host the microbes, systems easy to fuel continuously and to run safely. We obtained an optimal energy recovery close to 3 kWh m−3 per day that can power sensors and low-power appliances, allowing data processing and transmission from remote/harsh environments. PMID:26611142

  14. THERMODYNAMIC STUDY OF THE NICKEL ADDITION IN ZINC HOT-DIP GALVANIZING BATHS

    SciTech Connect

    Pistofidis, N.; Vourlias, G.

    2010-01-21

    A usual practice during zinc hot-dip galvanizing is the addition of nickel in the liquid zinc which is used to inhibit the Sandelin effect. Its action is due to the fact that the zeta(zeta) phase of the Fe-Zn system is replaced by the TAU(tau) phase of the Fe-Zn-Ni system. In the present work an attempt is made to explain the formation of the TAU phase with thermodynamics. For this reason the Gibbs free energy changes for TAU and zeta phases were calculated. The excess free energy for the system was calculated with the Redlich-Kister polyonyme. From this calculation it was deduced that the Gibbs energy change for the tau phase is negative. As a result its formation is spontaneous.

  15. Preliminary study of neutron absorption by concrete with boron carbide addition

    NASA Astrophysics Data System (ADS)

    Abdullah, Yusof; Ariffin, Fatin Nabilah Tajul; Hamid, Roszilah; Yusof, Mohd Reusmaazran; Zali, Nurazila Mat; Ahmad, Megat Harun Al Rashid Megat; Yazid, Hafizal; Ahmad, Sahrim; Mohamed, Abdul Aziz

    2014-02-01

    Concrete has become a conventional material in construction of nuclear reactor due to its properties like safety and low cost. Boron carbide was added as additives in the concrete construction as it has a good neutron absorption property. The sample preparation for concrete was produced with different weight percent of boron carbide powder content. The neutron absorption rate of these samples was determined by using a fast neutron source of Americium-241/Be (Am-Be 241) and detection with a portable backscattering neutron detector. Concrete with 20 wt % of boron carbide shows the lowest count of neutron transmitted and this indicates the most neutrons have been absorbed by the concrete. Higher boron carbide content may affect the concrete strength and other properties.

  16. Sulphur diffusion in β-NiAl and effect of Pt additive: an ab initio study

    NASA Astrophysics Data System (ADS)

    Chen, Kuiying

    2016-02-01

    Diffusivities of detrimental impurity sulfur (S) in stoichiometric and Pt doped β-NiAl were evaluated using density functional theory calculations. The apparent activation energy and the pre-exponential factor of diffusivity via the next nearest neighbour (NNN) and interstitial jumps were evaluated to identify possible preferred diffusion mechanism(s). By calculating the electron localization function (ELF), the bonding characteristics of S with its surrounding atoms were assessed for the diffusion process. By comparison with the experimental results, the S diffusion through the NNN vacancy-mediated mechanism is found to be favoured. Addition of Pt in β-NiAl was found to significantly reduce the S diffusivity, and an associated electronic effect was explored. The elucidation of the above mechanisms may shed light on the development of new Pt-modified doped β-NiAl bond coats that can extend the life of oxidation resistant and thermal barrier coatings.

  17. Additive Manufacturing of a Microbial Fuel Cell--A detailed study.

    PubMed

    Calignano, Flaviana; Tommasi, Tonia; Manfredi, Diego; Chiolerio, Alessandro

    2015-01-01

    In contemporary society we observe an everlasting permeation of electron devices, smartphones, portable computing tools. The tiniest living organisms on Earth could become the key to address this challenge: energy generation by bacterial processes from renewable stocks/waste through devices such as microbial fuel cells (MFCs). However, the application of this solution was limited by a moderately low efficiency. We explored the limits, if any, of additive manufacturing (AM) technology to fabricate a fully AM-based powering device, exploiting low density, open porosities able to host the microbes, systems easy to fuel continuously and to run safely. We obtained an optimal energy recovery close to 3 kWh m(-3) per day that can power sensors and low-power appliances, allowing data processing and transmission from remote/harsh environments. PMID:26611142

  18. Additive Manufacturing of a Microbial Fuel Cell—A detailed study

    NASA Astrophysics Data System (ADS)

    Calignano, Flaviana; Tommasi, Tonia; Manfredi, Diego; Chiolerio, Alessandro

    2015-11-01

    In contemporary society we observe an everlasting permeation of electron devices, smartphones, portable computing tools. The tiniest living organisms on Earth could become the key to address this challenge: energy generation by bacterial processes from renewable stocks/waste through devices such as microbial fuel cells (MFCs). However, the application of this solution was limited by a moderately low efficiency. We explored the limits, if any, of additive manufacturing (AM) technology to fabricate a fully AM-based powering device, exploiting low density, open porosities able to host the microbes, systems easy to fuel continuously and to run safely. We obtained an optimal energy recovery close to 3 kWh m-3 per day that can power sensors and low-power appliances, allowing data processing and transmission from remote/harsh environments.

  19. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  20. Sensitization to Food Additives in Patients with Allergy: A Study Based on Skin Test and Open Oral Challenge.

    PubMed

    Moghtaderi, Mozhgan; Hejrati, Zinatosadat; Dehghani, Zahra; Dehghani, Faranak; Kolahi, Niloofar

    2016-06-01

    There has been a great increase in the consumption of various food additives in recent years. The purpose of this study was to identify the incidence of sensitization to food additives by using skin prick test in patients with allergy and to determine the concordance rate between positive skin tests and oral challenge in hypersensitivity to additives. This cross-sectional study included 125 (female 71, male 54) patients aged 2-76 years with allergy and 100 healthy individuals. Skin tests were performed in both patient and control groups with 25 fresh food additives. Among patients with allergy, 22.4% showed positive skin test at least to one of the applied materials. Skin test was negative to all tested food additives in control group. Oral food challenge was done in 28 patients with positive skin test, in whom 9 patients showed reaction to culprit (Concordance rate=32.1%). The present study suggested that about one-third of allergic patients with positive reaction to food additives showed positive oral challenge; it may be considered the potential utility of skin test to identify the role of food additives in patients with allergy. PMID:27424134

  1. Spectroscopic Evidence for Covalent Binding of Sulfadiazine to Natural Soils via 1,4-nucleophilic addition (Michael Type Addition) studied by Spin Labeling ESR

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Olga

    2015-04-01

    Among different classes of veterinary pharmaceuticals, Sulfadiazine (SDZ) is widely used in animal husbandry. Its residues were detected in different environmental compartments. However, soil is a hot spot for SDZ as it receives a large portion of excreted compounds through the application of manure during soil fertilization. Ample studies on the fate of SDZ in soils showed that a large portion forms nonextractable residues (NER) along with transformation products and a low mineralization (Mueller et al., 2013). A common observation was an initially fast formation of NER up to 10% of the applied amount promptly after the application of SDZ to soil, and this portion increased up to 50% within a few days (Mueller et al., 2013; Nowak et al., 2011). A common finding for SDZ, as for other sulfonamides, was biphasic kinetics of the formation of NER, which was attributed to the occurrence of two reaction processes: a rapid, often reversible process and a slower, irreversible process (Weber et al., 1996). A single-phase reaction process was also established under anaerobic treatment (Gulkowska et al., 2014). A major focus of this work is to elucidate a reaction mechanism of covalent binding of SDZ to soil that is currently required to estimate a risk of NER formed by SDZ in soils for human health. Taking into account a key role of the amine functional groups of SDZ on its reactivity in soil, nitroxide radicals with the sewed aromatic or aliphatic amines labeled soil samples and then, were investigated by means of ESR spectroscopy. 2,5,5-Trimethyl-2-(3-aminophenyl)pyrrolidin-1-yloxy and 4-amino-2,2,6,6-Tetramethylpiperidin-1-oxyl modeled decomposition products of SDZ with the aromatic and aliphatic amines, respectively. The application of the defined combination of both spin labels (SL) to different soils well simulated a change of a paramagnetic signal of soil organic radicals interacted with SDZ. After their application to soil, SL were found in soil sites characterized

  2. Spectroscopic Evidence for Covalent Binding of Sulfadiazine to Natural Soils via 1,4-nucleophilic addition (Michael Type Addition) studied by Spin Labeling ESR

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Olga

    2015-04-01

    Among different classes of veterinary pharmaceuticals, Sulfadiazine (SDZ) is widely used in animal husbandry. Its residues were detected in different environmental compartments. However, soil is a hot spot for SDZ as it receives a large portion of excreted compounds through the application of manure during soil fertilization. Ample studies on the fate of SDZ in soils showed that a large portion forms nonextractable residues (NER) along with transformation products and a low mineralization (Mueller et al., 2013). A common observation was an initially fast formation of NER up to 10% of the applied amount promptly after the application of SDZ to soil, and this portion increased up to 50% within a few days (Mueller et al., 2013; Nowak et al., 2011). A common finding for SDZ, as for other sulfonamides, was biphasic kinetics of the formation of NER, which was attributed to the occurrence of two reaction processes: a rapid, often reversible process and a slower, irreversible process (Weber et al., 1996). A single-phase reaction process was also established under anaerobic treatment (Gulkowska et al., 2014). A major focus of this work is to elucidate a reaction mechanism of covalent binding of SDZ to soil that is currently required to estimate a risk of NER formed by SDZ in soils for human health. Taking into account a key role of the amine functional groups of SDZ on its reactivity in soil, nitroxide radicals with the sewed aromatic or aliphatic amines labeled soil samples and then, were investigated by means of ESR spectroscopy. 2,5,5-Trimethyl-2-(3-aminophenyl)pyrrolidin-1-yloxy and 4-amino-2,2,6,6-Tetramethylpiperidin-1-oxyl modeled decomposition products of SDZ with the aromatic and aliphatic amines, respectively. The application of the defined combination of both spin labels (SL) to different soils well simulated a change of a paramagnetic signal of soil organic radicals interacted with SDZ. After their application to soil, SL were found in soil sites characterized

  3. Te Rita Papesch: Case Study of an Exemplary Learner of Maori as an Additional Language

    ERIC Educational Resources Information Center

    Ratima, Matiu Tai; Papesch, Te Rita

    2014-01-01

    This paper presents a case study of the life experiences of one exemplar adult second language Maori learner--Te Rita Papesch. Te Rita was one of 17 participants who were interviewed as a part of the first author's PhD study which sought to answer the question: what factors lead to the development of proficiency in te reo Maori amongst adult…

  4. Study of NiO cathode modified by ZnO additive for MCFC

    NASA Astrophysics Data System (ADS)

    Huang, Bo; Li, Fei; Yu, Qing-chun; Chen, Gang; Zhao, Bin-yuan; Hu, Ke-ao

    The preparation and subsequent oxidation of nickel cathodes modified by impregnation with zinc oxide (ZnO) were evaluated by surface and bulk analysis. The electrochemical behaviors of ZnO impregnated NiO cathodes was also evaluated in a molten 62 mol% Li 2CO 3 + 38 mol% K 2CO 3 eutectic at 650 °C by electrochemical impedance spectroscopy (EIS) as a function of ZnO content and immersion time. The ZnO impregnated nickel cathodes showed the similar porosity, pore size distribution and morphology to the reference nickel cathode. The stability tests of ZnO impregnated NiO cathodes showed that the ZnO additive could dramatically reduce the solubility of NiO in a eutectic carbonate mixture under the standard cathode gas condition. The impedance spectra for cathode materials show important variations during the 100 h of immersion. The incorporation of lithium in its structure and the low dissolution of nickel oxide and zinc oxide are responsible of these changes. After that, the structure reaches a stable state. The cathode material having 2 mol% of ZnO showed a very low dissolution and a good catalytic efficiency close to the NiO value. We thought that 2 mol% ZnO/NiO materials would be able to adapt as alternative cathode materials for MCFCs.

  5. Generalized linear and generalized additive models in studies of species distributions: Setting the scene

    USGS Publications Warehouse

    Guisan, A.; Edwards, T.C., Jr.; Hastie, T.

    2002-01-01

    An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.

  6. Numerical study of the effect of water addition on gas explosion.

    PubMed

    Liang, Yuntao; Zeng, Wen

    2010-02-15

    Through amending the SENKIN code of CHEMKIN III chemical kinetics package, a computational model of gas explosion in a constant volume bomb was built, and the detailed reaction mechanism (GRI-Mech 3.0) was adopted. The mole fraction profiles of reactants, some selected free radicals and catastrophic gases in the process of gas explosion were analyzed by this model. Furthermore, through the sensitivity analysis of the reaction mechanism of gas explosion, the dominant reactions that affect gas explosion and the formation of catastrophic gases were found out. At the same time, the inhibition mechanisms of water on gas explosion and the formation of catastrophic gases were analyzed. The results show that the induced explosion time is prolonged, and the mole fractions of reactant species such as CH(4), O(2) and catastrophic gases such as CO, CO(2) and NO are decreased as water is added to the mixed gas. With the water fraction in the mixed gas increasing, the sensitivities of the dominant reactions contributing to CH(4), CO(2) are decreased and the sensitivity coefficients of CH(4), CO and NO mole fractions are also decreased. The inhibition of gas explosion with water addition can be ascribed to the significant decrease of H, O and OH in the process of gas explosion due to the water presence. PMID:19811873

  7. Chromosome studies in the aquatic monocots of Myanmar: A brief review with additional records

    PubMed Central

    2014-01-01

    Abstract Myanmar (Burma) constitutes a significant component of the Indo-Myanmar biodiversity hotspot, with elements of the Indian, the Indochina, and the Sino-Japanese floristic regions, yet thus far only a few reliable sources of the country's flora have been available. As a part of a contribution for the floristic inventory of Myanmar, since it is important in a floristic survey to obtain as much information as possible, in addition to previous two reports, here we present three more chromosome counts in the aquatic monocots of Myanmar: Limnocharis flava with 2n = 20, Sagittaria trifolia with 2n = 22 (Alismataceae), and Potamogeton distinctus × Potamogeton nodosus with 2n = 52 (Potamogetonaceae); the third one is new to science. A brief review of cytological researches in the floristic regions' 45 non-hybrid aquatic monocots plus well investigated two inter-specific hybrids that are recorded in Myanmar is given, indicating that the further works with a focus on species in Myanmar that has infra-specific chromosome variation in the floristic regions will address the precise evolutionary history of the aquatic flora of Myanmar. PMID:24891826

  8. Exploratory studies of extended storage of apheresis platelets in a platelet additive solution (PAS).

    PubMed

    Slichter, Sherrill J; Corson, Jill; Jones, Mary Kay; Christoffel, Todd; Pellham, Esther; Bailey, S Lawrence; Bolgiano, Doug

    2014-01-01

    To evaluate the poststorage viability of apheresis platelets stored for up to 18 days in 80% platelet additive solution (PAS)/20% plasma, 117 healthy subjects donated platelets using the Haemonetics MCS+, COBE Spectra (Spectra), or Trima Accel (Trima) systems. Control platelets from the same subjects were compared with their stored test PAS platelets by radiolabeling their stored and control platelets with either (51)chromium or (111)indium. Trima platelets met Food and Drug Administration poststorage platelet viability criteria for only 7 days vs almost 13 days for Haemonetics platelets; ie, platelet recoveries after these storage times averaged 44 ± 3% vs 49 ± 3% and survivals were 5.4 ± 0.3 vs 4.6 ± 0.3 days, respectively. The differences in storage duration are likely related to both the collection system and the storage bag. The Spectra and Trima platelets were hyperconcentrated during collection, and PAS was added, whereas the Haemonetics platelets were elutriated with PAS, which may have resulted in less collection injury. When Spectra and Trima platelets were stored in Haemonetics' bags, poststorage viability was significantly improved. Platelet viability is better maintained in vitro than in vivo, allowing substantial increases in platelet storage times. However, implementation will require resolution of potential bacterial overgrowth during storage. PMID:24258816

  9. Further verification of the isotope dilution approach for estimating the degree of participation of (/sup 3/H)thymidine in DNA synthesis in studies of aquatic bacterial production

    SciTech Connect

    Bell, R.T.

    1986-11-01

    The optimal concentration of (/sup 3/H)thymidine (i.e., the maximal degree of participation in DNA synthesis) as determined by adding increasing amounts of labeled thymidine at the same specific activity was similar to the concentration of thymidine inhibiting the de novo pathway as determined by isotope dilution plots. These experiments provide further verification of the isotope dilution approach for determining the degree of participation of (/sup 3/H)thymidine in DNA synthesis.

  10. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John

    2006-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  11. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  12. Food additives

    MedlinePlus

    Food additives are substances that become part of a food product when they are added during the processing or making of that food. "Direct" food additives are often added during processing to: Add nutrients ...

  13. Extension and validation of an analytical model for in vivo PET verification of proton therapy--a phantom and clinical study.

    PubMed

    Attanasi, F; Knopf, A; Parodi, K; Paganetti, H; Bortfeld, T; Rosso, V; Del Guerra, A

    2011-08-21

    The interest in positron emission tomography (PET) as a tool for treatment verification in proton therapy has become widespread in recent years, and several research groups worldwide are currently investigating the clinical implementation. After the first off-line investigation with a PET/CT scanner at MGH (Boston, USA), attention is now focused on an in-room PET application immediately after treatment in order to also detect shorter-lived isotopes, such as O15 and N13, minimizing isotope washout and avoiding patient repositioning errors. Clinical trials are being conducted by means of commercially available PET systems, and other tests are planned using application-dedicated tomographs. Parallel to the experimental investigation and new hardware development, great interest has been shown in the development of fast procedures to provide feedback regarding the delivered dose from reconstructed PET images. Since the thresholds of inelastic nuclear reactions leading to tissue β+ -activation fall within the energy range of 15-20 MeV, the distal activity fall-off is correlated, but not directly matched, to the distal fall-off of the dose distribution. Moreover, the physical interactions leading to β+ -activation and energy deposition are of a different nature. All these facts make it essential to further develop accurate and fast methodologies capable of predicting, on the basis of the planned dose distribution, expected PET images to be compared with actual PET measurements, thus providing clinical feedback on the correctness of the dose delivery and of the irradiation field position. The aim of this study has been to validate an analytical model and to implement and evaluate it in a fast and flexible framework able to locally predict such activity distributions directly taking the reference planning CT and planned dose as inputs. The results achieved in this study for phantoms and clinical cases highlighted the potential of the implemented method to predict expected

  14. Voice measures of workload in the advanced flight deck: Additional studies

    NASA Technical Reports Server (NTRS)

    Schneider, Sid J.; Alpert, Murray

    1989-01-01

    These studies investigated acoustical analysis of the voice as a measure of workload in individual operators. In the first study, voice samples were recorded from a single operator during high, medium, and low workload conditions. Mean amplitude, frequency, syllable duration, and emphasis all tended to increase as workload increased. In the second study, NASA test pilots performed a laboratory task, and used a flight simulator under differing work conditions. For two of the pilots, high workload in the simulator brought about greater amplitude, peak duration, and stress. In both the laboratory and simulator tasks, high workload tended to be associated with more statistically significant drop-offs in the acoustical measures than were lower workload levels. There was a great deal of intra-subject variability in the acoustical measures. The results suggested that in individual operators, increased workload might be revealed by high initial amplitude and frequency, followed by rapid drop-offs over time.

  15. Additional Study of Water Droplet Median Volume Diameter (MVD) Effects on Ice Shapes

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching; Anderson, David N.

    2005-01-01

    This paper reports the result of an experimental study in the NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the MVD-independent effect identified previously might apply to SLD conditions in rime icing situations. Models were NACA 0012 wing sections with chords of 53.3 and 91.4 cm. Tests were conducted with a nominal airspeed of 77 m/s (150 kt) and a number of MVD's ranging from 15 to 100 m with LWC of 0.5 to 1 g/cu m. In the present study, ice shapes recorded from past studies and recent results at SLD and Appendix-C conditions are reviewed to show that droplet diameter is not important to rime ice shape for MVD of 30 microns or larger, but for less than 30 m drop sizes a rime ice shape transition from convex to wedge to spearhead type ice shape is observed.

  16. Study on Type C Coal Fly ash as an Additive to Molding Sand for Steel Casting

    NASA Astrophysics Data System (ADS)

    Palaniappan, Jayanthi

    2016-05-01

    Study of physio-chemical properties studies such as granulometric analysis, moisture, X ray fluorescence etc. were performed with Type C coal—combustion fly ash to investigate their potential as a distinct option for molding sand in foundry, thereby reducing the dependency on latter. Technological properties study such as compressive strength, tensile strength, permeability and compaction of various compositions of fly ash molding sand (10, 20 and 30 % fly ash substitute to chemically bonded sand) were performed and compared with silica molding sand. Steel casting production using this fly ash molding sand was done and the casting surface finish and typical casting parameters were assessed. It was noted that a good quality steel casting could be produced using type C fly ash molding sand, which effectively replaced 20 % of traditional molding sand and binders thereby providing greater financial profits to the foundry and an effective way of fly ash utilization (waste management).

  17. In situ vitrification and the effects of soil additives; A mixture experiment case study

    SciTech Connect

    Piepel, G.F.; Shade, J.W. )

    1992-01-01

    This paper presents a case study involving in situ vitrification (ISV), a process for immobilizing chemical or nuclear wastes in soil by melting-dissolving the contaminated soil into a glass block. One goal of the study was to investigate how viscosity and electrical conductivity were affected by mixing CaO and Na{sub 2}O with soil. A three-component constrained-region mixture experiment design was generated and the viscosity and electrical conductivity data collected. Several second-order mixture models were considered, and the Box-Cox transformation technique was applied to select property transformations. The fitted models were used to produce contour and component effects plots.

  18. A review of approaches to the study of turbulence modification by means of non-Newtonian additives

    NASA Astrophysics Data System (ADS)

    Vlassopoulos, Dimitris; Schowalter, William R.

    1987-12-01

    The addition of small amounts of polymers to Newtonian liquids under conditions of turbulent flow results in substantial reduction of skin friction. This phenomenon has been observed experimentally. It can be attributed to the unusual behavior of dilute polymer solutions in turbulent flows. A condensed review of topics relevent to theoretical study of drag reduction by non-Newtonian additives is presented. In addition, the techniques and results of experimental investigations of this phenomenon are examined. It is proposed that dilute solutions of polymers or surfactants can be rheologically characterized by measuring the secondary flow characteristics that occur in the neighborhood of an oscillating cylinder. Plans for conducting these measurements are presented.

  19. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  20. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  1. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  2. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  3. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  4. A Study of the Effect of Additional Reading Assistance on Student Achievement

    ERIC Educational Resources Information Center

    Gillan-Sanderson, Nicole A.

    2012-01-01

    This study describes a procedure one school district used to increase students' reading abilities through reviewing data and adjusting the instruction to give students intensive services, as needed. This school worked in a problem-solving team approach to develop a comprehensive team that followed the progression of student achievement.…

  5. Genome-Wide Association Study of Intelligence: Additive Effects of Novel Brain Expressed Genes

    ERIC Educational Resources Information Center

    Loo, Sandra K.; Shtir, Corina; Doyle, Alysa E.; Mick, Eric; McGough, James J.; McCracken, James; Biederman, Joseph; Smalley, Susan L.; Cantor, Rita M.; Faraone, Stephen V.; Nelson, Stanley F.

    2012-01-01

    Objective: The purpose of the present study was to identify common genetic variants that are associated with human intelligence or general cognitive ability. Method: We performed a genome-wide association analysis with a dense set of 1 million single-nucleotide polymorphisms (SNPs) and quantitative intelligence scores within an ancestrally…

  6. Nahuatl as a Classical, Foreign, and Additional Language: A Phenomenological Study

    ERIC Educational Resources Information Center

    De Felice, Dustin

    2012-01-01

    In this study, participants learning an endangered language variety shared their experiences, thoughts, and feelings about the often complex and diverse language-learning process. I used phenomenological interviews in order to learn more about these English or Spanish language speakers' journey with the Nahuatl language. From first encounter to…

  7. CNV-based genome wide association study reveals additional variants contributing to meat quality in swine

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pork quality is important both to the meat processing industry and consumers’ purchasing attitudes. Copy number variation (CNV) is a burgeoning kind of variant that may influence meat quality. Herein, a genome-wide association study (GWAS) was performed between CNVs and meat quality traits in swine....

  8. Integrated guidance, navigation and control verification plan primary flight system. [space shuttle avionics integration

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.

  9. Simplifying EPID dosimetry for IMRT treatment verification

    SciTech Connect

    Pecharroman-Gallego, R.; Mans, Anton; Sonke, Jan-Jakob; Stroom, Joep C.; Olaciregui-Ruiz, Igor; Herk, Marcel van; Mijnheer, Ben J.

    2011-02-15

    Purpose: Electronic portal imaging devices (EPIDs) are increasingly used for IMRT dose verification, both pretreatment and in vivo. In this study, an earlier developed backprojection model has been modified to avoid the need for patient-specific transmission measurements and, consequently, leads to a faster procedure. Methods: Currently, the transmission, an essential ingredient of the backprojection model, is estimated from the ratio of EPID measurements with and without a phantom/patient in the beam. Thus, an additional irradiation to obtain ''open images'' under the same conditions as the actual phantom/patient irradiation is required. However, by calculating the transmission of the phantom/patient in the direction of the beam instead of using open images, this extra measurement can be avoided. This was achieved by using a model that includes the effect of beam hardening and off-axis dependence of the EPID response on photon beam spectral changes. The parameters in the model were empirically obtained by performing EPID measurements using polystyrene slab phantoms of different thickness in 6, 10, and 18 MV photon beams. A theoretical analysis to verify the sensitivity of the model with patient thickness changes was performed. The new model was finally applied for the analysis of EPID dose verification measurements of step-and-shoot IMRT treatments of head and neck, lung, breast, cervix, prostate, and rectum patients. All measurements were carried out using Elekta SL20i linear accelerators equipped with a hydrogenated amorphous silicon EPID, and the IMRT plans were made using PINNACLE software (Philips Medical Systems). Results: The results showed generally good agreement with the dose determined using the old model applying the measured transmission. The average differences between EPID-based in vivo dose at the isocenter determined using either the new model for transmission and its measured value were 2.6{+-}3.1%, 0.2{+-}3.1%, and 2.2{+-}3.9% for 47 patients

  10. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  11. Thermal analysis studies of Ge additive of Se-Te glasses

    NASA Astrophysics Data System (ADS)

    Mohamed, M.; Abdel-Rahim, M. A.

    2016-04-01

    Ge x Se50Te50- x ( x = 5, 15, 20, 35 at.%) bulk glasses were synthesized by the melt quenching method. The amorphous nature of the investigated glasses was determined by X-ray diffraction. Results of differential scanning calorimetry (DSC) of the studied compositions under non-isothermal conditions were reported and discussed. The glass transition temperature ( T g), onset crystallization temperature ( T c), and crystallization peak temperature ( T p) were determined from DSC traces at different heating rates. It was found that the values of T g, T c, and T p rely on both composition and heating rate. A double crystallization stages were observed in the DSC results. Various kinetics parameters such as the glass transition energy ( E g), crystallization activation energy ( E c), and rate constant ( K p) were calculated. The glass-forming ability of the studied compositions was discussed as function of the determined kinetics parameters.

  12. Comparative study of glycine single crystals with additive of potassium nitrate in different concentration ratios

    NASA Astrophysics Data System (ADS)

    Gujarati, Vivek P.; Deshpande, M. P.; Patel, Kamakshi R.; Chaki, S. H.

    2016-05-01

    Semi-organic crystals of Glycine Potassium Nitrate (GPN) with potential applications in Non linear optics (NLO) were grown using slow evaporation technique. Glycine and Potassium Nitrate were taken in three different concentration ratios of 3:1, 2:1 and 1:1 respectively. We checked the solubility of the material in distilled water at different temperatures and could observe the growth of crystals in 7 weeks time. Purity of the grown crystals was confirmed by Energy Dispersive X-ray Analysis (EDAX) and CHN analysis. GSN Powder X-ray diffraction pattern was recorded to confirm the crystalline nature. To confirm the applications of grown crystals in opto-electronics field, UV-Vis-NIR study was carried out. Dielectric properties of the samples were studied in between the frequency range 1Hz to 100 KHz.

  13. Brief reconnaissance study for the addition of hydropower for Riegel Dam, Trion, Georgia

    SciTech Connect

    Gebhard, T.G. Jr.

    1982-05-10

    The feasibility of retrofitting the Riegel Dam near Trion, Georgia for power generation was examined. This dam has a developable head of 17 ft., was built in 1900 for supplying hydroelectric power for a textile mill, and currently provides cooling water to the mill. The study of environmental, institutional, safety and economic factors showed that hydroelectric power development at this site would require new generating equipment and that such retrofitting appears to be economically feasible. (LCL)

  14. Brief reconnaissance study for the addition of hydropower for Carr Fork Dam, Sassafras, Kentucky

    SciTech Connect

    Gebhard, T.G. Jr.

    1982-05-24

    The feasibility of retrofitting the Carr Fork Dam near Hazard, KY for power generation was examined. This dam has a developable head of 80 ft and was built in 1975 to provide flood protection. The study of environmental, institutional, safety, and economic factors showed that the total investment cost would be $909,600 and that hydroelectric power development at this site is not feasible unless a higher price could be obtained for the power sold. (LCL)

  15. A Design Verification of the Parallel Pipelined Image Processings

    NASA Astrophysics Data System (ADS)

    Wasaki, Katsumi; Harai, Toshiaki

    2008-11-01

    This paper presents a case study of the design and verification of a parallel and pipe-lined image processing unit based on an extended Petri net, which is called a Logical Colored Petri net (LCPN). This is suitable for Flexible-Manufacturing System (FMS) modeling and discussion of structural properties. LCPN is another family of colored place/transition-net(CPN) with the addition of the following features: integer value assignment of marks, representation of firing conditions as marks' value based formulae, and coupling of output procedures with transition firing. Therefore, to study the behavior of a system modeled with this net, we provide a means of searching the reachability tree for markings.

  16. Stationary spiraling eddies in presence of polar amplification of global warming as a governing factor of ecology of Greenland seals White Sea population: results of verification study

    NASA Astrophysics Data System (ADS)

    Melentyev, K.; Chernook, V.; Melentyev, V.

    2003-04-01

    Ice-associated forms of marine mammals are representatives of a high level of fodder chains in the ocean and taxation of population number for different group, as assessment of ecology and animal welfare are the important tasks for marine biology, ecology, fishery and other application uses. Many problems create a global warming and antropogenical impact on marine and coastal ecosystem. In order to investigate ice covered Arctic Ocean and charting the number of seals were performed annual inspections onboard research aircraft PINRO "Arktika". Multi-spectral airborne and satellite observations were fulfilled regularly from Barents and White Sea to the Bering and Okhotsk Sea (1996-2002). A contemporary status of different group of sea mammals was evaluated, where number of adults and pups were checked separately. In situ observations were provided with using helicopter and icebreaker for gathering a water samples and ice cores (with following biochemical and toxicological analysis). A prevailing part of life cycle of Greenland seals (harp seal) is strongly depended from winter hydrology (water masses, stable currents, meandering fronts, stationary eddies) and closely connected with type of ice (pack, fast ice) and other parameters of ice (age, origin, salinity, ice edge.). First-year ice floes which has a specific properties and distinctive features are used by harp seals for pupping, lactation, molting, pairing and resting. Ringed seals, inversely, use for corresponding purposes only fast-ice. Different aspects of ecology, and migration features of harp seals were analyzed in frame of verification study. It was revealed a scale of influence of winter severity and wind regime, but stationary eddies in the White Sea is most effective governing factor (novelty). Following relationship " eddies - ecology of Greenland seal White Sea population " will be discussed: A) regularities of eddies formation and their spatial arrangement, temporal (seasonal and annual

  17. A clinical comparative study of Cadiax Compact II and intraoral records using wax and addition silicone.

    PubMed

    Torabi, Kianoosh; Pour, Sasan Rasaei; Ahangari, Ahmad Hassan; Ghodsi, Safoura

    2014-01-01

    Evaluation of mandibular movements is necessary to form the occlusal anatomical contour, analyze the temporomandibular joint status, and evaluate the patient's occlusion. This clinical study was conducted to compare the mandibular recording device Cadiax Compact II with routine intraoral records for measuring condylar inclinations. The results showed that the differences between Cadiax and intraoral records were statistically significant for all measurements. Cadiax measurements had a stronger correlation with silicone records. The quantities of recorded Bennett angles were lower and the values of sagittal condylar inclination were higher with Cadiax than with routine intraoral records. PMID:25390868

  18. Effect of object identification algorithms on feature based verification scores

    NASA Astrophysics Data System (ADS)

    Weniger, Michael; Friederichs, Petra

    2015-04-01

    Many modern spatial verification techniques rely on feature identification algorithms. We study the importance of the choice of algorithm and its parameters for the resulting scores. SAL is used as an example to show that these choices have a statistically significant impact on the distributions of object dependent scores. Non-continuous operators used for feature identification are identified as the underlying reason for the observed stability issues, with implications for many feature based verification techniques.

  19. [Chewing gum as an additional agent in maintaining oral hygiene versus smoking status--preliminary study].

    PubMed

    Nakonieczna-Rudnicka, Marta; Strycharz-Dudziak, Małgorzata; Bachanek, Teresa

    2012-01-01

    Nowadays chewing gum is widely used in different age groups, so complying with proper duration and frequency of chewing is an important factor influencing the state of masticatory system. The study involved 112 dental students of the Medical University of Lublin. Everyday use of chewing gum declared 47,32% of cases. Chewing time up to 10 minutes was stated in 23,08% of respondents, 11-20 minutes in 40,38% of interviewees. Among the examined students 17,3% smoked cigarettes. In smokers group 83,33% of questioned chewed the gum every day, while among non-smokers - 43,37%. Chewing time shorter than 10 minutes declared 22,22% of smokers and 23,26% of non-smokers, while chewing time between 11-20 minutes - 27,78% i 44,35% of smokers and non-smokers respectively. Obtained results indicate the need of carrying out further studies aimed at the nicotine influence on saliva parameters with respect to development of diseases of hard tooth tissues. PMID:23421028

  20. Study on the interaction of the toxic food additive carmoisine with serum albumins: a microcalorimetric investigation.

    PubMed

    Basu, Anirban; Kumar, Gopinatha Suresh

    2014-05-30

    The interaction of the synthetic azo dye and food colorant carmoisine with human and bovine serum albumins was studied by microcalorimetric techniques. A complete thermodynamic profile of the interaction was obtained from isothermal titration calorimetry studies. The equilibrium constant of the complexation process was of the order of 10(6)M(-1) and the binding stoichiometry was found to be 1:1 with both the serum albumins. The binding was driven by negative standard molar enthalpy and positive standard molar entropy contributions. The binding affinity was lower at higher salt concentrations in both cases but the same was dominated by mostly non-electrostatic forces at all salt concentrations. The polyelectrolytic forces contributed only 5-8% of the total standard molar Gibbs energy change. The standard molar enthalpy change enhanced whereas the standard molar entropic contribution decreased with rise in temperature but they compensated each other to keep the standard molar Gibbs energy change almost invariant. The negative standard molar heat capacity values suggested the involvement of a significant hydrophobic contribution in the complexation process. Besides, enthalpy-entropy compensation phenomenon was also observed in both the systems. The thermal stability of the serum proteins was found to be remarkably enhanced on binding to carmoisine. PMID:24742664

  1. Combined ab initio molecular dynamics and experimental studies of carbon atom addition to benzene.

    PubMed

    McKee, Michael L; Reisenauer, Hans Peter; Schreiner, Peter R

    2014-04-17

    Car-Parrinello molecular dynamics was used to explore the reactions between triplet and singlet carbon atoms with benzene. The computations reveal that, in the singlet C atom reaction, products are very exothermic where nearly every collision yields a product that is determined by the initial encounter geometry. The singlet C atom reaction does not follow the minimum energy path because the bimolecular reaction is controlled by dynamics (i.e., initial orientation of encounter). On the other hand, in a 10 K solid Ar matrix, ground state C((3)P) atoms do tend to follow RRKM kinetics. Thus, ab initio molecular dynamics (AIMD) results indicate that a significant fraction of C-H insertion occurs to form phenylcarbene whereas, in marked contrast to previous theoretical and experimental conclusions, the Ar matrix isolation studies indicate a large fraction of direct cycloheptatetraene formation, without the intermediacy of phenylcarbene. The AIMD calculations are more consistent with vaporized carbon atom experiments where labeling studies indicate the initial formation of phenylcarbene. This underlines that the availability of thermodynamic sinks can completely alter the observed reaction dynamics. PMID:24661002

  2. Biometric verification with correlation filters.

    PubMed

    Vijaya Kumar, B V K; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-10

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. PMID:14735958

  3. Biometric verification with correlation filters

    NASA Astrophysics Data System (ADS)

    Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-01

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.

  4. TPS verification with UUT simulation

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  5. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  6. Shelf Life and Quality Study of Minced Tilapia with Nori and Hijiki Seaweeds as Natural Additives

    PubMed Central

    Ribeiro, Ingridy Simone; Shirahigue, Ligianne Din; Ferraz de Arruda Sucasas, Lia; Anbe, Lika; da Cruz, Pedro Gomes; Gallo, Cláudio Rosa; Carpes, Solange Teresinha; Marques, Marcos José; Oetterer, Marília

    2014-01-01

    The extraction of mechanically separated meat has emerged as an attractive process. However, it increases the incorporation of oxygen and, consequently, of flavors due to rancidity. Thus, preservatives must be added. The objective of this study was to evaluate the shelf life of minced tilapia to replace synthetic preservatives with Hijiki and Nori seaweeds extracts. The application of the extracts had no effect on the chemical composition of the minced tilapia. The seaweed extracts had inhibitory effect on total volatile base nitrogen. The minced tilapia complied with the microbiological standard set by Brazilin law. The panelists detected no differences in the rancid aroma and only minor differences were detected in the color of the products. It can be concluded that the minced tilapia with added seaweed extracts were within quality standards during frozen storage. PMID:25478593

  7. Shelf life and quality study of minced tilapia with Nori and Hijiki seaweeds as natural additives.

    PubMed

    Ribeiro, Ingridy Simone; Shirahigue, Ligianne Din; Ferraz de Arruda Sucasas, Lia; Anbe, Lika; da Cruz, Pedro Gomes; Gallo, Cláudio Rosa; Carpes, Solange Teresinha; Marques, Marcos José; Oetterer, Marília

    2014-01-01

    The extraction of mechanically separated meat has emerged as an attractive process. However, it increases the incorporation of oxygen and, consequently, of flavors due to rancidity. Thus, preservatives must be added. The objective of this study was to evaluate the shelf life of minced tilapia to replace synthetic preservatives with Hijiki and Nori seaweeds extracts. The application of the extracts had no effect on the chemical composition of the minced tilapia. The seaweed extracts had inhibitory effect on total volatile base nitrogen. The minced tilapia complied with the microbiological standard set by Brazilin law. The panelists detected no differences in the rancid aroma and only minor differences were detected in the color of the products. It can be concluded that the minced tilapia with added seaweed extracts were within quality standards during frozen storage. PMID:25478593

  8. l-carnitine as a Potential Additive in Blood Storage Solutions: A Study on Erythrocytes.

    PubMed

    Soumya, R; Carl, H; Vani, R

    2016-09-01

    Erythrocytes undergo various changes during storage (storage lesion) that in turn reduces their functioning and survival. Oxidative stress plays a major role in the storage lesion and antioxidants can be used to combat this stress. This study elucidates the effects of l-carnitine (LC) on erythrocytes of stored blood. Blood was obtained from male Wistar rats and stored (4 °C) for 20 days in CPDA-1 (citrate phosphate dextrose adenine) solution. Samples were divided into-(i) controls (ii) LC 10 (l-carnitine at a concentration of 10 mM) (iii) LC 30 (l-carnitine at a concentration of 30 mM) and (iv) LC 60 (l-carnitine at a concentration of 60 mM). Every fifth day, the biomarkers (haemoglobin, hemolysis, antioxidant enzymes, lipid peroxidation and protein oxidation products) were analysed in erythrocytes. Hemoglobin and protein sulfhydryls were insignificant during storage indicative of the maintenance of hemoglobin and sulfhydryls in all groups. Superoxide dismutase and malondialdehyde levels increased initially and decreased towards the end of storage. The levels of catalase and glutathione peroxidase were lower in experimentals than controls during storage. l-carnitine assisted the enzymes by scavenging the reactive oxygen species produced. Hemolysis increased in all groups with storage, elucidating that l-carnitine could not completely protect lipids and proteins from oxidative stress. Hence, this study opens up new avenues of using l-carnitine as a component of storage solutions with combinations of antioxidants in order to maintain efficacy of erythrocytes. PMID:27429526

  9. Biological effect of food additive titanium dioxide nanoparticles on intestine: an in vitro study.

    PubMed

    Song, Zheng-Mei; Chen, Ni; Liu, Jia-Hui; Tang, Huan; Deng, Xiaoyong; Xi, Wen-Song; Han, Kai; Cao, Aoneng; Liu, Yuanfang; Wang, Haifang

    2015-10-01

    Titanium dioxide nanoparticles (TiO2 NPs) are widely found in food-related consumer products. Understanding the effect of TiO2 NPs on the intestinal barrier and absorption is essential and vital for the safety assessment of orally administrated TiO2 NPs. In this study, the cytotoxicity and translocation of two native TiO2 NPs, and these two TiO2 NPs pretreated with the digestion simulation fluid or bovine serum albumin were investigated in undifferentiated Caco-2 cells, differentiated Caco-2 cells and Caco-2 monolayer. TiO2 NPs with a concentration less than 200 µg ml(-1) did not induce any toxicity in differentiated cells and Caco-2 monolayer after 24 h exposure. However, TiO2 NPs pretreated with digestion simulation fluids at 200 µg ml(-1) inhibited the growth of undifferentiated Caco-2 cells. Undifferentiated Caco-2 cells swallowed native TiO2 NPs easily, but not pretreated NPs, implying the protein coating on NPs impeded the cellular uptake. Compared with undifferentiated cells, differentiated ones possessed much lower uptake ability of these TiO2 NPs. Similarly, the traverse of TiO2 NPs through the Caco-2 monolayer was also negligible. Therefore, we infer the possibility of TiO2 NPs traversing through the intestine of animal or human after oral intake is quite low. This study provides valuable information for the risk assessment of TiO2 NPs in food. PMID:26106068

  10. Mechanistic study of secondary organic aerosol components formed from nucleophilic addition reactions of methacrylic acid epoxide

    NASA Astrophysics Data System (ADS)

    Birdsall, A. W.; Miner, C. R.; Mael, L. E.; Elrod, M. J.

    2014-08-01

    Recently, methacrylic acid epoxide (MAE) has been proposed as a precursor to an important class of isoprene-derived compounds found in secondary organic aerosol (SOA): 2-methylglyceric acid (2-MG) and a set of oligomers, nitric acid esters and sulfuric acid esters related to 2-MG. However, the specific chemical mechanisms by which MAE could form these compounds have not been previously studied. In order to determine the relevance of these processes to atmospheric aerosol, MAE and 2-MG have been synthesized and a series of bulk solution-phase experiments aimed at studying the reactivity of MAE using nuclear magnetic resonance (NMR) spectroscopy have been performed. The present results indicate that the acid-catalyzed MAE reaction is more than 600 times slower than a similar reaction of an important isoprene-derived epoxide, but is still expected to be kinetically feasible in the atmosphere on more acidic SOA. The specific mechanism by which MAE leads to oligomers was identified, and the reactions of MAE with a number of atmospherically relevant nucleophiles were also investigated. Because the nucleophilic strengths of water, sulfate, alcohols (including 2-MG), and acids (including MAE and 2-MG) in their reactions with MAE were found to be of a similar magnitude, it is expected that a diverse variety of MAE + nucleophile product species may be formed on ambient SOA. Thus, the results indicate that epoxide chain reaction oligomerization will be limited by the presence of high concentrations of non-epoxide nucleophiles (such as water); this finding is consistent with previous environmental chamber investigations of the relative humidity-dependence of 2-MG-derived oligomerization processes and suggests that extensive oligomerization may not be likely on ambient SOA because of other competitive MAE reaction mechanisms.

  11. Mechanistic study of secondary organic aerosol components formed from nucleophilic addition reactions of methacrylic acid epoxide

    NASA Astrophysics Data System (ADS)

    Birdsall, A. W.; Miner, C. R.; Mael, L. E.; Elrod, M. J.

    2014-12-01

    Recently, methacrylic acid epoxide (MAE) has been proposed as a precursor to an important class of isoprene-derived compounds found in secondary organic aerosol (SOA): 2-methylglyceric acid (2-MG) and a set of oligomers, nitric acid esters, and sulfuric acid esters related to 2-MG. However, the specific chemical mechanisms by which MAE could form these compounds have not been previously studied with experimental methods. In order to determine the relevance of these processes to atmospheric aerosol, MAE and 2-MG have been synthesized and a series of bulk solution-phase experiments aimed at studying the reactivity of MAE using nuclear magnetic resonance (NMR) spectroscopy have been performed. The present results indicate that the acid-catalyzed MAE reaction is more than 600 times slower than a similar reaction of an important isoprene-derived epoxide, but is still expected to be kinetically feasible in the atmosphere on more acidic SOA. The specific mechanism by which MAE leads to oligomers was identified, and the reactions of MAE with a number of atmospherically relevant nucleophiles were also investigated. Because the nucleophilic strengths of water, sulfate, alcohols (including 2-MG), and acids (including MAE and 2-MG) in their reactions with MAE were found to be of similar magnitudes, it is expected that a diverse variety of MAE + nucleophile product species may be formed on ambient SOA. Thus, the results indicate that epoxide chain reaction oligomerization will be limited by the presence of high concentrations of non-epoxide nucleophiles (such as water); this finding is consistent with previous environmental chamber investigations of the relative humidity dependence of 2-MG-derived oligomerization processes and suggests that extensive oligomerization may not be likely on ambient SOA because of other competitive MAE reaction mechanisms.

  12. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  13. Verification system for postoperative autologous blood retransfusion.

    PubMed

    Yoshikawa, Takeki; Kimura, Eizen; Kobayashi, Shinji; Ishihara, Ken

    2013-01-01

    Medical staff members should match blood products with patients using a barcode authentication system for blood transfusion to prevent medical accidents. However, our hospital only verifies the blood products of the Japanese Red Cross Society and the preserved autologous blood, not the autologous blood salvaged during the operation or from the oxygenator. In this study, we developed the barcode medication administration system and mobile device for verification. This system will prevent blood transfusion errors in the ward setting. PMID:23920751

  14. A digital process for additive manufacturing of occlusal splints: a clinical pilot study.

    PubMed

    Salmi, Mika; Paloheimo, Kaija-Stiina; Tuomi, Jukka; Ingman, Tuula; Mäkitie, Antti

    2013-07-01

    The aim of this study was to develop and evaluate a digital process for manufacturing of occlusal splints. An alginate impression was taken from the upper and lower jaws of a patient with temporomandibular disorder owing to cross bite and wear of the teeth, and then digitized using a table laser scanner. The scanned model was repaired using the 3Data Expert software, and a splint was designed with the Viscam RP software. A splint was manufactured from a biocompatible liquid photopolymer by stereolithography. The system employed in the process was SLA 350. The splint was worn nightly for six months. The patient adapted to the splint well and found it comfortable to use. The splint relieved tension in the patient's bite muscles. No sign of tooth wear or significant splint wear was detected after six months of testing. Modern digital technology enables us to manufacture clinically functional occlusal splints, which might reduce costs, dental technician working time and chair-side time. Maximum-dimensional errors of approximately 1 mm were found at thin walls and sharp corners of the splint when compared with the digital model. PMID:23614943

  15. A digital process for additive manufacturing of occlusal splints: a clinical pilot study

    PubMed Central

    Salmi, Mika; Paloheimo, Kaija-Stiina; Tuomi, Jukka; Ingman, Tuula; Mäkitie, Antti

    2013-01-01

    The aim of this study was to develop and evaluate a digital process for manufacturing of occlusal splints. An alginate impression was taken from the upper and lower jaws of a patient with temporomandibular disorder owing to cross bite and wear of the teeth, and then digitized using a table laser scanner. The scanned model was repaired using the 3Data Expert software, and a splint was designed with the Viscam RP software. A splint was manufactured from a biocompatible liquid photopolymer by stereolithography. The system employed in the process was SLA 350. The splint was worn nightly for six months. The patient adapted to the splint well and found it comfortable to use. The splint relieved tension in the patient's bite muscles. No sign of tooth wear or significant splint wear was detected after six months of testing. Modern digital technology enables us to manufacture clinically functional occlusal splints, which might reduce costs, dental technician working time and chair-side time. Maximum-dimensional errors of approximately 1 mm were found at thin walls and sharp corners of the splint when compared with the digital model. PMID:23614943

  16. Value addition of Palmyra palm and studies on the storage life.

    PubMed

    Chaurasiya, A K; Chakraborty, I; Saha, J

    2014-04-01

    Palmyra palm (Borassus flabellifer L.) belonging to the family Palmae is referred to as tree of life with several uses including food, beverage, fibre, medicinal and timber. Unfortunately, the nutritionally enriched pulp of ripened palm has limited commercial use. Extraction of pulp has been accomplished by using water and heat to ensure maximum pulp recovery. Different recipes were tried for the preparation of two uncommon value added products like palm spread and palm toffee. On the basis of biochemical composition, organoleptic scores, microbial estimation and storage study both under ambient and refrigerated conditions; the suitable recipe was selected with the maximum acceptability. Gradual increase in total soluble solid (TSS), total sugar and reducing sugar while decrease in ascorbic acid, pH, β-carotene and protein content of processed products have been observed irrespective of storage condition. The results obtained from sensory evaluation and microbial status revealed that palm spread and toffee remained acceptable up to 9 months and 8 months, respectively at ambient temperature. The income per rupee investment for these two products was found to be remunerative. PMID:24741173

  17. Mass analysis addition to the Differential Ion Flux Probe (DIFP) study

    NASA Technical Reports Server (NTRS)

    Wright, K. H., Jr.; Jolley, Richard

    1994-01-01

    The objective of this study is to develop a technique to measure the characteristics of space plasmas under highly disturbed conditions; e.g., non-Maxwellian plasmas with strong drifting populations and plasmas contaminated by spacecraft outgassing. The approach, conducted in conjunction with current MSFC activities, is to extend the capabilities of the Differential Ion Flux Probe (DIFP) to include a high throughput mass measurement that does not require either high voltage or contamination sensitive devices such as channeltron electron multipliers or microchannel plates. This will significantly reduce the complexity and expense of instrument fabrication, testing, and integration of flight hardware compared to classical mass analyzers. The feasibility of the enhanced DIFP has been verified by using breadboard test models in a controlled plasma environment. The ability to manipulate particles through the instrument regardless of incident angle, energy, or ionic component has been amply demonstrated. The energy analysis mode is differential and leads directly to a time-of-flight mass measurement. With the new design, the DIFP will separate multiple ion streams and analyze each stream independently for ion flux intensity, velocity (including direction of motion), mass, and temperature (or energy distribution). In particular, such an instrument will be invaluable on follow-on electrodynamic TSS missions and, possibly, for environmental monitoring on the space station.

  18. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  19. Additional weight load increases freezing of gait episodes in Parkinson's disease; an experimental study.

    PubMed

    Mensink, Senja H G; Nonnekes, Jorik; van Bon, Geert; Snijders, Anke H; Duysens, Jacques; Weerdesteyn, Vivian; Bloem, Bastiaan R; Oude Nijhuis, Lars B

    2014-05-01

    Freezing of gait is an episodic gait disorder,characterized by the inability to generate effective forward stepping movements. The pathophysiology underlying freezing of gait remains insufficiently understood, and this hampers the development of better treatment strategies.Preliminary evidence suggests that impaired force control during walking may contribute to freezing episodes, with difficulty to unload the swing leg and initiate the swing phase. Here, we used external loading to manipulate force control and to investigate its influence on freezing of gait.Twelve Parkinson's disease patients with freezing of gait performed three contrasting tasks: (1) loaded gait while wearing a belt fortified with lead weights; (2) weight supported gait using a parachute harness connected to a rigid metal cable running above the gait trajectory; and (3)normal gait. Gait tasks were used to provoke freezing episodes, including rapid 360° turns. Freezing episodes were quantified using blinded, videotaped clinical assessment. Furthermore, ground reaction forces and body kinematics were recorded. Loading significantly increased the mean number of freezing episodes per trial compared to the normal gait condition (P<0.05), but the effect of weight support was not consistent. Loading particularly increased the number of freezing episodes during rapid short steps. Step length was significantly smaller during loaded gait compared to normal gait (P<0.05), but changes in anticipatory postural adjustments were not different.Our results may point to impaired force control playing a key role in freezing of gait. Future studies should further investigate the mechanism, i.e., the contribution of deficient load feedback, and evaluate which forms of weight support might offer treatment opportunities. PMID:24658705

  20. Neurobehavioral deficits in Persian Gulf veterans: additional evidence from a population-based study.

    PubMed

    Storzbach, D; Rohlman, D S; Anger, W K; Binder, L M; Campbell, K A

    2001-01-01

    Reports of low-concentration nerve gas exposures during the Gulf War (GW) have spurred concern about possible health consequences and symptoms reported by many returning veterans. The Portland Environmental Hazards Research Center is studying veterans from the northwest United States who report persistent, unexplained "Gulf War" symptoms (cases) and those who do not report those symptoms (controls). An epidemiological survey focused on exposures and symptoms was mailed to a random sample of GW veterans from Oregon and southwestern Washington. Volunteers recruited from survey respondents agreed to undergo a thorough medical examination and psychological and neurobehavioral assessment. Persistent symptoms with no medical explanation associated with Persian Gulf service (e.g., fatigue, muscle pain, memory deficits) beginning during or after the war qualified respondents as cases. The 239 cases with unexplained symptoms and the 112 controls without symptoms were administered a computerized assessment battery of 12 psychosocial and 6 neurobehavioral tests. Replicating and extending previous interim findings, a subgroup of veterans emerged from the initial analysis in the form of extreme outliers which produced a visually and quantitatively obvious bimodal distribution. This led, as it had previously, to analyses of the outliers as a separate group (labeled "slow ODTP"), which confirmed the initial findings of neurobehavioral differences between the outliers and the other cases and controls and provided more convincing evidence that the majority of cases who report neurobehavioral symptoms have no objective evidence of neurobehavioral deficits. However, the larger group of symptomatic veterans do have highly significant and compelling evidence of psychological distress based on scores from 11 separate psychological tests. Whereas the cases differed from the controls by poorer neurobehavioral test performance, extraction of the slow ODTP participants (almost all cases

  1. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  2. On code verification of RANS solvers

    NASA Astrophysics Data System (ADS)

    Eça, L.; Klaij, C. M.; Vaz, G.; Hoekstra, M.; Pereira, F. S.

    2016-04-01

    This article discusses Code Verification of Reynolds-Averaged Navier Stokes (RANS) solvers that rely on face based finite volume discretizations for volumes of arbitrary shape. The study includes test cases with known analytical solutions (generated with the method of manufactured solutions) corresponding to laminar and turbulent flow, with the latter using eddy-viscosity turbulence models. The procedure to perform Code Verification based on grid refinement studies is discussed and the requirements for its correct application are illustrated in a simple one-dimensional problem. It is shown that geometrically similar grids are recommended for proper Code Verification and so the data should not have scatter making the use of least square fits unnecessary. Results show that it may be advantageous to determine the extrapolated error to cell size/time step zero instead of assuming that it is zero, especially when it is hard to determine the asymptotic order of grid convergence. In the RANS examples, several of the features of the ReFRESCO solver are checked including the effects of the available turbulence models in the convergence properties of the code. It is shown that it is required to account for non-orthogonality effects in the discretization of the diffusion terms and that the turbulence quantities transport equations can deteriorate the order of grid convergence of mean flow quantities.

  3. A Pilot Study to Examine the Effect of Additional Structured Outdoor Playtime on Preschoolers' Physical Activity Levels

    ERIC Educational Resources Information Center

    Alhassan, Sofiya; Nwaokelemeh, Ogechi; Lyden, Kate; Goldsby, TaShauna; Mendoza, Albert

    2013-01-01

    The impact of additional structured outdoor playtime on preschoolers'; physical activity (PA) level is unclear. The purpose of this pilot study was to explore the effects of increasing structured outdoor playtime on preschoolers'; PA levels. Eight full-day classrooms (n = 134 children) from two preschool programmes were randomised into a treatment…

  4. STUDY OF THE EFFECT OF CHLORINE ADDITION ON MERCURY OXIDATION BY SCR CATALYST UNDER SIMULATED SUBBITUMINOUS COAL FLUE GAS

    EPA Science Inventory

    An entrained flow reactor is used to study the effect of addition of chlorine-containing species on the oxidation of elemental mercury (Hgo)by a selective catalytic reduction (SCR) catalyst in simulated subbituminous coal combustion flue gas. The combustion flue gas was doped wit...

  5. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314 Tank Farm Restoration and Safe Operations

    SciTech Connect

    MCGREW, D.L.

    1999-09-28

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate.

  6. FINAL REPORT –INDEPENDENT VERIFICATION SURVEY SUMMARY AND RESULTS FOR THE ARGONNE NATIONAL LABORATORY BUILDING 330 PROJECT FOOTPRINT, ARGONNE, ILLINOIS

    SciTech Connect

    ERIKA N. BAILEY

    2012-02-29

    ORISE conducted onsite verification activities of the Building 330 project footprint during the period of June 6 through June 7, 2011. The verification activities included technical reviews of project documents, visual inspections, radiation surface scans, and sampling and analysis. The draft verification report was issued in July 2011 with findings and recommendations. The contractor performed additional evaluations and remediation.

  7. Uncertainty Estimation in Intensity-Modulated Radiotherapy Absolute Dosimetry Verification

    SciTech Connect

    Sanchez-Doblado, Francisco . E-mail: paco@us.es; Hartmann, Guenther H.; Pena, Javier; Capote, Roberto; Paiusco, Marta; Rhein, Bernhard; Leal, Antonio; Lagares, Juan Ignacio

    2007-05-01

    Purpose: Intensity-modulated radiotherapy (IMRT) represents an important method for improving RT. The IMRT relative dosimetry checks are well established; however, open questions remain in reference dosimetry with ionization chambers (ICs). The main problem is the departure of the measurement conditions from the reference ones; thus, additional uncertainty is introduced into the dose determination. The goal of this study was to assess this effect systematically. Methods and Materials: Monte Carlo calculations and dosimetric measurements with five different detectors were performed for a number of representative IMRT cases, covering both step-and-shoot and dynamic delivery. Results: Using ICs with volumes of about 0.125 cm{sup 3} or less, good agreement was observed among the detectors in most of the situations studied. These results also agreed well with the Monte Carlo-calculated nonreference correction factors (c factors). Additionally, we found a general correlation between the IC position relative to a segment and the derived correction factor c, which can be used to estimate the expected overall uncertainty of the treatment. Conclusion: The increase of the reference dose relative standard uncertainty measured with ICs introduced by nonreference conditions when verifying an entire IMRT plan is about 1-1.5%, provided that appropriate small-volume chambers are used. The overall standard uncertainty of the measured IMRT dose amounts to about 2.3%, including the 0.5% of reproducibility and 1.5% of uncertainty associated with the beam calibration factor. Solid state detectors and large-volume chambers are not well suited to IMRT verification dosimetry because of the greater uncertainties. An action level of 5% is appropriate for IMRT verification. Greater discrepancies should lead to a review of the dosimetric procedure, including visual inspection of treatment segments and energy fluence.

  8. Survey of Product-line Verification and Validation Techniques

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn

    2007-01-01

    This report presents the results from the first task of the SARP Center Initiative, 'Product Line Verification of Safety-Critical Software.' Task 1 is a literature survey of available techniques for product line verification and validation. Section 1 of the report provides an introduction to product lines and motivates the survey of verification techniques. It describes what is reused in product-line engineering and explains the goal of verifiable conformance of the developed system to its product-line specifications. Section 2 of the report describes six lifecycle steps in product-line verification and validation. This description is based on, and refers to, the best practices extracted from the readings. It ends with a list of verification challenges for NASA product lines (2.7) and verification enablers for NASA product lines (2.8) derived from the survey. Section 3 provides resource lists of related conferences, workshops, industrial and defense industry experiences and case studies of product lines, and academic/industrial consortiums. Section 4 is a bibliography of papers and tutorials with annotated entries for relevant papers not previously discussed in sections 2 or 3.

  9. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  10. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  11. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  12. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  13. A study of internal structure in components made by additive manufacturing process using 3 D X-ray tomography

    NASA Astrophysics Data System (ADS)

    Raguvarun, K.; Balasubramaniam, Krishnan; Rajagopal, Prabhu; Palanisamy, Suresh; Nagarajah, Romesh; Hoye, Nicholas; Curiri, Dominic; Kapoor, Ajay

    2015-03-01

    Additive manufacturing methods are gaining increasing popularity for rapidly and efficiently manufacturing parts and components in the industrial context, as well as for domestic applications. However, except when used for prototyping or rapid visualization of components, industries are concerned with the load carrying capacity and strength achievable by additive manufactured parts. In this paper, the wire-arc additive manufacturing (AM) process based on gas tungsten arc welding (GTAW) has been examined for the internal structure and constitution of components generated by the process. High-resolution 3D X-ray tomography is used to gain cut-views through wedge-shaped parts created using this GTAW additive manufacturing process with titanium alloy materials. In this work, two different control conditions for the GTAW process are considered. The studies reveal clusters of porosities, located in periodic spatial intervals along the sample cross-section. Such internal defects can have a detrimental effect on the strength of the resulting AM components, as shown in destructive testing studies. Closer examination of this phenomenon shows that defect clusters are preferentially located at GTAW traversal path intervals. These results highlight the strong need for enhanced control of process parameters in ensuring components with minimal defects and higher strength.

  14. A study of internal structure in components made by additive manufacturing process using 3 D X-ray tomography

    SciTech Connect

    Raguvarun, K. Balasubramaniam, Krishnan Rajagopal, Prabhu; Palanisamy, Suresh; Nagarajah, Romesh; Kapoor, Ajay; Hoye, Nicholas; Curiri, Dominic

    2015-03-31

    Additive manufacturing methods are gaining increasing popularity for rapidly and efficiently manufacturing parts and components in the industrial context, as well as for domestic applications. However, except when used for prototyping or rapid visualization of components, industries are concerned with the load carrying capacity and strength achievable by additive manufactured parts. In this paper, the wire-arc additive manufacturing (AM) process based on gas tungsten arc welding (GTAW) has been examined for the internal structure and constitution of components generated by the process. High-resolution 3D X-ray tomography is used to gain cut-views through wedge-shaped parts created using this GTAW additive manufacturing process with titanium alloy materials. In this work, two different control conditions for the GTAW process are considered. The studies reveal clusters of porosities, located in periodic spatial intervals along the sample cross-section. Such internal defects can have a detrimental effect on the strength of the resulting AM components, as shown in destructive testing studies. Closer examination of this phenomenon shows that defect clusters are preferentially located at GTAW traversal path intervals. These results highlight the strong need for enhanced control of process parameters in ensuring components with minimal defects and higher strength.

  15. Studies on the effect of plasticiser and addition of toluene diisocyanate at different temperatures in composite propellant formulations.

    PubMed

    Jawalkar, S N; Mehilal; Ramesh, K; Radhakrishnan, K K; Bhattacharya, B

    2009-05-30

    Different composite propellant mixtures have been prepared using ammonium perchlorate, aluminium powder and hydroxyl terminated polybutadiene by varying the percentage of plasticiser and addition of toluene diisocyanate at different temperatures, and studied their different properties such as viscosity build-up, mechanical and ballistic properties and sensitivity. The data on different plasticiser level indicate that on decreasing the plasticiser content, there is a significant enhancement in end of mix viscosity, tensile strength and modulus while elongation decreases drastically. The data on sensitivity of the studied mixtures reveal that on decreasing the percentage of plasticiser, the sensitivity increases, accordingly. Further, the data on the effect of addition of TDI at different temperatures (35-60 degrees C) infer that on increasing the addition temperature of TDI there is a decrease in end of mix viscosity i.e. 800Pas at 35 degrees C to 448Pas at 60 degrees C. Moreover, there is no effect on mechanical and ballistic properties on higher temperature addition of TDI was observed. PMID:18835097

  16. The new geospatial tools: global transparency enhancing safeguards verification

    SciTech Connect

    Pabian, Frank Vincent

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  17. Optical security verification for blurred fingerprints

    NASA Astrophysics Data System (ADS)

    Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.

    1998-12-01

    Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.

  18. Practical aspects of dynamic verification of extensometers; Part 1 -- The concepts

    SciTech Connect

    Albright, F.J.; Annala, J. )

    1994-01-01

    Material property studies frequently require the measurement of load and strain. Accurate measurement of both parameters is essential. Methods for accurate static calibration and verification of load transducers and extensometers are well established. More recently, standard practices have been developed for the dynamic calibration of load transducers. Still in its infancy is a standard method for dynamic verification of extensometers. Dynamic verification introduces a wide range of new issues. These encompass not only the transducer but also the conditioning electronics and actual test machine. Static calibration permits the elimination of nearly all dynamics, whereas dynamic verification must be done in the presence of these dynamic effects. This paper outlines the various concepts that need to be understood when performing the dynamic verification of an extensometer. Problems related to computer aided verification are emphasized, issues of aliasing and resolution in particular.

  19. Using a generalized additive model with autoregressive terms to study the effects of daily temperature on mortality

    PubMed Central

    2012-01-01

    Background Generalized Additive Model (GAM) provides a flexible and effective technique for modelling nonlinear time-series in studies of the health effects of environmental factors. However, GAM assumes that errors are mutually independent, while time series can be correlated in adjacent time points. Here, a GAM with Autoregressive terms (GAMAR) is introduced to fill this gap. Methods Parameters in GAMAR are estimated by maximum partial likelihood using modified Newton’s method, and the difference between GAM and GAMAR is demonstrated using two simulation studies and a real data example. GAMM is also compared to GAMAR in simulation study 1. Results In the simulation studies, the bias of the mean estimates from GAM and GAMAR are similar but GAMAR has better coverage and smaller relative error. While the results from GAMM are similar to GAMAR, the estimation procedure of GAMM is much slower than GAMAR. In the case study, the Pearson residuals from the GAM are correlated, while those from GAMAR are quite close to white noise. In addition, the estimates of the temperature effects are different between GAM and GAMAR. Conclusions GAMAR incorporates both explanatory variables and AR terms so it can quantify the nonlinear impact of environmental factors on health outcome as well as the serial correlation between the observations. It can be a useful tool in environmental epidemiological studies. PMID:23110601

  20. Microstructural Development and Technical Challenges in Laser Additive Manufacturing: Case Study with a 316L Industrial Part

    NASA Astrophysics Data System (ADS)

    Marya, Manuel; Singh, Virendra; Marya, Surendar; Hascoet, Jean Yves

    2015-08-01

    Additive manufacturing (AM) brings disruptive changes to the ways parts, and products are designed, fabricated, tested, qualified, inspected, marketed, and sold. These changes introduce novel technical challenges and concerns arising from the maturity and diversity of today's AM processes, feedstock materials, and process parameter interactions. AM bears a resemblance with laser and electron beam welding in the so-called conduction mode, which involves a multitude of dynamic physical events between the projected feedstock and a moving heat source that eventually influence AM part properties. For this paper, an air vent was selected for its thin-walled, hollow, and variable cross section, and limited size. The studied air vents, randomly selected from a qualification batch, were fabricated out of 316L stainless steel using a 4 kW fiber laser powder-fed AM system, referred to as construction laser additive direct (CLAD). These were systematically characterized by microhardness indentation, visual examination, optical and scanning electron microscopy, and electron-back-scattering diffraction in order to determine AM part suitability for service and also broadly discuss metallurgical phenomena. The paper then briefly expands the discussion to include additional engineering alloys and further analyze relationships between AM process parameters and AM part properties, consistently utilizing past experience with the same powder-fed CLAD 3D printer, the well-established science and technology of welding and joining, and recent publications on additive manufacturing.

  1. Space Telescope performance and verification

    NASA Technical Reports Server (NTRS)

    Wright, W. F.

    1980-01-01

    The verification philosophy for the Space Telescope (ST) has evolved from years of experience with multispacecraft programs modified by the new factors introduced by the Space Transportation System. At the systems level of test, the ST will undergo joint qualification/acceptance tests with environment simulation using Lockheed's large spacecraft test facilities. These tests continue the process of detecting workmanship defects and module interface incompatibilities. The test program culminates in an 'all up' ST environmental test verification program resulting in a 'ready to launch' ST.

  2. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  3. Design, analysis, and test verification of advanced encapsulation systems. [low cost solar arrays

    NASA Technical Reports Server (NTRS)

    Garcia, A.; Minning, C.

    1981-01-01

    The construction of optical and electrical verification test coupons is detailed. Testing of these coupons was completed and the results are presented. Additionally, a thermal simulation of roof mounted array conditions was done and the results documented.

  4. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  5. Phosphazene additives

    SciTech Connect

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  6. Effect of One Percent Chlorhexidine Addition on the Antibacterial Activity and Mechanical Properties of Sealants: An in vitro Study

    PubMed Central

    Asokan, Sharath; John, J Baby; Priya, PR Geetha; Devi, Jagadeesan Gnana

    2015-01-01

    ABSTRACT Aim: The aim of the study was to evaluate the effect of addition of 1% chlorhexidine digluconate solution on the antibacterial activity and mechanical properties of glass ionomer and resin based sealant. Materials and methods: Conventional glass ionomer sealant (GIS) (Fuji VII, Japan) and resin sealant (Clinpro 3M ESPE, USA) were used in this study. Chlorhexidine digluconate (CHX) (20%) liquid was added to both the sealants, and the concentration of chlorhexidine in sealants was adjusted to 1%. The sealants were divided into four groups as: group A (GIS), group B (GIS + 1% CHX), group C (resin sealant), group D (resin sealant + 1% CHX). Five cylindrical specimens were prepared in each group. Their antibacterial activity against Streptococcus mutans and Lactobacillus acidophilus, and their mechanical properties (compressive strength and diametrical tensile strength) were assessed. Mann-Whitney and Wilcoxon signed rank test were used appropriately for statistical analysis (SPSS version 19). Result: Addition of one percent chlorhexidine significantly increased the antibacterial activity of both the sealants. There was a significant difference between groups A and B (p < 0.009), and groups C and D (p < 0.008). There was no significant difference in the mechanical properties of the sealants. Conclusion: Addition of one percent chlorhexidine to the glass ionomer and resin based sealants provided sufficient antibacterial activity, without significantly affecting the mechanical property of the sealants. How to cite this article: Shanmugaavel AK, Asokan S, John JB, Geetha Priya PR, Gnana Devi J. Effect of one percent Chlorhexidine Addition on the Antibacterial Activity and Mechanical Properties of Sealants: An in vitro Study. Int J Clin Pediatr Dent 2015;8(3):196-201. PMID:26628854

  7. A study on the effect of halloysite nanoparticle addition on the strength of glass fiber reinforced plastic

    NASA Astrophysics Data System (ADS)

    Kim, Yun-Hae; Park, Soo-Jeong; Lee, Jin-Woo; Moon, Kyung-Man

    2015-03-01

    Halloysite nanotube, which has been used in the polymer, has been spotlighted as a useful functional materials in the improvement of mechanical properties. In the current study, we established the optimal nanoparticle dispersion and analyzed the mechanical characteristics and the behavior of composites reinforced by HNTs have been synthesized by dispersing HNTs to the unsaturated polyester resin (UPR) and their mechanical characteristics, especially the tensile strength, interlaminar shear strength have been analyzed. Additionally, the reinforcement effect and its variation according to the amount of HNTs was also studied.

  8. Delivery verification and dose reconstruction in tomotherapy

    NASA Astrophysics Data System (ADS)

    Kapatoes, Jeffrey Michael

    2000-11-01

    It has long been a desire in photon-beam radiation therapy to make use of the significant fraction of the beam exiting the patient to infer how much of the beam energy was actually deposited in the patient. With a linear accelerator and corresponding exit detector mounted on the same ring gantry, tomotherapy provides a unique opportunity to accomplish this. Dose reconstruction describes the process in which the full three-dimensional dose actually deposited in a patient is computed. Dose reconstruction requires two inputs: an image of the patient at the time of treatment and the actual energy fluence delivered. Dose is reconstructed by computing the dose in the CT with the verified energy fluence using any model-based algorithm such as convolution/superposition or Monte Carlo. In tomotherapy, the CT at the time of treatment is obtained by megavoltage CT, the merits of which have been studied and proven. The actual energy fluence delivered to the patient is computed in a process called delivery verification. Methods for delivery verification and dose reconstruction in tomotherapy were investigated in this work. It is shown that delivery verification can be realized by a linear model of the tornotherapy system. However, due to the measurements required with this initial approach, clinical implementation would be difficult. Therefore, a clinically viable method for delivery verification was established, the details of which are discussed. With the verified energy fluence from delivery verification, an assessment of the accuracy and usefulness of dose reconstruction is performed. The latter two topics are presented in the context of a generalized dose comparison tool developed for intensity modulated radiation therapy. Finally, the importance of having a CT from the time of treatment for reconstructing the dose is shown. This is currently a point of contention in modern clinical radiotherapy and it is proven that using the incorrect CT for dose reconstruction can lead

  9. A Technique for Verification of Isocenter Position in Tangential Field Breast Irradiation

    SciTech Connect

    Prabhakar, Ramachandran Pande, Manish; Harsh, Kumar; Julka, Pramod K.; Ganesh, Tharmar; Rath, Goura K.

    2009-04-01

    Treatment verification and reproducibility of the breast treatment portals play a very important role in breast radiotherapy. We propose a simple technique to verify the planned isocenter position during treatment using an electronic portal imaging device. Ten patients were recruited in this study and (CT) computed tomography-based planning was performed with a conventional tangential field technique. For verification purposes, in addition to the standard medial (F1) and lateral (F2) tangential fields, a field (F3) perpendicular to the medial field was used for verification of the treatment portals. Lead markers were placed along the central axis of the 2 defined fields (F1 and F3) and the separation between the markers was measured on the portal images and verified with the marker separation on the digitally reconstructed radiographs (DRRs). Any deviation will identify the shift in the planned isocenter position during treatment. The average deviation observed between the markers measured from the DRR and portal image was 1.6 and 2.1 mm, with a standard deviation of 0.4 and 0.9 mm for fields F1 and F3, respectively. The maximum deviation observed was 3.0 mm for field F3. This technique will be very useful in patient setup for tangential breast radiotherapy.

  10. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  11. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  12. A fundamental study of the oxidation behavior of SI primary reference fuels with propionaldehyde and DTBP as an additive

    NASA Astrophysics Data System (ADS)

    Johnson, Rodney

    In an effort to combine the benefits of SI and CI engines, Homogeneous Charge Compression Ignition (HCCI) engines are being developed. HCCI combustion is achieved by controlling the temperature, pressure, and composition of the fuel and air mixture so that autoignition occurs in proper phasing with the piston motion. This control system is fundamentally more challenging than using a spark plug or fuel injector to determine ignition timing as in SI and CI engines, respectively. As a result, this is a technical barrier that must be overcome to make HCCI engines applicable to a wide range of vehicles and viable for high volume production. One way to tailor the autoignition timing is to use small amounts of ignition enhancing additives. In this study, the effect of the addition of DTBP and propionaldehyde on the autoignition behavior of SI primary reference fuels was investigated. The present work was conducted in a new research facility built around a single cylinder Cooperative Fuels Research (CFR) octane rating engine but modified to run in HCCI mode. It focused on the effect of select oxygenated hydrocarbons on hydrocarbon fuel oxidation, specifically, the primary reference fuels n-heptane and iso-octane. This work was conducted under HCCI operating conditions. Previously, the operating parameters for this engine were validated for stable combustion under a wide range of operating parameters such as engine speeds, equivalence ratios, compression ratios and inlet manifold temperature. The stable operating range under these conditions was recorded and used for the present study. The major focus of this study was to examine the effect of the addition of DTBP or propionaldehyde on the oxidation behavior of SI primary reference fuels. Under every test condition the addition of the additives DTBP and propionaldehyde caused a change in fuel oxidation. DTBP always promoted fuel oxidation while propionaldehyde promoted oxidation for lower octane number fuels and delayed

  13. A numerical study of the influence of ammonia addition on the auto-ignition limits of methane/air mixtures.

    PubMed

    Van den Schoor, F; Norman, F; Vandebroek, L; Verplaetsen, F; Berghmans, J

    2009-05-30

    In this study the auto-ignition limit of ammonia/methane/air mixtures is calculated based upon a perfectly stirred reactor model with convective heat transfer. The results of four different reaction mechanisms are compared with existing experimental data at an initial temperature of 723 K with ammonia concentrations of 0-20 mol.% and methane concentrations of 2.5-10 mol.%. It is found that the calculation of the auto-ignition limit pressure at constant temperature leads to larger relative deviations between calculated and experimental results than the calculation of the auto-ignition temperature at constant pressure. In addition to the calculations, a reaction path analysis is performed to explain the observed lowering of the auto-ignition limit of methane/air mixtures by ammonia addition. It is found that this decrease is caused by the formation of NO and NO(2), which enhance the oxidation of methane at low temperatures. PMID:18926632

  14. The influence of deposit control additives on nitrogen oxides emissions from spark ignition engines (case study: Tehran).

    PubMed

    Bidhendi, Gholamreza Nabi; Zand, Ali Daryabeigi; Tabrizi, Alireza Mikaeili; Pezeshk, Hamid; Baghvand, Akbar

    2007-04-15

    In the present research, the influence of a deposit control additive on NOx emissions from two types of gasoline engine vehicles i.e., Peykan (base on Hillman) and Pride (South Korea Kia motors) was studied. Exhaust NOx emissions were measured in to stages, before decarbonization process and after that. Statistical analysis was conducted on the measurement results. Results showed that NOx emissions from Peykans increased 0.28% and NOx emissions from Pride automobiles decreased 6.18% on average, due to the elimination of engine deposits. The observed variations were not statistically and practically significant. The results indicated that making use of detergent additives is not an effective way to reduce the exhaust NOx emissions from gasoline engine vehicles. PMID:19069943

  15. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  16. A mechanistic study of the addition of alcohol to a five-membered ring silene via a photochemical reaction.

    PubMed

    Su, Ming-Der

    2016-03-21

    The mechanism for the photochemical rearrangement of a cyclic divinyldisilane (1-Si) in its first excited state ((1)π → (1)π*) is determined using the CAS/6-311G(d) and MP2-CAS/6-311++G(3df,3pd) levels of theory. The photoproduct, a cyclic silene, reacts with various alcohols to yield a mixture of cis- and trans- adducts. The two reaction pathways are denoted as the cis- addition path (path A) and the trans-addition path (path B). These model studies demonstrate that conical intersections play a crucial role in the photo-rearrangements of cyclic divinyldisilanes. The theoretical evidence also demonstrates that the addition of alcohol to a cyclic divinyldisilane follows the reaction path: cyclic divinyldisilane → Franck-Condon region → conical intersection → photoproduct (cyclic silene) → local intermediate (with alcohol) → transition state → cis- or trans-adduct. The theoretical studies demonstrate that the steric effects as well as the concentrations of CH3OH must have a dominant role in determining the yields of the final adducts by stereochemistry. The same mechanism for the carbon derivative (1-C) is also considered in this work. However, the theoretical results indicate that 1-C does not undergo a methanol addition reaction via the photochemical reaction pathway, since its energy of conical intersection (S1/S0-CI-C) is more than that of its FC (FC-C). The reason for these phenomena could be that the atomic radius of carbon is much smaller than that of silicon (77 and 117 pm, respectively). As a result, the conformation for 1-C is more sterically congested than that for 1-Si, along the 1,3-silyl-migration pathway. PMID:26928893

  17. Ozone Monitoring Instrument geolocation verification

    NASA Astrophysics Data System (ADS)

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  18. Towards the formal verification of the requirements and design of a processor interface unit: HOL listings

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.

  19. Mapping {sup 15}O Production Rate for Proton Therapy Verification

    SciTech Connect

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping; Min, Chul Hee; Testa, Mauro; Winey, Brian; Normandin, Marc D.; Shih, Helen A.; Paganetti, Harald; Bortfeld, Thomas; El Fakhri, Georges

    2015-06-01

    Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 ({sup 15}O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of {sup 15}O production and clearance rates, which were compared to live versus dead rates for the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of {sup 15}O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using {sup 15}O decay constant, whereas the live thigh activity decayed faster. Most importantly, the {sup 15}O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of {sup 15}O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of {sup 15}O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, {sup 15}O clearance rates may be useful in monitoring permeability changes due to therapy.

  20. Mapping 15O production rate for proton therapy verification

    PubMed Central

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping; Min, Chul Hee; Testa, Mauro; Winey, Brian; Normandin, Marc D.; Shih, Helen A.; Paganetti, Harald; Bortfeld, Thomas; El Fakhri, Georges

    2015-01-01

    Purpose This is a proof-of-principle study for the evaluation of 15O production as an imaging target, through the use of positron emission tomography (PET), to improve verification of proton treatment plans and study the effects of perfusion. Methods and Materials Dynamic PET measurements of irradiation-produced isotopes were taken for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged in both live and dead conditions. A differential equation was fitted to the phantom and the in vivo data, yielding estimates of the 15O production and clearance rates, which was compared for live versus dead for the rabbit, and to Monte Carlo (MC) predictions. Results PET clearance rates agreed with the decay constants of the dominant radionuclide species in three different phantom materials. In two oxygen-rich materials, the ratio of 15O production rates agreed with the MC prediction. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using the 15O decay constant, while the live thigh activity decayed faster. Most importantly, the 15O production rates agreed within 2% (p> 0.5) between conditions. Conclusion We developed a new method for quantitative measurement of 15O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of 15O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, 15O clearance rates may be useful in monitoring permeability changes due to therapy. PMID:25817530

  1. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  2. Visual Attention During Sentence Verification.

    ERIC Educational Resources Information Center

    Lucas, Peter A.

    Eye movement data were collected for 28 college students reading 32 sentences with sentence verification questions. The factors observed were target sentence voice (active/passive), probe voice, and correct response (true/false). Pairs of subjects received the same set of stimuli, but with agents and objects in the sentences reversed. As expected,…

  3. Improved method for coliform verification.

    PubMed

    Diehl, J D

    1991-02-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  4. Improved method for coliform verification.

    PubMed Central

    Diehl, J D

    1991-01-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  5. A scheme for symmetrization verification

    NASA Astrophysics Data System (ADS)

    Sancho, Pedro

    2011-08-01

    We propose a scheme for symmetrization verification in two-particle systems, based on one-particle detection and state determination. In contrast to previous proposals, it does not follow a Hong-Ou-Mandel-type approach. Moreover, the technique can be used to generate superposition states of single particles.

  6. VERIFICATION OF WATER QUALITY MODELS

    EPA Science Inventory

    The basic concepts of water quality models are reviewed and the need to recognize calibration and verification of models with observed data is stressed. Post auditing of models after environmental control procedures are implemented is necessary to determine true model prediction ...

  7. Dual task performance with LPC (Linear Predictive Coding) degraded speech in a sentence verification task

    NASA Astrophysics Data System (ADS)

    Schmidt-Nielsen, Astrid; Kallman, Howard J.; Meijer, Corinne

    1989-10-01

    The results of a preliminary study on the effects of reduced speech intelligibility on dual task performance are reported. The speech task was a sentence verification task, and the speech degradation was accomplished using a narrowband digital voice transmission system operating with and without random bit errors. The second task was a visual picture sorting task. There was a dual task decrement on the sorting task, and in addition, there was a further decrease in sorts per minute as the speech was increasingly degraded. Reaction time for the speech task increased with the concurrent sorting task, but the dual task condition did not affect speech task error rates.

  8. Numerical study on the influence of hydrogen addition on soot formation in a laminar ethylene-air diffusion flame

    SciTech Connect

    Guo, Hongsheng; Liu, Fengshan; Smallwood, Gregory J.; Guelder, OEmer L.

    2006-04-15

    The influence of hydrogen addition to the fuel of an atmosphere pressure coflow laminar ethylene-air diffusion flame on soot formation was studied by numerical simulation. A detailed gas-phase reaction mechanism, which includes aromatic chemistry up to four rings, and complex thermal and transport properties were used. The fully coupled elliptic governing equations were solved. The interactions between soot and gas-phase chemistry were taken into account. Radiation heat transfer from CO{sub 2}, CO, H{sub 2}O, and soot was calculated using the discrete-ordinates method coupled to a statistical narrow-band-correlated K-based wide-band model. The predicted results were compared with the available experimental data and analyzed. It is indicated that the addition of hydrogen to the fuel in an ethylene-air diffusion flame suppresses soot formation through the effects of dilution and chemistry. This result is in agreement with available experiments. The simulations further suggest that the chemically inhibiting effect of hydrogen addition on soot formation is due to the decrease of hydrogen atom concentration in soot surface growth regions and higher concentration of molecular hydrogen in the lower flame region. (author)

  9. Effects of the colour additive caramel colour III on the immune system: a study with human volunteers.

    PubMed

    Houben, G F; Abma, P M; van den Berg, H; van Dokkum, W; van Loveren, H; Penninks, A H; Seinen, W; Spanhaak, S; Vos, J G; Ockhuizen, T

    1992-09-01

    Administration of the colour additive Caramel Colour III to rats has been associated with decreased numbers of lymphocytes and several other changes in the immune system, as well as in immune function parameters, specifically in animals fed a diet with a relatively low vitamin B6 content. The effects are caused by the imidazole derivative 2-acetyl-4(5)-tetrahydroxybutylimidazole (THI). Caramel Colour III is commonly used in food products such as bakery products, soya-bean sauces, brown sauces, gravies, soup aromas, brown (dehydrated) soups, brown malt caramel blend for various applications, vinegars and beers, and effects in humans on dietary intake cannot be excluded. Elderly male volunteers with a marginal deficit in vitamin B6 were considered a relevant and potentially sensitive group to study possible effects of Caramel Colour III on blood lymphocyte numbers (total and within subsets) or on proliferative responses of lymphocytes to mitogenic stimulation. In addition, several other haematological parameters, as well as serum immunoglobulin levels and immunoglobulin production in vitro by pokeweed mitogen-stimulated mononuclear blood cells were studied. The results of this double-blind intervention study demonstrated that in a selected test group of apparently healthy elderly male volunteers with a biochemically marginally deficient vitamin B6 status, Caramel Colour III containing 23 (commercial sample) or 143 (research sample) ppm THI and administered at the level of the current acceptable daily intake of 200 mg/kg body weight/day for 7 days did not affect any of the factors investigated. PMID:1427513

  10. A study on the effect of the polymeric additive HPMC on morphology and polymorphism of ortho-aminobenzoic acid crystals

    NASA Astrophysics Data System (ADS)

    Simone, E.; Cenzato, M. V.; Nagy, Z. K.

    2016-07-01

    In the present study, the effect of Hydroxy Propyl Methyl Cellulose (HPMC) on the crystallization of ortho-aminobenzoic acid (OABA) was investigated by seeded and unseeded cooling crystallization experiments. The influence of HPMC on the induction time, crystal shape of Forms I and II of OABA and the polymorphic transformation time was studied. Furthermore, the capability of HPMC to inhibit growth of Form I was evaluated quantitatively and modeled using population balance equations (PBE) solved with the method of moments. The additive was found to strongly inhibit nucleation and growth of Form I as well as to increase the time for the polymorphic transformation from Form II to I. Solvent was also found to influence the shape of Form I crystals at equal concentrations of HPMC. In situ process analytical technology (PAT) tools, including Raman spectroscopy, focused beam reflectance measurement (FBRM) and attenuated total reflectance (ATR) UV-vis spectroscopy were used in combination with off-line techniques, such as optical microscopy, scanning electron microscopy (SEM), Raman spectroscopy, Malvern Mastersizer and differential scanning calorimetry (DSC) to study the crystals produced. The results illustrate how shape, size and stability of the two polymorphs of OABA can be controlled and tailored using a polymeric additive.

  11. Three new double-headed nucleotides with additional nucleobases connected to C-5 of pyrimidines; synthesis, duplex and triplex studies.

    PubMed

    Kumar, Pawan; Sharma, Pawan K; Hansen, Jonas; Jedinak, Lukas; Reslow-Jacobsen, Charlotte; Hornum, Mick; Nielsen, Poul

    2016-02-15

    In the search for double-coding DNA-systems, three new pyrimidine nucleosides, each coded with an additional nucleobase anchored to the major groove face, are synthesized. Two of these building blocks carry a thymine at the 5-position of 2'-deoxyuridine through a methylene linker and a triazolomethylene linker, respectively. The third building block carries an adenine at the 6-position of pyrrolo-2'-deoxycytidine through a methylene linker. These double-headed nucleosides are introduced into oligonucleotides and their effects on the thermal stabilities of duplexes are studied. All studied double-headed nucleotide monomers reduce the thermal stability of the modified duplexes, which is partially compensated by using consecutive incorporations of the modified monomers or by flanking the new double-headed analogs with members of our former series containing propyne linkers. Also their potential in triplex-forming oligonucleotides is studied for two of the new double-headed nucleotides as well as the series of analogs with propyne linkers. The most stable triplexes are obtained with single incorporations of additional pyrimidine nucleobases connected via the propyne linker. PMID:26778611

  12. A SEARCH FOR ADDITIONAL PLANETS IN FIVE OF THE EXOPLANETARY SYSTEMS STUDIED BY THE NASA EPOXI MISSION

    SciTech Connect

    Ballard, Sarah; Charbonneau, David; Holman, Matthew J.; Christiansen, Jessie L.; Deming, Drake; Barry, Richard K.; Kuchner, Marc J.; Livengood, Timothy A.; Hewagama, Tilak; Hampton, Don L.; Lisse, Carey M.; Seager, Sara; Veverka, Joseph F.

    2011-05-01

    We present time series photometry and constraints on additional planets in five of the exoplanetary systems studied by the EPOCh (Extrasolar Planet Observation and Characterization) component of the NASA EPOXI mission: HAT-P-4, TrES-3, TrES-2, WASP-3, and HAT-P-7. We conduct a search of the high-precision time series for photometric transits of additional planets. We find no candidate transits with significance higher than our detection limit. From Monte Carlo tests of the time series using putative periods from 0.5 days to 7 days, we demonstrate the sensitivity to detect Neptune-sized companions around TrES-2, sub-Saturn-sized companions in the HAT-P-4, TrES-3, and WASP-3 systems, and Saturn-sized companions around HAT-P-7. We investigate in particular our sensitivity to additional transits in the dynamically favorable 3:2 and 2:1 exterior resonances with the known exoplanets: if we assume coplanar orbits with the known planets, then companions in these resonances with HAT-P-4b, WASP-3b, and HAT-P-7b would be expected to transit, and we can set lower limits on the radii of companions in these systems. In the nearly grazing exoplanetary systems TrES-3 and TrES-2, additional coplanar planets in these resonances are not expected to transit. However, we place lower limits on the radii of companions that would transit if the orbits were misaligned by 2.{sup 0}0 and 1.{sup 0}4 for TrES-3 and TrES-2, respectively.

  13. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  14. Verification and validation of control system software

    SciTech Connect

    Munro, J.K. Jr.; Kisner, R.A. ); Bhadtt, S.C. )

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  15. Advanced verification methods for OVI security ink

    NASA Astrophysics Data System (ADS)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  16. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  17. A fast and simple procedure for determination of perfluoroalkyl substances in food and feed: a method verification by an interlaboratory study.

    PubMed

    Hrádková, P; Poustka, J; Pulkrabová, J; Hlousková, V; Kocourek, V; Llorca, M; Farré, M; Barceló, D; Hajslová, J

    2013-09-01

    In this study, a simple, fast, and cheap sample preparation procedure for the analysis of three well-known representatives of perfluoroalkyl substances (perfluorooctane sulfonate, perfluorooctanoic acid, and perfluorooctane sulfonamide) was validated in accordance with Commission Decision 2002/657/EC. The method was based on extraction with methanol followed by a dispersive solid phase extraction cleanup step by addition of activated charcoal for fish tissue, fish feed, and milk samples. The novel analytical approach combined with liquid chromatography-tandem mass spectrometry makes it possible to achieve limits of quantification below 1 μg/kg (defined by Commission Recommendation 2010/161/EU). This method provides a high laboratory sample throughput: ten samples in 60 min. The validated procedure was successfully verified in an interlaboratory study. PMID:23609786

  18. Can additional urban development have major impacts on streamflow of a peri-urban catchment? A case study from Portugal

    NASA Astrophysics Data System (ADS)

    Ferreira, Carla; Walsh, Rory; Nunes, João; Steenhuis, Tammo; de Lima, João; Coelho, Celeste; Ferreira, António

    2016-04-01

    It is well known that urban development brings about changes in hydrological response. Relatively little, however, is known about impacts on streamflow during urban development in the Mediterranean climate. This paper examines changes in streamflow resulting from the construction of an enterprise park, a major road and apartment blocks in a small partially urbanized peri-urban catchment (6.2 km2) in central Portugal. These developments led to an increase in urban area from 32% to 40% over a five-year period (hydrological years 2008/09-2012/13). In the initial two-year period minor land-use changes increased impervious surfaces from 12.8% to 13.2%. The subsequent three-year period led to a further 17.2% increase in impervious area. Streamflow was recorded by a V-notch weir at the catchment outlet. Rainfall was recorded at a weather station 0.5km north of the catchment, and by five tipping-bucket raingauges installed in January 2011 within the study catchment. Annual runoff and storm runoff coefficients ranged from 14% to 21% and 9% to 14%, respectively, recorded in 2011/12 and 2012/13. Although these differences in runoff were caused in part by variation in rainfall, the comparison between 2009/10 (pre-) and 2012/13 (post-additional urban development), with broadly similar rainfall (887mm vs 947mm, respectively) and evapotranspiration (740mm vs 746mm), showed a 43% increase in storm runoff (from 90mm to 129mm), resulting from additional overland flow generated largely by the 4.4% increase in impervious surfaces. The additional urban development also led to changes in hydrograph parameters. The increase in storm runoff was not progressive over the study period, but regression lines of storm runoff against rainstorm parameters exhibited higher vertical positions in 2012/13 than 2008/09. Increasing peak flows, however, were more progressive over the study period, with annual regression lines displaying higher vertical positions, but with a clear distance between pre

  19. A comparison of software verification techniques

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A controlled experiment performed by the Software Engineering Laboratory (SEL) to compare the effectiveness of code reading, functional testing, and structural testing as software verification techniques is described. The experiment results indicate that code reading provides the greatest error detection capability at the lowest cost, whereas structural testing is the least effective technique. The experiment plan is explained, the experiment results are described, related results from other studies are discussed. The application of these results to the development of software in the flight dynamics environment is considered. Appendices summarize the experiment data and list the test programs.

  20. On-machine dimensional verification. Final report

    SciTech Connect

    Rendulic, W.

    1993-08-01

    General technology for automating in-process verification of machined products has been studied and implemented on a variety of machines and products at AlliedSignal Inc., Kansas City Division (KCD). Tests have been performed to establish system accuracy and probe reliability on two numerically controlled machining centers. Commercial software has been revised, and new cycles such as skew check and skew machining, have been developed to enhance and expand probing capabilities. Probe benefits have been demonstrated in the area of setup, cycle time, part quality, tooling cost, and product sampling.

  1. Imaging of prompt gamma rays emitted during delivery of clinical proton beams with a Compton camera: feasibility studies for range verification

    NASA Astrophysics Data System (ADS)

    Polf, Jerimy C.; Avery, Stephen; Mackin, Dennis S.; Beddar, Sam

    2015-09-01

    The purpose of this paper is to evaluate the ability of a prototype Compton camera (CC) to measure prompt gamma rays (PG) emitted during delivery of clinical proton pencil beams for prompt gamma imaging (PGI) as a means of providing in vivo verification of the delivered proton radiotherapy beams. A water phantom was irradiated with clinical 114 MeV and 150 MeV proton pencil beams. Up to 500 cGy of dose was delivered per irradiation using clinical beam currents. The prototype CC was placed 15 cm from the beam central axis and PGs from 0.2 MeV up to 6.5 MeV were measured during irradiation. From the measured data (2D) images of the PG emission were reconstructed. (1D) profiles were extracted from the PG images and compared to measured depth dose curves of the delivered proton pencil beams. The CC was able to measure PG emission during delivery of both 114 MeV and 150 MeV proton beams at clinical beam currents. 2D images of the PG emission were reconstructed for single 150 MeV proton pencil beams as well as for a 5   ×   5 cm mono-energetic layer of 114 MeV pencil beams. Shifts in the Bragg peak (BP) range were detectable on the 2D images. 1D profiles extracted from the PG images show that the distal falloff of the PG emission profile lined up well with the distal BP falloff. Shifts as small as 3 mm in the beam range could be detected from the 1D PG profiles with an accuracy of 1.5 mm or better. However, with the current CC prototype, a dose of 400 cGy was required to acquire adequate PG signal for 2D PG image reconstruction. It was possible to measure PG interactions with our prototype CC during delivery of proton pencil beams at clinical dose rates. Images of the PG emission could be reconstructed and shifts in the BP range were detectable. Therefore PGI with a CC for in vivo range verification during proton treatment delivery is feasible. However, improvements in the prototype CC detection efficiency and reconstruction algorithms are necessary

  2. Study on the Reutilization of Clear Fracturing Flowback Fluids in Surfactant Flooding with Additives for Enhanced Oil Recovery (EOR)

    PubMed Central

    Dai, Caili; Wang, Kai; Liu, Yifei; Fang, Jichao; Zhao, Mingwei

    2014-01-01

    An investigation was conducted to study the reutilization of clear fracturing flowback fluids composed of viscoelastic surfactants (VES) with additives in surfactant flooding, making the process more efficient and cost-effective. The clear fracturing flowback fluids were used as surfactant flooding system with the addition of α-olefin sulfonate (AOS) for enhanced oil recovery (EOR). The interfacial activity, emulsification activity and oil recovery capability of the recycling system were studied. The interfacial tension (IFT) between recycling system and oil can be reduced by 2 orders of magnitude to 10−3 mN/m, which satisfies the basic demand of surfactant flooding. The oil can be emulsified and dispersed more easily due to the synergetic effect of VES and AOS. The oil-wet surface of quartz can be easily converted to water-wet through adsorption of surfactants (VES/AOS) on the surface. Thirteen core plug flooding tests were conducted to investigate the effects of AOS concentrations, slug sizes and slug types of the recycling system on the incremental oil recovery. The investigations prove that reclaiming clear fracturing flowback fluids after fracturing operation and reuse it in surfactant flooding might have less impact on environment and be more economical. PMID:25409507

  3. A DFT study of addition reaction between fragment ion (CH₂) units and fullerene (C₆₀) molecule.

    PubMed

    Zaragoza, Irineo Pedro; Vergara, Jaime; Pérez-Manríquez, Liliana; Salcedo, Roberto

    2011-05-01

    The theoretical study of the interaction between CH(2) and fullerene (C(60)) suggests the existence of an addition reaction mechanism; this feature is studied by applying an analysis of electronic properties. Several different effects are evident in this interaction as a consequence of the particular electronic transfer which occurs during the procedure. The addition or insertion of the methylene group results in a process, where the inclusion of CH(2) into a fullerene bond produces the formation of several geometric deformations. A simulation of these procedures was carried out, taking advantage of the dynamic semi-classical Born-Oppenheimer approximation. Dynamic aspects were analyzed at different speeds, for the interaction between the CH(2) group and the two bonds: CC (6, 6) and CC (6, 5) respectively on the fullerene (C(60)) rings. All calculations which involved electrons employed DFT as well as exchange and functional correlation. The results indicate a tendency for the CH(2) fragment to attack the CC (6, 5) bond. PMID:20658255

  4. Experimental study on the characteristics of ventilated cavitation around an underwater navigating body influenced by turbulent drag-reducing additives

    NASA Astrophysics Data System (ADS)

    Jiang, ChenXing; Li, FengChen

    2015-09-01

    In this study, a new control strategy for turbulent drag reduction involving ventilated cavitation is proposed. The configurational and hydrodynamic characteristics of ventilated cavities influenced by turbulent drag-reducing additives were experimentally studied in water tunnel. The test model was fixed in the water tunnel by a strut in the aft-part. Aqueous solutions of CTAC/NaSal (cetyltrimethyl ammonium chloride/sodium salicylate) with weight concentrations of 100, 200, 400 and 600 ppm (part per million), respectively, were injected into the ventilated air cavity from the edge of the cavitator with accurate control by an injection pump. The cavity configurations were recorded by a high-speed CCD camera. The hydrodynamic characteristics of the test model were measured by a six-component balance. Experimental results show that, within the presently tested cases, the lengths of cavity influenced by drag-reducing solution are smaller than normal condition (ventilated cavity) in water, but the asymmetry of the cavity is improved. The drag resisted by the test model is reduced dramatically (the maximum drag reduction can reach to 80%) and the re-entrant jet is more complex after the CTAC solution is injected into the cavity. Turbulent drag-reducing additives have the potential in enhancement of supercavitating asymmetry and further drag reduction.

  5. [Study on the spectra of Au/CeO2 catalysts modified by La2O3 additive].

    PubMed

    Zhang, Qing; He, Zhen-Liang; Li, Jin-Wei; Zhan, Ying-Ying; Lin, Xing-Yi; Zheng, Qi

    2008-04-01

    For Au-ceria catalysts prepared by deposition-precipitation method, the catalytic performance of water gas shift reaction was studied with different La loadings. In the complete doping range, ceria retains with its cubic fluorite structures. XRD, HRTEM and UV-Vis-DRS, studies showed that La doping can improve the activity of Au-ceria catalyst by stabilizing ceria and modifying its morphology. In addition, the test of catalyst stability evaluation also proved, that a better stability performance of Au-ceria catalyst can be realized by appropriate La doping. The Au/CL5.0 sample with 5 at % La doping showed the best performance in WGS reaction. PMID:18619331

  6. Feasibility Study on 3-D Printing of Metallic Structural Materials with Robotized Laser-Based Metal Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Ding, Yaoyu; Kovacevic, Radovan

    2016-05-01

    Metallic structural materials continue to open new avenues in achieving exotic mechanical properties that are naturally unavailable. They hold great potential in developing novel products in diverse industries such as the automotive, aerospace, biomedical, oil and gas, and defense. Currently, the use of metallic structural materials in industry is still limited because of difficulties in their manufacturing. This article studied the feasibility of printing metallic structural materials with robotized laser-based metal additive manufacturing (RLMAM). In this study, two metallic structural materials characterized by an enlarged positive Poisson's ratio and a negative Poisson's ratio were designed and simulated, respectively. An RLMAM system developed at the Research Center for Advanced Manufacturing of Southern Methodist University was used to print them. The results of the tensile tests indicated that the printed samples successfully achieved the corresponding mechanical properties.

  7. Feasibility Study on 3-D Printing of Metallic Structural Materials with Robotized Laser-Based Metal Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Ding, Yaoyu; Kovacevic, Radovan

    2016-07-01

    Metallic structural materials continue to open new avenues in achieving exotic mechanical properties that are naturally unavailable. They hold great potential in developing novel products in diverse industries such as the automotive, aerospace, biomedical, oil and gas, and defense. Currently, the use of metallic structural materials in industry is still limited because of difficulties in their manufacturing. This article studied the feasibility of printing metallic structural materials with robotized laser-based metal additive manufacturing (RLMAM). In this study, two metallic structural materials characterized by an enlarged positive Poisson's ratio and a negative Poisson's ratio were designed and simulated, respectively. An RLMAM system developed at the Research Center for Advanced Manufacturing of Southern Methodist University was used to print them. The results of the tensile tests indicated that the printed samples successfully achieved the corresponding mechanical properties.

  8. Hazard and risk assessment of a nanoparticulate cerium oxide-based diesel fuel additive - a case study.

    PubMed

    Park, Barry; Donaldson, Kenneth; Duffin, Rodger; Tran, Lang; Kelly, Frank; Mudway, Ian; Morin, Jean-Paul; Guest, Robert; Jenkinson, Peter; Samaras, Zissis; Giannouli, Myrsini; Kouridis, Haris; Martin, Patricia

    2008-04-01

    Envirox is a scientifically and commercially proven diesel fuel combustion catalyst based on nanoparticulate cerium oxide and has been demonstrated to reduce fuel consumption, greenhouse gas emissions (CO(2)), and particulate emissions when added to diesel at levels of 5 mg/L. Studies have confirmed the adverse effects of particulates on respiratory and cardiac health, and while the use of Envirox contributes to a reduction in the particulate content in the air, it is necessary to demonstrate that the addition of Envirox does not alter the intrinsic toxicity of particles emitted in the exhaust. The purpose of this study was to evaluate the safety in use of Envirox by addressing the classical risk paradigm. Hazard assessment has been addressed by examining a range of in vitro cell and cell-free endpoints to assess the toxicity of cerium oxide nanoparticles as well as particulates emitted from engines using Envirox. Exposure assessment has taken data from modeling studies and from airborne monitoring sites in London and Newcastle adjacent to routes where vehicles using Envirox passed. Data have demonstrated that for the exposure levels measured, the estimated internal dose for a referential human in a chronic exposure situation is much lower than the no-observed-effect level (NOEL) in the in vitro toxicity studies. Exposure to nano-size cerium oxide as a result of the addition of Envirox to diesel fuel at the current levels of exposure in ambient air is therefore unlikely to lead to pulmonary oxidative stress and inflammation, which are the precursors for respiratory and cardiac health problems. PMID:18444008

  9. In-situ study of the influence of additives on the growth behavior of copper electrodeposits on copper single crystal

    NASA Astrophysics Data System (ADS)

    Wu, Aiwen

    Trace organic additives are known to be essential in obtaining desired metal electrodeposits in the microelectronic industry, however, fundamental design principles for their use and a scientific understanding of their interaction during electrodeposition is lacking. In the present study we investigated electrodeposition of copper on the Cu(100) surface in air-saturated or dearated acid-sulfate plating solutions containing several combinations of chloride and additives benzotriazole (BTA) and 3-mercapto propane sulfonic acid (MPSA) under galvanostatic pulse-current conditions. The electrodeposition process was followed using in-situ atomic force microscopy (AFM). AFM images were quantitatively analyzed by pattern-recognition and scaling procedures. In the absence of additives, copper deposits grew in a layer-by-layer mode from the earliest stage of deposition. The surface consisted of smooth terraces separated by steps. The scaling analysis result was consistent with a process dominated by surface diffusion and step growth. In chloride containing solutions, square-pyramidal mounds were initiated and grew to cover the surface. Mound slope increased with deposition time with no indication of reaching a steady-state value. This growth mode was consistent with a surface diffusion mechanism. The scaling result was similar to the additive-free system, but indicated that surface diffusion was more dominant in the presence of chloride. BTA inhibited the surface and produced nucleation-limited growth at hemispheroidal centers whose height to base radius aspect ratio increased linearly with deposition time. Nucleation and growth of three-dimensional nodules started randomly across the entire surface. The nodules were smaller in size than the mounds observed without BTA. The number and density of nodules were much higher than the mounds density. The deposit growth was dominated by a roughening mechanism that can be described by the random roughening term of a stochastic model

  10. Verification of computational models of cardiac electro-physiology.

    PubMed

    Pathmanathan, Pras; Gray, Richard A

    2014-05-01

    For computational models of cardiac activity to be used in safety-critical clinical decision-making, thorough and rigorous testing of the accuracy of predictions is required. The field of 'verification, validation and uncertainty quantification' has been developed to evaluate the credibility of computational predictions. The first stage, verification, is the evaluation of how well computational software correctly solves the underlying mathematical equations. The aim of this paper is to introduce novel methods for verifying multi-cellular electro-physiological solvers, a crucial first stage for solvers to be used with confidence in clinical applications. We define 1D-3D model problems with exact solutions for each of the monodomain, bidomain, and bidomain-with-perfusing-bath formulations of cardiac electro-physiology, which allow for the first time the testing of cardiac solvers against exact errors on fully coupled problems in all dimensions. These problems are carefully constructed so that they can be easily run using a general solver and can be used to greatly increase confidence that an implementation is correct, which we illustrate by testing one major solver, 'Chaste', on the problems. We then perform case studies on calculation verification (also known as solution verification) for two specific applications. We conclude by making several recommendations regarding verification in cardiac modelling. PMID:24259465

  11. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  12. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  13. Effectiveness of a pressurized stormwater filtration system in Green Bay, Wisconsin: a study for the environmental technology verification program of the U.S. Environmental Protection Agency

    USGS Publications Warehouse

    Horwatich, J.A.; Corsi, Steven R.; Bannerman, Roger T.

    2004-01-01

    A pressurized stormwater filtration system was installed in 1998 as a stormwater-treatment practice to treat runoff from a hospital rooftop and parking lot in Green Bay, Wisconsin. This type of filtration system has been installed in Florida citrus groves and sewage treatment plants around the United States; however, this installation is the first of its kind to be used to treat urban runoff and the first to be tested in Wisconsin. The U.S. Geological Survey (USGS) monitored the system between November 2000 and September 2002 to evaluate it as part of the U.S. Environmental Protection Agency's Environmental Technology Verification Program. Fifteen runoff events were monitored for flow and water quality at the inlet and outlet of the system, and comparison of the event mean concentrations and constituent loads was used to evaluate its effectiveness. Loads were decreased in all particulate-associated constituents monitored, including suspended solids (83 percent), suspended sediment (81 percent), total Kjeldahl nitrogen (26 percent), total phosphorus (54 percent), and total recoverable zinc (62 percent). Total dissolved solids, dissolved phosphorus, and nitrate plus nitrite loads remained similar or increased through the system. The increase in some constituents was most likely due to a ground-water contribution between runoff events. Sand/silt split analysis resulted in the median silt content of 78 percent at the inlet, 87 percent at the outlet, and 3 percent at the flow splitter.

  14. [Study on chromosomes aberration in wheat-rye disomics addition lines induced by the gametocidal chromosome 2C].

    PubMed

    Sun, Zhong-Ping; Wang, Zhan-Bin; Xu, Xiang-Ling; Li, Ji-Lin

    2004-11-01

    In the present study,Chinese Spring-Imperial (1 R-7R) wheat-rye disomic addition lines were hybridized with Chinese Spring-2C (derived from Aegilops cylindrica) disomic addition lines. The F1 hybrids were examined by mitotic and meiotic analysis. There were observed abnormal chromosome configurations. A total of 430 F2 plants were obtained by self-pollination. Chromosomes aberrations, such as translocation, deletions, isobrachial and dicentromere chromosomes, are identified in F2 individual plants by C-banding combined with fluorescent in situ hybridization (FISH). Additionally, chromosome spontaneous substitutions such as 2C substituting for wheat chromosomes 2A, 2B and 2D were also observed. The rule and frequency of chromosome aberration in F2 are the following: 22 out of 430 F2 plants (5.11%) were found involving aberration rye chromosomes. Among them, 10 plants were identified as wheat-rye chromosome translocation lines comprising 2.3%. Rye chromosome deletions comprised 12 of them (2.79%). 3 isobrachial aberrations were detected (about 0.7%), too. Most of the translocation lines are with wheat centromere, only one of them is with rye centromere. Rye chromosome aberrations occurred unevenly among homoeologous groups. There were 5 in 1R, 3 in 2R, 1 in 3R, 3 in 4R, 6 in 5R and 4 in 6R. The majority of the translocation lines are terminal translocation. 54 out of the total 430 progenies are wheat deletions,and 27 are distributed in the A group, 20 in the B group and 7 in the D group respectively. Finally,we discussed the possible cause for the uneven chromosome aberration among homoeologous groups in wheat and rye as well as the effect characteristics of 2C on wheat and rye chromosome. PMID:15651680

  15. Study of sorption of two sulfonylurea type of herbicides and their additives on soils and soil components.

    PubMed

    Földényi, Rita; Tóth, Zoltán; Samu, Gyöngyi; Érsek, Csaba

    2013-01-01

    The sorption of two sulfonylurea type herbicides (chlorsulfuron: (1-(2-chlorophenylsulfonyl)-3-(4-methoxy-6-methyl-1,3,5-triazin-2-yl)urea; tribenuron methyl: (methyl-2-[N-(4-methoxy-6-methyl-1,3,5-triazin-2-yl)-3-(methyl-ureido)-sulfonyl]-benzoate) was studied on sand and chernozem soil adsorbents. Experimental results for solutions prepared from the pure ingredients were compared to those prepared from the appropriate formulated commercial products. At small concentrations, the extent of adsorption of the active ingredient was higher than from the formulation containing solutions. Environmental fate and effects of the forming agents are less investigated because they rarely have concentration limits recommended by authorities. In addition to the adsorption of active ingredients, therefore, the sorption behavior of a widely used additive Supragil WP (sodium diisopropyl naphthalene sulphonate) was also studied. This dispersant is an anionic forming agent applied in a lot of pesticide formulations. Using three different soils (sand, brown forest, chernozem) as adsorbents two-step isotherms were obtained. The role of the soil organic matter (OM) was significant in the adsorption mechanism because the adsorbed amounts of the dispersant correlated with the specific surface area as well as with the total organic carbon (TOC) content of the soils. The sorption behavior indicates the operation of hydrophobic interaction mechanism between the soil OM and the dispersant. These results are supported by our further sorption experiments on clays, too. Zeta potential measurements seem to be promising for the interpretation of multi-step isotherms. The application of this technique proved that higher concentrations of the anionic forming agent assisted the peptization of soil organic matter (SOM) resulting in stable colloidal solution dominated by negative charges. Since the pesticides investigated are also anionic at the studied pH (7 and 8.3) the dissolved organics lead to the

  16. Dosimetric verification of IMAT delivery with a conventional EPID system and a commercial portal dose image prediction tool

    SciTech Connect

    Iori, Mauro; Cagni, Elisabetta; Paiusco, Marta; Munro, Peter; Nahum, Alan E.

    2010-01-15

    Purpose: The electronic portal imaging device (EPID) is a system for checking the patient setup; as a result of its integration with the linear accelerator and software customized for dosimetry, it is increasingly used for verification of the delivery of fixed-field intensity-modulated radiation therapy (IMRT). In order to extend such an approach to intensity-modulated arc therapy (IMAT), the combined use of an EPID system and a portal dose image prediction (PDIP) tool has been investigated. Methods: The dosimetric behavior of an EPID system, mechanically reinforced to maintain its positional stability during the accelerator gantry rotation, has been studied to assess its ability to measure portal dose distributions for IMAT treatment beams. In addition, the PDIP tool of a commercial treatment planning system, commonly used for static IMRT dosimetry, has been validated for simulating the PDIs of IMAT treatment fields. The method has been applied to the delivery verification of 23 treatment fields that were measured in their dual mode of IMRT and IMAT modalities. Results: The EPID system has proved to be appropriate for measuring the PDIs of IMAT fields; additionally the PDIP tool was able to simulate these accurately. The results are quite similar to those obtained for static IMRT treatment verification, although it was necessary to investigate the dependence of the EPID signal and of the accelerator monitor chamber response on variable dose rate. Conclusions: Our initial tests indicate that the EPID system, together with the PDIP tool, is a suitable device for the verification of IMAT plan delivery; however, additional tests are necessary to confirm these results.

  17. Appendix: Conjectures concerning proof, design, and verification.

    SciTech Connect

    Wos, L.

    2000-05-31

    This article focuses on an esoteric but practical use of automated reasoning that may indeed be new to many, especially those concerned primarily with verification of both hardware and software. Specifically, featured are a discussion and some methodology for taking an existing design -- of a circuit, a chip, a program, or the like--and refining and improving it in various ways. Although the methodology is general and does not require the use of a specific program, McCune's program OTTER does offer what is needed. OTTER has played and continues to play the key role in my research, and an interested person can gain access to this program in various ways, not the least of which is through the included CD-ROM in [3]. When success occurs, the result is a new design that may require fewer components, avoid the use of certain costly components, offer more reliability and ease of verification, and, perhaps most important, be more efficient in the contexts of speed and heat generation. Although the author has minimal experience in circuit design, circuit validation, program synthesis, program verification, and similar concerns, (at the encouragement of colleagues based on successes to be cited) he presents materials that might indeed be of substantial interest to manufacturers and programmers. He writes this article in part prompted by the recent activities of chip designers that include Intel and AMD, activities heavily emphasizing the proving of theorems. As for his research that appears to the author to be relevant, he has made an intense and most profitable study of finding proofs that are shorter [2,3], some that avoid the use of various types of term, some that are far less complex than previously known, and the like. Those results suggest to me a strong possible connection between more appealing proofs (in mathematics and in logic) and enhanced and improved design of both hardware and software. Here the author explores diverse conjectures that elucidate some of the

  18. Prefrontal cortex activity during motor tasks with additional mental load requiring attentional demand: a near-infrared spectroscopy study.

    PubMed

    Mandrick, Kevin; Derosiere, Gérard; Dray, Gérard; Coulon, Denis; Micallef, Jean-Paul; Perrey, Stéphane

    2013-07-01

    Functional near-infrared spectroscopy (fNIRS) is suitable for investigating cerebral oxygenation changes during motor and/or mental tasks. In the present study, we investigated how an additional mental load during a motor task at two submaximal loadings affects the fNIRS-measured brain activation over the right prefrontal cortex (PFC). Fifteen healthy males performed isometric grasping contractions at 15% and 30% of the maximal voluntary contraction (MVC) with or without an additional mental (i.e., arithmetic) task. Mental performance, force variability, fNIRS and subjective perception responses were measured in each condition. The performance of the mental task decreased significantly while the force variability increased significantly at 30% MVC as compared to 15% MVC, suggesting that performance of dual-task required more attentional resources. PFC activity increased significantly as the effort increased from 15% to 30% MVC (p<.001). Although a larger change in the deoxyhemoglobin was observed in dual-task conditions (p=.051), PFC activity did not change significantly as compared to the motor tasks alone. In summary, participants were unable to invest more attention and effort in performing the more difficult levels in order to maintain adequate mental performance. PMID:23665138

  19. A Thrust and Impulse Study of Guanidinium Azo-Tetrazolate as an Additive for Hybrid Rocket Fuel

    NASA Astrophysics Data System (ADS)

    Patton, J.; Wright, A. M.; Dunn, L.; Alford, B.

    2000-03-01

    A thrust and impulse study of the hybrid rocket fuel additive Guanidinium Azo-Tetrazolate (GAT) was conducted at the University of Arkansas at Little Rock (UALR) Hybrid Rocket Facility. GAT is an organic salt with a high percentage of nitrogen. GAT was mixed with the standard hybrid rocket fuel, Hydroxyl-Terminated Polybutadiene (HTPB), in the concentration of 15%, by mass. The fuel grains with the GAT additive were fired for 4 second runs with the oxygen flows of 0.05, 0.07, 0.09, and 0.12 lbm/sec. For each run average thrust, total impulse, and specific impulse were measured. Average thrust, specific impulse, and total impulse vs. oxygen flow were plotted. Similar data was collected for plain HTPB/PAPI fuels for comparison. GAT was found to increase the thrust output when it was added to the standard hybrid rocket fuel, HTPB. GAT also increased the total impulse during the run. The thrust and total impulse were increased at all flows, but especially at the lower oxygen flow rates. Specific impulse only increased during the lower oxygen flow runs, and decreased slightly for the higher oxygen flow runs.

  20. Influence of CuO and ZnO addition on the multicomponent phosphate glasses: Spectroscopic studies

    NASA Astrophysics Data System (ADS)

    Szumera, Magdalena; Wacławska, Irena; Sułowska, Justyna

    2016-06-01

    The spectra of phosphate-silicate glasses from the P2O5-SiO2-K2O-MgO-CaO system modified with the addition of CuO or ZnO have been studied by means of FTIR, Raman and 31P MAS NMR spectroscopy. All glasses were synthesized by the conventional melt-quenching technique and their homogeneous chemical composition was controlled and confirmed. By using the aforementioned research techniques, the presence of structural units with various degrees of polymerization was shown in the structure of analyzed phosphate-silicate glasses: Q3, Q2, Q1 and Q0. It was found that an increase in the content of CuO or ZnO in the composition of analyzed glasses, which are introduced at the expense of decreasing amounts of CaO and MgO, has a different influence on the phospho-oxygen network. It was shown that copper ions cause its gradual polymerization, while zinc ions cause its depolymerization. At the same time, polymerization of the silico-oxygen subnetwork was found. Additionally, in the case of glasses containing increasing amounts of ZnO, a change of the role of zinc ions in the vitreous matrix was confirmed (from the modifier to a structure-forming component).

  1. Measurements for liquid rocket engine performance code verification

    NASA Technical Reports Server (NTRS)

    Praharaj, Sarat C.; Palko, Richard L.

    1986-01-01

    The goal of the rocket engine performance code verification tests is to obtain the I sub sp with an accuracy of 0.25% or less. This needs to be done during the sequence of four related tests (two reactive and two hot gas simulation) to best utilize the loss separation technique recommended in this study. In addition to I sub sp, the measurements of the input and output parameters for the codes are needed. This study has shown two things in regard to obtaining the I sub sp uncertainty within the 0.25% target. First, this target is generally not being realized at the present time, and second, the instrumentation and testing technology does exist to obtain this 0.25% uncertainty goal. However, to achieve this goal will require carefully planned, designed, and conducted testing. In addition, the test-stand (or system) dynamics must be evaluated in the pre-test and post-test phases of the design of the experiment and data analysis, respectively always keeping in mind that a .25% overall uncertainty in I sub sp is targeted. A table gives the maximum allowable uncertainty required for obtaining I sub sp with 0.25% uncertainty, the currently-quoted instrument specification, and present test uncertainty for the parameters. In general, it appears that measurement of the mass flow parameter within the required uncertainty may be the most difficult.

  2. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  3. Correction, improvement and model verification of CARE 3, version 3

    NASA Technical Reports Server (NTRS)

    Rose, D. M.; Manke, J. W.; Altschul, R. E.; Nelson, D. L.

    1987-01-01

    An independent verification of the CARE 3 mathematical model and computer code was conducted and reported in NASA Contractor Report 166096, Review and Verification of CARE 3 Mathematical Model and Code: Interim Report. The study uncovered some implementation errors that were corrected and are reported in this document. The corrected CARE 3 program is called version 4. Thus the document, correction. improvement, and model verification of CARE 3, version 3 was written in April 1984. It is being published now as it has been determined to contain a more accurate representation of CARE 3 than the preceding document of April 1983. This edition supercedes NASA-CR-166122 entitled, 'Correction and Improvement of CARE 3,' version 3, April 1983.

  4. Linearity and additivity in cluster-induced sputtering: A molecular-dynamics study of van der Waals bonded systems

    SciTech Connect

    Anders, Christian; Urbassek, Herbert M.; Johnson, Robert E.

    2004-10-15

    Using molecular-dynamics simulation, we study sputtering of a condensed-gas solid induced by the impact of atomic clusters with sizes 1{<=}n{<=}10{sup 4}. Above a nonlinear onset regime, we find a linear increase of the sputter yield Y with the total energy E of the bombarding cluster. The fitting coefficients in the linear regime depend only on the cluster size n such that for fixed bombardment energy, sputtering decreases with increasing cluster size n. We find that to a good approximation the sputter yield in this regime obeys an additivity rule in cluster size n such that doubling the cluster size at the same cluster velocity amounts to doubling the sputter yield. The sputter-limiting energy {epsilon}{sub s} is introduced which separates erosion ({epsilon}>{epsilon}{sub s}) from growth ({epsilon}<{epsilon}{sub s}) under cluster impact.

  5. Study on the performance of polycarboxylate-based superplasticizers synthesized by reversible addition-fragmentation chain transfer (RAFT) polymerization

    NASA Astrophysics Data System (ADS)

    Yu, Binbin; Zeng, Zhong; Ren, Qinyu; Chen, Yang; Liang, Mei; Zou, Huawei

    2016-09-01

    A series of block type polycarboxylate-based superplasticizers (PCs) with different molecular architectures were synthesized with macromonomer butenyl alkylene polyoxyethylene-polyoxypropylene ether (BAPP) and acrylic acid (AA) by reversible addition-fragmentation chain transfer (RAFT) polymerization. Fourier-Transformed Infrared (FTIR) Spectroscopy and dynamic light scattering (DLS) were applied to investigate the PCs' molecular structure. The dispersion capacity of the PCs in cement were also measured, and the results showed that the polycarboxylic dispersing agents prepared by this method were suitable for portlant cement. It was found that the PCs could affect the hydration process, which was performed through retarding the generation of ettringite in the hydrated product. Our studies with X-ray diffraction (XRD), scanning electron microscopy (SEM) and compressive strength measurement of hydrated production were all supporting this conclusion.

  6. Study on Antiwear and Repairing Performances about Mass of Nano-copper Lubricating Additives to 45 Steel

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Yin, Y. L.; Zhang, G. N.; Wang, W. Y.; Zhao, K. K.

    Nano-copper usually serve for lubricating additives in tribology field. The antiwear and reducing friction performances both basic lubrication oil and basic lubrication oil with nano-copper in different mass were tested by friction wear test machine. The morphologies and the main elements of worn surfaces were analyzed by SEM. The results indicated that nano-copper could improve tribology performances of basic lubrication oil. Comparing with base lubrication oil, the mass is 0.15% of nano-copper, the friction coefficient and the worn trace width can be reduced 34% and 32% respectively. Nano-copper can form self-repairing film in lubrication oil which availably separates the friction materials in friction process. Therefore, nano-copper has wonderful antiwear, reducing friction and self-repairing performances. And the function mechanism of Cu nanoparticles is studied in the paper.

  7. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  8. MOV reliability evaluation and periodic verification scheduling

    SciTech Connect

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  9. Understanding correlation coefficients in treaty verification

    SciTech Connect

    DeVolpi, A.

    1991-11-01

    When a pair of images are compared on a point-by-point basis, the linear-correlation coefficient is usually used as a measure of similarity or dissimilarity. This paper evaluates the theoretical underpinnings and limitation of the linear-correlation coefficient, as well as other related statistics, particularly for cases where inherent white noise is present. As a result of the limitations in linear-correlation, an additional step has been derived -- local-sum clustering -- in order to improve recognition of small dissimilarities in a pair images. Results show that three-stage procedure, consisting of first establishing congruence of the two images, than using the linear-correlation coefficient as a test of true negatives, and finally qualifying a true positive by using the cluster (local-sum) method. These algorithmic stages would be especially useful in arms control treaty verification.

  10. Multimodal Speaker Verification Based on Electroglottograph Signal and Glottal Activity Detection

    NASA Astrophysics Data System (ADS)

    Ćirović, Zoran; Milosavljević, Milan; Banjac, Zoran

    2010-12-01

    To achieve robust speaker verification, we propose a multimodal method which includes additional nonaudio features and glottal activity detector. As a nonaudio sensor an electroglottograph (EGG) is applied. Parameters of EGG signal are used to augment conventional audio feature vector. Algorithm for EGG parameterization is based on the shape of the idealized waveform and glottal activity detector. We compare our algorithm with conventional one in the term of verification accuracy in high noise environment. All experiments are performed using Gaussian Mixture Model recognition system. Obtained results show a significant improvement of the text-independent speaker verification in high noise environment and opportunity for further improvements in this area.

  11. Evidence of Rapidly Warming Rivers in the UK from an Extensive Additive Modelling Study at the National Scale Using R

    NASA Astrophysics Data System (ADS)

    Simpson, G. L.

    2011-12-01

    River water temperature data exhibit non-linear behaviour over the past 50 or so years. Standard techniques for identifying and quantifying trends have centred around the use of linear regression and Mann-Kendall and Thiel-Sen procedures. Observational data from UK rivers suggest that temperatures are far more variable then assumed under these statistical models. In a national-scale assessment of the response of riverine systems to global climatic change, an additive model framework was employed to model patterns in water temperatures from a large database of temporal observational data. Models were developed using R, which allowed for the deployment of cutting-edge additive modelling techniques to describe trends at 2773 sites across England and Wales, UK. At a subset of sites, additive models were used to model long-term trends, trends within seasons and the long-term variation in the seasonal pattern of water temperatures. Changes in water temperature have important consequences for aquatic ecology, with some species being particularly sensitive even to small shifts in temperature during some or all of their lifecycle. While there are many studies reporting increasing regional and global air temperatures, evidence for changes in river water temperature has thus far been site specific and/or from sites heavily influenced by human activities that could themselves lead to warming. Here I present selected results from a national-scale assessment of changing river water temperatures, covering the whole of England and Wales, comprising data from 2,773 locations. Positive trends in water temperature were observed at 86% of sites. At a subset of sites, seasonal trend models were developed, which showed that 90% of locations demonstrated statistically significant increases in water temperature during Autumn and Winter periods. Multivariate smoothers, that allow for within-year and longer-term trend interactions in time, suggest that periods of warmer waters now extend

  12. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  13. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  14. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  15. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  16. Verification and validation for induction heating

    SciTech Connect

    Lam, Kin; Tippetts, Trevor B; Allen, David W

    2008-01-01

    Truchas is a software package being developed at LANL within the Telluride project for predicting the complex physical processes in metal alloy casting. The software was designed to be massively parallel, multi-material, multi-physics, and to run on 3D, fully unstructured meshes. This work describes a Verification and Validation assessment of Truchas for simulating the induction heating phase of a casting process. We used existing data from a simple experiment involving the induction heating of a graphite cylinder, as graphite is a common material used for mold assemblies. Because we do not have complete knowledge of all the conditions and properties in this experiment (as is the case in many other experiments), we performed a parameter sensitivity study, modeled the uncertainties of the most sensitive parameters, and quantified how these uncertainties propagate to the Truchas output response. A verification analysis produced estimates of the numerical error of the Truchas solution to our computational model. The outputs from Truchas runs with randomly sampled parameter values were used for the validation study.

  17. Verification of Medium Range Probabilistic Rainfall Forecasts Over India

    NASA Astrophysics Data System (ADS)

    Dube, Anumeha; Ashrit, Raghavendra; Singh, Harvir; Iyengar, Gopal; Rajagopal, E. N.

    2016-03-01

    Forecasting rainfall in the tropics is a challenging task further hampered by the uncertainty in the numerical weather prediction models. Ensemble prediction systems (EPSs) provide an efficient way of handling the inherent uncertainty of these models. Verification of forecasts obtained from an EPS is a necessity, to build confidence in using these forecasts. This study deals with the verification of the probabilistic rainfall forecast obtained from the National Centre for Medium Range Weather Forecasting (NCMRWF) Global Ensemble Forecast system (NGEFS) for three monsoon seasons, i.e., JJAS 2012, 2013 and 2014. Verification is done based on the Brier Score (BS) and its components (reliability, resolution and uncertainty), Brier Skill Score (BSS), reliability diagram, relative operating characteristic (ROC) curve and area under the ROC (AROC) curve. Three observation data sets are used (namely, NMSG, CPC-RFE2.0 and TRMM) for verification of forecasts and the statistics are compared. BS values for verification of NGEFS forecasts using NMSG data are the lowest, indicating that the forecasts have a better match with these observations as compared to both TRMM and CPC-RFE2.0. This is further strengthened by lower reliability, higher resolution and BSS values for verification against this data set. The ROC curve shows that lower rainfall amounts have a higher hit rate, which implies that the model has better skill in predicting these rainfall amounts. The reliability plots show that the events with lower probabilities were under forecasted and those with higher probabilities were over forecasted. From the current study it can be concluded that even though NGEFS is a coarse resolution EPS, the probabilistic forecast has good skill. This in turn leads to an increased confidence in issuing operational probabilistic forecasts based on NGEFS.

  18. Verification of Medium Range Probabilistic Rainfall Forecasts Over India

    NASA Astrophysics Data System (ADS)

    Dube, Anumeha; Ashrit, Raghavendra; Singh, Harvir; Iyengar, Gopal; Rajagopal, E. N.

    2016-07-01

    Forecasting rainfall in the tropics is a challenging task further hampered by the uncertainty in the numerical weather prediction models. Ensemble prediction systems (EPSs) provide an efficient way of handling the inherent uncertainty of these models. Verification of forecasts obtained from an EPS is a necessity, to build confidence in using these forecasts. This study deals with the verification of the probabilistic rainfall forecast obtained from the National Centre for Medium Range Weather Forecasting (NCMRWF) Global Ensemble Forecast system (NGEFS) for three monsoon seasons, i.e., JJAS 2012, 2013 and 2014. Verification is done based on the Brier Score (BS) and its components (reliability, resolution and uncertainty), Brier Skill Score (BSS), reliability diagram, relative operating characteristic (ROC) curve and area under the ROC (AROC) curve. Three observation data sets are used (namely, NMSG, CPC-RFE2.0 and TRMM) for verification of forecasts and the statistics are compared. BS values for verification of NGEFS forecasts using NMSG data are the lowest, indicating that the forecasts have a better match with these observations as compared to both TRMM and CPC-RFE2.0. This is further strengthened by lower reliability, higher resolution and BSS values for verification against this data set. The ROC curve shows that lower rainfall amounts have a higher hit rate, which implies that the model has better skill in predicting these rainfall amounts. Th e reliability plots show that the events with lower probabilities were under forecasted and those with higher probabilities were over forecasted. From the current study it can be concluded that even though NGEFS is a coarse resolution EPS, the probabilistic forecast has good skill. This in turn leads to an increased confidence in issuing operational probabilistic forecasts based on NGEFS.

  19. Design verification of SIFT

    NASA Technical Reports Server (NTRS)

    Moser, Louise; Melliar-Smith, Michael; Schwartz, Richard

    1987-01-01

    A SIFT reliable aircraft control computer system, designed to meet the ultrahigh reliability required for safety critical flight control applications by use of processor replications and voting, was constructed for SRI, and delivered to NASA Langley for evaluation in the AIRLAB. To increase confidence in the reliability projections for SIFT, produced by a Markov reliability model, SRI constructed a formal specification, defining the meaning of reliability in the context of flight control. A further series of specifications defined, in increasing detail, the design of SIFT down to pre- and post-conditions on Pascal code procedures. Mechanically checked mathematical proofs were constructed to demonstrate that the more detailed design specifications for SIFT do indeed imply the formal reliability requirement. An additional specification defined some of the assumptions made about SIFT by the Markov model, and further proofs were constructed to show that these assumptions, as expressed by that specification, did indeed follow from the more detailed design specifications for SIFT. This report provides an outline of the methodology used for this hierarchical specification and proof, and describes the various specifications and proofs performed.

  20. Effect of addition of magnesium to local anesthetics for peribulbar block: A prospective randomized double-blind study

    PubMed Central

    Sinha, R; Sharma, A; Ray, BR; Chandiran, R; Chandralekha, C; Sinha, R

    2016-01-01

    Background: Magnesium sulphate has been used along with local anesthetics in different regional blocks and found to be effective in decreasing the time of onset of the block and increasing the duration of the block. Objective: To evaluate the effect of addition of magnesium sulfate to standard local anesthetics mixture on the time for onset of the globe and lid akinesia for peribulbar block in ophthalmic surgeries. Materials and Methods: Sixty patients with American Society of Anesthesiologists status I to III undergoing ophthalmic surgery under peribulbar block were included in this study. Patients were randomized into two groups. Both the groups received 4.5 ml of 2% lidocaine, 4.5 ml of 0.5% bupivacaine with150 IU hyaluronidase. Group NS received normal saline 1 ml in the peribulbar block and Group MS, magnesium sulfate 50 mg in 1 ml normal saline. The onset of akinesia, satisfactory block and complications were observed by an independent observer. Results: Demographic data was statistically similar. In the Group NS at 3, 5, 10 and 15 min after the block, complete akinesia was seen in 0, 2, 11 and 28 patients respectively. In the Group MS, at 3, 5, 10 and 15 min after the block, complete akinesia was seen in 13, 23, 27 and 28 patients respectively. Patients received magnesium sulfate showed the statistically significant rapid onset of lid and globe akinesia than the control group till 10 min (P < 0.000). None of the patients needed a supplementary block and had complications during the surgery. Conclusion: Addition of 50 mg of magnesium sulfate to the lidocaine-bupivacaine mixture for peribulbar block decreases the onset of akinesia without any obvious side effect. PMID:26955313