Science.gov

Sample records for addition verification studies

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: ENVIROFUELS DIESEL FUEL CATALYZER FUEL ADDITIVE

    EPA Science Inventory

    EPA's Environmental Technology Verification Program has tested EnviroFuels diesel fuel additive, called the Diesel Fuel Catalyzer. EnviroFuels has stated that heavy-duty on and off road diesel engines are the intended market for the catalyzer. Preliminary tests conducted indicate...

  2. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  3. Environmental Technology Verification Report: Taconic Energy, Inc. TEA Fuel Additive

    EPA Science Inventory

    The Greenhouse Gas Technology Center (GHG Center) is one of six verification organizations operating under EPA’s ETV program. One sector of significant interest to GHG Center stakeholders is transportation - particularly technologies that result in fuel economy improvements. Taco...

  4. Expert system verification and validation study

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.

  5. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  6. Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.

  7. Learner Verification: A Publisher's Case Study.

    ERIC Educational Resources Information Center

    Wilson, George

    Learner verification, a process by which publishers monitor the effectiveness of their products and strive to improve their services to schools, is a practice that most companies take seriously. The quality of educational materials may be ensured in many ways: by analysis of sales, through firsthand investigation, and by employing a system of…

  8. Hybrid Enrichment Verification Array: Module Characterization Studies

    SciTech Connect

    Zalavadia, Mital A.; Smith, Leon E.; McDonald, Benjamin S.; Kulisek, Jonathan A.; Mace, Emily K.; Deshmukh, Nikhil S.

    2016-03-01

    The work presented in this report is focused on the characterization and refinement of the Hybrid Enrichment Verification Array (HEVA) approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL, and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. The core of the HEVA methodology, the high-energy prompt gamma-ray signature, serves as an indirect method for the measurement of total neutron emission from the cylinder. A method for measuring the intrinsic efficiency of this “non-traditional” neutron signature and the results from a benchmark experiment are presented. Also discussed are potential perturbing effects on the non-traditional signature, including short-lived activation of materials in the HEVA module. Modeling and empirical results are presented to demonstrate that such effects are expected to be negligible for the envisioned implementation scenario. In comparison to previous versions, the new design boosts the high-energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  9. Verification study of an emerging fire suppression system

    DOE PAGES

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; ...

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less

  10. Verification study of an emerging fire suppression system

    SciTech Connect

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; Gubernatis, David C.

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation and mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.

  11. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  12. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    SciTech Connect

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  13. Mental Arithmetic in Children with Mathematics Learning Disabilities: The Adaptive Use of Approximate Calculation in an Addition Verification Task

    ERIC Educational Resources Information Center

    Rousselle, Laurence; Noel, Marie-Pascale

    2008-01-01

    The adaptive use of approximate calculation was examined using a verification task with 18 third graders with mathematics learning disabilities, 22 typically achieving third graders, and 21 typically achieving second graders. Participants were asked to make true-false decisions on simple and complex addition problems while the distance between the…

  14. Metals Verification Study for Sinclair and Dyes Inlets,Washington

    SciTech Connect

    Kohn, Nancy P.; Miller, Martin C.; Brandenberger, Jill M.; Johnston, Robert K.

    2004-09-29

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington's 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. This Metals Verification Study was conducted to address the 303(d) segments that are listed for metal contaminants in marine sediment, because significant cleanup and source control activities have been conducted in the Inlets since the data supporting the 1998 303(d) listings were collected. The study was designed to obtain present-day sediment metals concentrations throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, with stations spatially distributed to support 303(d) listing updates and also watershed-level water quality and contaminant transport modeling efforts. A total of 160 surface sediment samples from Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage were screened for copper, lead, and zinc using X-Ray Fluorescence (XRF). 40 samples (25%) were selected for confirmatory metals analysis by ICP-MS for cadmium, silver, and arsenic in addition to copper, lead, and zinc. Regression relationships between the ICP-MS and XRF datasets were developed to estimate copper, lead, and zinc concentrations in all samples. The XRF results for copper, lead, and zinc correlated well with ICP-MS results, and predicted concentrations were calculated for all samples. The results of the Metals Verification Study show that sediment quality in Sinclair Inlet has improved markedly since implementation of cleanup and source control actions, and that the distribution of residual contaminants is limited to nearshore areas already within the actively managed Puget Sound Naval Shipyard Superfund Site where further source control actions and monitoring are under way. Outside of Sinclair Inlet, the target metals met state sediment quality standards.

  15. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  16. Verification, validation and sensitivity studies in computational biomechanics.

    PubMed

    Anderson, Andrew E; Ellis, Benjamin J; Weiss, Jeffrey A

    2007-06-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation (V&V). The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of V&V principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques.

  17. Advanced NSTS propulsion system verification study

    NASA Technical Reports Server (NTRS)

    Wood, Charles

    1989-01-01

    The merits of propulsion system development testing are discussed. The existing data base of technical reports and specialists is utilized in this investigation. The study encompassed a review of all available test reports of propulsion system development testing for the Saturn stages, the Titan stages, and the Space Shuttle main propulsion system. The knowledge on propulsion system development and system testing available from specialists and managers was also 'tapped' for inclusion.

  18. Mental arithmetic in children with mathematics learning disabilities: the adaptive use of approximate calculation in an addition verification task.

    PubMed

    Rousselle, Laurence; Noël, Marie-Pascale

    2008-01-01

    The adaptive use of approximate calculation was examined using a verification task with 18 third graders with mathematics learning disabilities, 22 typically achieving third graders, and 21 typically achieving second graders. Participants were asked to make true-false decisions on simple and complex addition problems while the distance between the proposed and the correct answer was manipulated. Both typically achieving groups were sensitive to answer plausibility on simple problems, were faster at rejecting extremely incorrect results than at accepting correct answers on complex addition problems, and showed a reduction of the complexity effect on implausible problems, attesting to the use of approximate calculation. Conversely, children with mathematics disabilities were unaffected by answer plausibility on simple addition problems, processed implausible and correct sums with equal speed on complex problems, and exhibited a smaller reduction of the complexity effect on implausible problems. They also made more errors on implausible problems. Different hypotheses are discussed to account for these results.

  19. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  20. Metrology test object for dimensional verification in additive manufacturing of metals for biomedical applications.

    PubMed

    Teeter, Matthew G; Kopacz, Alexander J; Nikolov, Hristo N; Holdsworth, David W

    2015-01-01

    Additive manufacturing continues to increase in popularity and is being used in applications such as biomaterial ingrowth that requires sub-millimeter dimensional accuracy. The purpose of this study was to design a metrology test object for determining the capabilities of additive manufacturing systems to produce common objects, with a focus on those relevant to medical applications. The test object was designed with a variety of features of varying dimensions, including holes, cylinders, rectangles, gaps, and lattices. The object was built using selective laser melting, and the produced dimensions were compared to the target dimensions. Location of the test objects on the build plate did not affect dimensions. Features with dimensions less than 0.300 mm did not build or were overbuilt to a minimum of 0.300 mm. The mean difference between target and measured dimensions was less than 0.100 mm in all cases. The test object is applicable to multiple systems and materials, tests the effect of location on the build, uses a minimum of material, and can be measured with a variety of efficient metrology tools (including measuring microscopes and micro-CT). Investigators can use this test object to determine the limits of systems and adjust build parameters to achieve maximum accuracy.

  1. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  2. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  3. Additional field verification of convective scaling for the lateral dispersion parameter

    SciTech Connect

    Sakiyama, S.K.; Davis, P.A.

    1988-07-01

    The results of a series of diffusion trials over the heterogeneous surface of the Canadian Precambrian Shield provide additional support for the convective scaling of the lateral dispersion parameter. The data indicate that under convective conditions, the lateral dispersion parameter can be scaled with the convective velocity scale and the mixing depth. 10 references.

  4. Additional Sawmill Electrical Energy Study.

    SciTech Connect

    Carroll, Hatch & Associates.

    1987-02-01

    This study was undertaken to investigate the potential for reducing use of electrical energy at lumber dry kilns by reducing fan speeds part way through the lumber drying process. It included three tasks: to quantify energy savings at a typical mill through field tests; to investigate the level of electric energy use at a representative sample of other mills and thereby to estimate the transferability of the conservation to the region; and to prepare a guidebook to present the technology to mill operators, and to allow them to estimate the economic value of adopting the technique at their facilities. This document reports on the first two tasks.

  5. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    SciTech Connect

    Baba, H; Tachibana, H; Kamima, T; Takahashi, R; Kawai, D; Sugawara, Y; Yamamoto, T; Sato, A; Yamashita, M

    2015-06-15

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  7. Environmental Technology Verification Program - ETV - Case Studies: Demonstrating Program Outcomes

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This cd con...

  8. Shuttle payload interface verification equipment study. Volume 2: Technical document. Part 2: Appendices

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Appendices to the shuttle payload integration study provide for: (1) The interface verification equipment hardware utilization list; (2) the horizontal IVE in-field assembly procedure; and (3) payload integration baseline functional flow block diagrams and options.

  9. How to test maximal oxygen uptake: a study on timing and testing procedure of a supramaximal verification test.

    PubMed

    Scharhag-Rosenberger, Friederike; Carlsohn, Anja; Cassel, Michael; Mayer, Frank; Scharhag, Jürgen

    2011-02-01

    Verification tests are becoming increasingly common for confirming maximal oxygen uptake (VO2 max) attainment. Yet, timing and testing procedures vary between working groups. The aims of this study were to investigate whether verification tests can be performed after an incremental test or should be performed on a separate day, and whether VO2 max can still be determined within the first testing session in subjects not satisfying the verification criterion. Forty subjects (age, 24 ± 4 years; VO2 max, 50 ± 7 mL·min-1·kg-1) performed a maximal incremental treadmill test and, 10 min afterwards, a verification test (VerifDay1) at 110% of maximal velocity (vmax). The verification criterion was a VerifDay1 peak oxygen uptake (VO2 peak) ≤5.5% higher than the incremental test value. Subjects not achieving the verification criterion performed another verification test at 115% vmax (VerifDay1') 10 min later, trying to confirm VerifDay1 VO2 peak as VO2 max. All other subjects exclusively repeated VerifDay1 on a separate day (VerifDay2). Of the 40 subjects, 6 did not satisfy the verification criterion. In 4 of them, attainment of VO2 max was confirmed by VerifDay1'. VO2 peak was equivalent between VerifDay1 and VerifDay2 (3722 ± 991 mL·min-1 vs. 3752 ± 995 mL·min-1, p = 0.56), whereas time to exhaustion was significantly longer in VerifDay2 (2:06 ± 0:22 min:s vs. 2:42 ± 0:38 min:s, p < 0.001, n = 34). The verification test VO2 peak does not seem to be affected by a preceding maximal incremental test. Incremental and verification tests can therefore be performed within the same testing session. In individuals not achieving the verification criterion, VO2 max can be determined by means of a subsequent, more intense verification test in most but not all cases.

  10. Verification Study - Wah Wah Valley, Utah. Volume I. Synthesis.

    DTIC Science & Technology

    1981-03-24

    characteristics (e.g., igneous , metamorphic, sedimentary). ROTARY WASH DRILLING - A boring technique in which advancement of the hole through overburden is...Valley, Utah for siting the MX Land Mobile Advanced ICBM system and presents the geo- logical, geophysical, and soils engineering data upon which the...Missile-X (MX), an advanced intercontinental ballistic missile system. The phases are called Screening, Characterization, Ranking, and Verification

  11. Reactive system verification case study: Fault-tolerant transputer communication

    NASA Technical Reports Server (NTRS)

    Crane, D. Francis; Hamory, Philip J.

    1993-01-01

    A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.

  12. Comparison of mouse and human genomes followed by experimental verification yields an estimated 1,019 additional genes.

    PubMed

    Guigo, Roderic; Dermitzakis, Emmanouil T; Agarwal, Pankaj; Ponting, Chris P; Parra, Genis; Reymond, Alexandre; Abril, Josep F; Keibler, Evan; Lyle, Robert; Ucla, Catherine; Antonarakis, Stylianos E; Brent, Michael R

    2003-02-04

    A primary motivation for sequencing the mouse genome was to accelerate the discovery of mammalian genes by using sequence conservation between mouse and human to identify coding exons. Achieving this goal proved challenging because of the large proportion of the mouse and human genomes that is apparently conserved but apparently does not code for protein. We developed a two-stage procedure that exploits the mouse and human genome sequences to produce a set of genes with a much higher rate of experimental verification than previously reported prediction methods. RT-PCR amplification and direct sequencing applied to an initial sample of mouse predictions that do not overlap previously known genes verified the regions flanking one intron in 139 predictions, with verification rates reaching 76%. On average, the confirmed predictions show more restricted expression patterns than the mouse orthologs of known human genes, and two-thirds lack homologs in fish genomes, demonstrating the sensitivity of this dual-genome approach to hard-to-find genes. We verified 112 previously unknown homologs of known proteins, including two homeobox proteins relevant to developmental biology, an aquaporin, and a homolog of dystrophin. We estimate that transcription and splicing can be verified for >1,000 gene predictions identified by this method that do not overlap known genes. This is likely to constitute a significant fraction of the previously unknown, multiexon mammalian genes.

  13. SU-E-T-32: A Feasibility Study of Independent Dose Verification for IMAT

    SciTech Connect

    Kamima, T; Takahashi, R; Sato, Y; Baba, H; Tachibana, H; Yamashita, M; Sugawara, Y

    2015-06-15

    Purpose: To assess the feasibility of the independent dose verification (Indp) for intensity modulated arc therapy (IMAT). Methods: An independent dose calculation software program (Simple MU Analysis, Triangle Products, JP) was used in this study, which can compute the radiological path length from the surface to the reference point for each control point using patient’s CT image dataset and the MLC aperture shape was simultaneously modeled in reference to the information of MLC from DICOM-RT plan. Dose calculation was performed using a modified Clarkson method considering MLC transmission and dosimetric leaf gap. In this study, a retrospective analysis was conducted in which IMAT plans from 120 patients of the two sites (prostate / head and neck) from four institutes were retrospectively analyzed to compare the Indp to the TPS using patient CT images. In addition, an ion-chamber measurement was performed to verify the accuracy of the TPS and the Indp in water-equivalent phantom. Results: The agreements between the Indp and the TPS (mean±1SD) were −0.8±2.4% and −1.3±3.8% for the regions of prostate and head and neck, respectively. The measurement comparison showed similar results (−0.8±1.6% and 0.1±4.6% for prostate and head and neck). The variation was larger in the head and neck because the number of the segments was increased that the reference point was under the MLC and the modified Clarkson method cannot consider the smooth falloff of the leaf penumbra. Conclusion: The independent verification program would be practical and effective for secondary check for IMAT with the sufficient accuracy in the measurement and CT-based calculation. The accuracy would be improved if considering the falloff of the leaf penumbra.

  14. VAMOS: The verification and monitoring options study: Current research options for in-situ monitoring and verification of contaminant remediation and containment within the vadose zone

    SciTech Connect

    Betsill, J.D.; Gruebel, R.D.

    1995-09-01

    The Verification and Monitoring Options Study Project (VAMOS) was established to identify high-priority options for future vadose-zone environmental research in the areas of in-situ remediation monitoring, post-closure monitoring, and containment emplacement and verification monitoring. VAMOS examined projected needs not currently being met with applied technology in order to develop viable monitoring and verification research options. The study emphasized a compatible systems approach to reinforce the need for utilizing compatible components to provide user friendly site monitoring systems. To identify the needs and research options related to vadose-zone environmental monitoring and verification, a literature search and expert panel forums were conducted. The search included present drivers for environmental monitoring technology, technology applications, and research efforts. The forums included scientific, academic, industry, and regulatory environmental professionals as well as end users of environmental technology. The experts evaluated current and future monitoring and verification needs, methods for meeting these needs, and viable research options and directions. A variety of high-priority technology development, user facility, and technology guidance research options were developed and presented as an outcome of the literature search and expert panel forums.

  15. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click.

    PubMed

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, FLINT HILLS RESOURCES, LP, CCD15010 DIESEL FUEL FORMULATION WITH HITEC4121 ADDITIVE

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  17. Hands-on Verification of Mechanics Training: A Cost-Effectiveness Study of Videodisc Simulation.

    ERIC Educational Resources Information Center

    Maher, Thomas G.

    This document reports the results of a study on the feasibility of training smog check mechanics in California via hands-on verification of mechanics' ability to inspect and repair vehicles. The reviews of the research literature that compare the learning effectiveness of different delivery media tend to support the position that in learning, the…

  18. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  19. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    SciTech Connect

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  20. Additional EIPC Study Analysis. Final Report

    SciTech Connect

    Hadley, Stanton W; Gotham, Douglas J.; Luciani, Ralph L.

    2014-12-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 14 topics was developed for further analysis. This paper brings together the earlier interim reports of the first 13 topics plus one additional topic into a single final report.

  1. Verification of Commercial Decontamination Technologies in Bench-Scale Studies Using Bacillus anthracis Spores

    DTIC Science & Technology

    2004-11-17

    12980) • Spore Strips – Bacillus atrophaeus (ATCC 9372) Biological Indicator Spore Strip BUSINESS SENSITIVE Organisms Biological Indicators: SEM Images...BUSINESS SENSITIVE Verification of Commercial Decontamination Technologies in Bench-Scale Studies Using Bacillus anthracis Spores M.L. Taylor, J.V...Commercial Decontamination Technologies in Bench-Scale Studies Using Bacillus anthracis Spores 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  2. Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.

  3. Aircraft surface coatings study: Verification of selected materials

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Three liquid coatings and four films that might improve and/or maintain the smoothness of transport aircraft surfaces are considered. Laboratory tests were performed on the liquid coatings (elastomeric polyurethanes) exposed to synthetic type hydraulic fluid, with and without a protective topcoat. Results were analyzed of a 14-month flight service evaluation of coatings applied to leading edges of an airline 727. Two additional airline service evaluations were initiated. Labortory tests were conducted on the films, bonded to aluminum substrate with various adhesives, to determine the best film/adhesive combinations. A cost/benefits analysis was performed and recommendations made for future work toward the application of this technology to commercial transports.

  4. Formal Verification of Curved Flight Collision Avoidance Maneuvers: A Case Study

    NASA Astrophysics Data System (ADS)

    Platzer, André; Clarke, Edmund M.

    Aircraft collision avoidance maneuvers are important and complex applications. Curved flight exhibits nontrivial continuous behavior. In combination with the control choices during air traffic maneuvers, this yields hybrid systems with challenging interactions of discrete and continuous dynamics. As a case study illustrating the use of a new proof assistant for a logic for nonlinear hybrid systems, we analyze collision freedom of roundabout maneuvers in air traffic control, where appropriate curved flight, good timing, and compatible maneuvering are crucial for guaranteeing safe spatial separation of aircraft throughout their flight. We show that formal verification of hybrid systems can scale to curved flight maneuvers required in aircraft control applications. We introduce a fully flyable variant of the roundabout collision avoidance maneuver and verify safety properties by compositional verification.

  5. The Cooking and Pneumonia Study (CAPS) in Malawi: Implementation of Remote Source Data Verification

    PubMed Central

    Weston, William; Smedley, James; Bennett, Andrew; Mortimer, Kevin

    2016-01-01

    Background Source data verification (SDV) is a data monitoring procedure which compares the original records with the Case Report Form (CRF). Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS) is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org). CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF). In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi. Methods At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators) recorded in the eCRF with the data in the digital images of the original records. Result 664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92%) had a finding of pneumonia in the original records. All digital images of the original records were clear and legible. Conclusion Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints. PMID:27355447

  6. Feasibility Study on Applying Radiophotoluminescent Glass Dosimeters for CyberKnife SRS Dose Verification

    PubMed Central

    Hsu, Shih-Ming; Hung, Chao-Hsiung; Liao, Yi-Jen; Fu, Hsiao-Mei; Tsai, Jo-Ting

    2017-01-01

    CyberKnife is one of multiple modalities for stereotactic radiosurgery (SRS). Due to the nature of CyberKnife and the characteristics of SRS, dose evaluation of the CyberKnife procedure is critical. A radiophotoluminescent glass dosimeter was used to verify the dose accuracy for the CyberKnife procedure and validate a viable dose verification system for CyberKnife treatment. A radiophotoluminescent glass dosimeter, thermoluminescent dosimeter, and Kodak EDR2 film were used to measure the lateral dose profile and percent depth dose of CyberKnife. A Monte Carlo simulation for dose verification was performed using BEAMnrc to verify the measured results. This study also used a radiophotoluminescent glass dosimeter coupled with an anthropomorphic phantom to evaluate the accuracy of the dose given by CyberKnife. Measurements from the radiophotoluminescent glass dosimeter were compared with the results of a thermoluminescent dosimeter and EDR2 film, and the differences found were less than 5%. The radiophotoluminescent glass dosimeter has some advantages in terms of dose measurements over CyberKnife, such as repeatability, stability, and small effective size. These advantages make radiophotoluminescent glass dosimeters a potential candidate dosimeter for the CyberKnife procedure. This study concludes that radiophotoluminescent glass dosimeters are a promising and reliable dosimeter for CyberKnife dose verification with clinically acceptable accuracy within 5%. PMID:28046056

  7. Feasibility Study on Applying Radiophotoluminescent Glass Dosimeters for CyberKnife SRS Dose Verification.

    PubMed

    Hsu, Shih-Ming; Hung, Chao-Hsiung; Liao, Yi-Jen; Fu, Hsiao-Mei; Tsai, Jo-Ting; Huang, Yung-Hui; Huang, David Y C

    2017-01-01

    CyberKnife is one of multiple modalities for stereotactic radiosurgery (SRS). Due to the nature of CyberKnife and the characteristics of SRS, dose evaluation of the CyberKnife procedure is critical. A radiophotoluminescent glass dosimeter was used to verify the dose accuracy for the CyberKnife procedure and validate a viable dose verification system for CyberKnife treatment. A radiophotoluminescent glass dosimeter, thermoluminescent dosimeter, and Kodak EDR2 film were used to measure the lateral dose profile and percent depth dose of CyberKnife. A Monte Carlo simulation for dose verification was performed using BEAMnrc to verify the measured results. This study also used a radiophotoluminescent glass dosimeter coupled with an anthropomorphic phantom to evaluate the accuracy of the dose given by CyberKnife. Measurements from the radiophotoluminescent glass dosimeter were compared with the results of a thermoluminescent dosimeter and EDR2 film, and the differences found were less than 5%. The radiophotoluminescent glass dosimeter has some advantages in terms of dose measurements over CyberKnife, such as repeatability, stability, and small effective size. These advantages make radiophotoluminescent glass dosimeters a potential candidate dosimeter for the CyberKnife procedure. This study concludes that radiophotoluminescent glass dosimeters are a promising and reliable dosimeter for CyberKnife dose verification with clinically acceptable accuracy within 5%.

  8. Verification of a Quality Management Theory: Using a Delphi Study

    PubMed Central

    Mosadeghrad, Ali Mohammad

    2013-01-01

    Background: A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. Methods: The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. Results: The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. Conclusion: A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence. PMID:24596883

  9. Verifying an interactive consistency circuit: A case study in the reuse of a verification technology

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1990-01-01

    The work done at ORA for NASA-LRC in the design and formal verification of a hardware implementation of a scheme for attaining interactive consistency (byzantine agreement) among four microprocessors is presented in view graph form. The microprocessors used in the design are an updated version of a formally verified 32-bit, instruction-pipelined, RISC processor, MiniCayuga. The 4-processor system, which is designed under the assumption that the clocks of all the processors are synchronized, provides software control over the interactive consistency operation. Interactive consistency computation is supported as an explicit instruction on each of the microprocessors. An identical user program executing on each of the processors decides when and on what data interactive consistency must be performed. This exercise also served as a case study to investigate the effectiveness of reusing the technology which was developed during the MiniCayuga effort for verifying synchronous hardware designs. MiniCayuga was verified using the verification system Clio which was also developed at ORA. To assist in reusing this technology, a computer-aided specification and verification tool was developed. This tool specializes Clio to synchronous hardware designs and significantly reduces the tedium involved in verifying such designs. The tool is presented and how it was used to specify and verify the interactive consistency circuit is described.

  10. Viability Study for an Unattended UF6 Cylinder Verification Station: Phase I Final Report

    SciTech Connect

    Smith, Leon E.; Miller, Karen A.; Garner, James R.; Branney, Sean; McDonald, Benjamin S.; Webster, Jennifer B.; Zalavadia, Mital A.; Todd, Lindsay C.; Kulisek, Jonathan A.; Nordquist, Heather; Deshmukh, Nikhil S.; Stewart, Scott

    2016-05-31

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument

  11. MX Siting Investigation Geotechnical Evaluation. Verification Study - Dry Lake Valley, Nevada. Volume I. Synthesis.

    DTIC Science & Technology

    1980-03-14

    This report is one of several being prepared to describe Verifi- cation studies in the Nevada-Utah region . The Verification studies are the final...issued for each valley studied. They will be numbered in the series, FN-TR-27-xx-I, or II, for the Nevada-Utah region . The "xx" will be replaced by an...faults are based on air photo interpretation during a regional fault and earthquake study and have not been field verified (FNI, 1980b). They occur in

  12. Structure Property Studies for Additively Manufactured Parts

    SciTech Connect

    Milenski, Helen M; Schmalzer, Andrew Michael; Kelly, Daniel

    2015-08-17

    Since the invention of modern Additive Manufacturing (AM) processes engineers and designers have worked hard to capitalize on the unique building capabilities that AM allows. By being able to customize the interior fill of parts it is now possible to design components with a controlled density and customized internal structure. The creation of new polymers and polymer composites allow for even greater control over the mechanical properties of AM parts. One of the key reasons to explore AM, is to bring about a new paradigm in part design, where materials can be strategically optimized in a way that conventional subtractive methods cannot achieve. The two processes investigated in my research were the Fused Deposition Modeling (FDM) process and the Direct Ink Write (DIW) process. The objectives of the research were to determine the impact of in-fill density and morphology on the mechanical properties of FDM parts, and to determine if DIW printed samples could be produced where the filament diameter was varied while the overall density remained constant.

  13. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    SciTech Connect

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2007-02-15

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of

  14. DNA binding studies of tartrazine food additive.

    PubMed

    Kashanian, Soheila; Zeidali, Sahar Heidary

    2011-07-01

    The interaction of native calf thymus DNA with tartrazine in 10 mM Tris-HCl aqueous solution at neutral pH 7.4 was investigated. Tartrazine is a nitrous derivative and may cause allergic reactions, with a potential of toxicological risk. Also, tartrazine induces oxidative stress and DNA damage. Its DNA binding properties were studied by UV-vis and circular dichroism spectra, competitive binding with Hoechst 33258, and viscosity measurements. Tartrazine molecules bind to DNA via groove mode as illustrated by hyperchromism in the UV absorption band of tartrazine, decrease in Hoechst-DNA solution fluorescence, unchanged viscosity of DNA, and conformational changes such as conversion from B-like to C-like in the circular dichroism spectra of DNA. The binding constants (K(b)) of DNA with tartrazine were calculated at different temperatures. Enthalpy and entropy changes were calculated to be +37 and +213 kJ mol(-1), respectively, according to the Van't Hoff equation, which indicated that the reaction is predominantly entropically driven. Also, tartrazine does not cleave plasmid DNA. Tartrazine interacts with calf thymus DNA via a groove interaction mode with an intrinsic binding constant of 3.75 × 10(4) M(-1).

  15. Verification of the Calore thermal analysis code.

    SciTech Connect

    Dowding, Kevin J.; Blackwell, Bennie Francis

    2004-07-01

    Calore is the ASC code developed to model steady and transient thermal diffusion with chemistry and dynamic enclosure radiation. An integral part of the software development process is code verification, which addresses the question 'Are we correctly solving the model equations'? This process aids the developers in that it identifies potential software bugs and gives the thermal analyst confidence that a properly prepared input will produce satisfactory output. Grid refinement studies have been performed on problems for which we have analytical solutions. In this talk, the code verification process is overviewed and recent results are presented. Recent verification studies have focused on transient nonlinear heat conduction and verifying algorithms associated with (tied) contact and adaptive mesh refinement. In addition, an approach to measure the coverage of the verification test suite relative to intended code applications is discussed.

  16. Field test for treatment verification of an in-situ enhanced bioremediation study

    SciTech Connect

    Taur, C.K.; Chang, S.C.

    1995-09-01

    Due to a leakage from a 12-inch pressurized diesel steel pipe four years ago, an area of approximately 30,000 square meters was contaminated. A pilot study applying the technology of in-situ enhanced bioremediation was conducted. In the study, a field test kit and on-site monitoring equipment were applied for site characterization and treatment verification. Physically, the enhanced bioremediation study consisted of an air extraction and air supply system, and a nutrition supply network. Certain consistent sampling methodology was employed. Progress was verified by daily monitoring and monthly verification. The objective of this study was to evaluate the capabilities of indigenous microorganisms to biodegrade the petroleum hydrocarbons with provision of oxygen and nutrients. Nine extraction wells and eight air sparging wells were installed. The air sparging wells injected the air into geoformation and the extraction wells provided the underground air circulation. The soil samples were obtained monthly for treatment verification by a Minuteman drilling machine with 2.5-foot-long hollow-stem augers. The samples were analyzed on site for TPH-diesel concentration by a field test kit manufactured by HNU-Hanby, Houston, Texas. The analytical results from the field test kit were compared with the results from an environmental laboratory. The TVPH concentrations of the air extracted from the vadose zone by a vacuum blower and the extraction wells were routinely monitored by a Foxboro FID and Cosmos XP-311A combustible air detector. The daily monitoring of TVPH concentrations provided the reliable data for assessing the remedial progress.

  17. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  18. Decoloration of Amaranth by the white-rot fungus Trametes versicolor. Part II. Verification study.

    PubMed

    Gavril, Mihaela; Hodson, Peter V

    2007-02-01

    The involvement of lignin peroxidase (LiP) in the decoloration of the mono-azo substituted napthalenic dye Amaranth was investigated with pure enzymes and whole cultures of Trametes versicolor. The verification study confirmed that LiP has a direct influence on the initial decoloration rate and showed that another enzyme, which does not need hydrogen peroxide to function and is not a laccase, also plays a role during decoloration. These results confirm the results of a previous statistical analysis. Furthermore, the fungal mycelium affects the performance of the decoloration process.

  19. Open-source MFIX-DEM software for gas-solids flows: Part I verification studies

    SciTech Connect

    Garg, Rahul; Galvin, Janine; Li, Tingwen; Pannala, Sreekanth

    2012-01-01

    With rapid advancements in computer hardware, it is now possible to perform large simulations of granular flows using the Discrete Element Method (DEM). As a result, solids are increasingly treated in a discrete Lagrangian fashion in the gas solids flow community. In this paper, the open-source MFIX-DEM software is described that can be used for simulating the gas solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles. This method is referred to as the continuum discrete method (CDM) to clearly make a distinction between the ambiguity of using a Lagrangian or Eulerian reference for either continuum or discrete formulations. This freely available CDM code for gas solids flows can accelerate the research in computational gas solids flows and establish a baseline that can lead to better closures for the continuum modeling (or traditionally referred to as two fluid model) of gas solids flows. In this paper, a series of verification cases is employed which tests the different aspects of the code in a systematic fashion by exploring specific physics in gas solids flows before exercising the fully coupled solution on simple canonical problems. It is critical to have an extensively verified code as the physics is complex with highly-nonlinear coupling, and it is difficult to ascertain the accuracy of the results without rigorous verification. These series of verification tests set the stage not only for rigorous validation studies (performed in part II of this paper) but also serve as a procedure for testing any new developments that couple continuum and discrete formulations for gas solids flows.

  20. Open-source MFIX-DEM software for gas-solids flows: Part 1 - Verification studies

    SciTech Connect

    Garg, Rahul; Galvin, Janine; Li, Tingwen; Pannala, Sreekanth

    2012-04-01

    With rapid advancements in computer hardware, it is now possible to perform large simulations of granular flows using the Discrete Element Method (DEM). As a result, solids are increasingly treated in a discrete Lagrangian fashion in the gas–solids flow community. In this paper, the open-source MFIX-DEM software is described that can be used for simulating the gas–solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles. This method is referred to as the continuum discrete method (CDM) to clearly make a distinction between the ambiguity of using a Lagrangian or Eulerian reference for either continuum or discrete formulations. This freely available CDM code for gas–solids flows can accelerate the research in computational gas–solids flows and establish a baseline that can lead to better closures for the continuum modeling (or traditionally referred to as two fluid model) of gas–solids flows. In this paper, a series of verification cases is employed which tests the different aspects of the code in a systematic fashion by exploring specific physics in gas–solids flows before exercising the fully coupled solution on simple canonical problems. It is critical to have an extensively verified code as the physics is complex with highly-nonlinear coupling, and it is difficult to ascertain the accuracy of the results without rigorous verification. These series of verification tests set the stage not only for rigorous validation studies (performed in part II of this paper) but also serve as a procedure for testing any new developments that couple continuum and discrete formulations for gas–solids flows.

  1. Open Source MFIX-DEM Software for Gas-Solids Flows: Part 1 - Verification Studies

    SciTech Connect

    Garg, Rahul; Galvin, Janine; Li, Tingwen; Pannala, Sreekanth

    2012-04-01

    With rapid advancements in computer hardware, it is now possible to perform large simulations of granular flows using the Discrete Element Method (DEM). As a result, solids are increasingly treated in a discrete Lagrangian fashion in the gas–solids flow community. In this paper, the open-source MFIX-DEM software is described that can be used for simulating the gas–solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles. This method is referred to as the continuum discrete method (CDM) to clearly make a distinction between the ambiguity of using a Lagrangian or Eulerian reference for either continuum or discrete formulations. This freely available CDM code for gas–solids flows can accelerate the research in computational gas–solids flows and establish a baseline that can lead to better closures for the continuum modeling (or traditionally referred to as two fluid model) of gas–solids flows. In this paper, a series of verification cases is employed which tests the different aspects of the code in a systematic fashion by exploring specific physics in gas–solids flows before exercising the fully coupled solution on simple canonical problems. It is critical to have an extensively verified code as the physics is complex with highly-nonlinear coupling, and it is difficult to ascertain the accuracy of the results without rigorous verification. These series of verification tests set the stage not only for rigorous validation studies (performed in part II of this paper) but also serve as a procedure for testing any new developments that couple continuum and discrete formulations for gas–solids flows.

  2. Computer Simulations to Study Diffraction Effects of Stacking Faults in Beta-SiC: II. Experimental Verification. 2; Experimental Verification

    NASA Technical Reports Server (NTRS)

    Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)

    2000-01-01

    Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.

  3. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants.

    PubMed

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-06-27

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.

  4. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    PubMed Central

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  5. ICAN/PART: Particulate composite analyzer, user's manual and verification studies

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Murthy, Pappu L. N.; Mital, Subodh K.

    1996-01-01

    A methodology for predicting the equivalent properties and constituent microstresses for particulate matrix composites, based on the micromechanics approach, is developed. These equations are integrated into a computer code developed to predict the equivalent properties and microstresses of fiber reinforced polymer matrix composites to form a new computer code, ICAN/PART. Details of the flowchart, input and output for ICAN/PART are described, along with examples of the input and output. Only the differences between ICAN/PART and the original ICAN code are described in detail, and the user is assumed to be familiar with the structure and usage of the original ICAN code. Detailed verification studies, utilizing dim dimensional finite element and boundary element analyses, are conducted in order to verify that the micromechanics methodology accurately models the mechanics of particulate matrix composites. ne equivalent properties computed by ICAN/PART fall within bounds established by the finite element and boundary element results. Furthermore, constituent microstresses computed by ICAN/PART agree in average sense with results computed using the finite element method. The verification studies indicate that the micromechanics programmed into ICAN/PART do indeed accurately model the mechanics of particulate matrix composites.

  6. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    SciTech Connect

    Kawai, D; Takahashi, R; Kamima, T; Baba, H; Yamamoto, T; Kubo, Y; Ishibashi, S; Higuchi, Y; Takahashi, H; Tachibana, H

    2015-06-15

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencil Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine.

  7. GFO-1 Geophysical Data Record and Orbit Verifications for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This final report summarizes the research work conducted under NASA's Physical Oceanography Program, entitled, GFO-1 Geophysical Data Record And Orbit Verifications For Global Change Studies, for the investigation time period from December 1, 1997 through November 30, 2000. The primary objectives of the investigation include providing verification and improvement for the precise orbit, media, geophysical, and instrument corrections to accurately reduce U.S. Navy's Geosat-Followon-1 (GFO-1) mission radar altimeter data to sea level measurements. The status of the GFO satellite (instrument and spacecraft operations, orbital tracking and altimeter) is summarized. GFO spacecraft has been accepted by the Navy from Ball Aerospace and has been declared operational since November, 2000. We have participated in four official GFO calibration/validation periods (Cal/Val I-IV), spanning from June 1999 through October 2000. Results of verification of the GFO orbit and geophysical data record measurements both from NOAA (IGDR) and from the Navy (NGDR) are reported. Our preliminary results indicate that: (1) the precise orbit (GSFC and OSU) can be determined to approx. 5 - 6 cm rms radially using SLR and altimeter crossovers; (2) estimated GFO MOE (GSFC or NRL) radial orbit accuracy is approx. 7 - 30 cm and Operational Doppler orbit accuracy is approx. 60 - 350 cm. After bias and tilt adjustment (1000 km arc), estimated Doppler orbit accuracy is approx. 1.2 - 6.5 cm rms and the MOE accuracy is approx. 1.0 - 2.3 cm; (3) the geophysical and media corrections have been validated versus in situ measurements and measurements from other operating altimeters (T/P and ERS-2). Altimeter time bias is insignificant with 0-2 ms. Sea state bias is about approx. 3 - 4.5% of SWH. Wet troposphere correction has approx. 1 cm bias and approx. 3 cm rms when compared with ERS-2 data. Use of GIM and IRI95 provide ionosphere correction accurate to 2-3 cm rms during medium to high solar activities; (4

  8. CDRD and PNPR passive microwave precipitation retrieval algorithms: verification study over Africa and Southern Atlantic

    NASA Astrophysics Data System (ADS)

    Panegrossi, Giulia; Casella, Daniele; Cinzia Marra, Anna; Petracca, Marco; Sanò, Paolo; Dietrich, Stefano

    2015-04-01

    to all available PMW radiometers in the GPM constellation of satellites (including NPP Suomi ATMS, and GMI). Three years of SSMIS and AMSU/MHS data have been considered to carry out a verification study over Africa of the retrievals from the CDRD and PNPR algorithms. The precipitation products from the TRMM ¬Precipitation radar (PR) (TRMM product 2A25 and 2A23) have been used as ground truth. The results of this study aimed at assessing the accuracy of the precipitation retrievals in different climatic regions and precipitation regimes will be presented. Particular emphasis will be given to the analysis of the level of coherence of the precipitation estimates and patterns between the two algorithms exploiting different radiometers. Recent developments aimed at the full exploitation of the GPM constellation of satellites for optimal precipitation/drought monitoring will be also presented.

  9. A hybrid Bayesian hierarchical model combining cohort and case-control studies for meta-analysis of diagnostic tests: Accounting for partial verification bias.

    PubMed

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R; Chu, Haitao

    2016-12-01

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities, and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented.

  10. A study on the factors that affect the advanced mask defect verification

    NASA Astrophysics Data System (ADS)

    Woo, Sungha; Jang, Heeyeon; Lee, Youngmo; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    Defect verification has become significantly difficult to higher technology nodes over the years. Traditional primary method of defect (include repair point) control consists of inspection, AIMS and repair steps. Among them, AIMS process needs various wafer lithography conditions, such as NA, inner/outer sigma, illumination shape and etc. It has a limit to analyze for every layer accurately because AIMS tool uses the physical aperture system. And it requires meticulous management of exposure condition and CD target value which change frequently in advanced mask. We report on the influence of several AIMS parameters on the defect analysis including repair point. Under various illumination conditions with different patterns, it showed the significant correlation in defect analysis results. It is able to analyze defect under certain error budget based on the management specification required for each layer. In addition, it provided us with one of the clues in the analysis of wafer repeating defect. Finally we will present 'optimal specification' for defect management with common AIMS recipe and suggest advanced mask process flow.

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES, VOLUME II

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  12. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  13. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    PubMed

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  14. Study on thermal effects & sulfurized additives, in lubricating greases

    NASA Astrophysics Data System (ADS)

    Shah, Ami Atul

    Lithium Base grease constitutes about 50% of market. The greases are developed to be able to work in multiple working conditions and have longer working life. Greases with extreme pressure additives and anti-wear additives have been developed as a solution to many of the applications. These developed greases are tested under ASTM D2266 testing conditions to meet the requirements. The actual working conditions, although, differ than the real testing conditions. The loading, speed and temperature conditions can be more harsh, or fluctuating in nature. The cyclic nature of the parameters cannot be directly related to the test performance. For this purpose studies on the performance under spectrum loading, variable speed and fluctuating temperature must be performed. This study includes tests to understand the effect of thermal variation on some of the most commonly used grease additives that perform well under ASTM D2266 testing conditions. The studied additives include most widely used industrial extreme pressure additive MoS2. Performance of ZDDP which is trying to replace MoS2 in its industrial applications has also been studied. The tests cover study of extreme pressure, anti-wear and friction modifier additives to get a general idea on the effects of thermal variation in three areas. Sulphur is the most common extreme pressure additive. Sulphur based MoS 2 is extensively used grease additive. Study to understand the tribological performance of this additive through wear testing and SEM/EDX studies has been done. This performance is also studied for other metallic sulfides like WS2 and sulphur based organic compound. The aim is to study the importance of the type of bond that sulphur shares in its additive's structure on its performance. The MoS2 film formation is found to be on the basis of the FeS formation on the substrate and protection through sacrificial monolayer deposition of the MoS2 sheared structure. The free Mo then tends to oxidise. An attempt to

  15. WTS-4 system verification unit for wind/hydroelectric integration study

    NASA Astrophysics Data System (ADS)

    Watts, A. W.

    The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.

  16. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  17. WTS-4 system verification unit for wind/hydroelectric integration study

    NASA Technical Reports Server (NTRS)

    Watts, A. W.

    1982-01-01

    The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.

  18. Transition Metal Complexes of Naproxen: Synthesis, Characterization, Forced Degradation Studies, and Analytical Method Verification

    PubMed Central

    Hasan, Md. Sharif; Kayesh, Ruhul; Begum, Farida; Rahman, S. M. Abdur

    2016-01-01

    The aim of our current research was to synthesize some transition metal complexes of Naproxen, determine their physical properties, and examine their relative stability under various conditions. Characterizations of these complexes were done by 1H-NMR, Differential Scanning Calorimetry (DSC), FT-IR, HPLC, and scanning electron microscope (SEM). Complexes were subjected to acidic, basic, and aqueous hydrolysis as well as oxidation, reduction, and thermal degradation. Also the reversed phase high-performance liquid chromatography (RP-HPLC) method of Naproxen outlined in USP was verified for the Naproxen-metal complexes, with respect to accuracy, precision, solution stability, robustness, and system suitability. The melting points of the complexes were higher than that of the parent drug molecule suggesting their thermal stability. In forced degradation study, complexes were found more stable than the Naproxen itself in all conditions: acidic, basic, oxidation, and reduction media. All the HPLC verification parameters were found within the acceptable value. Therefore, it can be concluded from the study that the metal complexes of Naproxen can be more stable drug entity and offer better efficacy and longer shelf life than the parent Naproxen. PMID:27034891

  19. A Modeling and Verification Study of Summer Precipitation Systems Using NASA Surface Initialization Datasets

    NASA Technical Reports Server (NTRS)

    Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.

    2010-01-01

    One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using

  20. Studies of a Proton Bunch Phase Monitor for Range Verification in Proton Therapy

    SciTech Connect

    Werner, T.; Golnik, C.; Enghardt, W.; Petzoldt, J.; Kormoll, T.; Pausch, G.; Straessner, A.; Roemer, K.; Dreyer, A.; Hueso-Gonzalez, F.; Enghardt, W.

    2015-07-01

    A primary subject of the present research in particle therapy is to ensure the precise irradiation of the target volume. The prompt gamma timing (PGT) method provides one possibility for in vivo range verification during the irradiation of patients. Prompt gamma rays with high energies are emitted promptly due to nuclear reactions of protons with tissue. The arrival time of these gammas to the detector reflects the stopping process of the primary protons in tissue and is directly correlated to the range. Due to the time resolution of the detector and the proton bunch time spread, as well as drifts of the bunch phase with respect to the accelerator frequency, timing spectra are smeared out and compromise the accuracy of range information intended for future clinical applications. Nevertheless, counteracting this limitation and recovering range information from the PGT measured spectra, corrections using a bunch phase monitor can be performed. A first prototype of bunch phase monitor was tested at GSI Darmstadt, where measurements of the energy correlation profile of the ion bunches were performed. At the ELBE accelerator at Helmholtz-Zentrum Dresden-Rossendorf (HZDR), set up to provide bremsstrahlung photons in very short pulses, a constant fraction algorithm for the incoming digital signals was evaluated, which is used for optimizing the time resolution. Studies of scattering experiments with different thin targets and detector positions are accomplished at Onco Ray Dresden, where a clinical proton beam is available. These experiments allow a basic characterization of the proton bunch structure and the detection yield. (authors)

  1. Studies of a proton phase beam monitor for range verification in proton therapy

    SciTech Connect

    Werner, T.; Golnik, C.; Enghardt, W.; Petzoldt, J.; Kormoll, T.; Pausch, G.; Straessner, A.; Roemer, K.; Dreyer, A.; Hueso-Gonzalez, F.; Enghardt, W.

    2015-07-01

    A primary subject of the present research in particle therapy is to ensure the precise irradiation of the target volume. The prompt gamma timing (PGT) method provides one possibility for in vivo range verification during the irradiation of patients. Prompt gamma rays with high energies are emitted promptly due to nuclear reactions of protons with tissue. The arrival time of these gammas to the detector reflects the stopping process of the primary protons in tissue and are directly correlated to the range. Due to the time resolution of the detector and the proton bunch time spread, as well as drifts of the bunch phase with respect to the accelerator frequency, timing spectra are smeared out and compromise the accuracy of range information intended for future clinical applications. Nevertheless, counteracting this limitation and recovering range information from the PGT measured spectra, corrections using a phase beam monitor can be performed. A first prototype of phase beam monitor was tested at GSI Darmstadt, where measurements of the energy profile of the ion bunches were performed. At the ELBE accelerator Helmholtz-Zentrum Dresden-Rossendorf (HZDR), set up to provide bremsstrahlung photons in very short pulses, a constant fraction algorithm for the incoming digital signals was evaluated, which is used for optimizing the time resolution. Studies of scattering experiments with different thin targets and detector positions are accomplished at Oncoray Dresden, where a clinical proton beam is available. These experiments allow a basic characterization of the proton bunch structure and the detection yield. (authors)

  2. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.

  3. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  4. MX Siting Investigation. Geotechnical Evaluation. Volume III. Nevada - Utah Verification Studies, FY 79. Geotechnical Data Snake East CDP, Utah.

    DTIC Science & Technology

    1979-08-24

    INVESTIGATION GEOTECHNICAL EVALUATION VOLUME III, NEVADA-UTAH VERIFICATION STUDIES, FY 79 GEOTECHNICAL DATA SNAKE EAST CDP , UTAH Prepared for: U. S. Department...Beach Boulevard Long Beach, California 90807 24 August 1979 - NO= NATIONAL, INO. FN-TR-27-III I VOLUME III GEOTECHNICAL DATA, SNAKE EAST CDP - , TABLE...Railroad and Big Smoky CDP’s. Geotechnical Data Volumes Volume II - Whirlwind CDP * Volume III - Snake East CDP Volume IV - Hamlin CDP Volume V - White River

  5. Teaching and Learning Components for Induction and Renewable Teacher Certification: A Statewide Verification Study.

    ERIC Educational Resources Information Center

    Logan, Connie S.; And Others

    The purpose of this paper is to report the results of a statewide content verification survey (CVS) of teachers to verify an initial set of System for Teaching and Learning Assessment and Review (STAR) teaching and learning components as being reasonable expectations for beginning and/or experienced teachers in Louisiana. The draft STAR was…

  6. TEST/QA PLAN FOR THE VERIFICATION TESTING OF ALTERNATIVES OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSONS, AND LUBRICANTS FOR HIGHWAY AND NONROAD USE HEAVY DUTY DIESEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  7. Feasibility study of patient positioning verification in electron beam radiotherapy with an electronic portal imaging device (EPID).

    PubMed

    Ramm, U; Köhn, J; Rodriguez Dominguez, R; Licher, J; Koch, N; Kara, E; Scherf, C; Rödel, C; Weiß, C

    2014-03-01

    The purpose of this study is to demonstrate the feasibility of verification and documentation in electron beam radiotherapy using the photon contamination detected with an electronic portal imaging device. For investigation of electron beam verification with an EPID, the portal images are acquired irradiating two different tissue equivalent phantoms at different electron energies. Measurements were performed on an Elekta SL 25 linear accelerator with an amorphous-Si electronic portal imaging device (EPID: iViewGT, Elekta Oncology Systems, Crawley, UK). As a measure of EPID image quality contrast (CR) and signal-to-noise ratio (SNR) are determined. For characterisation of the imaging of the EPID RW3 slabs and a Gammex 467 phantom with different material inserts are used. With increasing electron energy the intensity of photon contamination increases, yielding an increasing signal-to-noise ratio, but images are showing a decreasing contrast. As the signal-to-noise ratio saturates with increasing dose a minimum of 50 MUs is recommended. Even image quality depends on electron energy and diameter of the patient, the acquired results are mostly sufficient to assess the accuracy of beam positioning. In general, the online EPID acquisition has been demonstrated to be an effective electron beam verification and documentation method. The results are showing that this procedure can be recommended to be routinely and reliably done in patient treatment with electron beams.

  8. Nonlinear 3D MHD verification study: SpeCyl and PIXIE3D codes for RFP and Tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Bonfiglio, D.; Cappello, S.; Chacon, L.

    2010-11-01

    A strong emphasis is presently placed in the fusion community on reaching predictive capability of computational models. An essential requirement of such endeavor is the process of assessing the mathematical correctness of computational tools, termed verification [1]. We present here a successful nonlinear cross-benchmark verification study between the 3D nonlinear MHD codes SpeCyl [2] and PIXIE3D [3]. Excellent quantitative agreement is obtained in both 2D and 3D nonlinear visco-resistive dynamics for reversed-field pinch (RFP) and tokamak configurations [4]. RFP dynamics, in particular, lends itself as an ideal non trivial test-bed for 3D nonlinear verification. Perspectives for future application of the fully-implicit parallel code PIXIE3D to RFP physics, in particular to address open issues on RFP helical self-organization, will be provided. [4pt] [1] M. Greenwald, Phys. Plasmas 17, 058101 (2010) [0pt] [2] S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996) [0pt] [3] L. Chac'on, Phys. Plasmas 15, 056103 (2008) [0pt] [4] D. Bonfiglio, L. Chac'on and S. Cappello, Phys. Plasmas 17 (2010)

  9. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  10. Remote Source Document Verification in Two National Clinical Trials Networks: A Pilot Study

    PubMed Central

    Mealer, Meredith; Kittelson, John; Thompson, B. Taylor; Wheeler, Arthur P.; Magee, John C.; Sokol, Ronald J.; Moss, Marc; Kahn, Michael G.

    2013-01-01

    Objective Barriers to executing large-scale randomized controlled trials include costs, complexity, and regulatory requirements. We hypothesized that source document verification (SDV) via remote electronic monitoring is feasible. Methods Five hospitals from two NIH sponsored networks provided remote electronic access to study monitors. We evaluated pre-visit remote SDV compared to traditional on-site SDV using a randomized convenience sample of all study subjects due for a monitoring visit. The number of data values verified and the time to perform remote and on-site SDV was collected. Results Thirty-two study subjects were randomized to either remote SDV (N=16) or traditional on-site SDV (N=16). Technical capabilities, remote access policies and regulatory requirements varied widely across sites. In the adult network, only 14 of 2965 data values (0.47%) could not be located remotely. In the traditional on-site SDV arm, 3 of 2608 data values (0.12%) required coordinator help. In the pediatric network, all 198 data values in the remote SDV arm and all 183 data values in the on-site SDV arm were located. Although not statistically significant there was a consistent trend for more time consumed per data value (minutes +/- SD): Adult 0.50 +/- 0.17 min vs. 0.39 +/- 0.10 min (two-tailed t-test p=0.11); Pediatric 0.99 +/- 1.07 min vs. 0.56 +/- 0.61 min (p=0.37) and time per case report form: Adult: 4.60 +/- 1.42 min vs. 3.60 +/- 0.96 min (p=0.10); Pediatric: 11.64 +/- 7.54 min vs. 6.07 +/- 3.18 min (p=0.10) using remote SDV. Conclusions Because each site had different policies, requirements, and technologies, a common approach to assimilating monitors into the access management system could not be implemented. Despite substantial technology differences, more than 99% of data values were successfully monitored remotely. This pilot study demonstrates the feasibility of remote monitoring and the need to develop consistent access policies for remote study monitoring. PMID

  11. Feasibility study of a dual detector configuration concept for simultaneous megavoltage imaging and dose verification in radiotherapy

    SciTech Connect

    Deshpande, Shrikant; McNamara, Aimee L.; Holloway, Lois; Metcalfe, Peter; Vial, Philip

    2015-04-15

    Purpose: To test the feasibility of a dual detector concept for comprehensive verification of external beam radiotherapy. Specifically, the authors test the hypothesis that a portal imaging device coupled to a 2D dosimeter provides a system capable of simultaneous imaging and dose verification, and that the presence of each device does not significantly detract from the performance of the other. Methods: The dual detector configuration comprised of a standard radiotherapy electronic portal imaging device (EPID) positioned directly on top of an ionization-chamber array (ICA) with 2 cm solid water buildup material (between EPID and ICA) and 5 cm solid backscatter material. The dose response characteristics of the ICA and the imaging performance of the EPID in the dual detector configuration were compared to the performance in their respective reference clinical configurations. The reference clinical configurations were 6 cm solid water buildup material, an ICA, and 5 cm solid water backscatter material as the reference dosimetry configuration, and an EPID with no additional buildup or solid backscatter material as the reference imaging configuration. The dose response of the ICA was evaluated by measuring the detector’s response with respect to off-axis position, field size, and transit object thickness. Clinical dosimetry performance was evaluated by measuring a range of clinical intensity-modulated radiation therapy (IMRT) beams in transit and nontransit geometries. The imaging performance of the EPID was evaluated quantitatively by measuring the contrast-to-noise ratio (CNR) and spatial resolution. Images of an anthropomorphic phantom were also used for qualitative assessment. Results: The measured off-axis and field size response with the ICA in both transit and nontransit geometries for both dual detector configuration and reference dosimetry configuration agreed to within 1%. Transit dose response as a function of object thickness agreed to within 0.5%. All

  12. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  13. Simulation verification techniques study. Task report 4: Simulation module performance parameters and performance standards

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Shuttle simulation software modules in the environment, crew station, vehicle configuration and vehicle dynamics categories are discussed. For each software module covered, a description of the module functions and operational modes, its interfaces with other modules, its stored data, inputs, performance parameters and critical performance parameters is given. Reference data sources which provide standards of performance are identified for each module. Performance verification methods are also discussed briefly.

  14. Benchmark Study of Industrial Needs for Additive Manufacturing in Finland

    NASA Astrophysics Data System (ADS)

    Lindqvist, Markku; Piili, Heidi; Salminen, Antti

    Additive manufacturing (AM) is a modern way to produce parts for industrial use. Even though the technical knowledge and research of AM processes are strong in Finland, there are only few industrial applications. Aim of this study is to collect practical knowledge of companies who are interested in industrial use of AM, especially in South-Eastern Finland. Goal of this study is also to investigate demands and requirements of applications for industrial use of AM in this area of Finland. It was concluded, that two of the reasons prohibiting wider industrial use of AM in Finland, are wrong expectations against this technology as well as lack of basic knowledge of possibilities of the technology. Especially, it was noticed that strong 3D-hype is even causing misunderstandings. Nevertheless, the high-level industrial know-how in the area, built around Finnish lumber industry is a strong foundation for the additive manufacturing technology.

  15. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  16. Electrostatic Levitation for Studies of Additive Manufactured Materials

    NASA Technical Reports Server (NTRS)

    SanSoucie, Michael P.; Rogers, Jan R.; Tramel, Terri

    2014-01-01

    The electrostatic levitation (ESL) laboratory at NASA's Marshall Space Flight Center is a unique facility for investigators studying high temperature materials. The laboratory boasts two levitators in which samples can be levitated, heated, melted, undercooled, and resolidified. Electrostatic levitation minimizes gravitational effects and allows materials to be studied without contact with a container or instrumentation. The lab also has a high temperature emissivity measurement system, which provides normal spectral and normal total emissivity measurements at use temperature. The ESL lab has been instrumental in many pioneering materials investigations of thermophysical properties, e.g., creep measurements, solidification, triggered nucleation, and emissivity at high temperatures. Research in the ESL lab has already led to the development of advanced high temperature materials for aerospace applications, coatings for rocket nozzles, improved medical and industrial optics, metallic glasses, ablatives for reentry vehicles, and materials with memory. Modeling of additive manufacturing materials processing is necessary for the study of their resulting materials properties. In addition, the modeling of the selective laser melting processes and its materials property predictions are also underway. Unfortunately, there is very little data for the properties of these materials, especially of the materials in the liquid state. Some method to measure thermophysical properties of additive manufacturing materials is necessary. The ESL lab is ideal for these studies. The lab can provide surface tension and viscosity of molten materials, density measurements, emissivity measurements, and even creep strength measurements. The ESL lab can also determine melting temperature, surface temperatures, and phase transition temperatures of additive manufactured materials. This presentation will provide background on the ESL lab and its capabilities, provide an approach to using the ESL

  17. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  18. Recommended Protocol for Round Robin Studies in Additive Manufacturing.

    PubMed

    Moylan, Shawn; Brown, Christopher U; Slotwinski, John

    2016-03-01

    One way to improve confidence and encourage proliferation of additive manufacturing (AM) technologies and parts is by generating more high quality data describing the performance of AM processes and parts. Many in the AM community see round robin studies as a way to generate large data sets while distributing the cost among the participants, thereby reducing the cost to individual users. The National Institute of Standards and Technology (NIST) has conducted and participated in several of these AM round robin studies. While the results of these studies are interesting and informative, many of the lessons learned in conducting these studies concern the logistics and methods of the study and unique issues presented by AM. Existing standards for conducting interlaboratory studies of measurement methods, along with NIST's experience, form the basis for recommended protocols for conducting AM round robin studies. The role of round robin studies in AM qualification, some of the limitations of round robin studies, and the potential benefit of less formal collaborative experiments where multiple factors, AM machine being only one, are varied simultaneously are also discussed.

  19. Recommended Protocol for Round Robin Studies in Additive Manufacturing

    PubMed Central

    Moylan, Shawn; Brown, Christopher U.; Slotwinski, John

    2016-01-01

    One way to improve confidence and encourage proliferation of additive manufacturing (AM) technologies and parts is by generating more high quality data describing the performance of AM processes and parts. Many in the AM community see round robin studies as a way to generate large data sets while distributing the cost among the participants, thereby reducing the cost to individual users. The National Institute of Standards and Technology (NIST) has conducted and participated in several of these AM round robin studies. While the results of these studies are interesting and informative, many of the lessons learned in conducting these studies concern the logistics and methods of the study and unique issues presented by AM. Existing standards for conducting interlaboratory studies of measurement methods, along with NIST’s experience, form the basis for recommended protocols for conducting AM round robin studies. The role of round robin studies in AM qualification, some of the limitations of round robin studies, and the potential benefit of less formal collaborative experiments where multiple factors, AM machine being only one, are varied simultaneously are also discussed. PMID:27274602

  20. Pilot field-verification studies of the sodium sulfide/ferrous sulfate treatment process. Final report, September 1987-May 1988

    SciTech Connect

    Wiloff, P.M.; Suciu, D.F.; Prescott, D.S.; Schober, R.K.; Loyd, F.S.

    1988-09-01

    In previous project, jar and dynamic testing showed that the sodium sulfide/ferrous sulfate process was a viable method for reducing hexavalent chromium and removing heavy metals from the Tinker AFB industrial wastewater with significant decrease in sludge production and treatment costs. In this phase, pilot-plant field verification studies were conducted to evaluate the chemical and physical parameters of the chromium reduction process, the precipitation and clarification process, and the activated-sludge system. Sludge production was evaluated and compared to the sulfuric acid/sulfur dioxide/lime process.

  1. Genotoxicity studies of the food additive ester gum.

    PubMed

    Mukherjee, A; Agarwal, K; Chakrabarti, J

    1992-07-01

    Ester gum (EG) is used in citrus oil-based beverage flavourings as a weighting or colouring agent. In the present study, concentrations of 50, 100 and 150 mg/kg body weight were administered orally to male Swiss albino mice, and sister chromatid exchange and chromosomal aberration were used as the cytogenetic endpoints to determine the genotoxic and clastogenic potential of the food additive. Although EG was weakly clastogenic and could induce a marginal increase in sister chromatid exchange frequencies, it was not a potential health hazard at the doses tested.

  2. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    SciTech Connect

    Takahashi, R; Kamima, T; Tachibana, H; Baba, H; Itano, M; Yamazaki, T; Ishibashi, S; Higuchi, Y; Shimizu, H; Yamamoto, T; Yamashita, M; Sugawara, Y; Sato, A; Nishiyama, S; Kawai, D; Miyaoka, S

    2015-06-15

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sites (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.

  3. Making intelligent systems team players: Additional case studies

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra L.; Rhoads, Ron W.

    1993-01-01

    Observations from a case study of intelligent systems are reported as part of a multi-year interdisciplinary effort to provide guidance and assistance for designers of intelligent systems and their user interfaces. A series of studies were conducted to investigate issues in designing intelligent fault management systems in aerospace applications for effective human-computer interaction. The results of the initial study are documented in two NASA technical memoranda: TM 104738 Making Intelligent Systems Team Players: Case Studies and Design Issues, Volumes 1 and 2; and TM 104751, Making Intelligent Systems Team Players: Overview for Designers. The objective of this additional study was to broaden the investigation of human-computer interaction design issues beyond the focus on monitoring and fault detection in the initial study. The results of this second study are documented which is intended as a supplement to the original design guidance documents. These results should be of interest to designers of intelligent systems for use in real-time operations, and to researchers in the areas of human-computer interaction and artificial intelligence.

  4. Verification of Multiphysics software: Space and time convergence studies for nonlinearly coupled applications

    SciTech Connect

    Jean C. Ragusa; Vijay Mahadevan; Vincent A. Mousseau

    2009-05-01

    High-fidelity modeling of nuclear reactors requires the solution of a nonlinear coupled multi-physics stiff problem with widely varying time and length scales that need to be resolved correctly. A numerical method that converges the implicit nonlinear terms to a small tolerance is often referred to as nonlinearly consistent (or tightly coupled). This nonlinear consistency is still lacking in the vast majority of coupling techniques today. We present a tightly coupled multiphysics framework that tackles this issue and present code-verification and convergence analyses in space and time for several models of nonlinear coupled physics.

  5. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    NASA Technical Reports Server (NTRS)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  6. Expert system verification and validation study. Phase 2: Requirements identification. Delivery 1: Updated survey report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.

  7. Additive Manufacturing Materials Study for Gaseous Radiation Detection

    SciTech Connect

    Steer, C.A.; Durose, A.; Boakes, J.

    2015-07-01

    Additive manufacturing (AM) techniques may lead to improvements in many areas of radiation detector construction; notably the rapid manufacturing time allows for a reduced time between prototype iterations. The additive nature of the technique results in a granular microstructure which may be permeable to ingress by atmospheric gases and make it unsuitable for gaseous radiation detector development. In this study we consider the application of AM to the construction of enclosures and frames for wire-based gaseous radiation tracking detectors. We have focussed on oxygen impurity ingress as a measure of the permeability of the enclosure, and the gas charging and discharging curves of several simplistic enclosure shapes are reported. A prototype wire-frame is also presented to examine structural strength and positional accuracy of an AM produced frame. We lastly discuss the implications of this study for AM based radiation detection technology as a diagnostic tool for incident response scenarios, such as the interrogation of a suspect radiation-emitting package. (authors)

  8. RAMSEYS DRAFT WILDERNESS STUDY AREA AND ADDITION, VIRGINIA.

    USGS Publications Warehouse

    Lesure, Frank G.; Mory, Peter C.

    1984-01-01

    Mineral-resource surveys of the Ramseys Draft Wilderness Study Area and adjoining roadless area addition in George Washington National Forest in the western valley and ridge province, Augusta and Highland Counties, Virginia, were done. The surveys outlined three small areas containing anomalous amounts of copper, lead, and zinc related to stratabound red-bed copper mineralization, but these occurrences are not large and are not considered as having mineral-resource potential. The area contains abundant sandstone suitable for construction materials and shale suitable for making brick, tile, and other low-grade ceramic products, but these commodities occur in abundance outside the wilderness study area. Structural conditions are probably favorable for the accumulation of natural gas, but exploratory drilling has not been done sufficiently near the area to evaluate the gas potential.

  9. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    Formal Verification the verification tools developed by the Programming Languages and Software Engineering group were improved. A series of games...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox. Verification tools and games were integrated to verify...N/A i Contents List of Figures 1. SUMMARY .............................................................................................. 1 2

  10. Satellite Power Systems (SPS) concept definition study. Volume 6: SPS technology requirements and verification

    NASA Technical Reports Server (NTRS)

    Hanley, G.

    1978-01-01

    Volume 6 of the SPS Concept Definition Study is presented and also incorporates results of NASA/MSFC in-house effort. This volume includes a supporting research and technology summary. Other volumes of the final report that provide additional detail are as follows: (1) Executive Summary; (2) SPS System Requirements; (3) SPS Concept Evolution; (4) SPS Point Design Definition; (5) Transportation and Operations Analysis; and Volume 7, SPS Program Plan and Economic Analysis.

  11. Experimental Study of Additives on Viscosity biodiesel at Low Temperature

    NASA Astrophysics Data System (ADS)

    Fajar, Berkah; Sukarno

    2015-09-01

    An experimental investigation was performed to find out the viscosity of additive and biodiesel fuel mixture in the temperature range from 283 K to 318 K. Solutions to reduce the viscosity of biodiesel is to add the biodiesel with some additive. The viscosity was measured using a Brookfield Rheometer DV-II. The additives were the generic additive (Diethyl Ether/DDE) and the commercial additive Viscoplex 10-330 CFI. Each biodiesel blends had a concentration of the mixture: 0.0; 0.25; 0.5; 0.75; 1.0; and 1.25% vol. Temperature of biodiesel was controlled from 40°C to 0°C. The viscosity of biodiesel and additive mixture at a constant temperature can be approximated by a polynomial equation and at a constant concentration by exponential equation. The optimum mixture is at 0.75% for diethyl ether and 0.5% for viscoplex.

  12. Expert system verification and validation study. ES V/V guidelines/workshop conference summary

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    The intent of the workshop was to start moving research on the verification and validation (V&V) of knowledge based systems (KBSs) in the direction of providing tangible 'products' that a KBS developer could use. In the near term research will focus on identifying the kinds of experiences encountered during KBS development of 'real' KBSs. These will be stored in a repository and will serve as the foundation for the rest of the activities described here. One specific approach to be pursued is 'benchmarking'. With this approach, a KBS developer can use either 'canned' KBSs with seeded errors or existing KBSs with known errors to evaluate a given tool's ability to satisfactorily identify errors.

  13. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.

  14. Soil moisture verification study of the ESTAR microwave radiometer - Walnut Gulch, AZ 1991

    NASA Technical Reports Server (NTRS)

    Jackson, T. J.; Le Vine, D. M.; Griffis, A.; Goodrich, D. C.; Schmugge, T. J.; Swift, C. T.; O'Neill, P. E.; Roberts, R. R.; Parry, R.

    1992-01-01

    The application of an electronically steered thinned array L-band radiometer (ESTAR) for soil moisture mapping is investigated over the arid rangeland Walnut Gulch Watershed. Antecedent rainfall and evaporation for the flights are very different and result in a wide range of soil moisture conditions. The high spatial variability of rainfall events within this region results in moisture conditions with dramatic spatial patterns. Sensor performance is verified using two approaches. Microwave data are used in conjunction with a microwave emission model to predict soil moisture. These predictions are compared to ground observations of soil moisture. A second verification is possible using an extensive data set. Both tests showed that the ESTAR is capable of providing soil moisture with the same level of accuracy as existing systems.

  15. Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.

    PubMed

    Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M

    2000-02-01

    Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.

  16. Simulation verification techniques study: Simulation self test hardware design and techniques report

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.

  17. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  18. Additive Manufacturing in Production: A Study Case Applying Technical Requirements

    NASA Astrophysics Data System (ADS)

    Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni

    Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.

  19. Supporting the design of translational clinical studies through the generation and verification of conceptual knowledge-anchored hypotheses.

    PubMed

    Payne, Philip R O; Payne, Philip Richard Orrin; Borlawsky, Tara B; Borlawsky, Tara; Kwok, Alan; Greaves, Andrew W; Greaves, Andrew

    2008-11-06

    The ability to generate hypotheses based upon the contents of large-scale, heterogeneous data sets is critical to the design of translational clinical studies. In previous reports, we have described the application of a conceptual knowledge engineering technique, known as constructive induction (CI) in order to satisfy such needs. However, one of the major limitations of this method is the need to engage multiple subject matter experts to verify potential hypotheses generated using CI. In this manuscript, we describe an alternative verification technique that leverages published biomedical literature abstracts. Our report will be framed in the context of an ongoing project to generate hypotheses related to the contents of a translational research data repository maintained by the CLL Research Consortium. Such hypotheses will are intended to inform the design of prospective clinical studies that can elucidate the relationships that may exist between biomarkers and patient phenotypes.

  20. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  1. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  2. Inverse dynamics of underactuated mechanical systems: A simple case study and experimental verification

    NASA Astrophysics Data System (ADS)

    Blajer, W.; Dziewiecki, K.; Kołodziejczyk, K.; Mazur, Z.

    2011-05-01

    Underactuated systems are featured by fewer control inputs than the degrees-of-freedom, m < n. The determination of an input control strategy that forces such a system to complete a set of m specified motion tasks is a challenging task, and the explicit solution existence is conditioned to differential flatness of the problem. The flatness-based solution denotes that all the 2 n states and m control inputs can be algebraically expressed in terms of the m specified outputs and their time derivatives up to a certain order, which is in practice attainable only for simple systems. In this contribution the problem is posed in a more practical way as a set of index-three differential-algebraic equations, and the solution is obtained numerically. The formulation is then illustrated by a two-degree-of-freedom underactuated system composed of two rotating discs connected by a torsional spring, in which the pre-specified motion of one of the discs is actuated by the torque applied to the other disc, n = 2 and m = 1. Experimental verification of the inverse simulation control methodology is reported.

  3. GENERIC VERIFICATION PROTOCOL FOR DETERMINATION OF EMISSIONS REDUCTIONS OBTAINED BY USE OF ALTERNATIVE OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSIONS AND LUBRICANTS FOR HIGHWAY AND NONROAD USE DISEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    This report sets standards by which the emissions reduction provided by fuel and lubricant technologies can be tested and be tested in a comparable way. It is a generic protocol under the Environmental Technology Verification program.

  4. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    SciTech Connect

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  5. Psychiatric Residents' Attitudes toward and Experiences with the Clinical-Skills Verification Process: A Pilot Study on U.S. and International Medical Graduates

    ERIC Educational Resources Information Center

    Rao, Nyapati R.; Kodali, Rahul; Mian, Ayesha; Ramtekkar, Ujjwal; Kamarajan, Chella; Jibson, Michael D.

    2012-01-01

    Objective: The authors report on a pilot study of the experiences and perceptions of foreign international medical graduate (F-IMG), United States international medical graduate (US-IMG), and United States medical graduate (USMG) psychiatric residents with the newly mandated Clinical Skills Verification (CSV) process. The goal was to identify and…

  6. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  7. A Preliminary Study of In-House Monte Carlo Simulations: An Integrated Monte Carlo Verification System

    SciTech Connect

    Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hidek; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki

    2009-10-01

    Purpose: To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. Methods and Materials: The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. Results: The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Conclusions: Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.

  8. Feasibility study on the verification of fresh fuel assemblies in shipping containers

    SciTech Connect

    Swinth, K.L.; Tanner, J.E.

    1990-09-01

    The purpose of this study was to examine the feasibility of using various nondestructive measurement techniques to determine the presence of fuel assemblies inside shipping containers and to examine the feasibility of measuring the fissile content of the containers. Passive and active techniques based on both gamma and neutron assay were examined. In addition, some experiments and calculations were performed to evaluate neutron techniques. Passive counting of the 186 keV gamma from {sup 235}U is recommended for use as an attributes measurement technique. Experiments and studies indicated that a bismuth germanate (BGO) scintillator is the preferred detector. A properly designed system based on this detector will provide a compact detector that can selectively verify fuel assemblies within a shipping container while the container is in a stack of similarly loaded containers. Missing fuel assemblies will be readily detected, but gamma counting of assemblies cannot detect changes in the fissile content of the inner rods in an assembly. If a variables technique is required, it is recommended that more extensive calculations be performed and removal of the outer shipping container be considered. Marking (sealing) of the assemblies with a uniquely identifiable transponder was also considered. This would require the development of procedures that would assure proper application and removal of the seal. When change to a metal outer container occurs, the technique will no longer be useful unless a radiolucent window is included in the container. 20 refs., 7 figs., 2 tabs.

  9. Additive empirical parametrization and microscopic study of deuteron breakup

    NASA Astrophysics Data System (ADS)

    Avrigeanu, M.; Avrigeanu, V.

    2017-02-01

    Comparative assessment of the total breakup proton-emission cross sections measured for 56 MeV deuteron interaction with target nuclei from 12C to 209Bi, with an empirical parametrization and recently calculated microscopic neutron-removal cross sections was done at the same time with similar data measured at 15, 25.5, 70, and 80 MeV. Comparable mass dependencies of the elastic-breakup (EB) cross sections provided by the empirical parametrization and the microscopic results have been also found at the deuteron energy of 56 MeV, while the assessment of absolute-values variance up to a factor of two was not possible because of the lack of EB measurements at energies higher than 25.5 MeV. While the similarities represent an additional validation of the microscopic calculations, the cross-section difference should be considered within the objectives of further measurements.

  10. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  11. NMR relaxometry study of plaster mortar with polymer additives

    SciTech Connect

    Jumate, E.; Manea, D.; Moldovan, D.; Fechete, R.

    2013-11-13

    The cement mixed with water forms a plastic paste or slurry which stiffness in time and finally hardens into a resistant stone. The addition of sand aggregates, polymers (Walocel) and/or calcium carbonate will modify dramatically the final mortar mechanic and thermal properties. The hydration processes can be observed using the 1D NMR measurements of transverse T{sub 2} relaxation times distributions analysed by a Laplace inversion algorithm. These distributions were obtained for mortar pasta measured at 2 hours after preparation then at 3, 7 and 28 days after preparation. Multiple components are identified in the T{sub 2} distributions. These can be associated with the proton bounded chemical or physical to the mortar minerals characterized by a short T{sub 2} relaxation time and to water protons in pores with three different pore sizes as observed from SEM images. The evaporation process is faster in the first hours after preparation, while the mortar hydration (bonding of water molecules to mortar minerals) can be still observed after days or months from preparation. Finally, the mechanic resistance was correlated with the transverse T{sub 2} relaxation rates corresponding to the bound water.

  12. Monte Carlo verification of polymer gel dosimetry applied to radionuclide therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Gear, J. I.; Charles-Edwards, E.; Partridge, M.; Flux, G. D.

    2011-11-01

    This study evaluates the dosimetric performance of the polymer gel dosimeter 'Methacrylic and Ascorbic acid in Gelatin, initiated by Copper' and its suitability for quality assurance and analysis of I-131-targeted radionuclide therapy dosimetry. Four batches of gel were manufactured in-house and sets of calibration vials and phantoms were created containing different concentrations of I-131-doped gel. Multiple dose measurements were made up to 700 h post preparation and compared to equivalent Monte Carlo simulations. In addition to uniformly filled phantoms the cross-dose distribution from a hot insert to a surrounding phantom was measured. In this example comparisons were made with both Monte Carlo and a clinical scintigraphic dosimetry method. Dose-response curves generated from the calibration data followed a sigmoid function. The gels appeared to be stable over many weeks of internal irradiation with a delay in gel response observed at 29 h post preparation. This was attributed to chemical inhibitors and slow reaction rates of long-chain radical species. For this reason, phantom measurements were only made after 190 h of irradiation. For uniformly filled phantoms of I-131 the accuracy of dose measurements agreed to within 10% when compared to Monte Carlo simulations. A radial cross-dose distribution measured using the gel dosimeter compared well to that calculated with Monte Carlo. Small inhomogeneities were observed in the dosimeter attributed to non-uniform mixing of monomer during preparation. However, they were not detrimental to this study where the quantitative accuracy and spatial resolution of polymer gel dosimetry were far superior to that calculated using scintigraphy. The difference between Monte Carlo and gel measurements was of the order of a few cGy, whilst with the scintigraphic method differences of up to 8 Gy were observed. A manipulation technique is also presented which allows 3D scintigraphic dosimetry measurements to be compared to polymer

  13. Monte Carlo verification of polymer gel dosimetry applied to radionuclide therapy: a phantom study.

    PubMed

    Gear, J I; Charles-Edwards, E; Partridge, M; Flux, G D

    2011-11-21

    This study evaluates the dosimetric performance of the polymer gel dosimeter 'Methacrylic and Ascorbic acid in Gelatin, initiated by Copper' and its suitability for quality assurance and analysis of I-131-targeted radionuclide therapy dosimetry. Four batches of gel were manufactured in-house and sets of calibration vials and phantoms were created containing different concentrations of I-131-doped gel. Multiple dose measurements were made up to 700 h post preparation and compared to equivalent Monte Carlo simulations. In addition to uniformly filled phantoms the cross-dose distribution from a hot insert to a surrounding phantom was measured. In this example comparisons were made with both Monte Carlo and a clinical scintigraphic dosimetry method. Dose-response curves generated from the calibration data followed a sigmoid function. The gels appeared to be stable over many weeks of internal irradiation with a delay in gel response observed at 29 h post preparation. This was attributed to chemical inhibitors and slow reaction rates of long-chain radical species. For this reason, phantom measurements were only made after 190 h of irradiation. For uniformly filled phantoms of I-131 the accuracy of dose measurements agreed to within 10% when compared to Monte Carlo simulations. A radial cross-dose distribution measured using the gel dosimeter compared well to that calculated with Monte Carlo. Small inhomogeneities were observed in the dosimeter attributed to non-uniform mixing of monomer during preparation. However, they were not detrimental to this study where the quantitative accuracy and spatial resolution of polymer gel dosimetry were far superior to that calculated using scintigraphy. The difference between Monte Carlo and gel measurements was of the order of a few cGy, whilst with the scintigraphic method differences of up to 8 Gy were observed. A manipulation technique is also presented which allows 3D scintigraphic dosimetry measurements to be compared to polymer

  14. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  15. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  16. Simulated sudden increase in geomagnetic activity and its effect on heart rate variability: Experimental verification of correlation studies

    NASA Astrophysics Data System (ADS)

    Caswell, Joseph M.; Singh, Manraj; Persinger, Michael A.

    2016-08-01

    Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings.

  17. Additional EIPC Study Analysis: Interim Report on High Priority Topics

    SciTech Connect

    Hadley, Stanton W

    2013-11-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 13 topics was developed for further analysis; this paper discusses the first five.

  18. Additional studies for the spectrophotometric measurement of iodine in water

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Previous work in iodine spectroscopy is briefly reviewed. Continued studies of the direct spectrophotometric determination of aqueous iodine complexed with potassium iodide show that free iodine is optimally determined at the isosbestic point for these solutions. The effects on iodine determinations of turbidity and chemical substances (in trace amounts) is discussed and illustrated. At the levels tested, iodine measurements are not significantly altered by such substances. A preliminary design for an on-line, automated iodine monitor with eventual capability of operating also as a controller was analyzed and developed in detail with respect single beam colorimeter operating at two wavelengths (using a rotating filter wheel). A flow-through sample cell allows the instrument to operate continuously, except for momentary stop flow when measurements are made. The timed automatic cycling of the system may be interrupted whenever desired, for manual operation. An analog output signal permits controlling an iodine generator.

  19. NOTE: Hybrid plan verification for intensity-modulated radiation therapy (IMRT) using the 2D ionization chamber array I'mRT MatriXX—a feasibility study

    NASA Astrophysics Data System (ADS)

    Dobler, Barbara; Streck, Natalia; Klein, Elisabeth; Loeschel, Rainer; Haertl, Petra; Koelbl, Oliver

    2010-01-01

    The 2D ionization chamber array I'mRT MatriXX (IBA, Schwarzenbruck, Germany) has been developed for absolute 2D dosimetry and verification of intensity-modulated radiation therapy (IMRT) for perpendicular beam incidence. The aim of this study is to evaluate the applicability of I'mRT MatriXX for oblique beam incidence and hybrid plan verification of IMRT with original gantry angles. For the assessment of angular dependence, open fields with gantry angles in steps of 10° were calculated on a CT scan of I'mRT MatriXX. For hybrid plan verification, 17 clinical IMRT plans and one rotational plan were used. Calculations were performed with pencil beam (PB), collapsed cone (CC) and Monte Carlo (MC) methods, which had been previously validated. Measurements were conducted on an Elekta SynergyS linear accelerator. To assess the potential and limitations of the system, gamma evaluation was performed with different dose tolerances and distances to agreement. Hybrid plan verification passed the gamma test with 4% dose tolerance and 3 mm distance to agreement in all cases, in 82-88% of the cases for tolerances of 3%/3 mm, and in 59-76% of the cases if 3%/2 mm were used. Separate evaluation of the low dose and high dose regions showed that I'mRT MatriXX can be used for hybrid plan verification of IMRT plans within 3% dose tolerance and 3 mm distance to agreement with a relaxed dose tolerance of 4% in the low dose region outside the multileaf collimator (MLC).

  20. Implementation of the Short-Term Ensemble Prediction System (STEPS) in Belgium and verification of case studies

    NASA Astrophysics Data System (ADS)

    Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent

    2014-05-01

    The Short-Term Ensemble Prediction System (STEPS) is a probabilistic precipitation nowcasting scheme developed at the Australian Bureau of Meteorology in collaboration with the UK Met Office. In order to account for the multiscaling nature of rainfall structures, the radar field is decomposed into an 8 levels multiplicative cascade using a Fast Fourier Transform. The cascade is advected using the velocity field estimated with optical flow and evolves stochastically according to a hierarchy of auto-regressive processes. This allows reproducing the empirical observation that the rate of temporal evolution of the small scales is faster than the large scales. The uncertainty in radar rainfall measurement and the unknown future development of the velocity field are also considered by stochastic modelling in order to reflect their typical spatial and temporal variability. Recently, a 4 years national research program has been initiated by the University of Leuven, the Royal Meteorological Institute (RMI) of Belgium and 3 other partners: PLURISK ("forecasting and management of extreme rainfall induced risks in the urban environment"). The project deals with the nowcasting of rainfall and subsequent urban inundations, as well as socio-economic risk quantification, communication, warning and prevention. At the urban scale it is widely recognized that the uncertainty of hydrological and hydraulic models is largely driven by the input rainfall estimation and forecast uncertainty. In support to the PLURISK project the RMI aims at integrating STEPS in the current operational deterministic precipitation nowcasting system INCA-BE (Integrated Nowcasting through Comprehensive Analysis). This contribution will illustrate examples of STEPS ensemble and probabilistic nowcasts for a few selected case studies of stratiform and convective rain in Belgium. The paper focuses on the development of STEPS products for potential hydrological users and a preliminary verification of the nowcasts

  1. Food additives and Hymenolepis nana infection: an experimental study.

    PubMed

    El-Nouby, Kholoud A; Hamouda, Hala E; Abd El Azeem, Mona A; El-Ebiary, Ahmad A

    2009-12-01

    The effect of sodium benzoate (SB) on the pathogenesis of Hymenolepis nana (H. nana) and its neurological manifestations was studied in the present work. One hundred and thirty five mice were classified into three groups. GI: received SB alone. GII: received SB before & after infection with H. nana and GIII: infected with H. nana. All groups were subjected to parasitological, histopathological, immunohistochemical and biochemical assays. The results revealed a significant decrease in IL-4 serum level with a significant increase in gamma amino butyric acid (GABA) and decrease in zinc brain levels in GI, while GII showed non significant increase in IL-4 level that resulted in a highly significant increase in the mean number of cysticercoids and adult worms with delayed expulsion as compared to GIII. This was reflected on histopathological and immunohistochemical changes in the brain. Also, there was a highly significant increase in GABA and decrease in zinc brain levels in GII to the degree that induced behavioral changes. This emphasizes the possible synergistic effect of SB on the neurological manifestations of H. nana and could, in part, explain the increased incidence of behavioral changes in children exposed to high doses of SB and unfortunately have H. nana infection.

  2. Verification and analysis of TOVS satellite data

    NASA Technical Reports Server (NTRS)

    Gelman, M. E.; Miller, A. J.; Schmidlin, F. J.

    1984-01-01

    A study which involves the verification of TOVS (TIROS Operational Vertical Soundings) stratospheric temperature information and derived geopotential heights, as well as the higher order derivations of wind, heat flux, and momentum flux, is discussed. Rocketsondes were launched daily at the times of NOAA-7 overpasses of Wallops Island during February 23-March 24, 1982, and resources made available by NASA were used to provide in-situ field measurements and additional analysis. The TOVS satellite products from hemispheric analyses are compared with unsmoothed rocketsonde data. Some of the benefits provided by the comparisons are considered.

  3. Test/QA plan for the verification testing of alternative or reformulated liquid fuels, fuel additives, fuel emulsions, and lubricants for highway and nonroad use heavy-duty diesel engines

    EPA Science Inventory

    This Environmental Technology Verification Program test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR P...

  4. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  5. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-02

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment.

  6. Human Verification Using a Combination of Static and Dynamic Characteristics in Foot Pressure Images

    PubMed Central

    Zare, Fereshteh E.; Maghooli, Keivan

    2016-01-01

    Since gait is the mixture of many complex movements, each individual can define with a unique foot pressure image that can be used as a reliable biometric scale for human verification. Foot pressure color images of Center for Biometrics and Security Research (CBSR) dataset from 45 men and 5 women were used in this study. Owing to the properties of this dataset, an index of foot pressure in addition to external feature and contourlet coefficient of images was extracted. A multilayer perceptron (MLP) was utilized for verification of subjects (it is a common practice to explain more about the training and test dataset). To validate the algorithm performance, results were obtained using a 5-fold cross validation approach. The results indicated accuracy of 99.14±0.65 and equal error rate (EER) of 0.02. These results demonstrated the reliability of proposed neural network in human verification application. Hence, it can be utilized in other verification systems. PMID:28028498

  7. Environmental Technology Verification (ETV) Program Case Studies: Demonstrating Program Outcomes, Volume III

    EPA Science Inventory

    This booklet, ETV Program Case Studies: Demonstrating Program Outcomes, Volume III contains two case studies, addressing verified environmental technologies for decentalized wastewater treatment and converting animal waste to energy. Each case study contains a brief description ...

  8. Sterilization of compounded parenteral preparations: verification of autoclaves.

    PubMed

    Rahe, Hank

    2013-01-01

    This article discusses the basic principles for verification of a sterilization process and provides a recommended approach to assure that autoclaves deliver the sterility-assured levels required for patient safety. Included is a summary of the protocol and verification (validation) results of a previously published case study involving autoclaves. To assure the sterility of compounded preparations, a verification procedure must be in place.

  9. A Teacher-Verification Study of Speaking and Writing Prototype Tasks for a New TOEFL

    ERIC Educational Resources Information Center

    Cumming, A.; Grant, L.; Mulcahy-Ernt, P.; Powers, D.E.

    2004-01-01

    This study was undertaken, in conjunction with other studies field-testing prototype tasks for a new TOEFL, to evaluate the content validity, perceived authenticity and educational appropriateness of these prototype tasks. We interviewed seven highly experienced instructors of English as a Second Language (ESL) at three universities, asking them…

  10. A Study of Occupational Opportunities for Chicago Youth. DACUM Task Analysis and Survey Verification. Locally Verified Task Lists.

    ERIC Educational Resources Information Center

    National Alliance of Business, Inc., Washington, DC.

    This document contains the new and verified task lists for 10 occupations: shipping and receiving clerk; general office clerk; billing, cost, and rate clerk; order clerk; accounting clerk; typist/word processor; drafter; cook; automobile mechanic; and carpenter. The DACUM (Developing a Curriculum) process task analysis and survey verification and…

  11. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  12. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  13. RESULTS OF A METHOD VERIFICATION STUDY FOR ANALYSES OF PCP IN SOIL

    EPA Science Inventory

    As a prelude to a field demonstration of the fungal treatment technology by the SITE Program, a field treatability study was performed to select optimal fungal species and loading rates.using the site-specific soil matrix contaminated with Wood preserving wastes: PCP and PAHS. ur...

  14. Debris flow run off simulation and verification -- case study of Chen-You-Lan Watershed, Taiwan

    NASA Astrophysics Data System (ADS)

    Lin, M.-L.; Wang, K.-L.; Huang, J.-J.

    2005-06-01

    In 1996 typhoon Herb struck the central Taiwan area, causing severe debris flow in many subwatersheds of the Chen-You-Lan river watershed. More severe cases of debris flow occurred following Chi-Chi earthquake, 1999. In order to identify the potentially affected area and its severity, the ability to simulate the flow route of debris is desirable. In this research numerical simulation of debris flow deposition process had been carried out using FLO-2D adopting Chui-Sue river watershed as the study area. Sensitivity study of parameters used in the numerical model was conducted and adjustments were made empirically. The micro-geomorphic database of Chui-Sue river watershed was generated and analyzed to understand the terrain variations caused by the debris flow. Based on the micro-geomorphic analysis, the debris deposition in the Chui-Sue river watershed in the downstream area, and the position and volume of debris deposition were determined. The simulated results appeared to agree fairly well with the results of micro-geomorphic study of the area when not affected by other inflow rivers, and the trends of debris distribution in the study area appeared to be fairly consistent.

  15. Time-of-flight neutron rejection to improve prompt gamma imaging for proton range verification: a simulation study

    NASA Astrophysics Data System (ADS)

    Biegun, Aleksandra K.; Seravalli, Enrica; Cambraia Lopes, Patrícia; Rinaldi, Ilaria; Pinto, Marco; Oxley, David C.; Dendooven, Peter; Verhaegen, Frank; Parodi, Katia; Crespo, Paulo; Schaart, Dennis R.

    2012-10-01

    Therapeutic proton and heavier ion beams generate prompt gamma photons that may escape from the patient. In principle, this allows for real-time, in situ monitoring of the treatment delivery, in particular, the hadron range within the patient, by imaging the emitted prompt gamma rays. Unfortunately, the neutrons simultaneously created with the prompt photons create a background that may obscure the prompt gamma signal. To enhance the accuracy of proton dose verification by prompt gamma imaging, we therefore propose a time-of-flight (TOF) technique to reject this neutron background, involving a shifting time window to account for the propagation of the protons through the patient. Time-resolved Monte Carlo simulations of the generation and transport of prompt gamma photons and neutrons upon irradiation of a PMMA phantom with 100, 150 and 200 MeV protons were performed using Geant4 (version 9.2.p02) and MCNPX (version 2.7.D). The influence of angular collimation and TOF selection on the prompt gamma and neutron longitudinal profiles is studied. Furthermore, the implications of the proton beam microstructure (characterized by the proton bunch width and repetition period) are investigated. The application of a shifting TOF window having a width of ΔTOFz = 1.0 ns appears to reduce the neutron background by more than 99%. Subsequent application of an energy threshold does not appear to sharpen the distal falloff of the prompt gamma profile but reduces the tail that is observed beyond the proton range. Investigations of the influence of the beam time structure show that TOF rejection of the neutron background is expected to be effective for typical therapeutic proton cyclotrons.

  16. Time-of-flight neutron rejection to improve prompt gamma imaging for proton range verification: a simulation study.

    PubMed

    Biegun, Aleksandra K; Seravalli, Enrica; Lopes, Patrícia Cambraia; Rinaldi, Ilaria; Pinto, Marco; Oxley, David C; Dendooven, Peter; Verhaegen, Frank; Parodi, Katia; Crespo, Paulo; Schaart, Dennis R

    2012-10-21

    Therapeutic proton and heavier ion beams generate prompt gamma photons that may escape from the patient. In principle, this allows for real-time, in situ monitoring of the treatment delivery, in particular, the hadron range within the patient, by imaging the emitted prompt gamma rays. Unfortunately, the neutrons simultaneously created with the prompt photons create a background that may obscure the prompt gamma signal. To enhance the accuracy of proton dose verification by prompt gamma imaging, we therefore propose a time-of-flight (TOF) technique to reject this neutron background, involving a shifting time window to account for the propagation of the protons through the patient. Time-resolved Monte Carlo simulations of the generation and transport of prompt gamma photons and neutrons upon irradiation of a PMMA phantom with 100, 150 and 200 MeV protons were performed using Geant4 (version 9.2.p02) and MCNPX (version 2.7.D). The influence of angular collimation and TOF selection on the prompt gamma and neutron longitudinal profiles is studied. Furthermore, the implications of the proton beam microstructure (characterized by the proton bunch width and repetition period) are investigated. The application of a shifting TOF window having a width of ΔTOF(z) = 1.0 ns appears to reduce the neutron background by more than 99%. Subsequent application of an energy threshold does not appear to sharpen the distal falloff of the prompt gamma profile but reduces the tail that is observed beyond the proton range. Investigations of the influence of the beam time structure show that TOF rejection of the neutron background is expected to be effective for typical therapeutic proton cyclotrons.

  17. A simulation study of a C-shaped in-beam PET system for dose verification in carbon ion therapy

    NASA Astrophysics Data System (ADS)

    Jung An, Su; Beak, Cheol-Ha; Lee, Kisung; Hyun Chung, Yong

    2013-01-01

    The application of hadrons such as carbon ions is being developed for the treatment of cancer. The effectiveness of such a technique is due to the eligibility of charged particles in delivering most of their energy near the end of the range, called the Bragg peak. However, accurate verification of dose delivery is required since misalignment of the hadron beam can cause serious damage to normal tissue. PET scanners can be utilized to track the carbon beam to the tumor by imaging the trail of the hadron-induced positron emitters in the irradiated volume. In this study, we designed and evaluated (through Monte Carlo simulations) an in-beam PET scanner for monitoring patient dose in carbon beam therapy. A C-shaped PET and a partial-ring PET were designed to avoid interference between the PET detectors and the therapeutic carbon beam delivery. Their performance was compared with that of a full-ring PET scanner. The C-shaped, partial-ring, and full-ring scanners consisted of 14, 12, and 16 detector modules, respectively, with a 30.2 cm inner diameter for brain imaging. Each detector module was composed of a 13×13 array of 4.0 mm×4.0 mm×20.0 mm LYSO crystals and four round 25.4 mm diameter PMTs. To estimate the production yield of positron emitters such as 10C, 11C, and 15O, a cylindrical PMMA phantom (diameter, 20 cm; thickness, 20 cm) was irradiated with 170, 290, and 350 AMeV 12C beams using the GATE code. Phantom images of the three types of scanner were evaluated by comparing the longitudinal profile of the positron emitters, measured along the carbon beam as it passed a simulated positron emitter distribution. The results demonstrated that the development of a C-shaped PET scanner to characterize carbon dose distribution for therapy planning is feasible.

  18. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  19. Microbial ureolysis in the seawater-catalysed urine phosphorus recovery system: Kinetic study and reactor verification.

    PubMed

    Tang, Wen-Tao; Dai, Ji; Liu, Rulong; Chen, Guang-Hao

    2015-12-15

    Our previous study has confirmed the feasibility of using seawater as an economical precipitant for urine phosphorus (P) precipitation. However, we still understand very little about the ureolysis in the Seawater-based Urine Phosphorus Recovery (SUPR) system despite its being a crucial step for urine P recovery. In this study, batch experiments were conducted to investigate the kinetics of microbial ureolysis in the seawater-urine system. Indigenous bacteria from urine and seawater exhibited relatively low ureolytic activity, but they adapted quickly to the urine-seawater mixture during batch cultivation. During cultivation, both the abundance and specific ureolysis rate of the indigenous bacteria were greatly enhanced as confirmed by a biomass-dependent Michaelis-Menten model. The period for fully ureolysis was decreased from 180 h to 2.5 h after four cycles of cultivation. Based on the successful cultivation, a lab-scale SUPR reactor was set up to verify the fast ureolysis and efficient P recovery in the SUPR system. Nearly complete urine P removal was achieved in the reactor in 6 h without adding any chemicals. Terminal Restriction Fragment Length Polymorphism (TRFLP) analysis revealed that the predominant groups of bacteria in the SUPR reactor likely originated from seawater rather than urine. Moreover, batch tests confirmed the high ureolysis rates and high phosphorus removal efficiency induced by cultivated bacteria in the SUPR reactor under seawater-to-urine mixing ratios ranging from 1:1 to 9:1. This study has proved that the enrichment of indigenous bacteria in the SUPR system can lead to sufficient ureolytic activity for phosphate precipitation, thus providing an efficient and economical method for urine P recovery.

  20. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    PubMed Central

    Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos

    2016-01-01

    Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451

  1. A kinetic study of lipase-catalyzed reversible kinetic resolution involving verification at miniplant-scale.

    PubMed

    Berendsen, W R; Gendrot, G; Freund, A; Reuss, M

    2006-12-05

    Lipase-catalyzed kinetic resolution of racemates is a popular method for synthesis of chiral synthons. Most of these resolutions are reversible equilibrium limited reactions. For the first time, an extensive kinetic model is proposed for kinetic resolution reactions, which takes into account the full reversibility of the reaction, substrate inhibition by an acyl donor and an acyl acceptor as well as alternative substrate inhibition by each enantiomer. For this purpose, the reversible enantioselective transesterification of (R/S)-1-methoxy-2-propanol with ethyl acetate catalyzed by Candida antarctica lipase B (CAL-B) is investigated. The detailed model presented here is valid for a wide range of substrate and product concentrations. Following model discrimination and the application of Haldane equations to reduce the degree of freedom in parameter estimation, the 11 free parameters are successfully identified. All parameters are fitted to the complete data set simultaneously. Six types of independent initial rate studies provide a solid data basis for the model. The effect of changes in substrate and product concentration on reaction kinetics is discussed. The developed model is used for simulations to study the behavior of reaction kinetics in a fixed bed reactor. The typical plot of enantiomeric excess versus conversion of substrate and product is evaluated at various initial substrate mixtures. The model is validated by comparison with experimental results obtained with a fixed bed reactor, which is part of a fully automated state-of-the-art miniplant.

  2. Laboratory verification of submicron magnetite production in pseudotachylytes: relevance for paleointensity studies

    NASA Astrophysics Data System (ADS)

    Nakamura, Norihiro; Hirose, Takehiro; Borradaile, Graham J.

    2002-07-01

    Pseudotachylytes generally possess stable remanent magnetizations but the processes by which pseudotachylytes are magnetized remain poorly understood. Magnetic hysteresis and scanning electron microscope studies reveal that experimental frictional melting of granites produces dispersed submicron inclusions of weakly interacting pseudo-single-domain (PSD) magnetite, in artificial pseudotachylyte. The magnetite inclusions are absent in the undeformed granite protolith and result from oxidation of Fe in melt-susceptible mafic minerals during the melt-quenched event. The pseudotachylytes acquired a stable thermal remanence in fine-grained PSD magnetites during the rapid cooling of the melt, implying that fine-grained magnetite has the potential for paleointensity determinations of contemporaneous magnetic fields with co-seismic faulting in granitoids.

  3. Health Checkup and Telemedical Intervention Program for Preventive Medicine in Developing Countries: Verification Study

    PubMed Central

    Kai, Eiko; Ghosh, Partha Pratim; Islam, Rafiqul; Ahmed, Ashir; Kuroda, Masahiro; Inoue, Sozo; Hiramatsu, Tatsuo; Kimura, Michio; Shimizu, Shuji; Kobayashi, Kunihisa; Baba, Yukino; Kashima, Hisashi; Tsuda, Koji; Sugiyama, Masashi; Blondel, Mathieu; Ueda, Naonori; Kitsuregawa, Masaru; Nakashima, Naoki

    2015-01-01

    Background The prevalence of non-communicable diseases is increasing throughout the world, including developing countries. Objective The intent was to conduct a study of a preventive medical service in a developing country, combining eHealth checkups and teleconsultation as well as assess stratification rules and the short-term effects of intervention. Methods We developed an eHealth system that comprises a set of sensor devices in an attaché case, a data transmission system linked to a mobile network, and a data management application. We provided eHealth checkups for the populations of five villages and the employees of five factories/offices in Bangladesh. Individual health condition was automatically categorized into four grades based on international diagnostic standards: green (healthy), yellow (caution), orange (affected), and red (emergent). We provided teleconsultation for orange- and red-grade subjects and we provided teleprescription for these subjects as required. Results The first checkup was provided to 16,741 subjects. After one year, 2361 subjects participated in the second checkup and the systolic blood pressure of these subjects was significantly decreased from an average of 121 mmHg to an average of 116 mmHg (P<.001). Based on these results, we propose a cost-effective method using a machine learning technique (random forest method) using the medical interview, subject profiles, and checkup results as predictor to avoid costly measurements of blood sugar, to ensure sustainability of the program in developing countries. Conclusions The results of this study demonstrate the benefits of an eHealth checkup and teleconsultation program as an effective health care system in developing countries. PMID:25630348

  4. Verification of equations for incipient motion studies for a rigid rectangular channel.

    PubMed

    Bong, Charles Hin Joo; Lau, Tze Liang; Ghani, Aminuddin Ab

    2013-01-01

    The current study aims to verify the existing equations for incipient motion for a rigid rectangular channel. Data from experimental work on incipient motion from a rectangular flume with two different widths, namely 0.3 and 0.6 m, were compared with the critical velocity value predicted by the equations of Novak & Nalluri and El-Zaemey. The equation by El-Zaemey performed better with an average discrepancy ratio value of 1.06 compared with the equation by Novak & Nalluri with an average discrepancy ratio value of 0.87. However, as the sediment deposit thickness increased, the equation by El-Zaemey became less accurate. A plot on the Shields Diagram using the experimental data had shown the significant effect of the sediment deposit thickness where, as the deposit becomes thicker, the dimensionless shear stress θ value also increased. A new equation had been proposed by incorporating the sediment deposit thickness. The new equation gave improved prediction with an average discrepancy ratio value of 1.02.

  5. Preliminary studies of PQS PET detector module for dose verification of carbon beam therapy

    NASA Astrophysics Data System (ADS)

    Kim, H.-I.; An, S. Jung; Lee, C. Y.; Jo, W. J.; Min, E.; Lee, K.; Kim, Y.; Joung, J.; Chung, Y. H.

    2014-05-01

    PET imaging can be used to verify dose distributions of therapeutic particle beams such as carbon ion beams. The purpose of this study was to develop a PET detector module which was designed for an in-beam PET scanner geometry integrated into a carbon beam therapy system, and to evaluate its feasibility as a monitoring system of patient dose distribution. A C-shaped PET geometry was proposed to avoid blockage of the carbon beam by the detector modules. The proposed PET system consisted of 14 detector modules forming a bore with 30.2 cm inner diameter for brain imaging. Each detector module is composed of a 9 × 9 array of 4.0 mm × 4.0 mm × 20.0 mm LYSO crystal module optically coupled with four 29 mm diameter PMTs using Photomultiplier-quadrant-sharing (PQS) technique. Because the crystal pixel was identified based upon the distribution of scintillation lights of four PMTs, the design of the reflector between crystal elements should be well optimized. The optical design of reflectors was optimized using DETECT2000, a Monte Carlo code for light photon transport. A laser-cut reflector set was developed using the Enhanced Specular Reflector (ESR, 3M Co.) mirror-film with a high reflectance of 98% and a thickness of 0.064 mm. All 81 crystal elements of detector module were identified. Our result demonstrates that the C-shaped PET system is under development and we present the first reconstructed image.

  6. Theory for noise of propellers in angular inflow with parametric studies and experimental verification

    NASA Technical Reports Server (NTRS)

    Hanson, Donald B.; Parzych, David J.

    1993-01-01

    This report presents the derivation of a frequency domain theory and working equations for radiation of propeller harmonic noise in the presence of angular inflow. In applying the acoustic analogy, integration over the tangential coordinate of the source region is performed numerically, permitting the equations to be solved without approximation for any degree of angular inflow. Inflow angle is specified in terms of yaw, pitch, and roll angles of the aircraft. Since these can be arbitrarily large, the analysis applies with equal accuracy to propellers and helicopter rotors. For thickness and loading, the derivation is given in complete detail with working equations for near and far field. However, the quadrupole derivation has been carried only far enough to show feasibility of the numerical approach. Explicit formulas are presented for computation of source elements, evaluation of Green's functions, and location of observer points in various visual and retarded coordinate systems. The resulting computer program, called WOBBLE has been written in FORTRAN and follows the notation of this report very closely. The new theory is explored to establish the effects of varying inflow angle on axial and circumferential directivity. Also, parametric studies were performed to evaluate various phenomena outside the capabilities of earlier theories, such as an unsteady thickness effect. Validity of the theory was established by comparison with test data from conventional propellers and Prop Fans in flight and in wind tunnels under a variety of operating conditions and inflow angles.

  7. Daily precipitation estimation through different microwave sensors: Verification study over Italy

    NASA Astrophysics Data System (ADS)

    Ciabatta, Luca; Marra, Anna Cinzia; Panegrossi, Giulia; Casella, Daniele; Sanò, Paolo; Dietrich, Stefano; Massari, Christian; Brocca, Luca

    2017-02-01

    The accurate estimation of rainfall from remote sensing is of paramount importance for many applications as, for instance, the mitigation of natural hazards like floods, droughts, and landslides. Traditionally, microwave observations in the frequency between 10 and 183 GHz are used for estimating rainfall based on the direct interaction of radiation with the hydrometeors within precipitating clouds in a so-called top-down approach. Recently, a bottom-up approach was proposed that uses satellite soil moisture products derived from microwave observations (<10 GHz) for the estimation of accumulated rainfall amounts. The integration of the bottom-up and top-down approaches has large potential for providing high accurate rainfall estimates exploiting their different and complementary nature. In this study, we perform a long-term (3 years) assessment of different satellite rainfall products exploiting the full range of microwave frequencies over Italy. Specifically, the integration of two top-down algorithms (CDRD, Cloud Dynamics and Radiation Database, and PNPR, Passive microwave Neural network Precipitation Retrieval) for estimating rainfall from conically and cross-track scanning radiometers, and one bottom-up algorithm (SM2RAIN) applied to the Advanced SCATterometer soil moisture product is carried out. The performances of the products, individually and merged together, are assessed at daily time scale. The integration of top-down and bottom-up approaches provides the highest performance both in terms of continuous and categorical scores (i.e., median correlation coefficient and root mean square error values equal to 0.71 and 6.62 mm, respectively). In such a combination, the limitations of the two approaches are compensated allowing a better estimation of ground accumulated rainfall through SM2RAIN while, overcoming the limitations of rainfall estimation for intense events during wet conditions through CDRD-PNPR product. The accuracy and the reliability of the

  8. First in situ TOF-PET study using digital photon counters for proton range verification

    NASA Astrophysics Data System (ADS)

    Cambraia Lopes, P.; Bauer, J.; Salomon, A.; Rinaldi, I.; Tabacchini, V.; Tessonnier, T.; Crespo, P.; Parodi, K.; Schaart, D. R.

    2016-08-01

    Positron emission tomography (PET) is the imaging modality most extensively tested for treatment monitoring in particle therapy. Optimal use of PET in proton therapy requires in situ acquisition of the relatively strong 15O signal due to its relatively short half-life (~2 min) and high oxygen content in biological tissues, enabling shorter scans that are less sensitive to biological washout. This paper presents the first performance tests of a scaled-down in situ time-of-flight (TOF) PET system based on digital photon counters (DPCs) coupled to Cerium-doped Lutetium Yttrium Silicate (LYSO:Ce) crystals, providing quantitative results representative of a dual-head tomograph that complies with spatial constraints typically encountered in clinical practice (2  ×  50°, of 360°, transaxial angular acceptance). The proton-induced activity inside polymethylmethacrylate (PMMA) and polyethylene (PE) phantoms was acquired within beam pauses (in-beam) and immediately after irradiation by an actively-delivered synchrotron pencil-beam, with clinically relevant 125.67 MeV/u, 4.6  ×  108 protons s-1, and 1010 total protons. 3D activity maps reconstructed with and without TOF information are compared to FLUKA simulations, demonstrating the benefit of TOF-PET to reduce limited-angle artefacts using a 382 ps full width at half maximum coincidence resolving time. The time-dependent contributions from different radionuclides to the total count-rate are investigated. We furthermore study the impact of the acquisition time window on the laterally integrated activity depth-profiles, with emphasis on 2 min acquisitions starting at different time points. The results depend on phantom composition and reflect the differences in relative contributions from the radionuclides originating from carbon and oxygen. We observe very good agreement between the shapes of the simulated and measured activity depth-profiles for post-beam protocols. However, our results also

  9. First in situ TOF-PET study using digital photon counters for proton range verification.

    PubMed

    Cambraia Lopes, P; Bauer, J; Salomon, A; Rinaldi, I; Tabacchini, V; Tessonnier, T; Crespo, P; Parodi, K; Schaart, D R

    2016-08-21

    Positron emission tomography (PET) is the imaging modality most extensively tested for treatment monitoring in particle therapy. Optimal use of PET in proton therapy requires in situ acquisition of the relatively strong (15)O signal due to its relatively short half-life (~2 min) and high oxygen content in biological tissues, enabling shorter scans that are less sensitive to biological washout. This paper presents the first performance tests of a scaled-down in situ time-of-flight (TOF) PET system based on digital photon counters (DPCs) coupled to Cerium-doped Lutetium Yttrium Silicate (LYSO:Ce) crystals, providing quantitative results representative of a dual-head tomograph that complies with spatial constraints typically encountered in clinical practice (2  ×  50°, of 360°, transaxial angular acceptance). The proton-induced activity inside polymethylmethacrylate (PMMA) and polyethylene (PE) phantoms was acquired within beam pauses (in-beam) and immediately after irradiation by an actively-delivered synchrotron pencil-beam, with clinically relevant 125.67 MeV/u, 4.6  ×  10(8) protons s(-1), and 10(10) total protons. 3D activity maps reconstructed with and without TOF information are compared to FLUKA simulations, demonstrating the benefit of TOF-PET to reduce limited-angle artefacts using a 382 ps full width at half maximum coincidence resolving time. The time-dependent contributions from different radionuclides to the total count-rate are investigated. We furthermore study the impact of the acquisition time window on the laterally integrated activity depth-profiles, with emphasis on 2 min acquisitions starting at different time points. The results depend on phantom composition and reflect the differences in relative contributions from the radionuclides originating from carbon and oxygen. We observe very good agreement between the shapes of the simulated and measured activity depth-profiles for post-beam protocols. However, our results

  10. Practical mask inspection system with printability and pattern priority verification

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Hideo; Ozaki, Fumio; Takahara, Kenichi; Inoue, Takafumi; Kikuiri, Nobutaka

    2011-05-01

    Through the four years of study in Association of Super-Advanced Electronics Technologies (ASET) on reducing mask manufacturing Turn Around Time (TAT) and cost, we have been able to establish a technology to improve the efficiency of the review process by applying a printability verification function that utilizes computational lithography simulations to analyze defects detected by a high-resolution mask inspection system. With the advent of Source-Mask Optimization (SMO) and other technologies that extend the life of existing optical lithography, it is becoming extremely difficult to judge a defect only by the shape of a mask pattern, while avoiding pseudo-defects. Thus, printability verification is indispensable for filtering out nuisance defects from high-resolution mask inspection results. When using computational lithography simulations to verify printability with high precision, the image captured by the inspection system must be prepared with extensive care. However, for practical applications, this preparation process needs to be simplified. In addition, utilizing Mask Data Rank (MDR) to vary the defect detection sensitivity according to the patterns is also useful for simultaneously inspecting minute patterns and avoiding pseudo-defects. Combining these two technologies, we believe practical mask inspection for next generation lithography is achievable. We have been improving the estimation accuracy of the printability verification function through discussion with several customers and evaluation of their masks. In this report, we will describe the progress of these practical mask verification functions developed through customers' evaluations.

  11. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  12. SU-E-T-524: In-Vivo Diode Dosimetry Proton Therapy Range Verification Validation Study for Pediatric CSI

    SciTech Connect

    Toltz, A; Seuntjens, J; Hoesl, M; Schuemann, J; Lu, H; Paganetti, H

    2015-06-15

    Purpose: With the aim of reducing acute esophageal radiation toxicity in pediatric patients receiving craniospinal irradiation (CSI), we investigated the implementation of an in-vivo, adaptive proton therapy range verification methodology. Simulation experiments and in-phantom measurements were conducted to validate the range verification technique for this clinical application. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification for a prostate treatment case by correlating properties of the detector signal to the water equivalent path length (WEPL). We propose to extend the methodology to verify range distal to the vertebral body for pediatric CSI cases by placing this small volume dosimeter in the esophagus of the anesthetized patient immediately prior to treatment. A set of calibration measurements was performed to establish a time signal to WEPL fit for a “scout” beam in a solid water phantom. Measurements are compared against Monte Carlo simulation in GEANT4 using the Tool for Particle Simulation (TOPAS). Results: Measurements with the diode array in a spread out Bragg peak of 14 cm modulation width and 15 cm range (177 MeV passively scattered beam) in solid water were successfully validated against proton fluence rate simulations in TOPAS. The resulting calibration curve allows for a sensitivity analysis of detector system response with dose rate in simulation and with individual diode position through simulation on patient CT data. Conclusion: Feasibility has been shown for the application of this range verification methodology to pediatric CSI. An in-vivo measurement to determine the WEPL to the inner surface of the esophagus will allow for personalized adjustment of the treatment plan to ensure sparing of the esophagus while confirming target coverage. A Toltz acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the

  13. Using Statechart Assertion for the Formal Validation and Verification of a Real-Time Software System: A Case Study

    DTIC Science & Technology

    2011-03-01

    sophisticated surveillance system , but might not have a HARM missile on board. F-16 has HARM launching capabilities but does not have a sophisticated...requirement. R1: Missile launch must not occur, when system is in jamming radiate mode. Figure 14: Statechart assertion diagram for R1 requirement... Orion Pad Abort software,” Technical Report, Naval Postgraduate School, May 2010. [5] D. R.Wallace and R. U. Fujii, “Software Verification and

  14. Studies in Seismic Verification

    DTIC Science & Technology

    1992-05-01

    features in the Earth. G(co) includes source region effects such as free surface reflections, geometrical spreading which may be frequency dependent...pressure function at the elastic radius. They used a pressure function based on free -field observations of several underground nuclear explosions...show an increase in 10 and 30 Hz spectral amplitude by a factor of about 5 above the free surface effect. Therefore we expect the Anza spectral

  15. MX Siting Investigation Geotechnical Evaluation. Volume VI. Nevada-Utah Verification Studies, FY 79. Geotechnical Data, Garden-Coal CDP, Nevada.

    DTIC Science & Technology

    1979-08-24

    EVALUATION AD All 3328 VOLUME VI NEVADA-UTAH VERIFICATION STUDIES, FY 79 GEOTECHNICAL DATA GARDEN-COAL CDP , NEVADA PhEPARED FOR SPACE AND MISSILE...UTAH 79VERIPFICATION STUDIES, FY 79. GEOTECHNICAL DATA GARDEN-COAL CDP , NEVADA Prepared for: U. S4 Department of the Air Farce Space and. Missile...8217-.-’. ( *.. ’ - . A E &W R NTN M E S T S __ __ _.,__ __ __ __ __ _ __ __ _ =ju ,- 4H -- - FN-TR-27-VI VOLUME VI GEOTECHNICAL DATA, GARDEN-COAL CDP TABLE OF

  16. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  17. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  18. Verification and Validation (V&V) Methodologies for Multiphase Turbulent and Explosive Flows. V&V Case Studies of Computer Simulations from Los Alamos National Laboratory GMFIX codes

    NASA Astrophysics Data System (ADS)

    Dartevelle, S.

    2006-12-01

    Quality Assurance, verification, and validation requirements from the Office of Civilian Radioactive Waste Management of the US Dept of Energy. GMFIX solves Navier-Stokes and energy partial differential equations for each phase with appropriate turbulence and interfacial coupling between phases. For momentum-driven single- to multi-phase underexpanded jets, the position of the first Mach disk is known empirically as a function of both the pressure ratio, K, and the particle mass fraction, Phi at the nozzle. Namely, the higher K, the further downstream the Mach disk and the higher Phi, the further upstream the first Mach disk. We show that GMFIX captures these two essential features. In addition, GMFIX displays all the properties found in these jets, such as expansion fans, incident and reflected shocks, and subsequent downstream mach discs, which make this code ideal for further investigations of equivalent volcanological phenomena. One of the other most challenging aspects of volcanic phenomenology is the multiphase nature of turbulence. We also validated GMFIX in comparing the velocity profiles and turbulence quantities against well constrained analog experiments. The velocity profiles agree with the analog ones as well as these of production of turbulent quantities. Overall, the Verification and the Validation experiments although inherently challenging suggest GMFIX captures the most essential dynamical properties of multiphase and supersonic flows and jets.

  19. The 2014 Sandia Verification and Validation Challenge: Problem statement

    SciTech Connect

    Hu, Kenneth; Orient, George

    2016-01-18

    This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme is how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.

  20. The 2014 Sandia Verification and Validation Challenge: Problem statement

    DOE PAGES

    Hu, Kenneth; Orient, George

    2016-01-18

    This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less

  1. Study of wood plastic composite in the presence of nitrogen containing additives

    NASA Astrophysics Data System (ADS)

    Ali, K. M. Idriss; Khan, Mubarak A.; Husain, M. M.

    1994-10-01

    Effect of nitrogen-containing additives in the study of wood plastic composites of MMA with simul and mango wood of Bangladesh has been investigated. Nine different additives were used and the additives containing carboamide group induce the highest tensile strength to the composite.

  2. National Energy Efficiency Evaluation, Measurement and Verification (EM&V) Standard: Scoping Study of Issues and Implementation Requirements

    SciTech Connect

    Schiller Consulting, Inc.; Schiller, Steven R.; Goldman, Charles A.; Galawish, Elsia

    2011-02-04

    This report is a scoping study that identifies issues associated with developing a national evaluation, measurement and verification (EM&V) standard for end-use, non-transportation, energy efficiency activities. The objectives of this study are to identify the scope of such a standard and define EM&V requirements and issues that will need to be addressed in a standard. To explore these issues, we provide and discuss: (1) a set of definitions applicable to an EM&V standard; (2) a literature review of existing guidelines, standards, and 'initiatives' relating to EM&V standards as well as a review of 'bottom-up' versus 'top-down' evaluation approaches; (3) a summary of EM&V related provisions of two recent federal legislative proposals (Congressman Waxman's and Markey's American Clean Energy and Security Act of 2009 and Senator Bingaman's American Clean Energy Leadership Act of 2009) that include national efficiency resource requirements; (4) an annotated list of issues that that are likely to be central to, and need to be considered when, developing a national EM&V standard; and (5) a discussion of the implications of such issues. There are three primary reasons for developing a national efficiency EM&V standard. First, some policy makers, regulators and practitioners believe that a national standard would streamline EM&V implementation, reduce costs and complexity, and improve comparability of results across jurisdictions; although there are benefits associated with each jurisdiction setting its own EM&V requirements based on their specific portfolio and evaluation budgets and objectives. Secondly, if energy efficiency is determined by the US Environmental Protection Agency to be a Best Available Control Technology (BACT) for avoiding criteria pollutant and/or greenhouse gas emissions, then a standard can be required for documenting the emission reductions resulting from efficiency actions. The third reason for a national EM&V standard is that such a standard is

  3. MX Siting Investigation. Geotechnical Evaluation. Volume VII. Nevada - Utah Verification Studies, FY 79. Geotechnical Data Reveille - Railroad CDP, Nevada.

    DTIC Science & Technology

    1979-08-24

    NEVADA 01 SITING INVESTIGATION final DEPARTMENT OF THE AIR FORCE - SAMSO 7-18 2UVL 73 AFV-43 I WIN 71i l SIL DESCRIPTION 11ES ANALYhIS I U SAI LL 101...IAILROAU COP. NEVADA III SITING INVESTIGATION OEPARTNENT OF THlE AIR FORCE -SASSO 7-24 j JUL0 TO AFV-43 P1I-T"-7- I I so M SOIL DESCRIPTION RENARKS...gravel; trace LOUS OF SURFICIAL SOIL SAMPLES 1 VERIFICATION SITE REVEILLE-RAILROAD COP, NEVADA SSPARTIMENT Of THE AIR FORCE - SANSO ae RSTN NETOTO 2

  4. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  5. Cold fusion verification

    NASA Astrophysics Data System (ADS)

    North, M. H.; Mastny, G. F.; Wesley, E. J.

    1991-03-01

    The objective of this work to verify and reproduce experimental observations of Cold Nuclear Fusion (CNF), as originally reported in 1989. The method was to start with the original report and add such additional information as became available to build a set of operational electrolytic CNF cells. Verification was to be achieved by first observing cells for neutron production, and for those cells that demonstrated a nuclear effect, careful calorimetric measurements were planned. The authors concluded, after laboratory experience, reading published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater. The neutron detector used for these tests was a completely packaged unit built into a metal suitcase that afforded electrostatic shielding for the detectors and self-contained electronics. It was battery-powered, although it was on charge for most of the long tests. The sensor element consists of He detectors arranged in three independent layers in a solid moderating block. The count from each of the three layers as well as the sum of all the detectors were brought out and recorded separately. The neutron measurements were made with both the neutron detector and the sample tested in a cave made of thick moderating material that surrounded the two units on the sides and bottom.

  6. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  7. Design, analysis, and test verification of advanced encapsulation system

    NASA Technical Reports Server (NTRS)

    Garcia, A.; Minning, C.

    1981-01-01

    Procurement of 4 in x 4 in polycrystalline solar cells were proceeded with some delays. A total of 1200 cells were procured for use in both the verification testing and qualification testing. Additional thermal structural analyses were run and the data are presented. An outline of the verification testing is included with information on test specimen construction.

  8. Study on the Tritium Behaviors in the VHTR System. Part 1: Development of Tritium Analysis Code for VHTR and Verification

    SciTech Connect

    Eung Soo Kim; Chang Ho Oh; Mike Patterson

    2010-07-01

    A tritium permeation analyses code (TPAC) has been developed in Idaho National Laboratory (INL) by using MATLAB SIMULINK package for analysis of tritium behaviors in the VHTRs integrated with hydrogen production and process heat application systems. The modeling is based on the mass balance of tritium containing species and hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. The code includes (1) tritium sources from ternary fission and neutron reactions with 6Li, 7Li 10B, 3He, (2) tritium purification system, (3) leakage of tritium with coolant, (4) permeation through pipes, vessels, and heat exchangers, (4) electrolyzer for high temperature steam electrolysis (HTSE), and (5) isotope exchange for SI process. Verification of the code has been performed by comparisons with the analytical solutions, the experimental data, and the benchmark code results based on the Peach Bottom reactor design. The results showed that all the governing equations are well implemented into the code and correctly solved. This paper summarizes all the background, the theory, the code structures, and some verification results related to the TPAC code development in Idaho National Laboratory (INL).

  9. Cognitive Bias in the Verification and Validation of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  10. Feasibility and Credibility of Bilingual Instructional Features Study Plans: Field Verification. Planning Paper 5. Bilingual Instructional Features Planning Study.

    ERIC Educational Resources Information Center

    Nieves-Squires, Sarah; And Others

    This report is one of five submitted as products of a "Study Designed to Assist Planning of Research on Significant Instructional Features in Bilingual Education Programs." The reports are intended to assist the National Institute of Education in its plans for a major new research study in bilingual education. The present report…

  11. Optimal Imaging for Treaty Verification

    SciTech Connect

    Brubaker, Erik; Hilton, Nathan R.; Johnson, William; Marleau, Peter; Kupinski, Matthew; MacGahan, Christopher Jonathan

    2014-09-01

    Future arms control treaty verification regimes may use radiation imaging measurements to confirm and track nuclear warheads or other treaty accountable items (TAIs). This project leverages advanced inference methods developed for medical and adaptive imaging to improve task performance in arms control applications. Additionally, we seek a method to acquire and analyze imaging data of declared TAIs without creating an image of those objects or otherwise storing or revealing any classified information. Such a method would avoid the use of classified-information barriers (IB).

  12. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  13. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  14. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  15. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  16. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  17. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 11 2013-01-01 2013-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  18. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 11 2014-01-01 2014-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  19. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 11 2012-01-01 2012-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  20. Part-whole bias in intertemporal choice: An empirical study of additive assumption

    NASA Astrophysics Data System (ADS)

    Lu, Yang; Wu, Dongmei; Zhuang, Xintian

    2016-12-01

    Additive assumption means the overall value of multiple-dated outcomes is based on a simple aggregation of the values of each individual outcome. This assumption is generally accepted in the field of intertemporal choices. However, recent studies show additive assumption is questionable. In this paper, we experimentally tested the additive property of multiple-dated monetary rewards. Our results show: (1) additive assumption does not hold regardless of gain or loss; (2) the sum of subjective values of individual rewards is consistently larger than the valuation placed on the same rewards as a whole. This finding suggests that part-whole bias exists in the context of valuation of intertemporal monetary rewards.

  1. Verification of heterogeneous multi-agent system using MCMAS

    NASA Astrophysics Data System (ADS)

    Choi, Jiyoung; Kim, Seungkeun; Tsourdos, Antonios

    2015-03-01

    The focus of the paper is how to model autonomous behaviours of heterogeneous multi-agent systems such that it can be verified that they will always operate within predefined mission requirements and constraints. This is done by using formal methods with an abstraction of the behaviours modelling and model checking for their verification. Three case studies are presented to verify the decision-making behaviours of heterogeneous multi-agent system using a convoy mission scenario. The multi-agent system in a case study has been extended by increasing the number of agents and function complexity gradually. For automatic verification, model checker for multi-agent systems (MCMAS) is adopted due to its novel capability to accommodate the multi-agent system and successfully verifies the targeting behaviours of the team-level autonomous systems. The verification results help retrospectively the design of decision-making algorithms improved by considering additional agents and behaviours during three steps of scenario modification. Consequently, the last scenario deals with the system composed of a ground control system, two unmanned aerial vehicles, and four unmanned ground vehicles with fault-tolerant and communication relay capabilities.

  2. WE-EF-303-06: Feasibility of PET Image-Based On-Line Proton Beam-Range Verification with Simulated Uniform Phantom and Human Brain Studies

    SciTech Connect

    Lou, K; Sun, X; Zhu, X; Grosshans, D; Clark, J; Shao, Y

    2015-06-15

    Purpose: To study the feasibility of clinical on-line proton beam range verification with PET imaging Methods: We simulated a 179.2-MeV proton beam with 5-mm diameter irradiating a PMMA phantom of human brain size, which was then imaged by a brain PET with 300*300*100-mm{sup 3} FOV and different system sensitivities and spatial resolutions. We calculated the mean and standard deviation of positron activity range (AR) from reconstructed PET images, with respect to different data acquisition times (from 5 sec to 300 sec with 5-sec step). We also developed a technique, “Smoothed Maximum Value (SMV)”, to improve AR measurement under a given dose. Furthermore, we simulated a human brain irradiated by a 110-MeV proton beam of 50-mm diameter with 0.3-Gy dose at Bragg peak and imaged by the above PET system with 40% system sensitivity at the center of FOV and 1.7-mm spatial resolution. Results: MC Simulations on the PMMA phantom showed that, regardless of PET system sensitivities and spatial resolutions, the accuracy and precision of AR were proportional to the reciprocal of the square root of image count if image smoothing was not applied. With image smoothing or SMV method, the accuracy and precision could be substantially improved. For a cylindrical PMMA phantom (200 mm diameter and 290 mm long), the accuracy and precision of AR measurement could reach 1.0 and 1.7 mm, with 100-sec data acquired by the brain PET. The study with a human brain showed it was feasible to achieve sub-millimeter accuracy and precision of AR measurement with acquisition time within 60 sec. Conclusion: This study established the relationship between count statistics and the accuracy and precision of activity-range verification. It showed the feasibility of clinical on-line BR verification with high-performance PET systems and improved AR measurement techniques. Cancer Prevention and Research Institute of Texas grant RP120326, NIH grant R21CA187717, The Cancer Center Support (Core) Grant CA

  3. SU-E-J-82: Intra-Fraction Proton Beam-Range Verification with PET Imaging: Feasibility Studies with Monte Carlo Simulations and Statistical Modeling

    SciTech Connect

    Lou, K; Mirkovic, D; Sun, X; Zhu, X; Poenisch, F; Grosshans, D; Shao, Y; Clark, J

    2014-06-01

    Purpose: To study the feasibility of intra-fraction proton beam-range verification with PET imaging. Methods: Two phantoms homogeneous cylindrical PMMA phantoms (290 mm axial length, 38 mm and 200 mm diameter respectively) were studied using PET imaging: a small phantom using a mouse-sized PET (61 mm diameter field of view (FOV)) and a larger phantom using a human brain-sized PET (300 mm FOV). Monte Carlo (MC) simulations (MCNPX and GATE) were used to simulate 179.2 MeV proton pencil beams irradiating the two phantoms and be imaged by the two PET systems. A total of 50 simulations were conducted to generate 50 positron activity distributions and correspondingly 50 measured activity-ranges. The accuracy and precision of these activity-ranges were calculated under different conditions (including count statistics and other factors, such as crystal cross-section). Separate from the MC simulations, an activity distribution measured from a simulated PET image was modeled as a noiseless positron activity distribution corrupted by Poisson counting noise. The results from these two approaches were compared to assess the impact of count statistics on the accuracy and precision of activity-range calculations. Results: MC Simulations show that the accuracy and precision of an activity-range are dominated by the number (N) of coincidence events of the reconstructed image. They are improved in a manner that is inversely proportional to 1/sqrt(N), which can be understood from the statistical modeling. MC simulations also indicate that the coincidence events acquired within the first 60 seconds with 10{sup 9} protons (small phantom) and 10{sup 10} protons (large phantom) are sufficient to achieve both sub-millimeter accuracy and precision. Conclusion: Under the current MC simulation conditions, the initial study indicates that the accuracy and precision of beam-range verification are dominated by count statistics, and intra-fraction PET image-based beam-range verification is

  4. Neuropsychological tests for the diagnosis of Alzheimer’s disease dementia and other dementias: a generic protocol for cross-sectional and delayed-verification studies

    PubMed Central

    Davis, Daniel HJ; Creavin, Sam T; Noel-Storr, Anna; Quinn, Terry J; Smailagic, Nadja; Hyde, Chris; Brayne, Carol; McShane, Rupert; Cullum, Sarah

    2014-01-01

    This is the protocol for a review and there is no abstract. The objectives are as follows: To determine the cross-sectional diagnostic accuracy of [index test] at various thresholds for ADD and other dementias [target condition] in [target population].ORTo determine the accuracy of [index test] at various thresholds for diagnosing ADD and other dementias [target condition] in [target population] after a follow-up period (delayed-verification studies).To investigate the heterogeneity of test accuracy in the included studies.To highlight the quality and quantity of research evidence available on the effectiveness of the index test in the target population.To identify gaps in the evidence and determine where further research is required. PMID:25177209

  5. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  6. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  7. Voice verification upgrade

    NASA Astrophysics Data System (ADS)

    Davis, R. L.; Sinnamon, J. T.; Cox, D. L.

    1982-06-01

    This contractor has two major objectives. The first was to build, test, and deliver to the government an entry control system using speaker verification (voice authentication) as the mechanism for verifying the user's claimed identity. This system included a physical mantrap, with an integral weight scale to prevent more than one user from gaining access with one verification (tailgating). The speaker verification part of the entry control system contained all the updates and embellishments to the algorithm that was developed earlier for the BISS (Base and Installation Security System) system under contract with the Electronic Systems Division of the USAF. These updates were tested prior to and during the contract on an operational system used at Texas Instruments in Dallas, Texas, for controlling entry to the Corporate Information Center (CIC).

  8. Monte Carlo patient study on the comparison of prompt gamma and PET imaging for range verification in proton therapy

    NASA Astrophysics Data System (ADS)

    Moteabbed, M.; España, S.; Paganetti, H.

    2011-02-01

    The purpose of this work was to compare the clinical adaptation of prompt gamma (PG) imaging and positron emission tomography (PET) as independent tools for non-invasive proton beam range verification and treatment validation. The PG range correlation and its differences with PET have been modeled for the first time in a highly heterogeneous tissue environment, using different field sizes and configurations. Four patients with different tumor locations (head and neck, prostate, spine and abdomen) were chosen to compare the site-specific behaviors of the PG and PET images, using both passive scattered and pencil beam fields. Accurate reconstruction of dose, PG and PET distributions was achieved by using the planning computed tomography (CT) image in a validated GEANT4-based Monte Carlo code capable of modeling the treatment nozzle and patient anatomy in detail. The physical and biological washout phenomenon and decay half-lives for PET activity for the most abundant isotopes such as 11C, 15O, 13N, 30P and 38K were taken into account in the data analysis. The attenuation of the gamma signal after traversing the patient geometry and respective detection efficiencies were estimated for both methods to ensure proper comparison. The projected dose, PG and PET profiles along many lines in the beam direction were analyzed to investigate the correlation consistency across the beam width. For all subjects, the PG method showed on average approximately 10 times higher gamma production rates than the PET method before, and 60 to 80 times higher production after including the washout correction and acquisition time delay. This rate strongly depended on tissue density and elemental composition. For broad passive scattered fields, it was demonstrated that large differences exist between PG and PET signal falloff positions and the correlation with the dose distribution for different lines in the beam direction. These variations also depended on the treatment site and the

  9. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  10. [PIV: a computer-aided portal image verification system].

    PubMed

    Fu, Weihua; Zhang, Hongzhi; Wu, Jing

    2002-12-01

    Portal image verification (PIV) is one of the key actions in QA procedure for sophisticated accurate radiotherapy. The purpose of this study was to develop a PIV software as a tool for improving the accuracy and visualization of portal field verification and computing field placement errors. PIV was developed in the visual C++ integrated environment under Windows 95 operating system. It can improve visualization by providing tools for image processing and multimode images display. Semi-automatic register methods make verification more accurate than view-box method. It can provide useful quantitative errors for regular fields. PIV is flexible and accurate. It is an effective tool for portal field verification.

  11. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  12. Voice Verification Upgrade.

    DTIC Science & Technology

    1982-06-01

    to develop speaker verification techniques for use over degraded commun- ication channels -- specifically telephone lines. A test of BISS type speaker...verification technology was performed on a degraded channel and compensation techniques were then developed . The fifth program [103 (Total Voice SV...UPGAW. *mbit aL DuI~sel Jme T. SImmoon e~d David L. Cox AAWVLP FIR MIEW RMAS Utgl~rIMIW At" DT11C AU9 231f CD, _ ROME AIR DEVELOPMENT CENTER Air

  13. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    NASA Technical Reports Server (NTRS)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  14. Patient Study of In Vivo Verification of Beam Delivery and Range, Using Positron Emission Tomography and Computed Tomography Imaging After Proton Therapy

    SciTech Connect

    Parodi, Katia . E-mail: Katia.Parodi@med.uni-heidelberg.de; Paganetti, Harald; Shih, Helen A.; Michaud, Susan; Loeffler, Jay S.; DeLaney, Thomas F.; Liebsch, Norbert J.; Munzenrider, John E.; Fischman, Alan J.; Knopf, Antje; Bortfeld, Thomas

    2007-07-01

    Purpose: To investigate the feasibility and value of positron emission tomography and computed tomography (PET/CT) for treatment verification after proton radiotherapy. Methods and Materials: This study included 9 patients with tumors in the cranial base, spine, orbit, and eye. Total doses of 1.8-3 GyE and 10 GyE (for an ocular melanoma) per fraction were delivered in 1 or 2 fields. Imaging was performed with a commercial PET/CT scanner for 30 min, starting within 20 min after treatment. The same treatment immobilization device was used during imaging for all but 2 patients. Measured PET/CT images were coregistered to the planning CT and compared with the corresponding PET expectation, obtained from CT-based Monte Carlo calculations complemented by functional information. For the ocular case, treatment position was approximately replicated, and spatial correlation was deduced from reference clips visible in both the planning radiographs and imaging CT. Here, the expected PET image was obtained from an analytical model. Results: Good spatial correlation and quantitative agreement within 30% were found between the measured and expected activity. For head-and-neck patients, the beam range could be verified with an accuracy of 1-2 mm in well-coregistered bony structures. Low spine and eye sites indicated the need for better fixation and coregistration methods. An analysis of activity decay revealed as tissue-effective half-lives of 800-1,150 s. Conclusions: This study demonstrates the feasibility of postradiation PET/CT for in vivo treatment verification. It also indicates some technological and methodological improvements needed for optimal clinical application.

  15. Verification and Validation of RADTRAN 5.5.

    SciTech Connect

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  16. Evaluating Drugs and Food Additives for Public Use: A Case Studies Approach.

    ERIC Educational Resources Information Center

    Merritt, Sheridan V.

    1980-01-01

    Described is a case study used in an introductory college biology course that provides a basis for generating debate on an issue concerning the regulation of controversial food additives and prescription drugs. The case study contained within this article deals with drug screening, specifically with information related to thalidomide. (CS)

  17. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    SciTech Connect

    Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.; Sauer, Jeremy A.

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verification test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.

  18. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  20. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  1. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  2. [TG-FTIR study on pyrolysis of wheat-straw with abundant CaO additives].

    PubMed

    Han, Long; Wang, Qin-Hui; Yang, Yu-Kun; Yu, Chun-Jiang; Fang, Meng-Xiang; Luo, Zhong-Yang

    2011-04-01

    Biomass pyrolysis in presence of abundant CaO additives is a fundamental process prior to CaO sorption enhanced gasification in biomass-based zero emission system. In the present study, thermogravimetric Fourier transform infrared (TG-FTIR) analysis was adopted to examine the effects of CaO additives on the mass loss process and volatiles evolution of wheat-straw pyrolysis. Observations from TG and FTIR analyses simultaneously demonstrated a two-stage process for CaO catalyzed wheat-straw pyrolysis, different from the single stage process for pure wheat-straw pyrolysis. CaO additives could not only absorb the released CO2 but also reduce the yields of tar species such as toluene, phenol, and formic acid in the first stage, resulting in decreased mass loss and maximum mass loss rate in this stage with an increase in CaO addition. The second stage was attributed to the CaCO3 decomposition and the mass loss and maximum mass loss rate increased with increasing amount of CaO additives. The results of the present study demonstrated the great potential of CaO additives to capture CO2 and reduce tars yields in biomass-based zero emission system. The gasification temperature in the system should be lowered down to avoid CaCO3 decomposition.

  3. Modular verification of concurrent systems

    SciTech Connect

    Sobel, A.E.K.

    1986-01-01

    During the last ten years, a number of authors have proposed verification techniques that allow one to prove properties of individual processes by using global assumptions about the behavior of the remaining processes in the distributed program. As a result, one must justify these global assumptions before drawing any conclusions regarding the correctness of the entire program. This justification is often the most difficult part of the proof and presents a serious obstacle to hierarchical program development. This thesis develops a new approach to the verification of concurrent systems. The approach is modular and supports compositional development of programs since the proofs of each individual process of a program are completely isolated from all others. The generality of this approach is illustrated by applying it to a representative set of contemporary concurrent programming languages, namely: CSP, ADA, Distributed Processes, and a shared variable language. In addition, it is also shown how the approach may be used to deal with a number of other constructs that have been proposed for inclusion in concurrent languages: FORK and JOIN primitives, nested monitor calls, path expressions, atomic transactions, and asynchronous message passing. These results allow argument that the approach is universal and can be used to design proof systems for any concurrent language.

  4. Study raises questions about measurement of 'additionality,'or maintaining domestic health spending amid foreign donations.

    PubMed

    Garg, Charu C; Evans, David B; Dmytraczenko, Tania; Izazola-Licea, José-Antonio; Tangcharoensathien, Viroj; Ejeder, Tessa Tan-Torres

    2012-02-01

    Donor nations and philanthropic organizations increasingly require that funds provided for a specific health priority such as HIV should supplement domestic spending on that priority-a concept known as "additionality." We investigated the "additionality" concept using data from Honduras, Rwanda, and Thailand, and we found that the three countries increased funding for HIV in response to increased donor funding. In contrast, the study revealed that donors, faced with increased Global Fund resources for HIV in certain countries, tended to decrease their funding for HIV or shift funds for use in non-HIV health areas. More broadly, we found many problems in the measurement and interpretation of additionality. These findings suggest that it would be preferable for donors and countries to agree on how best to use available domestic and external funds to improve population health, and to develop better means of tracking outcomes, than to try to develop more sophisticated methods to track additionality.

  5. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  6. [Bootstrap method-based estimation on the confidence interval for additive interaction in cohort studies].

    PubMed

    Pan, Jin-ren; Chen, Kun

    2010-07-01

    Interaction assessment is an important step in epidemiological analysis. When etiological study is carried out, the logarithmic models such as logistic model or Cox proportional hazard model are commonly used to estimate the independent effects of the risk factors. However, estimating interaction between risk factors by the regression coefficient of the product term is on multiplicative scale, and for public-health purposes, it is supposed to be on additive scale or departure from additivity. This paper illustrates with a example of cohort study by fitting Cox proportional hazard model to estimate three measures for additive interaction which presented by Rothman. Adopting the S-Plus application with a built-in Bootstrap function, it is convenient to estimate the confidence interval for additive interaction. Furthermore, this method can avoid the exaggerated estimation by using ORs in a cohort study to gain better precision. When using the complex combination models between additive interaction and multiplicative interaction, it is reasonable to choose the former one when the result is inconsistent.

  7. Comparative study of electrolyte additives using electrochemical impedance spectroscopy on symmetric cells

    NASA Astrophysics Data System (ADS)

    Petibon, R.; Sinha, N. N.; Burns, J. C.; Aiken, C. P.; Ye, Hui; VanElzen, Collette M.; Jain, Gaurav; Trussler, S.; Dahn, J. R.

    2014-04-01

    The effect of various electrolyte additives and additive combinations added to a 1 M LiPF6 EC:EMC electrolyte on the positive and negative electrodes surface of 1 year old wound LiCoO2/graphite cells and Li[Ni0.4Mn0.4Co0.2])O2/graphite cells was studied using electrochemical impedance spectroscopy (EIS) on symmetric cells. The additives tested were: vinylene carbonate (VC), trimethoxyboroxine (TMOBX), fluoroethylene carbonate (FEC), lithium bis(trifluoromethanesulfonyl)imide (LiTFSI), and H2O alone or in combination. In general, compared to control electrolyte, the additives tested reduced the impedance of the positive electrode and increased the impedance of the negative electrode with the exception of LiTFSI in Li[Ni0.4Mn0.4Co0.2]O2/graphite wound cells. Higher charge voltage led to higher positive electrode impedance, with the exception of 2%VC + 2% FEC, and 2% LiTFSI. In some cases, some additives when mixed with another controlled the formation of the SEI at one electrode, and shared the formation of the SEI at one electrode when mixed with a different additive.

  8. Generating Scenarios of Addition and Subtraction: A Study of Japanese University Students

    ERIC Educational Resources Information Center

    Kinda, Shigehiro

    2013-01-01

    Students are presented with problems involving three scenario types of addition and subtraction in elementary mathematics: one dynamic ("Change") and two static ("Combine, Compare"). Previous studies have indicated that the dynamic type is easier for school children, whereas the static types are more difficult and comprehended only gradually…

  9. Performance on Addition and Subtraction Problems: Results from Individual Interviews - Sandy Bay Study.

    ERIC Educational Resources Information Center

    Romberg, Thomas A.; And Others

    The purpose of this study was to relate children's cognitive processing capabilities and their grade level to their performance and to the strategies they used when working addition and subtraction problems. From two sets of data which assessed memory capacity and cognitive processing capacities, six groups of children with different cognitive…

  10. Using E-Learning to Enhance the Learning of Additional Languages--A Pilot Comparative Study

    ERIC Educational Resources Information Center

    Hilton, Gillian L. S.

    2013-01-01

    This paper is concerned with a small pilot study to ascertain the use of, and changes in the use of e-learning to promote the learning of foreign and additional languages in a variety of countries in Europe. It was undertaken by individual researchers in an attempt to examine how the drive towards the teaching of new languages, encouraged by the…

  11. Industry research on the use and effects of levulinic acid: a case study in cigarette additives.

    PubMed

    Keithly, Lois; Ferris Wayne, Geoffrey; Cullen, Doris M; Connolly, Gregory N

    2005-10-01

    Public health officials and tobacco researchers have raised concerns about the possible contributions of additives to the toxicity of cigarettes. However, little attention has been given to the process whereby additives promote initiation and addiction. Levulinic acid is a known cigarette additive. Review of internal tobacco industry documents indicates that levulinic acid was used to increase nicotine yields while enhancing perceptions of smoothness and mildness. Levulinic acid reduces the pH of cigarette smoke and desensitizes the upper respiratory tract, increasing the potential for cigarette smoke to be inhaled deeper into the lungs. Levulinic acid also may enhance the binding of nicotine to neurons that ordinarily would be unresponsive to nicotine. These findings held particular interest in the internal development of ultralight and so-called reduced-exposure cigarette prototypes. Industry studies found significantly increased peak plasma nicotine levels in smokers of ultralight cigarettes following addition of levulinic acid. Further, internal studies observed changes in mainstream and sidestream smoke composition that may present increased health risks. The use of levulinic acid illustrates the need for regulatory authority over tobacco products as well as better understanding of the role of additives in cigarettes and other tobacco products.

  12. Anatomically ordered tapping interferes more with one-digit addition than two-digit addition: a dual-task fMRI study.

    PubMed

    Soylu, Firat; Newman, Sharlene D

    2016-02-01

    Fingers are used as canonical representations for numbers across cultures. In previous imaging studies, it was shown that arithmetic processing activates neural resources that are known to participate in finger movements. Additionally, in one dual-task study, it was shown that anatomically ordered finger tapping disrupts addition and subtraction more than multiplication, possibly due to a long-lasting effect of early finger counting experiences on the neural correlates and organization of addition and subtraction processes. How arithmetic task difficulty and tapping complexity affect the concurrent performance is still unclear. If early finger counting experiences have bearing on the neural correlates of arithmetic in adults, then one would expect anatomically and non-anatomically ordered tapping to have different interference effects, given that finger counting is usually anatomically ordered. To unravel these issues, we studied how (1) arithmetic task difficulty and (2) the complexity of the finger tapping sequence (anatomical vs. non-anatomical ordering) affect concurrent performance and use of key neural circuits using a mixed block/event-related dual-task fMRI design with adult participants. The results suggest that complexity of the tapping sequence modulates interference on addition, and that one-digit addition (fact retrieval), compared to two-digit addition (calculation), is more affected from anatomically ordered tapping. The region-of-interest analysis showed higher left angular gyrus BOLD response for one-digit compared to two-digit addition, and in no-tapping conditions than dual tapping conditions. The results support a specific association between addition fact retrieval and anatomically ordered finger movements in adults, possibly due to finger counting strategies that deploy anatomically ordered finger movements early in the development.

  13. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.

  14. Oxidative addition of methane and benzene C--H bonds to rhodium center: A DFT study

    NASA Astrophysics Data System (ADS)

    Bi, Siwei; Zhang, Zhenwei; Zhu, Shufen

    2006-11-01

    A density functional theory study on mechanisms of the oxidative addition of methane and benzene C-H bonds to the rhodium center containing Cp and PMe 3 ligands has been performed. Our calculated results confirm that the C-H bond cleavage from a sigma complex to a hydride alkyl complex is the rate-determining step. Compared with the case of methane C-H bond, the oxidative addition of benzene C-H bond is more favorable kinetically and thermodynamically. Stronger backdonation from metal center to the σ ∗ antibonding orbital of benzene C-H bond is responsible for the observations.

  15. Implications of Non-Systematic Observations for Verification of Forecasts of Aviation Weather Variables

    NASA Astrophysics Data System (ADS)

    Brown, B. G.; Young, G. S.; Fowler, T. L.

    2001-12-01

    Over the last several years, efforts have been undertaken to develop improved automated forecasts of weather phenomena that have large impacts on aviation, including turbulence and in-flight icing conditions. Verification of these forecasts - which has played a major role in their development - is difficult due to the nature of the limited observations available for these evaluations; in particular, voice reports by pilots (PIREPs). These reports, which are provided inconsistently by pilots, currently are the best observations of turbulence and in-flight icing conditions available. However, their sampling characteristics make PIREPs a difficult dataset to use for these evaluations. In particular, PIREPs have temporal and spatial biases (e.g., they are more frequent during daylight hours, and they occur most frequently along flight routes and in the vicinity of major airports, where aircraft are concentrated), and they are subjective. Most importantly, the observations are non-systematic. That is, observations are not consistently reported at the same location and time. This characteristic of the reports has numerous implications for the verification of forecasts of these phenomena. In particular, it is inappropriate to estimate certain common verification statistics that normally are of interest in forecast evaluations. For example, estimates of the false alarm ratio and critical success index are incorrect, due to the unrepresentativeness of the observations. Analytical explanations for this result have been developed, and the magnitudes of the errors associated with estimating these statistics have been estimated through Monte Carlo simulations. In addition, several approaches have been developed to compensate for these characteristics of PIREPs in verification studies, including methods for estimating confidence intervals for the verification statistics, which take into account their sampling variability. These approaches also have implications for verification

  16. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    PubMed Central

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-01-01

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused by a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. The observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys. PMID:26446425

  17. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    SciTech Connect

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-10-08

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused by a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. In conclusion, the observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys.

  18. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  19. Studies of levels of biogenic amines in meat samples in relation to the content of additives.

    PubMed

    Jastrzębska, Aneta; Kowalska, Sylwia; Szłyk, Edward

    2016-01-01

    The impact of meat additives on the concentration of biogenic amines and the quality of meat was studied. Fresh white and red meat samples were fortified with the following food additives: citric and lactic acids, disodium diphosphate, sodium nitrite, sodium metabisulphite, potassium sorbate, sodium chloride, ascorbic acid, α-tocopherol, propyl 3,4,5-trihydroxybenzoate (propyl gallate) and butylated hydroxyanisole. The content of spermine, spermidine, putrescine, cadaverine, histamine, tyramine, tryptamine and 2-phenylethylamine was determined by capillary isotachophoretic methods in meat samples (fresh and fortified) during four days of storage at 4°C. The results were applied to estimate the impact of the tested additives on the formation of biogenic amines in white and red meat. For all tested meats, sodium nitrite, sodium chloride and disodium diphosphate showed the best inhibition. However, cadaverine and putrescine were characterised by the biggest changes in concentration during the storage time of all the additives. Based on the presented data for the content of biogenic amines in meat samples analysed as a function of storage time and additives, we suggest that cadaverine and putrescine have a significant impact on meat quality.

  20. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  1. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2016-09-14

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this research, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determines their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d1, and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical Kinship Verification via Representation Learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU Kinship Database is created which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields stateof- the-art kinship verification accuracy on the WVU Kinship database and on four existing benchmark datasets. Further, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  2. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    MICROCOPY RESOLUTION TEST CHART MADCTR.S042 /2>t w NI TeduIem R"pm’ 00 0 MICRQCODE VERIFICATION PROJECT Unhvrsity of Southern California Stephen D...in the production, testing , and maintenance of Air Force software. This effort was undertaken in response to that goal. The objective of the effort was...rather than hard wiring, is a recent development in computer technology. Hardware diagnostics do not fulfill testing requirements for these computers

  3. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  4. Robust verification analysis

    SciTech Connect

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  5. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  6. Summary of Previous Chamber or Controlled Anthrax Studies and Recommendations for Possible Additional Studies

    SciTech Connect

    Piepel, Gregory F.; Amidan, Brett G.; Morrow, Jayne B.

    2010-12-29

    This report and an associated Excel file(a) summarizes the investigations and results of previous chamber and controlled studies(b) to characterize the performance of methods for collecting, storing and/or transporting, extracting, and analyzing samples from surfaces contaminated by Bacillus anthracis (BA) or related simulants. This report and the Excel are the joint work of the Pacific Northwest National Laboratory (PNNL) and the National Institute of Standards and Technology (NIST) for the Department of Homeland Security, Science and Technology Directorate. The report was originally released as PNNL-SA-69338, Rev. 0 in November 2009 with limited distribution, but was subsequently cleared for release with unlimited distribution in this Rev. 1. Only minor changes were made to Rev. 0 to yield Rev. 1. A more substantial update (including summarizing data from other studies and more condensed summary tables of data) is underway

  7. Self-verification motives at the collective level of self-definition.

    PubMed

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  8. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  9. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  10. Using epidemiology to regulate food additives: saccharin case-control studies.

    PubMed

    Cordle, F; Miller, S A

    1984-01-01

    The increasing use of nonnutritive sweeteners and the widely publicized 1969 ban on cyclamate led to additional investigations in rodents of the carcinogenic potential of saccharin. Preliminary results of a long-term feeding study indicated formation of bladder tumors in rodents, and collective experimental evidence has demonstrated that high doses of the synthetic sweetener saccharin can cause bladder cancer in rodents. Based on the results of that and other rodent studies indicating an increased risk of bladder cancer associated with saccharin, the Commissioner of the Food and Drug Administration announced the agency's intention to propose a ban on saccharin. This intention was made known in April 1977 under the Delaney Clause of the Food, Drug, and Cosmetic Act. The clause essentially states that no additive shall be deemed safe if it is found to induce cancer in man or animals, or if it is found, after tests appropriate for the evaluation of the safety of food additives, to induce cancer in man or animals. Also in 1977, a group of epidemiologists began to assess the available epidemiologic information to determine the potential human risk. This report describes the assessment of several human epidemiologic studies available then and the results of more recent epidemiologic studies.

  11. A kinetic study of struvite precipitation recycling technology with NaOH/Mg(OH)2 addition.

    PubMed

    Yu, Rongtai; Ren, Hongqiang; Wang, Yanru; Ding, Lili; Geng, Jingji; Xu, Ke; Zhang, Yan

    2013-09-01

    Struvite precipitation recycling technology is received wide attention in removal ammonium and phosphate out of wastewater. While past study focused on process efficiency, and less on kinetics. The kinetic study is essential for the design and optimization in the application of struvite precipitation recycling technology. The kinetics of struvite with NaOH/Mg(OH)2 addition were studied by thermogravimetry analysis with three rates (5, 10, 20 °C/min), using Friedman method and Ozawa-Flynn-Wall method, respectively. Degradation process of struvite with NaOH/Mg(OH)2 addition was three steps. The stripping of ammonia from struvite was mainly occurred at the first step. In the first step, the activation energy was about 70 kJ/mol, which has gradually declined as the reaction progress. By model fitting studies, the proper mechanism function for struvite decomposition process with NaOH/Mg(OH)2 addition was revealed. The mechanism function was f(α)=α(α)-(1-α)(n), a Prout-Tompkins nth order (Bna) model.

  12. A study on the relationship between the protein supplements intake satisfaction level and repurchase intention: Verification of mediation effects of word-of-mouth intention.

    PubMed

    Kim, Ill-Gwang

    2016-05-18

    The purpose of this study is to examine the relationship between the protein supplements intake satisfaction level and repurchase intention of university students majoring in physical education and verify the mediation effects of word-of-mouth intention. To achieve the purpose of this study, 700 university students majoring in physical education from 10 universities in Korea were selected from October 2013 to December 2013 as the target of this study through the cluster random sampling and data of 228 university students who had experience in the intake of protein supplements among them was analyzed. The composite reliability of each factor was in between 0.869 and 0.958, and the convergent validity and discriminant validity were verified. SPSS 18.0 and Amos 22.0 were utilized as data processing methods and the verification of significance on the medication effects and indirect effects of word-of-mouth intention was carried out using the frequency analysis, correlation analysis, CFA, SEM, and Amos bootstrapping. The result is as follows. The protein supplements intake satisfaction level had a positive effect on the word-of-mouth intention and the word-of-mouth intention had a positive effect on the repurchase intention. Also, it was shown that the word-of-mouth intention played a full mediation role between the intake satisfaction level and the repurchase intention.

  13. SU-E-T-600: Patient Specific IMRT Verification Using a Phosphor-Screen Based Geometric QA System: A Preliminary Study

    SciTech Connect

    Lee, M; Hu, E; Yi, B

    2015-06-15

    Purpose: Raven QA (JPLC, MD) is a unified and comprehensive quality assurance system for QA of TG-142, which use a phosphor screen, a mirror system and a camera. It is to test if this device can be used for IMRT QA dosimetry. Methods: A lung IMRT case is used deliver dose to Raven QA. Accuracy of dose distribution of 5cm slab phantom using Eclipse planning system (Varian) has been confirmed both from a Monte Carlo Simulation and from a MapCheck (SunNuclear) measurement. Geometric distortion and variation of spatial dose response are corrected after background subtraction. A pin-hole grid plate is designed and used to determine the light scatter in the Raven QA box and the spatial dose response. Optic scatter model was not applied in this preliminary study. Dose is normalized to the response of the 10×10 field and the TMR of 5cm depth was considered. Results: Time to setup the device for IMRT QA takes less than 5 minutes as other commercially available devices. It shows excellent dose linearity and dose rate independent, less than 1 %. Background signal, however, changes for different field sizes. It is believed to be due to inaccurate correction of optic scatter. Absolute gamma (5%, 5mm) passing rate was higher than 95%. Conclusion: This study proves that the Raven QA can be used for a patient specific IMRT verification. Part of this study is supported by the Maryland Industrial Partnership Grant.

  14. Study of asphalt/asphaltene precipitation during addition of solvents to West Sak crude

    SciTech Connect

    Jiang, J.C.; Patil, S.L.; Kamath, V.A. )

    1990-07-01

    In this study, experimental data on the amount of asphalt and asphaltene precipitation due to addition of solvents to West Sak crude were gathered. The first set of tests were conducted for two types of West Sak stock tank oils. Solvents used include: ethane, carbon dioxide, propane, n-butane, n-pentane, n-heptane, Prudhoe Bay natural gas (PBG) and natural gas liquids (NGL). Effect of solvent to oil dilution ratio on the amount of precipitation was studied. Alteration of crude oil composition due to asphalt precipitation was measured using gas-liquid chromatography. A second set of experiments were conducted to measure asphaltene precipitation due to addition of CO{sub 2} to live (recombined) West Sak crude.

  15. Cleaning Verification Monitor Technique Based on Infrared Optical Methods

    DTIC Science & Technology

    2004-10-01

    Cleaning Verification Techniques.” Real-time methods to provide both qualitative and quantitative assessments of surface cleanliness are needed for a...detection VCPI method offer a wide range of complementary capabilities in real-time surface cleanliness verification. Introduction Currently...also has great potential to reduce or eliminate premature failures of surface coatings caused by a lack of surface cleanliness . Additional

  16. Additive effects of neurofeedback on the treatment of ADHD: A randomized controlled study.

    PubMed

    Lee, Eun-Jeong; Jung, Chul-Ho

    2017-02-01

    Neurofeedback (NF) has been identified as a "possibly efficacious" treatment in current evidence-based reviews; therefore, more research is needed to determine its effects. The current study examined the potential additive effect of NF for children diagnosed with ADHD beginning a medication trial first. Thirty-six children (6-12 years) with a DSM-IV-TR diagnosis of ADHD were randomly assigned to an NF with medication (NF condition) or a medication only condition. Children in the NF group attended 20 twice-weekly sessions. Outcome measures included individual cognitive performance scores (ADS, K-WISC-III), ADHD rating scores completed by their parents (ARS, CRS) and brainwave indices of left and right hemispheres before and after NF treatment. Significant additive treatment effect in any of the symptom variables was found and a reduction of theta waves in both the right and left hemispheres was recorded in NF condition participants. However our randomized controlled study could not demonstrate superior effects of combined NF on intelligent functioning compared to the medication treatment only. This study suggested any possible evidence of positive and additive treatment effects of NF on brainwaves and ADHD symptomatology.

  17. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  18. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  19. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  20. TEM and HRTEM studies of ball milled 6061 aluminium alloy powder with Zr addition.

    PubMed

    Lityńska-Dobrzyńska, L; Dutkiewicz, J; Maziarz, W; Rogal, Ł

    2010-03-01

    The effect of mechanical alloying on the microstructure of atomized 6061 aluminium alloy powder and 6061 powder with a zirconium addition was studied in the work. The atomized 6061 aluminium alloy powder and 6061 powder with addition of 2 wt.% Zr were milled in a planetary ball mill and investigated using X-ray diffraction measurements, conventional and high-resolution electron microscopy (TEM/HRTEM) and high-angle annular dark field scanning transmission electron microscopy combined with energy dispersive X-ray microanalysis. An increase of stresses was observed in milled powders after the refinement of crystallites beyond 100 nm. In the powder with zirconium addition, some part of the Zr atoms diffused in aluminium forming a solid solution containing up to 0.5 wt.% Zr. The remaining was found to form Zr-rich particles containing up to 88 wt.% Zr and were identified as face centred cubic (fcc) phase with lattice constant a= 0.48 nm. That fcc phase partially transformed into the L1(2) ordered phase. Eighty-hour milling brought an increase of microhardness (measured with Vickers method) from about 50 HV (168 MPa) for the initial 6061 powder to about 170 HV (552 MPa). The addition of zirconium had no influence on the microhardness.

  1. Hallux valgus: comparative study between two surgical techniques of proximal addition osteotomy

    PubMed Central

    Lara, Luiz Carlos Ribeiro; de Araujo, Bruno Vierno; Franco, Nelson; Hita, Roberto Minoru

    2012-01-01

    OBJECTIVE: To clinically and radiographically compare the results of treatment of hallux valgus, by two addition osteotomy techniques: one using resected exostosis, and the other using a plate fixation for addition wedge. METHODS: We evaluated 24 feet of 19 patients, mean age 51.3 years, affected by hallux valgus, with a mean follow-up of 50.1 months. 13 feet underwent addition osteotomy with resected exostosis (AORE) and 11 patients (11 feet) underwent addition osteotomy with plate (AOP). The AOFAS score, intermetatarsal 1 and 2 angles, and hallux valgus angle were evaluated before and after surgery. RESULTS: In the AORE technique, the mean preoperative AOFAS was 46.6, with IMA 14o and HVA 32o, while in the postoperative AOFAS it was 81.3, with IMA 9o and HVA 25o, and 92.3% satisfactory results. In the AOP technique, the mean preoperative AOFAS was 42.1, with IMA 15o and HVA 29o while in the postoperative AOFAS it was 77.4, with IMA 11o and HVA 23o and 81.8% of satisfactory results. CONCLUSIONS: Both techniques proved to be effective in the treatment of hallux valgus, both clinically and radiografically, with no statistical difference between them. Level of evidence III, Retrospective comparative study. PMID:24453631

  2. Cone beam CT imaging with limited angle of projections and prior knowledge for volumetric verification of non-coplanar beam radiation therapy: a proof of concept study

    NASA Astrophysics Data System (ADS)

    Meng, Bowen; Xing, Lei; Han, Bin; Koong, Albert; Chang, Daniel; Cheng, Jason; Li, Ruijiang

    2013-11-01

    Non-coplanar beams are important for treatment of both cranial and noncranial tumors. Treatment verification of such beams with couch rotation/kicks, however, is challenging, particularly for the application of cone beam CT (CBCT). In this situation, only limited and unconventional imaging angles are feasible to avoid collision between the gantry, couch, patient, and on-board imaging system. The purpose of this work is to develop a CBCT verification strategy for patients undergoing non-coplanar radiation therapy. We propose an image reconstruction scheme that integrates a prior image constrained compressed sensing (PICCS) technique with image registration. Planning CT or CBCT acquired at the neutral position is rotated and translated according to the nominal couch rotation/translation to serve as the initial prior image. Here, the nominal couch movement is chosen to have a rotational error of 5° and translational error of 8 mm from the ground truth in one or more axes or directions. The proposed reconstruction scheme alternates between two major steps. First, an image is reconstructed using the PICCS technique implemented with total-variation minimization and simultaneous algebraic reconstruction. Second, the rotational/translational setup errors are corrected and the prior image is updated by applying rigid image registration between the reconstructed image and the previous prior image. The PICCS algorithm and rigid image registration are alternated iteratively until the registration results fall below a predetermined threshold. The proposed reconstruction algorithm is evaluated with an anthropomorphic digital phantom and physical head phantom. The proposed algorithm provides useful volumetric images for patient setup using projections with an angular range as small as 60°. It reduced the translational setup errors from 8 mm to generally <1 mm and the rotational setup errors from 5° to <1°. Compared with the PICCS algorithm alone, the integration of rigid

  3. Effect of Exogenous Phytase Addition on Soil Phosphatase Activities: a Fluorescence Spectroscopy Study.

    PubMed

    Yang, Xiao-zhu; Chen, Zhen-hua; Zhang, Yu-lan; Chen, Li-jun

    2015-05-01

    The utilization of organic phosphorus (P) has directly or indirectly improved after exogenous phytase was added to soil. However, the mechanism by which exogenous phytase affected the soil phosphatases (phosphomonoesterase and phosphodiesterase) activities was not clear. The present work was aimed to study red soil, brown soil and cinnamon soil phosphomonoesterase (acid and alkaline) (AcP and AlP) and phosphodiesterase (PD) activities responding to the addition of exogenous phytase (1 g phytase/50 g air dry soil sample) based on the measurements performed via a fluorescence detection method combined with 96 microplates using a TECAN Infinite 200 Multi-Mode Microplate Reader. The results indicated that the acid phosphomonoesterase activity was significantly enhanced in red soil (p≤0. 01), while it was significantly reduced in cinnamon soil; alkaline phosphomonoesterase activity was significantly enhanced in cinnamon soil (p≤ 0. 01), while it was significantly reduced in red soil; phosphodiesterase activity was increased in three soils but it was significantly increased in brown soil (p≤0. 01) after the addition of exogenous phytase. The activities still remained strong after eight days in different soils, which indicated that exogenous phytase addition could be enhance soil phosphatases activities effectively. This effect was not only related to soil properties, such as pH and phosphorus forms, but might also be related to the excreted enzyme amount of the stimulating microorganism. Using fluorescence spectroscopy to study exogenous phytase addition influence on soil phosphatase activities was the first time at home and abroad. Compared with the conventional spectrophotometric method, the fluorescence microplate method is an accurate, fast and simple to use method to determine the relationships among the soil phosphatases activities.

  4. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  5. Production readiness verification testing

    NASA Technical Reports Server (NTRS)

    James, A. M.; Bohon, H. L.

    1980-01-01

    A Production Readiness Verification Testing (PRVT) program has been established to determine if structures fabricated from advanced composites can be committed on a production basis to commercial airline service. The program utilizes subcomponents which reflect the variabilities in structure that can realistically be expected from current production and quality control technology to estimate the production qualities, variation in static strength, and durability of advanced composite structures. The results of the static tests and a durability assessment after one year of continuous load/environment testing of twenty two duplicates of each of two structural components (a segment of the front spar and cover of a vertical stabilizer box structure) are discussed.

  6. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    DOE PAGES

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; ...

    2015-10-08

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused bymore » a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. In conclusion, the observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys.« less

  7. Couples Counseling in Alzheimer’s Disease: Additional Clinical Findings from a Novel Intervention Study

    PubMed Central

    AUCLAIR, URSULA; EPSTEIN, CYNTHIA; MITTELMAN, MARY

    2009-01-01

    This article describes the clinical findings of a study designed to assess the benefit of counseling for couples, one of whom is in the early stage of Alzheimer’s disease (AD). We previously reported our findings based on the first 12 couples that enrolled in the study. Based on the treatment of 30 additional couples, we have refined our treatment strategy to include concepts of Gestalt Therapy and Transactional Analysis and identified prevalent issues of concern to this cohort. The study design has remained as described in the earlier article (Epstein et al., 2006), and has proven to be appropriate to meet the goals of this intervention as indicated by our clinical experience and feedback from the participating couples. Case vignettes demonstrate how to conduct the sessions so that the experience of each member of the dyad is validated, while acknowledging the differential impact of the disease on them. PMID:19865591

  8. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  9. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  10. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  11. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  12. A water soluble additive to suppress respirable dust from concrete-cutting chainsaws: a case study.

    PubMed

    Summers, Michael P; Parmigiani, John P

    2015-01-01

    Respirable dust is of particular concern in the construction industry because it contains crystalline silica. Respirable forms of silica are a severe health threat because they heighten the risk of numerous respirable diseases. Concrete cutting, a common work practice in the construction industry, is a major contributor to dust generation. No studies have been found that focus on the dust suppression of concrete-cutting chainsaws, presumably because, during normal operation water is supplied continuously and copiously to the dust generation points. However, there is a desire to better understand dust creation at low water flow rates. In this case study, a water-soluble surfactant additive was used in the chainsaw's water supply. Cutting was performed on a free-standing concrete wall in a covered outdoor lab with a hand-held, gas-powered, concrete-cutting chainsaw. Air was sampled at the operator's lapel, and around the concrete wall to simulate nearby personnel. Two additive concentrations were tested (2.0% and 0.2%), across a range of fluid flow rates (0.38-3.8 Lpm [0.1-1.0 gpm] at 0.38 Lpm [0.1 gpm] increments). Results indicate that when a lower concentration of additive is used exposure levels increase. However, all exposure levels, once adjusted for 3 hours of continuous cutting in an 8-hour work shift, are below the Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) of 5 mg/m(3). Estimates were made using trend lines to predict the fluid flow rates that would cause respirable dust exposure to exceed both the OSHA PEL and the American Conference of Governmental Industrial Hygienists (ACGIH®) threshold limit value (TLV).

  13. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  14. Prazosin addition to fluvoxamine: A preclinical study and open clinical trial in OCD.

    PubMed

    Feenstra, Matthijs G P; Klompmakers, André; Figee, Martijn; Fluitman, Sjoerd; Vulink, Nienke; Westenberg, Herman G M; Denys, Damiaan

    2016-02-01

    The efficacy of selective serotonin reuptake inhibitors (SRIs) in psychiatric disorders may be "augmented" through the addition of atypical antipsychotic drugs. A synergistic increase in dopamine (DA) release in the prefrontal cortex has been suggested to underlie this augmentation effect, though the mechanism of action is not clear yet. We used in vivo microdialysis in rats to study DA release following the administration of combinations of fluvoxamine (10 mg/kg) and quetiapine (10 mg/kg) with various monoamine-related drugs. The results confirmed that the selective 5-HT1A antagonist WAY-100635 (0.05 mg/kg) partially blocked the fluvoxamine-quetiapine synergistic effect (maximum DA increase dropped from 325% to 214%). A novel finding is that the α1-adrenergic blocker prazosin (1 mg/kg), combined with fluvoxamine, partially mimicked the effect of augmentation (maximum DA increase 205%; area-under-the-curve 163%). As this suggested that prazosin augmentation might be tested in a clinical study, we performed an open clinical trial of prazosin 20 mg addition to SRI in therapy-resistant patients with obsessive-compulsive disorder applying for neurosurgery. A small, non-significant reduction in Yale Brown Obsessive Compulsive Scale (Y-BOCS) scores was observed in 10 patients and one patient was classified as a responder with a reduction in Y-BOCS scores of more than 25%. We suggest that future clinical studies augmenting SRIs with an α1-adrenergic blocker in less treatment resistant cases should be considered. The clinical trial "Prazosin in combination with a serotonin reuptake inhibitor for patients with Obsessive Compulsive disorder: an open label study" was registered at 24/05/2011 under trial number ISRCTN61562706: http://www.controlled-trials.com/ISRCTN61562706.

  15. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  16. Validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1991-01-01

    Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.

  17. A combined toxicity study of zinc oxide nanoparticles and vitamin C in food additives

    NASA Astrophysics Data System (ADS)

    Wang, Yanli; Yuan, Lulu; Yao, Chenjie; Ding, Lin; Li, Chenchen; Fang, Jie; Sui, Keke; Liu, Yuanfang; Wu, Minghong

    2014-11-01

    At present, safety evaluation standards for nanofood additives are made based on the toxic effects of a single additive. Since the size, surface properties and chemical nature influence the toxicity of nanomaterials, the toxicity may have dramatically changed when nanomaterials are used as food additives in a complex system. Herein, we investigated the combined toxicity of zinc oxide nanoparticles (ZnO NPs) and vitamin C (Vc, ascorbic acid). The results showed that Vc increased the cytotoxicity significantly compared with that of the ZnO only NPs. When the cells were exposed to ZnO NPs at a concentration less than 15 mg L-1, or to Vc at a concentration less than 300 mg L-1, there was no significant cytotoxicity, both in the case of gastric epithelial cell line (GES-1) and neural stem cells (NSCs). However, when 15 mg L-1 of ZnO NPs and 300 mg L-1 of Vc were introduced to cells together, the cell viability decreased sharply indicating significant cytotoxicity. Moreover, the significant increase in toxicity was also shown in the in vivo experiments. The dose of the ZnO NPs and Vc used in the in vivo study was calculated according to the state of food and nutrition enhancer standard. After repeated oral exposure to ZnO NPs plus Vc, the injury of the liver and kidneys in mice has been indicated by the change of these indices. These findings demonstrate that the synergistic toxicity presented in a complex system is essential for the toxicological evaluation and safety assessment of nanofood.At present, safety evaluation standards for nanofood additives are made based on the toxic effects of a single additive. Since the size, surface properties and chemical nature influence the toxicity of nanomaterials, the toxicity may have dramatically changed when nanomaterials are used as food additives in a complex system. Herein, we investigated the combined toxicity of zinc oxide nanoparticles (ZnO NPs) and vitamin C (Vc, ascorbic acid). The results showed that Vc increased the

  18. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in social media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation

  19. Rate Constants of Hydroperoxyl Radical Addition to Cyclic Nitrones: A DFT Study

    PubMed Central

    Villamena, Frederick A.; Merle, John K.; Hadad, Christopher M.; Zweier*, Jay L.

    2008-01-01

    Nitrones are potential synthetic antioxidants against the reduction of radical-mediated oxidative damage in cells, and as analytical reagent for the identification of HO2• and other such transient species. In this work, the PCM/B3LYP/6−31+G(d,p)//B3LYP/6−31G(d) and PCM/mPW1K/6−31+G(d,p) density functional theory (DFT) methods were employed to predict the reactivity of HO2• with various functionalized nitrones as spin traps. The calculated second-order rate constants and free energies of reaction at both levels of theory were in the range of 100−103 M−1 s−1 and 1 to −12 kcal mol−1, respectively, and the rate constants for some nitrones are on the same order of magnitude as those observed experimentally. The trend in HO2• reactivity to nitrones could not be explained solely on the basis of the relationship of the theoretical positive charge densities on the nitronyl-C, with their respective ionization potentials, electron affinities, rate constants, or free energies of reaction. However, various modes of intramolecular H-bonding interaction were observed at the transition state (TS) structures of HO2• addition to nitrones. The presence of intramolecular H-bonding interactions in the transition states were predicted and may play a significant role towards a facile addition of HO2• to nitrones. In general, HO2• addition to ethoxycarbonyl- and spirolactam- substituted nitrones, as well as those nitrones without electron-withdrawing substituents, such as 5,5-dimethyl-pyrroline N-oxide (DMPO) and 5-spirocyclopentyl-pyrroline N-oxide (CPPO), are most preferred compared to the methylcarbamoyl-substituted nitrones. This study suggests that the use of specific spin traps for efficient trapping of HO2• could pave the way toward improved radical detection and antioxidant protection. PMID:17845014

  20. A combined toxicity study of zinc oxide nanoparticles and vitamin C in food additives.

    PubMed

    Wang, Yanli; Yuan, Lulu; Yao, Chenjie; Ding, Lin; Li, Chenchen; Fang, Jie; Sui, Keke; Liu, Yuanfang; Wu, Minghong

    2014-12-21

    At present, safety evaluation standards for nanofood additives are made based on the toxic effects of a single additive. Since the size, surface properties and chemical nature influence the toxicity of nanomaterials, the toxicity may have dramatically changed when nanomaterials are used as food additives in a complex system. Herein, we investigated the combined toxicity of zinc oxide nanoparticles (ZnO NPs) and vitamin C (Vc, ascorbic acid). The results showed that Vc increased the cytotoxicity significantly compared with that of the ZnO only NPs. When the cells were exposed to ZnO NPs at a concentration less than 15 mg L(-1), or to Vc at a concentration less than 300 mg L(-1), there was no significant cytotoxicity, both in the case of gastric epithelial cell line (GES-1) and neural stem cells (NSCs). However, when 15 mg L(-1) of ZnO NPs and 300 mg L(-1) of Vc were introduced to cells together, the cell viability decreased sharply indicating significant cytotoxicity. Moreover, the significant increase in toxicity was also shown in the in vivo experiments. The dose of the ZnO NPs and Vc used in the in vivo study was calculated according to the state of food and nutrition enhancer standard. After repeated oral exposure to ZnO NPs plus Vc, the injury of the liver and kidneys in mice has been indicated by the change of these indices. These findings demonstrate that the synergistic toxicity presented in a complex system is essential for the toxicological evaluation and safety assessment of nanofood.

  1. A study of pyrazines in cigarettes and how additives might be used to enhance tobacco addiction

    PubMed Central

    Alpert, Hillel R; Agaku, Israel T; Connolly, Gregory N

    2016-01-01

    Background Nicotine is known as the drug that is responsible for the addicted behaviour of tobacco users, but it has poor reinforcing effects when administered alone. Tobacco product design features enhance abuse liability by (A) optimising the dynamic delivery of nicotine to central nervous system receptors, and affecting smokers’ withdrawal symptoms, mood and behaviour; and (B) effecting conditioned learning, through sensory cues, including aroma, touch and visual stimulation, to create perceptions of pending nicotine reward. This study examines the use of additives called ‘pyrazines’, which may enhance abuse potential, their introduction in ‘lights’ and subsequently in the highly market successful Marlboro Lights (Gold) cigarettes and eventually many major brands. Methods We conducted internal tobacco industry research using online databases in conjunction with published scientific literature research, based on an iterative feedback process. Results Tobacco manufacturers developed the use of a range of compounds, including pyrazines, in order to enhance ‘light’ cigarette products’ acceptance and sales. Pyrazines with chemosensory and pharmacological effects were incorporated in the first ‘full-flavour, low-tar’ product achieving high market success. Such additives may enhance dependence by helping to optimise nicotine delivery and dosing and through cueing and learned behaviour. Conclusions Cigarette additives and ingredients with chemosensory effects that promote addiction by acting synergistically with nicotine, increasing product appeal, easing smoking initiation, discouraging cessation or promoting relapse should be regulated by the US Food and Drug Administration. Current models of tobacco abuse liability could be revised to include more explicit roles with regard to non-nicotine constituents that enhance abuse potential. PMID:26063608

  2. Impact of contacting study authors to obtain additional data for systematic reviews: diagnostic accuracy studies for hepatic fibrosis

    PubMed Central

    2014-01-01

    Background Seventeen of 172 included studies in a recent systematic review of blood tests for hepatic fibrosis or cirrhosis reported diagnostic accuracy results discordant from 2 × 2 tables, and 60 studies reported inadequate data to construct 2 × 2 tables. This study explores the yield of contacting authors of diagnostic accuracy studies and impact on the systematic review findings. Methods Sixty-six corresponding authors were sent letters requesting additional information or clarification of data from 77 studies. Data received from the authors were synthesized with data included in the previous review, and diagnostic accuracy sensitivities, specificities, and positive and likelihood ratios were recalculated. Results Of the 66 authors, 68% were successfully contacted and 42% provided additional data for 29 out of 77 studies (38%). All authors who provided data at all did so by the third emailed request (ten authors provided data after one request). Authors of more recent studies were more likely to be located and provide data compared to authors of older studies. The effects of requests for additional data on the conclusions regarding the utility of blood tests to identify patients with clinically significant fibrosis or cirrhosis were generally small for ten out of 12 tests. Additional data resulted in reclassification (using median likelihood ratio estimates) from less useful to moderately useful or vice versa for the remaining two blood tests and enabled the calculation of an estimate for a third blood test for which previously the data had been insufficient to do so. We did not identify a clear pattern for the directional impact of additional data on estimates of diagnostic accuracy. Conclusions We successfully contacted and received results from 42% of authors who provided data for 38% of included studies. Contacting authors of studies evaluating the diagnostic accuracy of serum biomarkers for hepatic fibrosis and cirrhosis in hepatitis C patients

  3. A Mechanistic Study of Halogen Addition and Photoelimination from π-Conjugated Tellurophenes.

    PubMed

    Carrera, Elisa I; Lanterna, Anabel E; Lough, Alan J; Scaiano, Juan C; Seferos, Dwight S

    2016-03-02

    The ability to drive reactivity using visible light is of importance for many disciplines of chemistry and has significant implications for sustainable chemistry. Identifying photochemically active compounds and understanding photochemical mechanisms is important for the development of useful materials for synthesis and catalysis. Here we report a series of photoactive diphenyltellurophene compounds bearing electron-withdrawing and electron-donating substituents synthesized by alkyne coupling/ring closing or palladium-catalyzed ipso-arylation chemistry. The redox chemistry of these compounds was studied with respect to oxidative addition and photoelimination of bromine, which is of importance for energy storage reactions involving X2. The oxidative addition reaction mechanism was studied using density functional theory, the results of which support a three-step mechanism involving the formation of an initial η(1) association complex, a monobrominated intermediate, and finally the dibrominated product. All of the tellurophene derivatives undergo photoreduction using 430, 447, or 617 nm light depending on the absorption properties of the compound. Compounds bearing electron-withdrawing substituents have the highest photochemical quantum efficiencies in the presence of an alkene trap, with efficiencies of up to 42.4% for a pentafluorophenyl-functionalized tellurophene. The photoelimination reaction was studied in detail through bromine trapping experiments and laser flash photolysis, and a mechanism is proposed. The photoreaction, which occurs by release of bromine radicals, is competitive with intersystem crossing to the triplet state of the brominated species, as evidenced by the formation of singlet oxygen. These findings should be useful for the design of new photochemically active compounds supported by main-group elements.

  4. Hydrogen atom addition to the surface of graphene nanoflakes: A density functional theory study

    NASA Astrophysics Data System (ADS)

    Tachikawa, Hiroto

    2017-02-01

    Polycyclic aromatic hydrocarbons (PAHs) provide a 2-dimensional (2D) reaction surface in 3-dimensional (3D) interstellar space and have been utilized as a model of graphene surfaces. In the present study, the reaction of PAHs with atomic hydrogen was investigated by means of density functional theory (DFT) to systematically elucidate the binding nature of atomic hydrogen to graphene nanoflakes. PAHs with n = 4-37 were chosen, where n indicates the number of benzene rings. Activation energies of hydrogen addition to the graphene surface were calculated to be 5.2-7.0 kcal/mol at the CAM-B3LYP/6-311G(d,p) level, which is almost constant for all PAHs. The binding energies of hydrogen atom were slightly dependent on the size (n): 14.8-28.5 kcal/mol. The absorption spectra showed that a long tail is generated at the low-energy region after hydrogen addition to the graphene surface. The electronic states of hydrogenated graphenes were discussed on the basis of theoretical results.

  5. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region.

  6. Covalent binding of aniline to humic substances. 2. 15N NMR studies of nucleophilic addition reactions

    USGS Publications Warehouse

    Thorn, K.A.; Pettigrew, P.J.; Goldenberg, W.S.; Weber, E.J.

    1996-01-01

    Aromatic amines are known to undergo covalent binding with humic substances in the environment. Although previous studies have examined reaction conditions and proposed mechanisms, there has been no direct spectroscopic evidence for the covalent binding of the amines to the functional groups in humic substances. In order to further elucidate the reaction mechanisms, the Suwannee River and IHSS soil fulvic and humic acids were reacted with 15N-labeled aniline at pH 6 and analyzed using 15N NMR spectrometry. Aniline underwent nucleophilic addition reactions with the quinone and other carbonyl groups in the samples and became incorporated in the form of anilinohydroquinone, anilinoquinone, anilide, imine, and heterocyclic nitrogen, the latter comprising 50% or more of the bound amine. The anilide and anilinohydroquinone nitrogens were determined to be susceptible to chemical exchange by ammonia. In the case of Suwannee River fulvic acid, reaction under anoxic conditions and pretreatment with sodium borohydride or hydroxylamine prior to reaction under oxic conditions resulted in a decrease in the proportion of anilinohydroquinone nitrogen incorporated. The relative decrease in the incorporation of anilinohydroquinone nitrogen with respect to anilinoquinone nitrogen under anoxic conditions suggested that inter- or intramolecular redox reactions accompanied the nucleophilic addition reactions.

  7. Toxicogenomics concepts and applications to study hepatic effects of food additives and chemicals

    SciTech Connect

    Stierum, Rob . E-mail: stierum@voeding.tno.nl; Heijne, Wilbert; Kienhuis, Anne; Ommen, Ben van; Groten, John

    2005-09-01

    Transcriptomics, proteomics and metabolomics are genomics technologies with great potential in toxicological sciences. Toxicogenomics involves the integration of conventional toxicological examinations with gene, protein or metabolite expression profiles. An overview together with selected examples of the possibilities of genomics in toxicology is given. The expectations raised by toxicogenomics are earlier and more sensitive detection of toxicity. Furthermore, toxicogenomics will provide a better understanding of the mechanism of toxicity and may facilitate the prediction of toxicity of unknown compounds. Mechanism-based markers of toxicity can be discovered and improved interspecies and in vitro-in vivo extrapolations will drive model developments in toxicology. Toxicological assessment of chemical mixtures will benefit from the new molecular biological tools. In our laboratory, toxicogenomics is predominantly applied for elucidation of mechanisms of action and discovery of novel pathway-supported mechanism-based markers of liver toxicity. In addition, we aim to integrate transcriptome, proteome and metabolome data, supported by bioinformatics to develop a systems biology approach for toxicology. Transcriptomics and proteomics studies on bromobenzene-mediated hepatotoxicity in the rat are discussed. Finally, an example is shown in which gene expression profiling together with conventional biochemistry led to the discovery of novel markers for the hepatic effects of the food additives butylated hydroxytoluene, curcumin, propyl gallate and thiabendazole.

  8. Study on Friction and Wear Properties of Silver Matrix Brush Material with Different Additives

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoli; Wang, Wenfang; Hong, Yu; Wu, Yucheng

    2013-07-01

    Friction and wear processes of AgCuX (G, CF and AlN) composites-CuAgV alloy friction pair and effects of different additive content in silver based composite on friction and wear behavior are studied in this paper. The microstructure of the brush wear surface is observed by SEM. The results show that when graphite content is up to 9 wt.%, Ag-Cu-CF-G composite exhibits the best wear properties; when the content of aluminum nitride is up to 0.5 wt.%, Ag-Cu-AlN-G composites has the most comprehensive performance. The wear loss of both composites arises with the increase of both pressure and speed, but when speed reaches a critical value, the increased amplitude of wear loss tends to be steady.

  9. Ribotyping as an additional molecular marker for studying Neisseria meningitidis serogroup B epidemic strains.

    PubMed Central

    Tondella, M L; Sacchi, C T; Neves, B C

    1994-01-01

    The molecular method of ribotyping was used as an additional epidemiological marker to study the epidemic strains of Neisseria meningitidis serogroup B, referred to as the ET-5 complex, responsible for the epidemic which occurred in greater São Paulo, Brazil. Ribotyping analysis of these strains showed only a single rRNA gene restriction pattern (Rb1), obtained with ClaI restriction enzyme. This method, as well as multilocus enzyme electrophoresis, provided useful information about the clonal characteristics of the N. meningitidis serogroup B strains isolated during this epidemic. The N. meningitidis serogroup B isolates obtained from epidemics which occurred in Norway, Chile, and Cuba also demonstrated the same pattern (Rb1). Ribotyping was a procedure which could be applied to a large number of isolates and was felt to be appropriate for routine use in laboratories, especially because of the convenience of using nonradioactive probes. Images PMID:7852566

  10. Spectroscopic studies of nucleic acid additions during seed-mediated growth of gold nanoparticles

    PubMed Central

    Tapp, Maeling; Sullivan, Rick; Dennis, Patrick; Naik, Rajesh R.

    2015-01-01

    The effect of adding nucleic acids to gold seeds during the growth stage of either nanospheres or nanorods was investigated using UV-Vis spectroscopy to reveal any oligonucleotide base or structure-specific effects on nanoparticle growth kinetics or plasmonic signatures. Spectral data indicate that the presence of DNA duplexes during seed ageing drastically accelerated nanosphere growth while the addition of single-stranded polyadenine at any point during seed ageing induces nanosphere aggregation. For seeds added to a gold nanorod growth solution, single-stranded polythymine induces a modest blue-shift in the longitudinal peak wavelength. Moreover, a particular sequence comprised of 50% thymine bases was found to induce a faster, more dramatic blue-shift in the longitudinal peak wavelength compared to any of the homopolymer incubation cases. Monomeric forms of the nucleic acids, however, do not yield discernable spectral differences in any of the gold suspensions studied. PMID:25960601

  11. Modular system for studying tonal sound excitation in resonators with heat addition and mean flow.

    PubMed

    Matveev, Konstantin I; Hernandez, Rafael

    2012-03-01

    An educational experimental system has been developed for studying tonal sound generation in acoustic resonators. Tones are excited by either heat addition or vortex shedding in the presence of mean flow. The system construction is straightforward and inexpensive. Several test arrangements and experimental data are described in this paper. The experimental setups include a modified Rijke tube, a standing-wave thermoacoustic engine, a baffled tube with mean flow, and an acoustic energy harvester with a piezoelement. Simplified mathematical models for interpreting data are discussed, and references are provided to literature with more advanced analyses. The developed system can assist both graduate and undergraduate students in understanding acoustic instabilities via conducting and analyzing interesting experiments.

  12. Genetic Assessment of Additional Endophenotypes from the Consortium on the Genetics of Schizophrenia Family Study

    PubMed Central

    Greenwood, Tiffany A.; Lazzeroni, Laura C.; Calkins, Monica E.; Freedman, Robert; Green, Michael F.; Gur, Raquel E.; Gur, Ruben C.; Light, Gregory A.; Nuechterlein, Keith H.; Olincy, Ann; Radant, Allen D.; Seidman, Larry J.; Siever, Larry J.; Silverman, Jeremy M.; Stone, William S.; Sugar, Catherine A.; Swerdlow, Neal R.; Tsuang, Debby W.; Tsuang, Ming T.; Turetsky, Bruce I.; Braff, David L.

    2015-01-01

    The Consortium on the Genetics of Schizophrenia Family Study (COGS-1) has previously reported our efforts to characterize the genetic architecture of 12 primary endophenotypes for schizophrenia. We now report the characterization of 13 additional measures derived from the same endophenotype test paradigms in the COGS-1 families. Nine of the measures were found to discriminate between schizophrenia patients and controls, were significantly heritable (31 to 62%), and were sufficiently independent of previously assessed endophenotypes, demonstrating utility as additional endophenotypes. Genotyping via a custom array of 1536 SNPs from 94 candidate genes identified associations for CTNNA2, ERBB4, GRID1, GRID2, GRIK3, GRIK4, GRIN2B, NOS1AP, NRG1, and RELN across multiple endophenotypes. An experiment-wide p value of 0.003 suggested that the associations across all SNPs and endophenotypes collectively exceeded chance. Linkage analyses performed using a genome-wide SNP array further identified significant or suggestive linkage for six of the candidate endophenotypes, with several genes of interest located beneath the linkage peaks (e.g., CSMD1, DISC1, DLGAP2, GRIK2, GRIN3A, and SLC6A3). While the partial convergence of the association and linkage likely reflects differences in density of gene coverage provided by the distinct genotyping platforms, it is also likely an indication of the differential contribution of rare and common variants for some genes and methodological differences in detection ability. Still, many of the genes implicated by COGS through endophenotypes have been identified by independent studies of common, rare, and de novo variation in schizophrenia, all converging on a functional genetic network related to glutamatergic neurotransmission that warrants further investigation. PMID:26597662

  13. Mössbauer spectroscopy: an excellent additional tool for the study of magnetic soils and sediments

    NASA Astrophysics Data System (ADS)

    Vandenberghe, R. E.; Hus, J. J.; de Grave, E.

    2009-04-01

    Since the discovery a half century ago of the resonant gamma absorption, known as the Mössbauer effect, the derived spectroscopic method (MS) has proven to be a very suitable tool for the characterization of soil and rock minerals. From the conventional absorption spectra of iron containing compounds, so-called hyperfine parameters are derived which are more or less typical for each kind of mineral. So, MS has a certain analytical power for the characterization of iron-bearing minerals. This is especially true for magnetic minerals for which the spectrum contains an additional hyperfine parameter. Moreover, MS also allows retrieving information about the magnetic structure and behavior. Because the relative area of the spectra is to some extent proportional to the amount of iron atoms in their environment, MS yields not only quantitative information about the various minerals present but also about the iron in the different crystallographic sites. The power of MS as an excellent additional tool for the study of magnetic soils and sediments could be well demonstrated in the joint research with Jozef Hus (CPG-IRM, Dourbes). In our common work, the emphasis went mainly to the study of Chinese loess and soils. Using MS on magnetically separated samples the various magnetic species in a loess and its associated soil were for the first time discerned in a direct way. Further, magnetically enriched samples of four different loess/paleosol couplets from a loess sequence in Huangling have been systematically investigated by MS. From the obtained qualitative and quantitative information the neoformation of magnetite/maghemite in the soils, responsible for the increased observed remanence and susceptibility, could be evidenced.

  14. Spin-probe ESR and molecular modeling studies on calcium carbonate dispersions in overbased detergent additives.

    PubMed

    Montanari, Luciano; Frigerio, Francesco

    2010-08-15

    Oil-soluble calcium carbonate colloids are used as detergent additives in lubricating oils. They are colloidal dispersions of calcium carbonate particles stabilized by different surfactants; in this study alkyl-aryl-sulfonates and sulfurized alkyl-phenates, widely used in the synthesis of these additives, are considered. The physical properties of surfactant layers surrounding the surfaces of calcium carbonate particles were analyzed by using some nitroxide spin-probes (stable free radicals) and observing the corresponding ESR spectra. The spin-probe molecules contain polar groups which tend to tether them to the carbonate particle polar surface. They can reach these surfaces only if the surfactant layers are not very compact, hence the relative amounts of spin-probe molecules accessing carbonate surfaces are an index of the compactness of surfactant core. ESR signals of spin-probe molecules dissolved in oil or "locked" near the carbonate surfaces are different because of the different molecular mobility. Through deconvolution of the ESR spectra, the fraction of spin-probes penetrating surfactant shells have been calculated, and differences were observed according to the surfactant molecular structures. Moreover, by using specially labeled spin-probes based on stearic acids, functionalized at different separations from the carboxylic acid group, it was possible to interrogate the molecular physical behavior of surfactant shells at different distances from carbonate surfaces. Molecular modeling was applied to generate some three-dimensional micellar models of the colloidal stabilizations of the stabilized carbonate particles with different molecular structures of the surfactant. The diffusion of spin-probe molecules into the surfactant shells were studied by applying a starting force to push the molecules towards the carbonate surfaces and then observing the ensuing behavior. The simulations are in accordance with the ESR data and show that the geometrical

  15. Standardized verification of fuel cycle modeling

    SciTech Connect

    Feng, B.; Dixon, B.; Sunny, E.; Cuadra, A.; Jacobson, J.; Brown, N. R.; Powers, J.; Worrall, A.; Passerini, S.; Gregg, R.

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-year basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.

  16. Standardized verification of fuel cycle modeling

    DOE PAGES

    Feng, B.; Dixon, B.; Sunny, E.; ...

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less

  17. Verification of Legal Knowledge-base with Conflictive Concept

    NASA Astrophysics Data System (ADS)

    Hagiwara, Shingo; Tojo, Satoshi

    In this paper, we propose a verification methodology of large-scale legal knowledge. With a revision of legal code, we are forced to revise also other affected code to keep the consistency of law. Thus, our task is to revise the affected area properly and to investigate its adequacy. In this study, we extend the notion of inconsistency besides of the ordinary logical inconsistency, to include the conceptual conflicts. We obtain these conflictions from taxonomy data, and thus, we can avoid tedious manual declarations of opponent words. In the verification process, we adopt extended disjunctive logic programming (EDLP) to tolerate multiple consequences for a given set of antecedents. In addition, we employ abductive logic programming (ALP) regarding the situations to which the rules are applied as premises. Also, we restrict a legal knowledge-base to acyclic program to avoid the circulation of definitions, to justify the relevance of verdicts. Therefore, detecting cyclic parts of legal knowledge would be one of our objectives. The system is composed of two subsystems; we implement the preprocessor in Ruby to facilitate string manipulation, and the verifier in Prolog to exert the logical inference. Also, we employ XML format in the system to retain readability. In this study, we verify actual code of ordinances of Toyama prefecture, and show the experimental results.

  18. PET-based dose delivery verification in proton therapy: a GATE based simulation study of five PET system designs in clinical conditions.

    PubMed

    Robert, Charlotte; Fourrier, Nicolas; Sarrut, David; Stute, Simon; Gueth, Pierre; Grevillot, Loïc; Buvat, Irène

    2013-10-07

    PET is a promising technique for in vivo treatment verification in hadrontherapy. Three main PET geometries dedicated to in-beam treatment monitoring have been proposed in the literature: the dual-head PET geometry, the OpenPET geometry and the slanted-closed ring geometry. The aim of this work is to characterize the performance of two of these dedicated PET detectors in realistic clinical conditions. Several configurations of the dual-head PET and OpenPET systems were simulated using GATE v6.2. For the dual-head configuration, two aperture angles (15° and 45°) were studied. For the OpenPET system, two gaps between rings were investigated (110 and 160 mm). A full-ring PET system was also simulated as a reference. After preliminary evaluation of the sensitivity and spatial resolution using a Derenzo phantom, a real small-field head and neck treatment plan was simulated, with and without introducing patient displacements. No wash-out was taken into account. 3D maps of the annihilation photon locations were deduced from the PET data acquired right after the treatment session (5 min acquisition) using a dedicated OS-EM reconstruction algorithm. Detection sensitivity at the center of the field-of-view (FOV) varied from 5.2% (45° dual-head system) to 7.0% (full-ring PET). The dual-head systems had a more uniform efficiency within the FOV than the OpenPET systems. The spatial resolution strongly depended on the location within the FOV for the ϕ = 45° dual-head system and for the two OpenPET systems. All investigated architectures identified the magnitude of mispositioning introduced in the simulations within a 1.5 mm accuracy. The variability on the estimated mispositionings was less than 2 mm for all PET systems.

  19. PET-based dose delivery verification in proton therapy: a GATE based simulation study of five PET system designs in clinical conditions

    NASA Astrophysics Data System (ADS)

    Robert, Charlotte; Fourrier, Nicolas; Sarrut, David; Stute, Simon; Gueth, Pierre; Grevillot, Loïc; Buvat, Irène

    2013-10-01

    PET is a promising technique for in vivo treatment verification in hadrontherapy. Three main PET geometries dedicated to in-beam treatment monitoring have been proposed in the literature: the dual-head PET geometry, the OpenPET geometry and the slanted-closed ring geometry. The aim of this work is to characterize the performance of two of these dedicated PET detectors in realistic clinical conditions. Several configurations of the dual-head PET and OpenPET systems were simulated using GATE v6.2. For the dual-head configuration, two aperture angles (15° and 45°) were studied. For the OpenPET system, two gaps between rings were investigated (110 and 160 mm). A full-ring PET system was also simulated as a reference. After preliminary evaluation of the sensitivity and spatial resolution using a Derenzo phantom, a real small-field head and neck treatment plan was simulated, with and without introducing patient displacements. No wash-out was taken into account. 3D maps of the annihilation photon locations were deduced from the PET data acquired right after the treatment session (5 min acquisition) using a dedicated OS-EM reconstruction algorithm. Detection sensitivity at the center of the field-of-view (FOV) varied from 5.2% (45° dual-head system) to 7.0% (full-ring PET). The dual-head systems had a more uniform efficiency within the FOV than the OpenPET systems. The spatial resolution strongly depended on the location within the FOV for the ϕ = 45° dual-head system and for the two OpenPET systems. All investigated architectures identified the magnitude of mispositioning introduced in the simulations within a 1.5 mm accuracy. The variability on the estimated mispositionings was less than 2 mm for all PET systems.

  20. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  1. Shift Verification and Validation

    SciTech Connect

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G; Johnson, Seth R.; Godfrey, Andrew T.

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  2. Food additive carrageenan: Part II: A critical review of carrageenan in vivo safety studies.

    PubMed

    Weiner, Myra L

    2014-03-01

    Carrageenan (CGN) is a seaweed-derived high molecular weight (Mw) hydrocolloid, primarily used as a stabilizer and thickener in food. The safety of CGN regarding its use in food is reviewed. Based on experimental studies in animals, ingested CGN is excreted quantitatively in the feces. Studies have shown that CGN is not significantly degraded by low gastric pH or microflora in the gastrointestinal (GI) tract. Due to its Mw, structure and its stability when bound to protein, CGN is not significantly absorbed or metabolized. CGN also does not significantly affect the absorption of nutrients. Subchronic and chronic feeding studies in rodents indicate that CGN at doses up to 5% in the diet does not induce any toxicological effects other than soft stools or diarrhea, which are a common effect for non-digestible high molecular weight compounds. Review of several studies from numerous species indicates that food grade CGN does not produce intestinal ulceration at doses up to 5% in the diet. Effects of CGN on the immune system following parenteral administration are well known, but not relevant to food additive uses. The majority of the studies evaluating the immunotoxicity potential were conducted with CGN administered in drinking water or by oral gavage where CGN exists in a random, open structured molecular conformation, particularly the lambda form; hence, it has more exposure to the intestinal mucosa than when bound to protein in food. Based on the many animal subchronic and chronic toxicity studies, CGN has not been found to affect the immune system, as judged by lack of effects on organ histopathology, clinical chemistry, hematology, normal health, and the lack of target organ toxicities. In these studies, animals consumed CGN at orders of magnitude above levels of CGN in the human diet: ≥1000 mg/kg/d in animals compared to 18-40 mg/kg/d estimated in the human diet. Dietary CGN has been shown to lack carcinogenic, tumor promoter, genotoxic, developmental, and

  3. Analytical and experimental studies of ventilation systems subjected to simulated tornado conditions: Verification of the TVENT computer code

    SciTech Connect

    Martin, R.A.; Gregory, W.S.; Ricketts, C.I.; Smith, P.R.; Littleton, P.E.; Talbott, D.V.

    1988-04-01

    Analytical and experimental studies of ventilation systems have been conducted to verify the Los Alamos National Laboratory TVENT accident analysis computer code for simulated tornado conditions. This code was developed to be a user-friendly analysis tool for designers and regulatory personnel and was designed to predict pressure and flow transients in arbitrary ventilation systems. The experimental studies used two relatively simple, yet sensitive, physical systems designed using similitude analysis. These physical models were instrumented end-to-end for pressure and volumetric flow rate and then subjected to the worst credible tornado conditions using a special blowdown apparatus. We verified TVENT by showing that it successfully predicted our experimental results. By comparing experimental results from both physical models with TVENT results, we showed that we have derived the proper similitude relations (governed by compressibility effects) for all sizes of ventilation systems. As a by-product of our studies, we determined the need for fan speed variation modeling in TVENT. This modification was made and resulted in a significant improvement in our comparisons of analytical and experimental results.

  4. Dosimetric Study and Verification of Total Body Irradiation Using Helical Tomotherapy and its Comparison to Extended SSD Technique

    SciTech Connect

    Zhuang, Audrey H.; Liu An; Schultheiss, Timothy E.; Wong, Jeffrey Y.C.

    2010-01-01

    The American College of Radiology practice guideline for total body irradiation (TBI) requires a back-up treatment delivery system. This study investigates the development of helical tomotherapy (HT) for delivering TBI and compares it with conventional extended source-to-surface distance (X-SSD) technique. Four patients' head-to-thigh computed tomographic images were used in this study, with the target defined as the body volume without the left and right lungs. HT treatment plans with the standard TBI prescription (1.2 Gy/fx, 10 fractions) were generated and verified on phantoms. To compare HT plans with X-SSD treatment, the dose distribution of X-SSD technique was simulated using the Eclipse software. The average dose received by 90% of the target volume was 12.3 Gy (range, 12.2-12.4 Gy) for HT plans and 10.3 Gy (range, 10.08-10.58 Gy) for X-SSD plans (p < 0.001). The left and right lung median doses were 5.44 Gy and 5.40 Gy, respectively, for HT plans and 8.34 Gy and 8.95 Gy, respectively, for X-SSD treatment. The treatment planning time was comparable between the two methods. The beam delivery time of HT treatment was longer than X-SSD treatment. In conclusion, HT-based TBI plans have better dose coverage to the target and better dose sparing to the lungs compared with X-SSD technique, which applies dose compensators, lung blocks, and electron boosts. This study demonstrates that HT is possible for delivering TBI. Clinical validation of the feasibility of this approach would be of interest in the future.

  5. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  6. The impact of bismuth addition to sequential treatment on Helicobacter pylori eradication: A pilot study.

    PubMed

    Basyigit, Sebahat; Kefeli, Ayse; Sapmaz, Ferdane; Yeniova, Abdullah Ozgür; Asilturk, Zeliha; Hokkaomeroglu, Murat; Uzman, Metin; Nazligul, Yasar

    2015-10-25

    The success of the current anti-Helicobacter pylori (H. pylori) treatment protocols is reported to decrease by years, and research is needed to strengthen the H. pylori eradication treatment. Sequential treatment (ST), one of the treatment modalities for H. pylori eradication, includes amoxicillin 1 gr b.i.d and proton pump inhibitor b.i.d for first 5 days and then includes clarithromycin 500 mg b.i.d, metronidazole 500 mg b.i.d and a proton pump inhibitor b.i.d for remaining 5 days. In this study, we investigated efficacy and tolerability of bismuth addition in to ST. We included patients that underwent upper gastrointestinal endoscopy in which H. pylori infection was diagnosed by histological examination of antral and corporal gastric mucosa biopsy. Participants were randomly administered ST or bismuth containing ST (BST) protocols for the first-line H. pylori eradication therapy. Participants have been tested by urea breath test for eradication success 6 weeks after the completion of treatment. One hundred and fifty patients (93 female, 57 male) were enrolled. There were no significant differences in eradication rates for both intention to treat population (70.2%, 95% confidence interval [CI]: 66.3-74.1% vs. 71.8%, 95% CI: 61.8-81.7%, for ST and BST, respectively, p>0.05) and per protocol population (74.6%, 95% CI: 63.2-85.8% vs. 73.7%, 95% CI: 63.9-83.5% for ST and BST, respectively, p>0.05). Despite the undeniable effect of bismuth, there may be several possible reasons of unsatisfactory eradication success. Drug administration time, coadministration of other drugs, possible H. pylori resistance to bismuth may affect the eradication success. The addition of bismuth subcitrate to ST regimen does not provide significant increase in eradication rates.

  7. Experimental Study of Disruption of Columnar Grains During Rapid Solidification in Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Manogharan, Guha; Yelamanchi, Bharat; Aman, Ronald; Mahbooba, Zaynab

    2016-03-01

    Over the years, many studies have been conducted to study and analyze the grain structures of metal alloys during additive manufacturing to improve mechanical properties. In particular, columnar grains are observed predominantly during rapid solidification of molten metal. This leads to lower mechanical properties and requires expensive secondary heat-treatment processes. This study is aimed at disrupting the formation of columnar grain growth during rapid solidification using ultrasonic vibration and analyzes the effects on grain structure and mechanical properties. A gas-metal arc welder mounted on a Rep-Rap-based low-cost metal 3 Dimension printer was used to deposit ER70S-6 mild steel layers on a plate. A contact-type ultrasonic transducer with a control system to vary the frequency and power of the vibration was used. The effects of ultrasonic vibration were determined from the statistical analysis of microstructure and micro-indentation techniques on the deposited layer and heat-affected zone. It was found that both frequency and interaction between frequency and power had significant impact on the refinement of average grain size up to 10.64% and increased the number of grains by approximately 41.78%. Analysis of micro-indentation tests showed that there was an increase of approximately 14.30% in micro-hardness due to the applied frequency during rapid solidification. A pole diagram shows that application of vibration causes randomization of grain orientation. Along with the results from this study, further efforts in modeling and experimentation of multi-directional vibrations would lead to a better understanding of disrupting columnar grains in applications that use mechanical vibrations, such as welding, directed energy deposition, brazing, etc.

  8. Percutaneous Dorsal Instrumentation of Vertebral Burst Fractures: Value of Additional Percutaneous Intravertebral Reposition—Cadaver Study

    PubMed Central

    Krüger, Antonio; Schmuck, Maya; Noriega, David C.; Ruchholtz, Steffen; Baroud, Gamal; Oberkircher, Ludwig

    2015-01-01

    Purpose. The treatment of vertebral burst fractures is still controversial. The aim of the study is to evaluate the purpose of additional percutaneous intravertebral reduction when combined with dorsal instrumentation. Methods. In this biomechanical cadaver study twenty-eight spine segments (T11-L3) were used (male donors, mean age 64.9 ± 6.5 years). Burst fractures of L1 were generated using a standardised protocol. After fracture all spines were allocated to four similar groups and randomised according to surgical techniques (posterior instrumentation; posterior instrumentation + intravertebral reduction device + cement augmentation; posterior instrumentation + intravertebral reduction device without cement; and intravertebral reduction device + cement augmentation). After treatment, 100000 cycles (100–600 N, 3 Hz) were applied using a servohydraulic loading frame. Results. Overall anatomical restoration was better in all groups where the intravertebral reduction device was used (p < 0.05). In particular, it was possible to restore central endplates (p > 0.05). All techniques decreased narrowing of the spinal canal. After loading, clearance could be maintained in all groups fitted with the intravertebral reduction device. Narrowing increased in the group treated with dorsal instrumentation. Conclusions. For height and anatomical restoration, the combination of an intravertebral reduction device with dorsal instrumentation showed significantly better results than sole dorsal instrumentation. PMID:26137481

  9. Evaluating the addition of positive reinforcement for learning a frightening task: a pilot study with horses.

    PubMed

    Heleski, Camie; Bauson, Laura; Bello, Nora

    2008-01-01

    Horse training often relies upon negative reinforcement (NR). This study tested the hypothesis that adding positive reinforcement (PR) to NR would enhance learning in horses (n = 34) being taught to walk over a tarp (novel/typically frightening task). Subjects were Arabians, and the same person handled all of them. This person handled half "traditionally" (NR only)--that is, halter/lead were pulled; when horse stepped forward, pressure was released; process repeated until criterion met (horse crossed the tarp with little/no obvious anxiety). The same person handled the other half traditionally--but with addition of PR < food + verbal praise > (NR + PR). Subjects "failed" the task if they refused to walk onto the tarp after 10 min. Nine horses failed; 6 of 9 failures were from NR only--no significant difference detected (p = .41). The study detected no difference in time to first crossing of the tarp (p = .30) or total time to achieve calmness criterion (p = .67). Overall, adding PR did not significantly enhance learning this task. However, there were practical implications--adding PR made the task safer/less fatiguing for the handler.

  10. Synthesis, Characterization, Molecular Modeling, and DNA Interaction Studies of Copper Complex Containing Food Additive Carmoisine Dye.

    PubMed

    Shahabadi, Nahid; Akbari, Alireza; Jamshidbeigi, Mina; Khodarahmi, Reza

    2016-06-02

    A copper complex of carmoisine dye; [Cu(carmoisine)2(H2O)2]; was synthesized and characterized by using physico-chemical and spectroscopic methods. The binding of this complex with calf thymus (ct) DNA was investigated by circular dichroism, absorption studies, emission spectroscopy, and viscosity measurements. UV-vis results confirmed that the Cu complex interacted with DNA to form a ground-state complex and the observed binding constant (2× 10(4) M(-1)) is more in keeping with the groove bindings with DNA. Furthermore, the viscosity measurement result showed that the addition of complex causes no significant change on DNA viscosity and it indicated that the intercalation mode is ruled out. The thermodynamic parameters are calculated by van't Hoff equation, which demonstrated that hydrogen bonds and van der Waals interactions played major roles in the reaction. The results of circular dichroism (CD) suggested that the complex can change the conformation of DNA from B-like form toward A-like conformation. The cytotoxicity studies of the carmoisine dye and its copper complex indicated that both of them had anticancer effects on HT-29 (colon cancer) cell line and they may be new candidates for treatment of the colon cancer.

  11. Density functional theory study of the effects of alloying additions on sulfur adsorption on nickel surfaces

    NASA Astrophysics Data System (ADS)

    Malyi, Oleksandr I.; Chen, Zhong; Kulish, Vadym V.; Bai, Kewu; Wu, Ping

    2013-01-01

    Reactions of hydrogen sulfide (H2S) with Nickel/Ytrria-doped zirconia (Ni/YDZ) anode materials might cause degradation of the performance of solid oxide fuel cells when S containing fuels are used. In this paper, we employ density functional theory to investigate S adsorption on metal (M)-doped and undoped Ni(0 0 1) and Ni(1 1 1) surfaces. Based on the performed calculations, we analyze the effects of 12 alloying additions (Ag, Au, Al, Bi, Cd, Co, Cu, Fe, Sn, Sb, V, and Zn) on the temperature of transition between clean (S atoms do not adsorb on the surfaces) and contaminated (S atoms can adsorb on the surfaces spontaneously) M-doped Ni surfaces for different concentrations of H2S in the fuel. Predicted results are consistent with many experimental studies relevant to S poisoning of both Ni/YDZ and M-doped Ni/YDZ anode materials. This study is important to understand S poisoning phenomena and to develop new S tolerant anode materials.

  12. An fMRI study of magnitude comparison and exact addition in children.

    PubMed

    Meintjes, Ernesta M; Jacobson, Sandra W; Molteno, Christopher D; Gatenby, J Christopher; Warton, Christopher; Cannistraci, Christopher J; Gore, John C; Jacobson, Joseph L

    2010-04-01

    By contrast to the adult literature, in which a consistent parietofrontal network for number processing has been identified, the data from studies of number processing in children have been less consistent, probably due to differences in study design and control conditions. Number processing was examined using functional magnetic resonance imaging in 18 right-handed children (8-12 years) from the Cape Coloured community in Cape Town, South Africa, using Proximity Judgment and Exact Addition (EA) tasks. The findings were consistent with the hypothesis that, as in adults, the anterior horizontal intraparietal sulcus (HIPS) plays a major role in the representation and manipulation of quantity in children. The posterior medial frontal cortex, believed to be involved in performance monitoring in more complex arithmetic manipulations in adults, was extensively activated even for relatively simple symbolic number processing in the children. Other areas activated to a greater degree in the children included the left precentral sulcus, which may mediate number knowledge and, for EA, the head of the caudate nucleus, which is part of a fronto-subcortical circuit involved in the behavioral execution of sequences. Two regions that have been linked to number processing in adults - the angular gyrus and posterior superior parietal lobule - were not activated in the children. The data are consistent with the inference that although the functional specialization of the anterior HIPS may increase as symbolic number processing becomes increasingly automatic, this region and other elements of the parietofrontal network identified in adults are already reliably and robustly activated by middle childhood.

  13. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  14. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  15. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  16. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  17. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  18. Improved fluid dynamics similarity, analysis and verification. Part 5: Analytical and experimental studies of thermal stratification phenomena

    NASA Technical Reports Server (NTRS)

    Winter, E. R. F.; Schoenhals, R. J.; Haug, R. I.; Libby, T. L.; Nelson, R. N.; Stevenson, W. H.

    1968-01-01

    The stratification behavior of a contained fluid subjected to transient free convection heat transfer was studied. A rectangular vessel was employed with heat transfer from two opposite walls of the vessel to the fluid. The wall temperature was increased suddenly to initiate the process and was then maintained constant throughout the transient stratification period. Thermocouples were positioned on a post at the center of the vessel. They were adjusted so that temperatures could be measured at the fluid surface and at specific depths beneath the surface. The predicted values of the surface temperature and the stratified layer thickness were found to agree reasonably well with the experimental measurements. The experiments also provided information on the transient centerline temperature distribution and the transient flow distribution.

  19. Integrated guidance, navigation and control verification plan primary flight system. [space shuttle avionics integration

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.

  20. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  1. Theoretical study of ignition reactions of linear symmetrical monoethers as potential diesel fuel additives: DFT calculations

    NASA Astrophysics Data System (ADS)

    Marrouni, Karim El; Abou-Rachid, Hakima; Kaliaguine, Serge

    This work investigates the chemical reactivity of four linear symmetrical monoethers with molecular oxygen. Such oxygenated compounds may be considered as potential diesel fuel additives in order to reduce the ignition delay in diesel fuel engines. For this purpose, a kinetic study is proposed to clarify the relation between the molecular structure of the fuel molecule and its ignition properties. To this end, DFT calculations were performed for these reactions using B3LYP/6-311G(d,p) and BH&HLYP/6-311G(d,p) to determine structures, energies, and vibrational frequencies of stationary points as well as activated complexes involved in each gas-phase combustion initiation reaction of the monoethers CH3OCH3, C2H5OC2H5, C3H7OC3H7, or C4H9OC4H9 with molecular oxygen. This theoretical kinetic study was carried out using electronic structure results and the transition state theory, to assess the rate constants for all studied combustion reactions. As it has been shown in our previous work [Abou-Rachid et al., J Mol Struct (Theochem) 2003, 621, 293], the cetane number (CN) of a pure organic molecule depends on the initiation rate of its homogeneous gas-phase reaction with molecular oxygen. Indeed, the calculated initiation rate constants of the H-abstraction process of linear monoethers with O2 show a very good correlation with experimental CN data of these pure compounds at T D 1,000 K. This temperature is representative of the operating conditions of a diesel fuel engine.0

  2. Experimental study of enhanced heat transfer by addition of CuO nanoparticle

    NASA Astrophysics Data System (ADS)

    Jesumathy, Stella; Udayakumar, M.; Suresh, S.

    2012-06-01

    An energy storage system has been designed to study the thermal characteristics of paraffin wax with an embedded nano size copper oxide (CuO) particle. This paper presents studies conducted on phase transition times, heat fraction as well as heat transfer characteristics of paraffin wax as phase change material (PCM) embedded with CuO nanoparticles. 40 nm mean size CuO particles of 2, 5 and 10% by weight were dispersed in PCM for this study. Experiments were performed on a heat exchanger with 1.5-10 l/min of heat transfer fluid (HTF) flow. Time-based variations of the temperature distributions are revealed from the results of observations of melting and solidification curves. The results strongly suggested that the thermal conductivity enhances 6, 6.7 and 7.8% in liquid state and in dynamic viscosity it enhances by 5, 14 and 30% with increasing mass fraction of the CNEPs. The thermal conductivity ratio of the composites can be augmented by a factor up to 1.3. The heat transfer coefficient during solidification increased about 78% for the maximum flow rate. The analysis of experimental results reveals that the addition of copper oxide nanoparticles to the paraffin wax enhances both the conduction and natural convection very effectively in composites and in paraffin wax. The paraffin wax-based composites have great potential for energy storage applications like industrial waste heat recovery, solar thermal applications and solar based dynamic space power generation with optimal fraction of copper oxide nanoparticles.

  3. Real time bolt preload monitoring using piezoceramic transducers and time reversal technique—a numerical study with experimental verification

    NASA Astrophysics Data System (ADS)

    Parvasi, Seyed Mohammad; Ho, Siu Chun Michael; Kong, Qingzhao; Mousavi, Reza; Song, Gangbing

    2016-08-01

    Bolted joints are ubiquitous structural elements, and form critical connections in mechanical and civil structures. As such, loosened bolted joints may lead to catastrophic failures of these structures, thus inspiring a growing interest in monitoring of bolted joints. A novel energy based wave method is proposed in this study to monitor the axial load of bolted joint connections. In this method, the time reversal technique was used to focus the energy of a piezoelectric (PZT)-generated ultrasound wave from one side of the interface to be measured as a signal peak by another PZT transducer on the other side of the interface. A tightness index (TI) was defined and used to correlate the peak amplitude to the bolt axial load. The TI bypasses the need for more complex signal processing required in other energy-based methods. A coupled, electro-mechanical analysis with elasto-plastic finite element method was used to simulate and analyze the PZT based ultrasonic wave propagation through the interface of two steel plates connected by a single nut and bolt connection. Numerical results, backed by experimental results from testing on a bolted connection between two steel plates, revealed that the peak amplitude of the focused signal increases as the bolt preload (torque level) increases due to the enlarging true contact area of the steel plates. The amplitude of the focused peak saturates and the TI reaches unity as the bolt axial load reaches a threshold value. These conditions are associated with the maximum possible true contact area between the surfaces of the bolted connection.

  4. A feasibility study of a molecular-based patient setup verification method using a parallel-plane PET system

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Satoshi; Ishikawa, Masayori; Bengua, Gerard; Sutherland, Kenneth; Nishio, Teiji; Tanabe, Satoshi; Miyamoto, Naoki; Suzuki, Ryusuke; Shirato, Hiroki

    2011-02-01

    A feasibility study of a novel PET-based molecular image guided radiation therapy (m-IGRT) system was conducted by comparing PET-based digitally reconstructed planar image (PDRI) registration with radiographic registration. We selected a pair of opposing parallel-plane PET systems for the practical implementation of this system. Planar images along the in-plane and cross-plane directions were reconstructed from the parallel-plane PET data. The in-plane and cross-plane FWHM of the profile of 2 mm diameter sources was approximately 1.8 and 8.1 mm, respectively. Therefore, only the reconstructed in-plane image from the parallel-plane PET data was used in the PDRI registration. In the image registration, five different sizes of 18F cylindrical sources (diameter: 8, 12, 16, 24, 32 mm) were used to determine setup errors. The data acquisition times were 1, 3 and 5 min. Image registration was performed by five observers to determine the setup errors from PDRI registration and radiographic registration. The majority of the mean registration errors obtained from the PDRI registration were not significantly different from those obtained from the radiographic registration. Acquisition time did not appear to result in significant differences in the mean registration error. The mean registration error for the PDRI registration was found to be 0.93 ± 0.33 mm. This is not statistically different from the radiographic registration which had a mean registration error of 0.92 ± 0.27 mm. Our results suggest that m-IGRT image registration using PET-based reconstructed planar images along the in-plane direction is feasible for clinical use if PDRI registration is performed at two orthogonal gantry angles.

  5. A simulation study of a dual-plate in-room PET system for dose verification in carbon ion therapy

    NASA Astrophysics Data System (ADS)

    Chen, Ze; Hu, Zheng-Guo; Chen, Jin-Da; Zhang, Xiu-Ling; Guo, Zhong-Yan; Xiao, Guo-Qing; Sun, Zhi-Yu; Huang, Wen-Xue; Wang, Jian-Song

    2014-08-01

    During carbon ion therapy, lots of positron emitters such as 11C, 15O, 10C are generated in irradiated tissues by nuclear reactions, and can be used to track the carbon beam in the tissue by a positron emission tomography (PET) scanner. In this study, an dual-plate in-room PET scanner has been designed and evaluated based on the GATE simulation platform to monitor patient dose in carbon ion therapy. The dual-plate PET is designed to avoid interference with the carbon beamline and with patient positioning. Its performance was compared with that of four-head and full-ring PET scanners. The dual-plate, four-head and full-ring PET scanners consisted of 30, 60, 60 detector modules, respectively, with a 36 cm distance between directly opposite detector modules for dose deposition measurements. Each detector module consisted of a 24×24 array of 2 mm×2 mm×18 mm LYSO pixels coupled to a Hamamatsu H8500 PMT. To estimate the production yield of positron emitters, a 10 cm×15 cm×15 cm cuboid PMMA phantom was irradiated with 172, 200, 250 MeV/u 12C beams. 3D images of the activity distribution measured by the three types of scanner are produced by an iterative reconstruction algorithm. By comparing the longitudinal profile of positron emitters along the carbon beam path, it is indicated that use of the dual-plate PET scanner is feasible for monitoring the dose distribution in carbon ion therapy.

  6. Simplifying EPID dosimetry for IMRT treatment verification

    SciTech Connect

    Pecharroman-Gallego, R.; Mans, Anton; Sonke, Jan-Jakob; Stroom, Joep C.; Olaciregui-Ruiz, Igor; Herk, Marcel van; Mijnheer, Ben J.

    2011-02-15

    Purpose: Electronic portal imaging devices (EPIDs) are increasingly used for IMRT dose verification, both pretreatment and in vivo. In this study, an earlier developed backprojection model has been modified to avoid the need for patient-specific transmission measurements and, consequently, leads to a faster procedure. Methods: Currently, the transmission, an essential ingredient of the backprojection model, is estimated from the ratio of EPID measurements with and without a phantom/patient in the beam. Thus, an additional irradiation to obtain ''open images'' under the same conditions as the actual phantom/patient irradiation is required. However, by calculating the transmission of the phantom/patient in the direction of the beam instead of using open images, this extra measurement can be avoided. This was achieved by using a model that includes the effect of beam hardening and off-axis dependence of the EPID response on photon beam spectral changes. The parameters in the model were empirically obtained by performing EPID measurements using polystyrene slab phantoms of different thickness in 6, 10, and 18 MV photon beams. A theoretical analysis to verify the sensitivity of the model with patient thickness changes was performed. The new model was finally applied for the analysis of EPID dose verification measurements of step-and-shoot IMRT treatments of head and neck, lung, breast, cervix, prostate, and rectum patients. All measurements were carried out using Elekta SL20i linear accelerators equipped with a hydrogenated amorphous silicon EPID, and the IMRT plans were made using PINNACLE software (Philips Medical Systems). Results: The results showed generally good agreement with the dose determined using the old model applying the measured transmission. The average differences between EPID-based in vivo dose at the isocenter determined using either the new model for transmission and its measured value were 2.6{+-}3.1%, 0.2{+-}3.1%, and 2.2{+-}3.9% for 47 patients

  7. Beyond the Call of Duty: A Qualitative Study of Teachers' Additional Responsibilities Related to Sexuality Education

    ERIC Educational Resources Information Center

    Eisenberg, Marla E.; Madsen, Nikki; Oliphant, Jennifer A.; Resnick, Michael

    2011-01-01

    Seven focus groups were conducted with sexuality educators in Minnesota to explore ways that teaching sexuality education differs from teaching other health education content and to determine if additional supports or resources are needed for sexuality educators. Teachers described many specific additional responsibilities or concerns related to…

  8. Additional Language Teaching within the International Baccalaureate Primary Years Programme: A Comparative Study

    ERIC Educational Resources Information Center

    Lebreton, Marlène

    2014-01-01

    The International Baccalaureate Primary Years Programme supports the learning of languages and cultures, but the role of the additional language within this programme is often unclear. There remains a great variability in schools regarding the frequency of lessons and the way that the additional language is taught within the Primary Years…

  9. Measurement and modeling of exposure to selected air toxics for health effects studies and verification by biomarkers.

    PubMed

    Harrison, Roy M; Delgado-Saborit, Juana Maria; Baker, Stephen J; Aquilina, Noel; Meddings, Claire; Harrad, Stuart; Matthews, Ian; Vardoulakis, Sotiris; Anderson, H Ross

    2009-06-01

    with the adsorbent resins Tenax GR and Carbotrap, and separate tubes for the collection of 1,3-butadiene were packed with Carbopack B and Carbosieve S-III. After sampling, the tubes were analyzed by means of a thermal desorber interfaced with a gas chromatograph-mass spectrometer (GC-MS). Particle-phase PAHs collected onto a quartz-fiber filter were extracted with solvent, purified, and concentrated before being analyzed with a GC-MS. Urinary biomarkers were analyzed by liquid chromatography-tandem mass spectrometry (LC-MS-MS). Both the environmental concentrations and personal exposure concentrations measured in this study are lower than those in the majority of earlier published work, which is consistent with the reported application of abatement measures to the control of air toxics emissions. The environmental concentration data clearly demonstrate the influence of traffic sources and meteorologic conditions leading to higher air toxics concentrations in the winter and during peak-traffic hours. The seasonal effect was also observed in indoor environments, where indoor sources add to the effects of the previously identified outdoor sources. The variability of personal exposure concentrations of VOCs and PAHs mainly reflects the range of activities the subjects engaged in during the five-day period of sampling. A number of generic factors have been identified to influence personal exposure concentrations to VOCs, such as the presence of an integral garage (attached to the home), exposure to environmental tobacco smoke (ETS), use of solvents, and commuting. In the case of the medium- and high-molecular-weight PAHs, traffic and ETS are important contributions to personal exposure. Personal exposure concentrations generally exceed home indoor concentrations, which in turn exceed outdoor concentrations. The home microenvironment is the dominant individual contributor to personal exposure. However, for those subjects with particularly high personal exposures, activities

  10. Meiofaunal and bacterial community response to diesel additions in a microcosm study.

    PubMed

    Lindgren, J Fredrik; Hassellöv, Ida-Maja; Dahllöf, Ingela

    2012-03-01

    Effects of low PAH-containing diesel were studied in a 60-day microcosm experiment at PAH concentrations 130, 1300 and 13,000μg/kg sediment. Nutrient fluxes, potential nitrification and meiofaunal community composition were analysed at three time points. Changed ∑NOx-fluxes indicated reduced sediment nitrification in Medium and High with time, in agreement with lowered potential nitrification rates in all treatments. Reduction in silicate and phosphate fluxes over time suggested severe effects on activity of meiofauna. Reduced activity increased the anoxic sediment layer, which could have contributed to the changed ∑NOx-fluxes. There were significant differences in meiofaunal community composition after 30 and 60days in Medium and High. Changes were due to increasing numbers of harpacticoids and the foraminiferan group Rotaliina, as well as decreasing numbers of Nematodes and the foraminiferan group Reophax. In spite of the low PAH-level, small additions of this diesel can still have pronounced effects on meiofaunal and bacterial communities.

  11. Experimental study of combustion characteristics of nanoscale metal and metal oxide additives in biofuel (ethanol)

    PubMed Central

    2011-01-01

    An experimental investigation of the combustion behavior of nano-aluminum (n-Al) and nano-aluminum oxide (n-Al2O3) particles stably suspended in biofuel (ethanol) as a secondary energy carrier was conducted. The heat of combustion (HoC) was studied using a modified static bomb calorimeter system. Combustion element composition and surface morphology were evaluated using a SEM/EDS system. N-Al and n-Al2O3 particles of 50- and 36-nm diameters, respectively, were utilized in this investigation. Combustion experiments were performed with volume fractions of 1, 3, 5, 7, and 10% for n-Al, and 0.5, 1, 3, and 5% for n-Al2O3. The results indicate that the amount of heat released from ethanol combustion increases almost linearly with n-Al concentration. N-Al volume fractions of 1 and 3% did not show enhancement in the average volumetric HoC, but higher volume fractions of 5, 7, and 10% increased the volumetric HoC by 5.82, 8.65, and 15.31%, respectively. N-Al2O3 and heavily passivated n-Al additives did not participate in combustion reactively, and there was no contribution from Al2O3 to the HoC in the tests. A combustion model that utilized Chemical Equilibrium with Applications was conducted as well and was shown to be in good agreement with the experimental results. PMID:21711760

  12. A theoretical study of wave dispersion and thermal conduction for HMX/additive interfaces

    NASA Astrophysics Data System (ADS)

    Long, Yao; Chen, Jun

    2014-04-01

    The wave dispersion rule for non-uniform material is useful for ultrasonic inspection and engine life prediction, and also is key in achieving an understanding of the energy dissipation and thermal conduction properties of solid material. On the basis of linear response theory and molecular dynamics, we derive a set of formulas for calculating the wave dispersion rate of interface systems, and study four kinds of interfaces inside plastic bonded explosives: HMX/{HMX, TATB, F2312, F2313}. (HMX: octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine; TATB: 1,3,5-triamino-2,4,6-trinitrobenzene; F2312, F2313: fluoropolymers). The wave dispersion rate is obtained over a wide frequency range from kHz to PHz. We find that at low frequency, the rate is proportional to the square of the frequency, and at high frequency, the rate couples with the molecular vibration modes at the interface. By using the results, the thermal conductivities of HMX/additive interfaces are derived, and a physical model is built for describing the total thermal conductivity of mixture explosives, including HMX multi-particle systems and {TATB, F2312, F2313}-coated HMX.

  13. Tackling missing data in community health studies using additive LS-SVM classifier.

    PubMed

    Wang, Guanjin; Deng, Zhaohong; Choi, Kup-Sze

    2016-12-01

    Missing data is a common issue in community health and epidemiological studies. Direct removal of samples with missing data can lead to reduced sample size and information bias, which deteriorates the significance of the results. While data imputation methods are available to deal with missing data, they are limited in performance and could introduce noises into the dataset. Instead of data imputation, a novel method based on additive least square support vector machine (LS-SVM) is proposed in this paper for predictive modeling when the input features of the model contain missing data. The method also determines simultaneously the influence of the features with missing values on the classification accuracy using the fast leave-one-out cross-validation strategy. The performance of the method is evaluated by applying it to predict the quality of life (QOL) of elderly people using health data collected in the community. The dataset involves demographics, socioeconomic status, health history and the outcomes of health assessments of 444 community-dwelling elderly people, with 5% to 60% of data missing in some of the input features. The QOL is measured using a standard questionnaire of the World Health Organization. Results show that the proposed method outperforms four conventional methods for handling missing data - case deletion, feature deletion, mean imputation and K-nearest neighbor imputation, with the average QOL prediction accuracy reaching 0.7418. It is potentially a promising technique for tackling missing data in community health research and other applications.

  14. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  15. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... addition to the benefit of providing high volume, centralized SSN verification services to the business community in a secure manner, CBSV provides us with cost and workload management benefits. New Information... ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social...

  16. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  17. Validation, Verification and Certification of Embedded Systems

    DTIC Science & Technology

    2005-10-01

    Walkthroughs (pluralistic, cognitive ) • Usability tests • Thinking aloud • Measured performance • Field usability testing • Follow-up studies...VERIFICATION AND VALIDATION: CURRENT AND BEST PRACTICE RTO-TR-IST-027 3 - 11 Cognitive walkthrough is a technique for evaluating user interfaces by...analysing the mental processes required of users. Like heuristic evaluation , the results are based on the judgement of the cognitive walkthrough analyst

  18. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John

    2006-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  19. Does Addition of Propolis to Glass Ionomer Cement Alter its Physicomechanical Properties? An In Vitro Study.

    PubMed

    Subramaniam, P; Girish Babu, K L; Neeraja, G; Pillai, S

    Propolis is a natural resinous substance produced by honey bees. The antimicrobial effects of glass ionomer cement have been shown to improve with the addition of propolis; however its effect on the physicomechanical properties of the cement is not known.

  20. Comparative Studies of Cathodically-Promoted and Base-Catalyzed Michael Addition Reactions of Levoglucosenone.

    PubMed

    Samet, Alexander V.; Niyazymbetov, Murat E.; Semenov, Victor V.; Laikhter, Andrei L.; Evans, Dennis H.

    1996-12-13

    Regioselective Michael addition of nitro and heterocyclic compounds to levoglucosenone, 1, is effectively catalyzed by amines and also by cathodic electrolysis. In comparison to the base-catalyzed reaction, it was found that under electrochemical conditions the reaction proceeds under milder conditions and with higher yields. Cathodically-initiated Michael addition of thiols to levoglucosenone using small currents produces the previously unknown threo addition product in several instances. The normal erythro isomer, identified as the kinetic product, tends to be formed when large currents are used. In contrast, slow, low current electrolyses promote equilibration of the two forms so that erythro can be converted to threo by the retro reaction and readdition. Addition of 2-naphthalenethiol to (R)-(+)-apoverbenone is also reported.

  1. Study of metal whiskers growth and mitigation technique using additive manufacturing

    NASA Astrophysics Data System (ADS)

    Gullapalli, Vikranth

    For years, the alloy of choice for electroplating electronic components has been tin-lead (Sn-Pb) alloy. However, the legislation established in Europe on July 1, 2006, required significant lead (Pb) content reductions from electronic hardware due to its toxic nature. A popular alternative for coating electronic components is pure tin (Sn). However, pure tin has the tendency to spontaneously grow electrically conductive Sn whisker during storage. Sn whisker is usually a pure single crystal tin with filament or hair-like structures grown directly from the electroplated surfaces. Sn whisker is highly conductive, and can cause short circuits in electronic components, which is a very significant reliability problem. The damages caused by Sn whisker growth are reported in very critical applications such as aircraft, spacecraft, satellites, and military weapons systems. They are also naturally very strong and are believed to grow from compressive stresses developed in the Sn coating during deposition or over time. The new directive, even though environmentally friendly, has placed all lead-free electronic devices at risk because of whisker growth in pure tin. Additionally, interest has occurred about studying the nature of other metal whiskers such as zinc (Zn) whiskers and comparing their behavior to that of Sn whiskers. Zn whiskers can be found in flooring of data centers which can get inside electronic systems during equipment reorganization and movement and can also cause systems failure. Even though the topic of metal whiskers as reliability failure has been around for several decades to date, there is no successful method that can eliminate their growth. This thesis will give further insights towards the nature and behavior of Sn and Zn whiskers growth, and recommend a novel manufacturing technique that has potential to mitigate metal whiskers growth and extend life of many electronic devices.

  2. Chemostat Studies of TCE-Dehalogenating Anaerobic Consortia under Excess and Limited Electron Donor Addition

    NASA Astrophysics Data System (ADS)

    Semprini, L.; Azizian, M.; Green, J.; Mayer-Blackwell, K.; Spormann, A. M.

    2015-12-01

    Two cultures - the Victoria Strain (VS) and the Evanite Strain (EV), enriched with the organohalide respiring bacteria Dehalococcoides mccartyi - were grown in chemostats for more than 4 years at a mean cell residence time of 50 days. The slow doubling rate represents growth likely experienced in the subsurface. The chemostats were fed formate as an electron donor and trichloroethene (TCE) as the terminal electron acceptor. Under excess formate conditions, stable operation was observed with respect to TCE transformation, steady-state hydrogen (H2) concentrations (40 nM), and the structure of the dehalogenating community. Both cultures completely transformed TCE to ethene, with minor amounts of vinyl chloride (VC) observed, along with acetate formation. When formate was limited, TCE was transformed incompletely to ethene (40-60%) and VC (60- 40%), and H2 concentrations ranged from 1 to 3 nM. The acetate concentration dropped below detection. Batch kinetic studies of TCE transformation with chemostat harvested cells found transformation rates of c-DCE and VC were greatly reduced when the cells were grown with limited formate. Upon increasing formate addition to the chemostats, from limited to excess, essentially complete transformation of TCE to ethene was achieved. The increase in formate was associated with an increase in H2 concentration and the production of acetate. Results of batch kinetic tests showed increases in transformation rates for TCE and c-DCE by factors of 3.5 and 2.5, respectively, while VC rates increased by factors of 33 to 500, over a six month period. Molecular analysis of chemostat samples is being performed to quantify the changes in copy numbers of reductase genes and to determine whether shifts in the strains of Dehalococcoides mccartyi where responsible for the observed rate increases. The results demonstrate the importance of electron donor supply for successful in-situ remediation.

  3. Formal verification of human-automation interaction.

    PubMed

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  4. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  5. Woodward Effect Experimental Verifications

    NASA Astrophysics Data System (ADS)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  6. 78 FR 68461 - Guidance for Industry: Studies To Evaluate the Utility of Anti-Salmonella Chemical Food Additives...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-14

    ... Anti- Salmonella Chemical Food Additives in Feeds; Request for Comments AGENCY: Food and Drug... revising the guidance entitled ``Guidance for Industry: Studies to Evaluate the Utility of Anti-Salmonella... Guidance for Industry: Studies to Evaluate the Utility of Anti-Salmonella Chemical Food Additives in...

  7. FINAL REPORT –INDEPENDENT VERIFICATION SURVEY SUMMARY AND RESULTS FOR THE ARGONNE NATIONAL LABORATORY BUILDING 330 PROJECT FOOTPRINT, ARGONNE, ILLINOIS

    SciTech Connect

    ERIKA N. BAILEY

    2012-02-29

    ORISE conducted onsite verification activities of the Building 330 project footprint during the period of June 6 through June 7, 2011. The verification activities included technical reviews of project documents, visual inspections, radiation surface scans, and sampling and analysis. The draft verification report was issued in July 2011 with findings and recommendations. The contractor performed additional evaluations and remediation.

  8. Stationary spiraling eddies in presence of polar amplification of global warming as a governing factor of ecology of Greenland seals White Sea population: results of verification study

    NASA Astrophysics Data System (ADS)

    Melentyev, K.; Chernook, V.; Melentyev, V.

    2003-04-01

    Ice-associated forms of marine mammals are representatives of a high level of fodder chains in the ocean and taxation of population number for different group, as assessment of ecology and animal welfare are the important tasks for marine biology, ecology, fishery and other application uses. Many problems create a global warming and antropogenical impact on marine and coastal ecosystem. In order to investigate ice covered Arctic Ocean and charting the number of seals were performed annual inspections onboard research aircraft PINRO "Arktika". Multi-spectral airborne and satellite observations were fulfilled regularly from Barents and White Sea to the Bering and Okhotsk Sea (1996-2002). A contemporary status of different group of sea mammals was evaluated, where number of adults and pups were checked separately. In situ observations were provided with using helicopter and icebreaker for gathering a water samples and ice cores (with following biochemical and toxicological analysis). A prevailing part of life cycle of Greenland seals (harp seal) is strongly depended from winter hydrology (water masses, stable currents, meandering fronts, stationary eddies) and closely connected with type of ice (pack, fast ice) and other parameters of ice (age, origin, salinity, ice edge.). First-year ice floes which has a specific properties and distinctive features are used by harp seals for pupping, lactation, molting, pairing and resting. Ringed seals, inversely, use for corresponding purposes only fast-ice. Different aspects of ecology, and migration features of harp seals were analyzed in frame of verification study. It was revealed a scale of influence of winter severity and wind regime, but stationary eddies in the White Sea is most effective governing factor (novelty). Following relationship " eddies - ecology of Greenland seal White Sea population " will be discussed: A) regularities of eddies formation and their spatial arrangement, temporal (seasonal and annual

  9. Crowd-Sourced Verification of Computational Methods and Data in Systems Toxicology: A Case Study with a Heat-Not-Burn Candidate Modified Risk Tobacco Product.

    PubMed

    Poussin, Carine; Belcastro, Vincenzo; Martin, Florian; Boué, Stéphanie; Peitsch, Manuel C; Hoeng, Julia

    2017-02-09

    Systems toxicology intends to quantify the effect of toxic molecules in biological systems and unravel their mechanisms of toxicity. The development of advanced computational methods is required for analyzing and integrating high throughput data generated for this purpose as well as for extrapolating predictive toxicological outcomes and risk estimates. To ensure the performance and reliability of the methods and verify conclusions from systems toxicology data analysis, it is important to conduct unbiased evaluations by independent third parties. As a case study, we report here the results of an independent verification of methods and data in systems toxicology by crowdsourcing. The sbv IMPROVER systems toxicology computational challenge aimed to evaluate computational methods for the development of blood-based gene expression signature classification models with the ability to predict smoking exposure status. Participants created/trained models on blood gene expression data sets including smokers/mice exposed to 3R4F (a reference cigarette) or noncurrent smokers/Sham (mice exposed to air). Participants applied their models on unseen data to predict whether subjects classify closer to smoke-exposed or nonsmoke exposed groups. The data sets also included data from subjects that had been exposed to potential modified risk tobacco products (MRTPs) or that had switched to a MRTP after exposure to conventional cigarette smoke. The scoring of anonymized participants' predictions was done using predefined metrics. The top 3 performers' methods predicted class labels with area under the precision recall scores above 0.9. Furthermore, although various computational approaches were used, the crowd's results confirmed our own data analysis outcomes with regards to the classification of MRTP-related samples. Mice exposed directly to a MRTP were classified closer to the Sham group. After switching to a MRTP, the confidence that subjects belonged to the smoke-exposed group

  10. Can ligand addition to soil enhance Cd phytoextraction? A mechanistic model study.

    PubMed

    Lin, Zhongbing; Schneider, André; Nguyen, Christophe; Sterckeman, Thibault

    2014-11-01

    Phytoextraction is a potential method for cleaning Cd-polluted soils. Ligand addition to soil is expected to enhance Cd phytoextraction. However, experimental results show that this addition has contradictory effects on plant Cd uptake. A mechanistic model simulating the reaction kinetics (adsorption on solid phase, complexation in solution), transport (convection, diffusion) and root absorption (symplastic, apoplastic) of Cd and its complexes in soil was developed. This was used to calculate plant Cd uptake with and without ligand addition in a great number of combinations of soil, ligand and plant characteristics, varying the parameters within defined domains. Ligand addition generally strongly reduced hydrated Cd (Cd(2+)) concentration in soil solution through Cd complexation. Dissociation of Cd complex ([Formula: see text]) could not compensate for this reduction, which greatly lowered Cd(2+) symplastic uptake by roots. The apoplastic uptake of [Formula: see text] was not sufficient to compensate for the decrease in symplastic uptake. This explained why in the majority of the cases, ligand addition resulted in the reduction of the simulated Cd phytoextraction. A few results showed an enhanced phytoextraction in very particular conditions (strong plant transpiration with high apoplastic Cd uptake capacity), but this enhancement was very limited, making chelant-enhanced phytoextraction poorly efficient for Cd.

  11. Mechanical characterization of filler sandcretes with rice husk ash additions. Study applied to Senegal

    SciTech Connect

    Cisse, I.K.; Laquerbe, M.

    2000-01-01

    To capitalize on the local materials of Senegal (agricultural and industrial wastes, residual fines from crushing process, sands from dunes, etc.), rise husk ash and residues of industrial and agricultural wastes have been used as additions in sandcretes. The mechanical resistance of sandcrete blocks obtained when unground ash (and notably the ground ash) is added reveals that there is an increase in performance over the classic mortar blocks. In addition, the use of unground rice husk ash enables production of a lightweight sandcrete with insulating properties, at a reduced cost. The ash pozzolanic reactivity explains the high strengths obtained.

  12. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    PubMed Central

    Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca

    2013-01-01

    Summary The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG ‡ and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  13. Studies on the Food Additive Propyl Gallate: Synthesis, Structural Characterization, and Evaluation of the Antioxidant Activity

    ERIC Educational Resources Information Center

    Garrido, Jorge; Garrido, E. Manuela; Borges, Fernanda

    2012-01-01

    Antioxidants are additives largely used in industry for delaying, retarding, or preventing the development of oxidative deterioration. Propyl gallate (E310) is a phenolic antioxidant extensively used in the food, cosmetics, and pharmaceutical industries. A series of lab experiments have been developed to teach students about the importance and…

  14. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  15. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  16. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  17. Uncertainty Estimation in Intensity-Modulated Radiotherapy Absolute Dosimetry Verification

    SciTech Connect

    Sanchez-Doblado, Francisco . E-mail: paco@us.es; Hartmann, Guenther H.; Pena, Javier; Capote, Roberto; Paiusco, Marta; Rhein, Bernhard; Leal, Antonio; Lagares, Juan Ignacio

    2007-05-01

    Purpose: Intensity-modulated radiotherapy (IMRT) represents an important method for improving RT. The IMRT relative dosimetry checks are well established; however, open questions remain in reference dosimetry with ionization chambers (ICs). The main problem is the departure of the measurement conditions from the reference ones; thus, additional uncertainty is introduced into the dose determination. The goal of this study was to assess this effect systematically. Methods and Materials: Monte Carlo calculations and dosimetric measurements with five different detectors were performed for a number of representative IMRT cases, covering both step-and-shoot and dynamic delivery. Results: Using ICs with volumes of about 0.125 cm{sup 3} or less, good agreement was observed among the detectors in most of the situations studied. These results also agreed well with the Monte Carlo-calculated nonreference correction factors (c factors). Additionally, we found a general correlation between the IC position relative to a segment and the derived correction factor c, which can be used to estimate the expected overall uncertainty of the treatment. Conclusion: The increase of the reference dose relative standard uncertainty measured with ICs introduced by nonreference conditions when verifying an entire IMRT plan is about 1-1.5%, provided that appropriate small-volume chambers are used. The overall standard uncertainty of the measured IMRT dose amounts to about 2.3%, including the 0.5% of reproducibility and 1.5% of uncertainty associated with the beam calibration factor. Solid state detectors and large-volume chambers are not well suited to IMRT verification dosimetry because of the greater uncertainties. An action level of 5% is appropriate for IMRT verification. Greater discrepancies should lead to a review of the dosimetric procedure, including visual inspection of treatment segments and energy fluence.

  18. A laboratory study of the perceived benefit of additional noise attenuation by houses

    NASA Technical Reports Server (NTRS)

    Flindell, I. H.

    1983-01-01

    Two Experiments were conducted to investigate the perceived benefit of additional house attenuation against aircraft flyover noise. First, subjects made annoyance judgments in a simulated living room while an operative window with real and dummy storm windows was manipulated in full view of those subjects. Second, subjects made annoyance judgments in an anechoic audiometric test chamber of frequency shaped noise signals having spectra closely matched to those of the aircraft flyover noises reproduced in the first experiment. These stimuli represented the aircraft flyover noises in levels and spectra but without the situational and visual cues present in the simulated living room. Perceptual constancy theory implies that annoyance tends to remain constant despite reductions in noise level caused by additional attenuation of which the subjects are fully aware. This theory was supported when account was taken for a reported annoyance overestimation for certain spectra and for a simulated condition cue overreaction.

  19. A Search for Additional Planets in the Exoplanetary Systems Studied by the NASA EPOXI Mission

    NASA Astrophysics Data System (ADS)

    Ballard, Sarah; Christiansen, J. L.; Charbonneau, D.; Holman, M. J.; Deming, D.; Wellnitz, D. D.; A'Hearn, M. F.; EPOXI Team

    2010-01-01

    The EPOCh (Extrasolar Planet Observation and Characterization) component of the NASA EPOXI mission used the HRI camera aboard the Deep Impact spacecraft to observe transiting exoplanet systems from January through August 2008. The majority of these targets were each observed nearly continuously for several weeks at a time. We searched these high-precision time series for additional planets in these systems, which could be revealed either directly through their photometric transits, or indirectly through the variations these second planets induce on the times of transit of the previously known planet. We present upper limits on putative additional planets in six of the systems observed by EPOXI: HAT-P-4, TrES-3, GJ 436, TrES-2, WASP-3, and HAT-P-7.

  20. Thiopeptin, a New Feed-Additive Antibiotic: Biological Studies and Field Trials

    PubMed Central

    Mine, K.; Miyairi, N.; Takano, N.; Mori, S.; Watanabe, N.

    1972-01-01

    Thiopeptin is a new antibiotic, produced by Streptomyces tateyamensis and developed solely for animal use as a feed additive. The antibiotic content in animal tissue and feed was assayed in terms of the antimicrobial activity against Mycoplasma laidlawii A. This antibiotic was found to be relatively nontoxic in rats and mice. In chickens, this antibiotic is excreted into feces within 48 hr of administration and is not absorbed in tissue. It is well tolerated in both broilers and swine and is highly stable in animal feed. Thiopeptin-supplemented feed contributes to the improvement of weight gain, feed efficiency in chickens and swine, and the egg performance in layers. Thus, thiopeptin, when used as a feed additive, is quite suitable for supplementing animal nutrition. PMID:4680812

  1. Structural changes in gluten protein structure after addition of emulsifier. A Raman spectroscopy study

    NASA Astrophysics Data System (ADS)

    Ferrer, Evelina G.; Gómez, Analía V.; Añón, María C.; Puppo, María C.

    2011-06-01

    Food protein product, gluten protein, was chemically modified by varying levels of sodium stearoyl lactylate (SSL); and the extent of modifications (secondary and tertiary structures) of this protein was analyzed by using Raman spectroscopy. Analysis of the Amide I band showed an increase in its intensity mainly after the addition of the 0.25% of SSL to wheat flour to produced modified gluten protein, pointing the formation of a more ordered structure. Side chain vibrations also confirmed the observed changes.

  2. A Delphi Study of Additive Manufacturing Applicability for United States Air Force Civil Engineer Contingency Operations

    DTIC Science & Technology

    2015-03-26

    This simple process is the basis for most consumer-grade desktop AM machines, commonly known as 3D printers (Pham & Gault, 1998:1270). Material...as a single purchase to decrease initial capital costs. Once the 3D printers are purchased and delivered, the selected bases can begin training...for several Questions if you would liKe to explain or elaborate on your answers. Additional information •out 3D printers and UTCs is provided as an

  3. Magnetic Force Microscopy Study of Zr2Co11 -Based Nanocrystalline Materials: Effect of Mo Addition

    DOE PAGES

    Yue, Lanping; Jin, Yunlong; Zhang, Wenyong; ...

    2015-01-01

    Tmore » he addition of Molybdenum was used to modify the nanostructure and enhance coercivity of rare-earth-free Zr2Co11-based nanocrystalline permanent magnets. he effect of Mo addition on magnetic domain structures of melt spun nanocrystalline Zr16Co84-xMox(x=0, 0.5, 1, 1.5, and 2.0) ribbons has been investigated. It was found that magnetic properties and local domain structures are strongly influenced by Mo doping. he coercivity of the samples increases with the increase in Mo content (x≤1.5). he maximum energy product(BH)maxincreases with increasingxfrom 0.5 MGOe forx=0to a maximum value of 4.2 MGOe forx=1.5. he smallest domain size with a relatively short magnetic correlation length of 128 nm and largest root-mean-square phase shiftΦrmsvalue of 0.66° are observed for thex=1.5. he optimal Mo addition promotes magnetic domain structure refinement and thus leads to a significant increase in coercivity and energy product in this sample.« less

  4. Load bearing and stiffness tailored NiTi implants produced by additive manufacturing: a simulation study

    NASA Astrophysics Data System (ADS)

    Rahmanian, Rasool; Shayesteh Moghaddam, Narges; Haberland, Christoph; Dean, David; Miller, Michael; Elahinia, Mohammad

    2014-03-01

    Common metals for stable long-term implants (e.g. stainless steel, Titanium and Titanium alloys) are much stiffer than spongy cancellous and even stiffer than cortical bone. When bone and implant are loaded this stiffness mismatch results in stress shielding and as a consequence, degradation of surrounding bony structure can lead to disassociation of the implant. Due to its lower stiffness and high reversible deformability, which is associated with the superelastic behavior, NiTi is an attractive biomaterial for load bearing implants. However, the stiffness of austenitic Nitinol is closer to that of bone but still too high. Additive manufacturing provides, in addition to the fabrication of patient specific implants, the ability to solve the stiffness mismatch by adding engineered porosity to the implant. This in turn allows for the design of different stiffness profiles in one implant tailored to the physiological load conditions. This work covers a fundamental approach to bring this vision to reality. At first modeling of the mechanical behavior of different scaffold designs are presented as a proof of concept of stiffness tailoring. Based on these results different Nitinol scaffolds can be produced by additive manufacturing.

  5. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  6. Complex VLSI Feature Comparison for Commercial Microelectronics Verification

    DTIC Science & Technology

    2014-03-27

    verification of untrusted circuits using industry-standard and custom software. The process developed under TRUST and implemented at the AFRL Mixed Signal...79 5.2.3 SCR and Other Netlists . . . . . . . . . . . . . . . . . . . . . . . . 79 5.2.4 Additional Tools...Reliability of Integrated Circuits LVS layout versus schematic MOSIS the Metal Oxide Semiconductor Implementation Service MSDC Mixed Signal Design

  7. Study of the structure of steel 12Kh12M1BFP modified with additions of fullerenes and carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Glebov, V. A.; Bakulina, A. S.; Efremov, I. V.; Shchetinin, I. V.; Yagodkin, Yu. D.; Glezer, A. M.; Rashkovskii, A. Yu.; Vainshtein, D. L.

    2010-12-01

    X-ray structural analysis, scanning electron microscopy, x-ray photoelectron spectroscopy, atomic force microscopy are used to study the structure of compacted specimens of steel 12Kh12M1BFP, modified with additions of fullerenes and carbon nanotubes. The effect of additions on the microhardness of compacted specimens is established.

  8. ALMA Band 5 Science Verification

    NASA Astrophysics Data System (ADS)

    Humphreys, L.; Biggs, A.; Immer, K.; Laing, R.; Liu, H. B.; Marconi, G.; Mroczkowski, T.; Testi, L.; Yagoubov, P.

    2017-03-01

    ALMA Band 5 (163–211 GHz) was recently commissioned and Science Verification (SV) observations were obtained in the latter half of 2016. A primary scientific focus of this band is the H2O line at 183.3 GHz, which can be observed around 15% of the time when the precipitable water vapour is sufficiently low (< 0.5 mm). Many more lines are covered in Band 5 and can be observed for over 70% of the time on Chajnantor, requiring similar restrictions to those for ALMA Bands 4 and 6. Examples include the H218O line at 203 GHz, some of the bright (3–2) lines of singly and doubly deuterated forms of formaldehyde, the (2–1) lines of HCO+, HCN, HNC, N2H+ and several of their isotopologues. A young star-forming region near the centre of the Milky Way, an evolved star also in our Galaxy, and a nearby ultraluminous infrared galaxy (ULIRG) were observed as part of the SV process and the data are briefly described. The reduced data, along with imaged data products, are now public and demonstrate the power of ALMA for high-resolution studies of H2O and other molecules in a variety of astronomical targets.

  9. Development of a Safeguards Verification Method and Instrument to Detect Pin Diversion from Pressurized Water Reactor (PWR) Spent Fuel Assemblies Phase I Study

    SciTech Connect

    Ham, Y S; Sitaraman, S

    2008-12-24

    A novel methodology to detect diversion of spent fuel from Pressurized Water Reactors (PWR) has been developed in order to address a long unsolved safeguards verification problem for international safeguards community such as International Atomic Energy Agency (IAEA) or European Atomic Energy Community (EURATOM). The concept involves inserting tiny neutron and gamma detectors into the guide tubes of a spent fuel assembly and measuring the signals. The guide tubes form a quadrant symmetric pattern in the various PWR fuel product lines and the neutron and gamma signals from these various locations are processed to obtain a unique signature for an undisturbed fuel assembly. Signatures based on the neutron and gamma signals individually or in a combination can be developed. Removal of fuel pins from the assembly will cause the signatures to be visibly perturbed thus enabling the detection of diversion. All of the required signal processing to obtain signatures can be performed on standard laptop computers. Monte Carlo simulation studies and a set of controlled experiments with actual commercial PWR spent fuel assemblies were performed and validated this novel methodology. Based on the simulation studies and benchmarking measurements, the methodology developed promises to be a powerful and practical way to detect partial defects that constitute 10% or more of the total active fuel pins. This far exceeds the detection threshold of 50% missing pins from a spent fuel assembly, a threshold defined by the IAEA Safeguards Criteria. The methodology does not rely on any operator provided data like burnup or cooling time and does not require movement of the fuel assembly from the storage rack in the spent fuel pool. A concept was developed to build a practical field device, Partial Defect Detector (PDET), which will be completely portable and will use standard radiation measuring devices already in use at the IAEA. The use of the device will not require any information provided

  10. Strategic Petroleum Reserve (SPR) additional geologic site characterization studies, Bryan Mound Salt Dome, Texas

    SciTech Connect

    Neal, J.T.; Magorian, T.R.; Ahmad, S.

    1994-11-01

    This report revises the original report that was published in 1980. Some of the topics covered in the earlier report were provisional and it is now practicable to reexamine them using new or revised geotechnical data and that obtained from SPR cavern operations, which involves 16 new caverns. Revised structure maps and sections show interpretative differences as compared with the 1980 report and more definition in the dome shape and caprock structural contours, especially a major southeast-northwest trending anomalous zone. The original interpretation was of westward tilt of the dome, this revision shows a tilt to the southeast, consistent with other gravity and seismic data. This interpretation refines the evaluation of additional cavern space, by adding more salt buffer and allowing several more caverns. Additional storage space is constrained on this nearly full dome because of low-lying peripheral wetlands, but 60 MMBBL or more of additional volume could be gained in six or more new caverns. Subsidence values at Bryan Mound are among the lowest in the SPR system, averaging about 11 mm/yr (0.4 in/yr), but measurement and interpretation issues persist, as observed values are about the same as survey measurement accuracy. Periodic flooding is a continuing threat because of the coastal proximity and because peripheral portions of the site are at elevations less than 15 ft. This threat may increase slightly as future subsidence lowers the surface, but the amount is apt to be small. Caprock integrity may be affected by structural features, especially the faulting associated with anomalous zones. Injection wells have not been used extensively at Bryan Mound, but could be a practicable solution to future brine disposal needs. Environmental issues center on the areas of low elevation that are below 15 feet above mean sea level: the coastal proximity and lowland environment combined with the potential for flooding create conditions that require continuing surveillance.

  11. Analytical verifications of NGST Advanced Mirror System Demonstrators

    NASA Astrophysics Data System (ADS)

    Cummings, Ramona O.; Peters, Bruce R.; Sutherlin, Steven; Smithers, Martin E.; Robinson, James

    2002-12-01

    Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper presents analytical verification and validation segments currently added to ambient and vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies for the Next Generation Space Telescope (NGST) project. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with analyses to be performed for AMSD testing, currently slated for calendar year 2003. These segments form a well-verified portion of the integrated modeling being conducted on AMSD for NGST performance predictions.

  12. Technical challenges for dismantlement verification

    SciTech Connect

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-11-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion.

  13. Optical security verification for blurred fingerprints

    NASA Astrophysics Data System (ADS)

    Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.

    1998-12-01

    Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.

  14. The new geospatial tools: global transparency enhancing safeguards verification

    SciTech Connect

    Pabian, Frank Vincent

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  15. A Review of Approaches to the Study of Turbulence Modification by Means of Non-Newtonian Additives

    DTIC Science & Technology

    1987-12-10

    interesting class of polymers for drag reduction studies. These polymers ( xanthomonas campestris , guar gum) were found to show long term stability...and their superiority to "typical" drag reducing polymers like polyethylene oxide, was evident. His degradation studies showed that the xanthomonas ... campestris is a better and more durable additive than guar gum. One of the most important biological additives is the DNA. It is actually a random

  16. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  17. A pilot-scale study of selective desulfurization via urea addition in iron ore sintering

    NASA Astrophysics Data System (ADS)

    Long, Hong-ming; Wu, Xue-jian; Chun, Tie-jun; Di, Zhan-xia; Wang, Ping; Meng, Qing-min

    2016-11-01

    The iron ore sintering process is the main source of SO2 emissions in the iron and steel industry. In our previous research, we proposed a novel technology for reducing SO2 emissions in the flue gas in the iron ore sintering process by adding urea at a given distance from the sintering grate bar. In this paper, a pilot-scale experiment was carried out in a commercial sintering plant. The results showed that, compared to the SO2 concentration in flue gas without urea addition, the SO2 concentration decreased substantially from 694.2 to 108.0 mg/m3 when 0.10wt% urea was added. NH3 decomposed by urea reacted with SO2 to produce (NH4)2SO4, decreasing the SO2 concentration in the flue gas.

  18. THERMODYNAMIC STUDY OF THE NICKEL ADDITION IN ZINC HOT-DIP GALVANIZING BATHS

    SciTech Connect

    Pistofidis, N.; Vourlias, G.

    2010-01-21

    A usual practice during zinc hot-dip galvanizing is the addition of nickel in the liquid zinc which is used to inhibit the Sandelin effect. Its action is due to the fact that the zeta(zeta) phase of the Fe-Zn system is replaced by the TAU(tau) phase of the Fe-Zn-Ni system. In the present work an attempt is made to explain the formation of the TAU phase with thermodynamics. For this reason the Gibbs free energy changes for TAU and zeta phases were calculated. The excess free energy for the system was calculated with the Redlich-Kister polyonyme. From this calculation it was deduced that the Gibbs energy change for the tau phase is negative. As a result its formation is spontaneous.

  19. Additive Manufacturing of a Microbial Fuel Cell—A detailed study

    PubMed Central

    Calignano, Flaviana; Tommasi, Tonia; Manfredi, Diego; Chiolerio, Alessandro

    2015-01-01

    In contemporary society we observe an everlasting permeation of electron devices, smartphones, portable computing tools. The tiniest living organisms on Earth could become the key to address this challenge: energy generation by bacterial processes from renewable stocks/waste through devices such as microbial fuel cells (MFCs). However, the application of this solution was limited by a moderately low efficiency. We explored the limits, if any, of additive manufacturing (AM) technology to fabricate a fully AM-based powering device, exploiting low density, open porosities able to host the microbes, systems easy to fuel continuously and to run safely. We obtained an optimal energy recovery close to 3 kWh m−3 per day that can power sensors and low-power appliances, allowing data processing and transmission from remote/harsh environments. PMID:26611142

  20. Preliminary study of neutron absorption by concrete with boron carbide addition

    SciTech Connect

    Abdullah, Yusof Yusof, Mohd Reusmaazran; Zali, Nurazila Mat; Ahmad, Megat Harun Al Rashid Megat; Yazid, Hafizal; Ariffin, Fatin Nabilah Tajul; Ahmad, Sahrim; Hamid, Roszilah; Mohamed, Abdul Aziz

    2014-02-12

    Concrete has become a conventional material in construction of nuclear reactor due to its properties like safety and low cost. Boron carbide was added as additives in the concrete construction as it has a good neutron absorption property. The sample preparation for concrete was produced with different weight percent of boron carbide powder content. The neutron absorption rate of these samples was determined by using a fast neutron source of Americium-241/Be (Am-Be 241) and detection with a portable backscattering neutron detector. Concrete with 20 wt % of boron carbide shows the lowest count of neutron transmitted and this indicates the most neutrons have been absorbed by the concrete. Higher boron carbide content may affect the concrete strength and other properties.

  1. Additive Manufacturing of a Microbial Fuel Cell--A detailed study.

    PubMed

    Calignano, Flaviana; Tommasi, Tonia; Manfredi, Diego; Chiolerio, Alessandro

    2015-11-27

    In contemporary society we observe an everlasting permeation of electron devices, smartphones, portable computing tools. The tiniest living organisms on Earth could become the key to address this challenge: energy generation by bacterial processes from renewable stocks/waste through devices such as microbial fuel cells (MFCs). However, the application of this solution was limited by a moderately low efficiency. We explored the limits, if any, of additive manufacturing (AM) technology to fabricate a fully AM-based powering device, exploiting low density, open porosities able to host the microbes, systems easy to fuel continuously and to run safely. We obtained an optimal energy recovery close to 3 kWh m(-3) per day that can power sensors and low-power appliances, allowing data processing and transmission from remote/harsh environments.

  2. Additive Manufacturing of a Microbial Fuel Cell—A detailed study

    NASA Astrophysics Data System (ADS)

    Calignano, Flaviana; Tommasi, Tonia; Manfredi, Diego; Chiolerio, Alessandro

    2015-11-01

    In contemporary society we observe an everlasting permeation of electron devices, smartphones, portable computing tools. The tiniest living organisms on Earth could become the key to address this challenge: energy generation by bacterial processes from renewable stocks/waste through devices such as microbial fuel cells (MFCs). However, the application of this solution was limited by a moderately low efficiency. We explored the limits, if any, of additive manufacturing (AM) technology to fabricate a fully AM-based powering device, exploiting low density, open porosities able to host the microbes, systems easy to fuel continuously and to run safely. We obtained an optimal energy recovery close to 3 kWh m-3 per day that can power sensors and low-power appliances, allowing data processing and transmission from remote/harsh environments.

  3. Preliminary study of neutron absorption by concrete with boron carbide addition

    NASA Astrophysics Data System (ADS)

    Abdullah, Yusof; Ariffin, Fatin Nabilah Tajul; Hamid, Roszilah; Yusof, Mohd Reusmaazran; Zali, Nurazila Mat; Ahmad, Megat Harun Al Rashid Megat; Yazid, Hafizal; Ahmad, Sahrim; Mohamed, Abdul Aziz

    2014-02-01

    Concrete has become a conventional material in construction of nuclear reactor due to its properties like safety and low cost. Boron carbide was added as additives in the concrete construction as it has a good neutron absorption property. The sample preparation for concrete was produced with different weight percent of boron carbide powder content. The neutron absorption rate of these samples was determined by using a fast neutron source of Americium-241/Be (Am-Be 241) and detection with a portable backscattering neutron detector. Concrete with 20 wt % of boron carbide shows the lowest count of neutron transmitted and this indicates the most neutrons have been absorbed by the concrete. Higher boron carbide content may affect the concrete strength and other properties.

  4. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  5. Professional Competence Development of the Social Work Specialists in the Period of Study in the System of Additional Education

    ERIC Educational Resources Information Center

    Davletkaliev, Denis Kuanyshevich; Zueva, Natalia Konstantinovna; Lebedeva, Natalya Vasilevna; Mkrtumova, Irina Vladimirovna; Timofeeva, Olga

    2015-01-01

    The goal of this work is the study of psychological-pedagogical approaches to the understanding of the idea of professional competence of social work specialists as well as the role of study in the system of additional educations in professional-personal development of the listeners. In the process of study of this problem we define main…

  6. Spectroscopic Evidence for Covalent Binding of Sulfadiazine to Natural Soils via 1,4-nucleophilic addition (Michael Type Addition) studied by Spin Labeling ESR

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Olga

    2015-04-01

    Among different classes of veterinary pharmaceuticals, Sulfadiazine (SDZ) is widely used in animal husbandry. Its residues were detected in different environmental compartments. However, soil is a hot spot for SDZ as it receives a large portion of excreted compounds through the application of manure during soil fertilization. Ample studies on the fate of SDZ in soils showed that a large portion forms nonextractable residues (NER) along with transformation products and a low mineralization (Mueller et al., 2013). A common observation was an initially fast formation of NER up to 10% of the applied amount promptly after the application of SDZ to soil, and this portion increased up to 50% within a few days (Mueller et al., 2013; Nowak et al., 2011). A common finding for SDZ, as for other sulfonamides, was biphasic kinetics of the formation of NER, which was attributed to the occurrence of two reaction processes: a rapid, often reversible process and a slower, irreversible process (Weber et al., 1996). A single-phase reaction process was also established under anaerobic treatment (Gulkowska et al., 2014). A major focus of this work is to elucidate a reaction mechanism of covalent binding of SDZ to soil that is currently required to estimate a risk of NER formed by SDZ in soils for human health. Taking into account a key role of the amine functional groups of SDZ on its reactivity in soil, nitroxide radicals with the sewed aromatic or aliphatic amines labeled soil samples and then, were investigated by means of ESR spectroscopy. 2,5,5-Trimethyl-2-(3-aminophenyl)pyrrolidin-1-yloxy and 4-amino-2,2,6,6-Tetramethylpiperidin-1-oxyl modeled decomposition products of SDZ with the aromatic and aliphatic amines, respectively. The application of the defined combination of both spin labels (SL) to different soils well simulated a change of a paramagnetic signal of soil organic radicals interacted with SDZ. After their application to soil, SL were found in soil sites characterized

  7. Using the Learning and Study Strategies Inventory Scores as Additional Predictors of Student Academic Performance.

    ERIC Educational Resources Information Center

    Rugsaken, Kris T.; Robertson, Jacqueline A.; Jones, James A.

    1998-01-01

    A study investigated the usefulness of the Learning and Study Strategies Inventory in predicting college students' academic performance, focusing on whether the scores enhance the accuracy of traditional predictors such as college entrance examinations and high school rank. Results indicate the scores produce a slight but not significant increase…

  8. Te Rita Papesch: Case Study of an Exemplary Learner of Maori as an Additional Language

    ERIC Educational Resources Information Center

    Ratima, Matiu Tai; Papesch, Te Rita

    2014-01-01

    This paper presents a case study of the life experiences of one exemplar adult second language Maori learner--Te Rita Papesch. Te Rita was one of 17 participants who were interviewed as a part of the first author's PhD study which sought to answer the question: what factors lead to the development of proficiency in te reo Maori amongst adult…

  9. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2016-09-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  10. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  11. Sensitization to Food Additives in Patients with Allergy: A Study Based on Skin Test and Open Oral Challenge.

    PubMed

    Moghtaderi, Mozhgan; Hejrati, Zinatosadat; Dehghani, Zahra; Dehghani, Faranak; Kolahi, Niloofar

    2016-06-01

    There has been a great increase in the consumption of various food additives in recent years. The purpose of this study was to identify the incidence of sensitization to food additives by using skin prick test in patients with allergy and to determine the concordance rate between positive skin tests and oral challenge in hypersensitivity to additives. This cross-sectional study included 125 (female 71, male 54) patients aged 2-76 years with allergy and 100 healthy individuals. Skin tests were performed in both patient and control groups with 25 fresh food additives. Among patients with allergy, 22.4% showed positive skin test at least to one of the applied materials. Skin test was negative to all tested food additives in control group. Oral food challenge was done in 28 patients with positive skin test, in whom 9 patients showed reaction to culprit (Concordance rate=32.1%). The present study suggested that about one-third of allergic patients with positive reaction to food additives showed positive oral challenge; it may be considered the potential utility of skin test to identify the role of food additives in patients with allergy.

  12. Exploratory studies of extended storage of apheresis platelets in a platelet additive solution (PAS).

    PubMed

    Slichter, Sherrill J; Corson, Jill; Jones, Mary Kay; Christoffel, Todd; Pellham, Esther; Bailey, S Lawrence; Bolgiano, Doug

    2014-01-09

    To evaluate the poststorage viability of apheresis platelets stored for up to 18 days in 80% platelet additive solution (PAS)/20% plasma, 117 healthy subjects donated platelets using the Haemonetics MCS+, COBE Spectra (Spectra), or Trima Accel (Trima) systems. Control platelets from the same subjects were compared with their stored test PAS platelets by radiolabeling their stored and control platelets with either (51)chromium or (111)indium. Trima platelets met Food and Drug Administration poststorage platelet viability criteria for only 7 days vs almost 13 days for Haemonetics platelets; ie, platelet recoveries after these storage times averaged 44 ± 3% vs 49 ± 3% and survivals were 5.4 ± 0.3 vs 4.6 ± 0.3 days, respectively. The differences in storage duration are likely related to both the collection system and the storage bag. The Spectra and Trima platelets were hyperconcentrated during collection, and PAS was added, whereas the Haemonetics platelets were elutriated with PAS, which may have resulted in less collection injury. When Spectra and Trima platelets were stored in Haemonetics' bags, poststorage viability was significantly improved. Platelet viability is better maintained in vitro than in vivo, allowing substantial increases in platelet storage times. However, implementation will require resolution of potential bacterial overgrowth during storage.

  13. Experimental study on agitated drying characteristics of sewage sludge under the effects of different additive agents.

    PubMed

    Deng, Wenyi; Su, Yaxin

    2014-07-01

    Drying experiments of dewatered sewage sludge (DSS) were conducted on a agitated paddle dryer, and the effects of additive agents, i.e., CaO, pulverized coal (PC), heavy oil (HO), and dried sludge ("DS" through back mixing) on the agitated drying characteristics of DSS were investigated. The results indicated that CaO can significantly increase the drying rate of DSS. The drying rate at CaO/DSS (mass ratio)=1/100 was 135% higher than that of CaO/DSS=0. Pulverized coal has no obvious effect on drying rate, but the increase of PC/DSS can promote breaking up of sludge lump. Heavy oil was found to be slightly effective in improving the drying rate of DSS in the examined experimental range of HO/DSS=0-1/20. It is also found that HO can reduce the torque of the dryer shaft, due to its lubrication effect. Back mixing of DS was found to be effective in alleviating the unfavorable effect of the lumpy phase by improving the mixing effect of the paddle dryer. There was a marked increase of drying rate with an increase of the DS/DSS in the experimental range of DS/DSS=0-1/3.

  14. Generalized linear and generalized additive models in studies of species distributions: Setting the scene

    USGS Publications Warehouse

    Guisan, A.; Edwards, T.C.; Hastie, T.

    2002-01-01

    An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.

  15. Study on additional carrier sensing for IEEE 802.15.4 wireless sensor networks.

    PubMed

    Lee, Bih-Hwang; Lai, Ruei-Lung; Wu, Huai-Kuei; Wong, Chi-Ming

    2010-01-01

    Wireless sensor networks based on the IEEE 802.15.4 standard are able to achieve low-power transmissions in the guise of low-rate and short-distance wireless personal area networks (WPANs). The slotted carrier sense multiple access with collision avoidance (CSMA/CA) is used for contention mechanism. Sensor nodes perform a backoff process as soon as the clear channel assessment (CCA) detects a busy channel. In doing so they may neglect the implicit information of the failed CCA detection and further cause the redundant sensing. The blind backoff process in the slotted CSMA/CA will cause lower channel utilization. This paper proposes an additional carrier sensing (ACS) algorithm based on IEEE 802.15.4 to enhance the carrier sensing mechanism for the original slotted CSMA/CA. An analytical Markov chain model is developed to evaluate the performance of the ACS algorithm. Both analytical and simulation results show that the proposed algorithm performs better than IEEE 802.15.4, which in turn significantly improves throughput, average medium access control (MAC) delay and power consumption of CCA detection.

  16. Towards the formal verification of the requirements and design of a processor interface unit: HOL listings

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.

  17. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  18. Lesion removal and lesion addition algorithms in lung volumetric data sets for perception studies

    NASA Astrophysics Data System (ADS)

    Madsen, Mark T.; Berbaum, Kevin S.; Ellingson, Andrew; Thompson, Brad H.; Mullan, Brian F.

    2006-03-01

    Image perception studies of medical images provide important information about how radiologists interpret images and insights for reducing reading errors. In the past, perception studies have been difficult to perform using clinical imaging studies because of the problems associated with obtaining images demonstrating proven abnormalities and appropriate normal control images. We developed and evaluated interactive software that allows the seamless removal of abnormal areas from CT lung image sets. We have also developed interactive software for capturing lung lesions in a database where they can be added to lung CT studies. The efficacy of the software to remove abnormal areas of lung CT studies was evaluated psychophysically by having radiologists select the one altered image from a display of four. The software for adding lesions was evaluated by having radiologists classify displayed CT slices with lesions as real or artificial scaled to 3 levels of confidence. The results of these experiments demonstrated that the radiologist had difficulty in distinguishing the raw clinical images from those that had been altered. We conclude that this software can be used to create experimental normal control and "proven" lesion data sets for volumetric CT of the lung fields. We also note that this software can be easily adapted to work with other tissue besides lung and that it can be adapted to other digital imaging modalities.

  19. Additional Study of Water Droplet Median Volume Diameter (MVD) Effects on Ice Shapes

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching; Anderson, David N.

    2005-01-01

    This paper reports the result of an experimental study in the NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the MVD-independent effect identified previously might apply to SLD conditions in rime icing situations. Models were NACA 0012 wing sections with chords of 53.3 and 91.4 cm. Tests were conducted with a nominal airspeed of 77 m/s (150 kt) and a number of MVD's ranging from 15 to 100 m with LWC of 0.5 to 1 g/cu m. In the present study, ice shapes recorded from past studies and recent results at SLD and Appendix-C conditions are reviewed to show that droplet diameter is not important to rime ice shape for MVD of 30 microns or larger, but for less than 30 m drop sizes a rime ice shape transition from convex to wedge to spearhead type ice shape is observed.

  20. Study on Type C Coal Fly ash as an Additive to Molding Sand for Steel Casting

    NASA Astrophysics Data System (ADS)

    Palaniappan, Jayanthi

    2017-04-01

    Study of physio-chemical properties studies such as granulometric analysis, moisture, X ray fluorescence etc. were performed with Type C coal—combustion fly ash to investigate their potential as a distinct option for molding sand in foundry, thereby reducing the dependency on latter. Technological properties study such as compressive strength, tensile strength, permeability and compaction of various compositions of fly ash molding sand (10, 20 and 30 % fly ash substitute to chemically bonded sand) were performed and compared with silica molding sand. Steel casting production using this fly ash molding sand was done and the casting surface finish and typical casting parameters were assessed. It was noted that a good quality steel casting could be produced using type C fly ash molding sand, which effectively replaced 20 % of traditional molding sand and binders thereby providing greater financial profits to the foundry and an effective way of fly ash utilization (waste management).

  1. Voice measures of workload in the advanced flight deck: Additional studies

    NASA Technical Reports Server (NTRS)

    Schneider, Sid J.; Alpert, Murray

    1989-01-01

    These studies investigated acoustical analysis of the voice as a measure of workload in individual operators. In the first study, voice samples were recorded from a single operator during high, medium, and low workload conditions. Mean amplitude, frequency, syllable duration, and emphasis all tended to increase as workload increased. In the second study, NASA test pilots performed a laboratory task, and used a flight simulator under differing work conditions. For two of the pilots, high workload in the simulator brought about greater amplitude, peak duration, and stress. In both the laboratory and simulator tasks, high workload tended to be associated with more statistically significant drop-offs in the acoustical measures than were lower workload levels. There was a great deal of intra-subject variability in the acoustical measures. The results suggested that in individual operators, increased workload might be revealed by high initial amplitude and frequency, followed by rapid drop-offs over time.

  2. Study on Type C Coal Fly ash as an Additive to Molding Sand for Steel Casting

    NASA Astrophysics Data System (ADS)

    Palaniappan, Jayanthi

    2016-05-01

    Study of physio-chemical properties studies such as granulometric analysis, moisture, X ray fluorescence etc. were performed with Type C coal—combustion fly ash to investigate their potential as a distinct option for molding sand in foundry, thereby reducing the dependency on latter. Technological properties study such as compressive strength, tensile strength, permeability and compaction of various compositions of fly ash molding sand (10, 20 and 30 % fly ash substitute to chemically bonded sand) were performed and compared with silica molding sand. Steel casting production using this fly ash molding sand was done and the casting surface finish and typical casting parameters were assessed. It was noted that a good quality steel casting could be produced using type C fly ash molding sand, which effectively replaced 20 % of traditional molding sand and binders thereby providing greater financial profits to the foundry and an effective way of fly ash utilization (waste management).

  3. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  4. Automated verification system user's guide

    NASA Technical Reports Server (NTRS)

    Hoffman, R. H.

    1972-01-01

    Descriptions of the operational requirements for all of the programs of the Automated Verification System (AVS) are provided. The AVS programs are: (1) FORTRAN code analysis and instrumentation program (QAMOD); (2) Test Effectiveness Evaluation Program (QAPROC); (3) Transfer Control Variable Tracking Program (QATRAK); (4) Program Anatomy Table Generator (TABGEN); and (5) Network Path Analysis Program (RAMBLE).

  5. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  6. A Micro-Developmental Approach to Studying Young Children's Problem Solving Behavior in Addition

    ERIC Educational Resources Information Center

    Voutsina, Chronoula

    2012-01-01

    This paper presents a study that investigated the process of change in 5-6-year-old children's successful problem-solving approaches when tackling a multiple-step task in elementary arithmetic. Micro-developmental changes in children's successful problem-solving behavior were analyzed using Karmiloff-Smith's model of representational redescription…

  7. Genome-Wide Association Study of Intelligence: Additive Effects of Novel Brain Expressed Genes

    ERIC Educational Resources Information Center

    Loo, Sandra K.; Shtir, Corina; Doyle, Alysa E.; Mick, Eric; McGough, James J.; McCracken, James; Biederman, Joseph; Smalley, Susan L.; Cantor, Rita M.; Faraone, Stephen V.; Nelson, Stanley F.

    2012-01-01

    Objective: The purpose of the present study was to identify common genetic variants that are associated with human intelligence or general cognitive ability. Method: We performed a genome-wide association analysis with a dense set of 1 million single-nucleotide polymorphisms (SNPs) and quantitative intelligence scores within an ancestrally…

  8. A Study of the Effect of Additional Reading Assistance on Student Achievement

    ERIC Educational Resources Information Center

    Gillan-Sanderson, Nicole A.

    2012-01-01

    This study describes a procedure one school district used to increase students' reading abilities through reviewing data and adjusting the instruction to give students intensive services, as needed. This school worked in a problem-solving team approach to develop a comprehensive team that followed the progression of student achievement.…

  9. Nahuatl as a Classical, Foreign, and Additional Language: A Phenomenological Study

    ERIC Educational Resources Information Center

    De Felice, Dustin

    2012-01-01

    In this study, participants learning an endangered language variety shared their experiences, thoughts, and feelings about the often complex and diverse language-learning process. I used phenomenological interviews in order to learn more about these English or Spanish language speakers' journey with the Nahuatl language. From first encounter to…

  10. CNV-based genome wide association study reveals additional variants contributing to meat quality in swine

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pork quality is important both to the meat processing industry and consumers’ purchasing attitudes. Copy number variation (CNV) is a burgeoning kind of variant that may influence meat quality. Herein, a genome-wide association study (GWAS) was performed between CNVs and meat quality traits in swine....

  11. Mapping {sup 15}O Production Rate for Proton Therapy Verification

    SciTech Connect

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping; Min, Chul Hee; Testa, Mauro; Winey, Brian; Normandin, Marc D.; Shih, Helen A.; Paganetti, Harald; Bortfeld, Thomas; El Fakhri, Georges

    2015-06-01

    Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 ({sup 15}O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of {sup 15}O production and clearance rates, which were compared to live versus dead rates for the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of {sup 15}O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using {sup 15}O decay constant, whereas the live thigh activity decayed faster. Most importantly, the {sup 15}O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of {sup 15}O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of {sup 15}O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, {sup 15}O clearance rates may be useful in monitoring permeability changes due to therapy.

  12. Thermal analysis studies of Ge additive of Se-Te glasses

    NASA Astrophysics Data System (ADS)

    Mohamed, M.; Abdel-Rahim, M. A.

    2016-04-01

    Ge x Se50Te50- x ( x = 5, 15, 20, 35 at.%) bulk glasses were synthesized by the melt quenching method. The amorphous nature of the investigated glasses was determined by X-ray diffraction. Results of differential scanning calorimetry (DSC) of the studied compositions under non-isothermal conditions were reported and discussed. The glass transition temperature ( T g), onset crystallization temperature ( T c), and crystallization peak temperature ( T p) were determined from DSC traces at different heating rates. It was found that the values of T g, T c, and T p rely on both composition and heating rate. A double crystallization stages were observed in the DSC results. Various kinetics parameters such as the glass transition energy ( E g), crystallization activation energy ( E c), and rate constant ( K p) were calculated. The glass-forming ability of the studied compositions was discussed as function of the determined kinetics parameters.

  13. Comparative study of glycine single crystals with additive of potassium nitrate in different concentration ratios

    NASA Astrophysics Data System (ADS)

    Gujarati, Vivek P.; Deshpande, M. P.; Patel, Kamakshi R.; Chaki, S. H.

    2016-05-01

    Semi-organic crystals of Glycine Potassium Nitrate (GPN) with potential applications in Non linear optics (NLO) were grown using slow evaporation technique. Glycine and Potassium Nitrate were taken in three different concentration ratios of 3:1, 2:1 and 1:1 respectively. We checked the solubility of the material in distilled water at different temperatures and could observe the growth of crystals in 7 weeks time. Purity of the grown crystals was confirmed by Energy Dispersive X-ray Analysis (EDAX) and CHN analysis. GSN Powder X-ray diffraction pattern was recorded to confirm the crystalline nature. To confirm the applications of grown crystals in opto-electronics field, UV-Vis-NIR study was carried out. Dielectric properties of the samples were studied in between the frequency range 1Hz to 100 KHz.

  14. Brief reconnaissance study for the addition of hydropower for Carr Fork Dam, Sassafras, Kentucky

    SciTech Connect

    Gebhard, T.G. Jr.

    1982-05-24

    The feasibility of retrofitting the Carr Fork Dam near Hazard, KY for power generation was examined. This dam has a developable head of 80 ft and was built in 1975 to provide flood protection. The study of environmental, institutional, safety, and economic factors showed that the total investment cost would be $909,600 and that hydroelectric power development at this site is not feasible unless a higher price could be obtained for the power sold. (LCL)

  15. Brief reconnaissance study for the addition of hydropower for Spray Dam, Eden, North Carolina

    SciTech Connect

    Gebhard, T.G. Jr.

    1981-01-30

    The feasibility of retrofitting the Spray Dam near Eden, North Carolina for power generation was examined. This dam has a developable head of 10 ft., was built in 1898 for hydroelectric power generation with one of 2 installed units currently operating. The study of environmental, institutional, safety and economic factors showed that hydroelectric power development at this site is possible and that the economics of retrofits will depend on whether existing equipment can be repaired or will have to be replaced. (LCL)

  16. Pairwise Identity Verification via Linear Concentrative Metric Learning.

    PubMed

    Zheng, Lilei; Duffner, Stefan; Idrissi, Khalid; Garcia, Christophe; Baskurt, Atilla

    2016-12-16

    This paper presents a study of metric learning systems on pairwise identity verification, including pairwise face verification and pairwise speaker verification, respectively. These problems are challenging because the individuals in training and testing are mutually exclusive, and also due to the probable setting of limited training data. For such pairwise verification problems, we present a general framework of metric learning systems and employ the stochastic gradient descent algorithm as the optimization solution. We have studied both similarity metric learning and distance metric learning systems, of either a linear or shallow nonlinear model under both restricted and unrestricted training settings. Extensive experiments demonstrate that with limited training pairs, learning a linear system on similar pairs only is preferable due to its simplicity and superiority, i.e., it generally achieves competitive performance on both the labeled faces in the wild face dataset and the NIST speaker dataset. It is also found that a pretrained deep nonlinear model helps to improve the face verification results significantly.

  17. Additional studies of sheep haemopexin: genetic control, frequencies and postnatal development.

    PubMed

    Stratil, A; Bobák, P; Margetín, M; Glasnák, V

    1989-01-01

    This study presents evidence that sheep haemopexin phenotypes are genetically controlled by three alleles, HpxA, HpxB1 and HpxB2, of a single autosomal locus. Frequencies of two alleles, HpxA and HpxB (HpxB encompasses two isoalleles, HpxB1 and HpxB2), were studied in eight sheep breeds in Czechoslovakia. The frequency of the HpxA allele was highest (ranging from 0.81 in Merino to 1.0 in East Friesian sheep). Qualitative and quantitative changes in haemopexin during postnatal development were studied by starch gel electrophoresis and rocket immunoelectrophoresis respectively. In electrophoresis, 1- or 2-day-old lambs had two very weak zones corresponding in mobility to two slower zones of adult animals. Later, the third more anodic zone appeared and gradually increased in intensity. In 1-month-old lambs the patterns were practically identical with those of adult animals. Using rocket immunoelectrophoresis, the level of haemopexin shortly after birth was practically zero. It rose sharply till the sixth day of life; then the level continued to rise slowly till about 1 month of age. The mean haemopexin level in adult sheep was 64.5 +/- 18.26 (SD) mg/100ml serum, ranging from 30.5 to 116.5 mg/100ml.

  18. A clinical comparative study of Cadiax Compact II and intraoral records using wax and addition silicone.

    PubMed

    Torabi, Kianoosh; Pour, Sasan Rasaei; Ahangari, Ahmad Hassan; Ghodsi, Safoura

    2014-01-01

    Evaluation of mandibular movements is necessary to form the occlusal anatomical contour, analyze the temporomandibular joint status, and evaluate the patient's occlusion. This clinical study was conducted to compare the mandibular recording device Cadiax Compact II with routine intraoral records for measuring condylar inclinations. The results showed that the differences between Cadiax and intraoral records were statistically significant for all measurements. Cadiax measurements had a stronger correlation with silicone records. The quantities of recorded Bennett angles were lower and the values of sagittal condylar inclination were higher with Cadiax than with routine intraoral records.

  19. [Anatomical and radiological studies on additional mandible teeth anesthesia considering innervation variability].

    PubMed

    Tarasenko, S V; Dydykin, S S; Kuzin, A V

    2013-01-01

    The paper presents studies on nutritional foramina of the mandible. Some nutritional foramina located in the frontal mandibular region on the lingual surface and containing significant blood vessels and nerves are found to be more typical for teeth-bearing mandible. In retromolar area in case of third molars presence intraosseous canals were revealed leading to inferior alveolar nerve canal. One should consider intraligamental and lingual anesthesia by lower incisors extraction. Intraosseous anesthesia and retromolar area infiltration significantly increase anesthesia efficiency by third molar extraction.

  20. Verification and validation of control system software

    SciTech Connect

    Munro, J.K. Jr.; Kisner, R.A. ); Bhadtt, S.C. )

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  1. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  2. Study on the interaction of the toxic food additive carmoisine with serum albumins: a microcalorimetric investigation.

    PubMed

    Basu, Anirban; Kumar, Gopinatha Suresh

    2014-05-30

    The interaction of the synthetic azo dye and food colorant carmoisine with human and bovine serum albumins was studied by microcalorimetric techniques. A complete thermodynamic profile of the interaction was obtained from isothermal titration calorimetry studies. The equilibrium constant of the complexation process was of the order of 10(6)M(-1) and the binding stoichiometry was found to be 1:1 with both the serum albumins. The binding was driven by negative standard molar enthalpy and positive standard molar entropy contributions. The binding affinity was lower at higher salt concentrations in both cases but the same was dominated by mostly non-electrostatic forces at all salt concentrations. The polyelectrolytic forces contributed only 5-8% of the total standard molar Gibbs energy change. The standard molar enthalpy change enhanced whereas the standard molar entropic contribution decreased with rise in temperature but they compensated each other to keep the standard molar Gibbs energy change almost invariant. The negative standard molar heat capacity values suggested the involvement of a significant hydrophobic contribution in the complexation process. Besides, enthalpy-entropy compensation phenomenon was also observed in both the systems. The thermal stability of the serum proteins was found to be remarkably enhanced on binding to carmoisine.

  3. Are gastric hyperplastic polyps an additional manifestation in celiac disease?: Results from a retrospective study.

    PubMed

    Dore, Maria Pina; Pes, Giovanni Mario; Rocchi, Chiara; Loria, Maria Francesca; Soro, Sara; Bassotti, Gabrio

    2017-02-01

    Gastric polyps are frequently reported in patients undergoing upper endoscopic procedures. In this retrospective study, the association between hyperplastic polyps and celiac disease in Northern Sardinia was estimated.Age, gender, body mass index, and medications taken in the 2 preceding months, including proton-pump inhibitors (PPIs), H2 receptor blockers (anti-H2), Helicobacter pylori status, endoscopic findings, and histology from charts of patients undergoing esophago-gastro-duodenoscopy were reviewed. Polyps were classified as hyperplastic, fundic gland, inflammatory, and adenomatous.3.7% (423/11379) patients had celiac disease. Prevalence of gastric polyps was 4.2% (3.8% among celiac vs 4.2% nonceliac patients). Inflammatory polyp was the most common histotype (55.8% and 56.2%) followed by fundic gland polyps (31.4% and 43.7%), hyperplastic (8.7% and 0%), and adenomas, in celiac and nonceliac patients, respectively. Fundic gland polyps were more common in PPI users (odds ratio: 4.06) than in nonusers (2.65, P = 0.001) among celiac and nonceliac patients. Age older than 50, female gender, esophago-gastro-duodenoscopy year, and PPI use were associated with the presence of polyps, whereas active H pylori infection was not.Gastric polyps were common in Sardinian patients undergoing esophago-gastro-duodenoscopy. However, the previously reported association between hyperplastic polyps and celiac disease was not confirmed in our study.

  4. [Chewing gum as an additional agent in maintaining oral hygiene versus smoking status--preliminary study].

    PubMed

    Nakonieczna-Rudnicka, Marta; Strycharz-Dudziak, Małgorzata; Bachanek, Teresa

    2012-01-01

    Nowadays chewing gum is widely used in different age groups, so complying with proper duration and frequency of chewing is an important factor influencing the state of masticatory system. The study involved 112 dental students of the Medical University of Lublin. Everyday use of chewing gum declared 47,32% of cases. Chewing time up to 10 minutes was stated in 23,08% of respondents, 11-20 minutes in 40,38% of interviewees. Among the examined students 17,3% smoked cigarettes. In smokers group 83,33% of questioned chewed the gum every day, while among non-smokers - 43,37%. Chewing time shorter than 10 minutes declared 22,22% of smokers and 23,26% of non-smokers, while chewing time between 11-20 minutes - 27,78% i 44,35% of smokers and non-smokers respectively. Obtained results indicate the need of carrying out further studies aimed at the nicotine influence on saliva parameters with respect to development of diseases of hard tooth tissues.

  5. Shelf life and quality study of minced tilapia with Nori and Hijiki seaweeds as natural additives.

    PubMed

    Ribeiro, Ingridy Simone; Shirahigue, Ligianne Din; Ferraz de Arruda Sucasas, Lia; Anbe, Lika; da Cruz, Pedro Gomes; Gallo, Cláudio Rosa; Carpes, Solange Teresinha; Marques, Marcos José; Oetterer, Marília

    2014-01-01

    The extraction of mechanically separated meat has emerged as an attractive process. However, it increases the incorporation of oxygen and, consequently, of flavors due to rancidity. Thus, preservatives must be added. The objective of this study was to evaluate the shelf life of minced tilapia to replace synthetic preservatives with Hijiki and Nori seaweeds extracts. The application of the extracts had no effect on the chemical composition of the minced tilapia. The seaweed extracts had inhibitory effect on total volatile base nitrogen. The minced tilapia complied with the microbiological standard set by Brazilin law. The panelists detected no differences in the rancid aroma and only minor differences were detected in the color of the products. It can be concluded that the minced tilapia with added seaweed extracts were within quality standards during frozen storage.

  6. Additional results for 'Sequential design approaches for bioequivalence studies with crossover designs'.

    PubMed

    Montague, Timothy H; Potvin, Diane; Diliberti, Charles E; Hauck, Walter W; Parr, Alan F; Schuirmann, Donald J

    2012-01-01

    In 2008, this group published a paper on approaches for two-stage crossover bioequivalence (BE) studies that allowed for the reestimation of the second-stage sample size based on the variance estimated from the first-stage results. The sequential methods considered used an assumed GMR of 0.95 as part of the method for determining power and sample size. This note adds results for an assumed GMR = 0.90. Two of the methods recommended for GMR = 0.95 in the earlier paper have some unacceptable increases in Type I error rate when the GMR is changed to 0.90. If a sponsor wants to assume 0.90 for the GMR, Method D is recommended. Copyright © 2011 John Wiley & Sons, Ltd.

  7. Shelf Life and Quality Study of Minced Tilapia with Nori and Hijiki Seaweeds as Natural Additives

    PubMed Central

    Ribeiro, Ingridy Simone; Shirahigue, Ligianne Din; Ferraz de Arruda Sucasas, Lia; Anbe, Lika; da Cruz, Pedro Gomes; Gallo, Cláudio Rosa; Carpes, Solange Teresinha; Marques, Marcos José; Oetterer, Marília

    2014-01-01

    The extraction of mechanically separated meat has emerged as an attractive process. However, it increases the incorporation of oxygen and, consequently, of flavors due to rancidity. Thus, preservatives must be added. The objective of this study was to evaluate the shelf life of minced tilapia to replace synthetic preservatives with Hijiki and Nori seaweeds extracts. The application of the extracts had no effect on the chemical composition of the minced tilapia. The seaweed extracts had inhibitory effect on total volatile base nitrogen. The minced tilapia complied with the microbiological standard set by Brazilin law. The panelists detected no differences in the rancid aroma and only minor differences were detected in the color of the products. It can be concluded that the minced tilapia with added seaweed extracts were within quality standards during frozen storage. PMID:25478593

  8. Biological effect of food additive titanium dioxide nanoparticles on intestine: an in vitro study.

    PubMed

    Song, Zheng-Mei; Chen, Ni; Liu, Jia-Hui; Tang, Huan; Deng, Xiaoyong; Xi, Wen-Song; Han, Kai; Cao, Aoneng; Liu, Yuanfang; Wang, Haifang

    2015-10-01

    Titanium dioxide nanoparticles (TiO2 NPs) are widely found in food-related consumer products. Understanding the effect of TiO2 NPs on the intestinal barrier and absorption is essential and vital for the safety assessment of orally administrated TiO2 NPs. In this study, the cytotoxicity and translocation of two native TiO2 NPs, and these two TiO2 NPs pretreated with the digestion simulation fluid or bovine serum albumin were investigated in undifferentiated Caco-2 cells, differentiated Caco-2 cells and Caco-2 monolayer. TiO2 NPs with a concentration less than 200 µg ml(-1) did not induce any toxicity in differentiated cells and Caco-2 monolayer after 24 h exposure. However, TiO2 NPs pretreated with digestion simulation fluids at 200 µg ml(-1) inhibited the growth of undifferentiated Caco-2 cells. Undifferentiated Caco-2 cells swallowed native TiO2 NPs easily, but not pretreated NPs, implying the protein coating on NPs impeded the cellular uptake. Compared with undifferentiated cells, differentiated ones possessed much lower uptake ability of these TiO2 NPs. Similarly, the traverse of TiO2 NPs through the Caco-2 monolayer was also negligible. Therefore, we infer the possibility of TiO2 NPs traversing through the intestine of animal or human after oral intake is quite low. This study provides valuable information for the risk assessment of TiO2 NPs in food.

  9. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  10. A digital process for additive manufacturing of occlusal splints: a clinical pilot study

    PubMed Central

    Salmi, Mika; Paloheimo, Kaija-Stiina; Tuomi, Jukka; Ingman, Tuula; Mäkitie, Antti

    2013-01-01

    The aim of this study was to develop and evaluate a digital process for manufacturing of occlusal splints. An alginate impression was taken from the upper and lower jaws of a patient with temporomandibular disorder owing to cross bite and wear of the teeth, and then digitized using a table laser scanner. The scanned model was repaired using the 3Data Expert software, and a splint was designed with the Viscam RP software. A splint was manufactured from a biocompatible liquid photopolymer by stereolithography. The system employed in the process was SLA 350. The splint was worn nightly for six months. The patient adapted to the splint well and found it comfortable to use. The splint relieved tension in the patient's bite muscles. No sign of tooth wear or significant splint wear was detected after six months of testing. Modern digital technology enables us to manufacture clinically functional occlusal splints, which might reduce costs, dental technician working time and chair-side time. Maximum-dimensional errors of approximately 1 mm were found at thin walls and sharp corners of the splint when compared with the digital model. PMID:23614943

  11. A Comparative Kirkwood-Buff Study of Aqueous Methanol Solutions Modeled by the CHARMM Additive and Drude Polarizable Force Fields

    PubMed Central

    Lin, Bin; He, Xibing; MacKerell, Alexander D.

    2013-01-01

    A comparative study on aqueous methanol solutions modeled by the CHARMM additive and Drude polarizable force fields was carried out by employing Kirkwood-Buff analysis. It was shown that both models reproduced the experimental Kirkwood-Buff integrals and excess coordination numbers adequately well over the entire concentration range. The Drude model showed significant improvement over the additive model in solution densities, partial molar volumes, excess molar volumes, concentration-dependent diffusion constants, and dielectric constants. However, the additive model performed somewhat better than the Drude model in reproducing the activity derivative, excess molar Gibbs energy and excess molar enthalpy of mixing. This is due to the additive achieving a better balance among solute-solute, solute-solvent, and solvent-solvent interactions, indicating the potential for improvements in the Drude polarizable alcohol model. PMID:23947568

  12. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  13. Verification of computational models of cardiac electro-physiology.

    PubMed

    Pathmanathan, Pras; Gray, Richard A

    2014-05-01

    For computational models of cardiac activity to be used in safety-critical clinical decision-making, thorough and rigorous testing of the accuracy of predictions is required. The field of 'verification, validation and uncertainty quantification' has been developed to evaluate the credibility of computational predictions. The first stage, verification, is the evaluation of how well computational software correctly solves the underlying mathematical equations. The aim of this paper is to introduce novel methods for verifying multi-cellular electro-physiological solvers, a crucial first stage for solvers to be used with confidence in clinical applications. We define 1D-3D model problems with exact solutions for each of the monodomain, bidomain, and bidomain-with-perfusing-bath formulations of cardiac electro-physiology, which allow for the first time the testing of cardiac solvers against exact errors on fully coupled problems in all dimensions. These problems are carefully constructed so that they can be easily run using a general solver and can be used to greatly increase confidence that an implementation is correct, which we illustrate by testing one major solver, 'Chaste', on the problems. We then perform case studies on calculation verification (also known as solution verification) for two specific applications. We conclude by making several recommendations regarding verification in cardiac modelling.

  14. Blood pressure goal achievement with olmesartan medoxomil-based treatment: additional analysis of the OLMEBEST study

    PubMed Central

    Barrios, Vivencio; Escobar, Carlos; Calderon, Alberto; Böhm, Michael

    2009-01-01

    Aims Guidelines recommend blood pressure (BP) in hypertensive patients should be <140 systolic BP (SBP) and <90 diastolic BP (DBP) mmHg. This analysis assessed goal rate achievement in hypertensive patients receiving olmesartan-based treatment in the OLMEBEST study. Methods Patients with essential hypertension (DBP ≥ 90 mmHg and <110 mmHg) received open-label olmesartan medoxomil 20 mg/day (n = 2306). After 8 weeks, patients with DBP ≥ 90 mmHg (n = 627) were randomized to 4 weeks’ double-blind treatment with olmesartan 40 mg/day monotherapy or olmesartan 20 mg/day plus hydrochlorothiazide (HCTZ) 12.5 mg/day. For this analysis, the numbers and proportions of patients who achieved SBP < 140 mmHg and/or DBP < 90 mmHg at the end of the 4 weeks were calculated. Results In patients who achieved DBP normalization (<90 mmHg) at week 8 (n = 1546) and continued open-label olmesartan 20 mg/day, 66.7% achieved SBP/DBP < 140/90 mmHg at Week 12. In patients who did not achieve DBP normalization at Week 8, 26.8% of those randomized to olmesartan 40 mg/day and 42.5% of those randomized to olmesartan 20 mg/day plus HCTZ 12.5 mg/day achieved a SBP/DBP < 140/90 mmHg at Week 12. Conclusion Olmesartan 40 mg/day and olmesartan 20 mg/day plus HCTZ 12.5 mg/day allow substantial proportions of patients to achieve BP goals. PMID:19756164

  15. Additional erythrocytic and reticulocytic parameters helpful for diagnosis of hereditary spherocytosis: results of a multicentre study.

    PubMed

    Mullier, François; Lainey, Elodie; Fenneteau, Odile; Da Costa, Lydie; Schillinger, Françoise; Bailly, Nicolas; Cornet, Yvan; Chatelain, Christian; Dogne, Jean-Michel; Chatelain, Bernard

    2011-07-01

    Hereditary spherocytosis (HS) is characterised by weakened vertical linkages between the membrane skeleton and the red blood cell's lipid bilayer, leading to the release of microparticles. All the reference tests suffer from specific limitations. The aim of this study was to develop easy to use diagnostic tool for screening of hereditary spherocytosis based on routinely acquired haematological parameters like percentage of microcytes, percentage of hypochromic cells, reticulocyte counts, and percentage of immature reticulocytes. The levels of haemoglobin, mean cell volume, mean corpuscular haemoglobin concentration, reticulocytes (Ret), immature reticulocytes fraction (IRF), hypochromic erythrocytes (Hypo-He) and microcytic erythrocytes (MicroR) were determined on EDTA samples on Sysmex instruments from a cohort of 45 confirmed SH. The HS group was then compared with haemolytical disorders, microcytic anaemia, healthy individuals and routine samples (n = 1,488). HS is characterised by a high Ret count without an equally elevated IRF. All 45 HS have Ret >80,000/μl and Ret(10(9)/L)/IRF (%) greater than 7.7 (rule 1). Trait and mild HS had a Ret/IRF ratio greater than 19. Moderate and severe HS had increased MicroR and MicroR/Hypo-He (rule 2). Combination of both rules gave predictive positive value and negative predictive value of respectively 75% and 100% (n=1,488), which is much greater than single parameters or existing rules. This simple and fast diagnostic method could be used as an excellent screening tool for HS. It is also valid for mild HS, neonates and ABO incompatibilities and overcomes the lack of sensitivity of electrophoresis in ankyrin deficiencies.

  16. Outside enclosure and additional enrichment for dairy goats – a preliminary study

    PubMed Central

    2012-01-01

    Background Dairy goats are commonly housed at a space allowance of 0.7 – 0.8 m2/goat in commercial Norwegian goat herds, which is very low compared to regulations and recommendations in other European countries. One easy and cheap way to increase space allowance is to allow the animals’ access to outdoor area. The aim of this study was to investigate the effect of access to an outside enclosure and environmental enrichment for dairy goats kept in slatted floor pens with low space allowance on their activity pattern and social behaviour. Methods A group of 82 dairy goats on a commercial Norwegian dairy farm were kept inside during the winter period from October to April. In April the goats were given access to an outside enclosure for 8 hours per day. After having access to the enclosure for another for two days, enrichment (branches) was provided, and after 19 days the enrichment were removed. The goats were observed for 5 hours per day for the two last days before they got access to the outside enclosure, the two days in the enclosure, the two first and the two last days with enrichment and for the following two days without enrichment by two trained observers. Results When allowed access to the enclosure, the goats spent nearly 50% of the time outside, and later the time spent outside was reduced to less than 40% (P < 0.0001), but there was no clear effect of enrichment. All the goats appeared to have a regular use of the enclosure. Time spent resting decreased 59.2% to only 25.2% when the goats first got access to the enclosure, but then started to increase again (P < 0.0001). Initially time spent exploring and chewing the branches was 20%, but this was reduced to around 12% in the last part of the ENRICH period (P < 0.0001). Number of aggressive interactions tended to increase when the goats were allowed access to the outdoor enclosure whereas play behaviour was only observed in the outside enclosure (P < 0.05). Conclusions In conclusion

  17. Appendix: Conjectures concerning proof, design, and verification.

    SciTech Connect

    Wos, L.

    2000-05-31

    This article focuses on an esoteric but practical use of automated reasoning that may indeed be new to many, especially those concerned primarily with verification of both hardware and software. Specifically, featured are a discussion and some methodology for taking an existing design -- of a circuit, a chip, a program, or the like--and refining and improving it in various ways. Although the methodology is general and does not require the use of a specific program, McCune's program OTTER does offer what is needed. OTTER has played and continues to play the key role in my research, and an interested person can gain access to this program in various ways, not the least of which is through the included CD-ROM in [3]. When success occurs, the result is a new design that may require fewer components, avoid the use of certain costly components, offer more reliability and ease of verification, and, perhaps most important, be more efficient in the contexts of speed and heat generation. Although the author has minimal experience in circuit design, circuit validation, program synthesis, program verification, and similar concerns, (at the encouragement of colleagues based on successes to be cited) he presents materials that might indeed be of substantial interest to manufacturers and programmers. He writes this article in part prompted by the recent activities of chip designers that include Intel and AMD, activities heavily emphasizing the proving of theorems. As for his research that appears to the author to be relevant, he has made an intense and most profitable study of finding proofs that are shorter [2,3], some that avoid the use of various types of term, some that are far less complex than previously known, and the like. Those results suggest to me a strong possible connection between more appealing proofs (in mathematics and in logic) and enhanced and improved design of both hardware and software. Here the author explores diverse conjectures that elucidate some of the

  18. Food additives

    MedlinePlus

    ... or natural. Natural food additives include: Herbs or spices to add flavor to foods Vinegar for pickling ... Certain colors improve the appearance of foods. Many spices, as well as natural and man-made flavors, ...

  19. Dosimetric verification of IMAT delivery with a conventional EPID system and a commercial portal dose image prediction tool

    SciTech Connect

    Iori, Mauro; Cagni, Elisabetta; Paiusco, Marta; Munro, Peter; Nahum, Alan E.

    2010-01-15

    Purpose: The electronic portal imaging device (EPID) is a system for checking the patient setup; as a result of its integration with the linear accelerator and software customized for dosimetry, it is increasingly used for verification of the delivery of fixed-field intensity-modulated radiation therapy (IMRT). In order to extend such an approach to intensity-modulated arc therapy (IMAT), the combined use of an EPID system and a portal dose image prediction (PDIP) tool has been investigated. Methods: The dosimetric behavior of an EPID system, mechanically reinforced to maintain its positional stability during the accelerator gantry rotation, has been studied to assess its ability to measure portal dose distributions for IMAT treatment beams. In addition, the PDIP tool of a commercial treatment planning system, commonly used for static IMRT dosimetry, has been validated for simulating the PDIs of IMAT treatment fields. The method has been applied to the delivery verification of 23 treatment fields that were measured in their dual mode of IMRT and IMAT modalities. Results: The EPID system has proved to be appropriate for measuring the PDIs of IMAT fields; additionally the PDIP tool was able to simulate these accurately. The results are quite similar to those obtained for static IMRT treatment verification, although it was necessary to investigate the dependence of the EPID signal and of the accelerator monitor chamber response on variable dose rate. Conclusions: Our initial tests indicate that the EPID system, together with the PDIP tool, is a suitable device for the verification of IMAT plan delivery; however, additional tests are necessary to confirm these results.

  20. Correction, improvement and model verification of CARE 3, version 3

    NASA Technical Reports Server (NTRS)

    Rose, D. M.; Manke, J. W.; Altschul, R. E.; Nelson, D. L.

    1987-01-01

    An independent verification of the CARE 3 mathematical model and computer code was conducted and reported in NASA Contractor Report 166096, Review and Verification of CARE 3 Mathematical Model and Code: Interim Report. The study uncovered some implementation errors that were corrected and are reported in this document. The corrected CARE 3 program is called version 4. Thus the document, correction. improvement, and model verification of CARE 3, version 3 was written in April 1984. It is being published now as it has been determined to contain a more accurate representation of CARE 3 than the preceding document of April 1983. This edition supercedes NASA-CR-166122 entitled, 'Correction and Improvement of CARE 3,' version 3, April 1983.

  1. STUDY OF THE EFFECT OF CHLORINE ADDITION ON MERCURY OXIDATION BY SCR CATALYST UNDER SIMULATED SUBBITUMINOUS COAL FLUE GAS

    EPA Science Inventory

    An entrained flow reactor is used to study the effect of addition of chlorine-containing species on the oxidation of elemental mercury (Hgo)by a selective catalytic reduction (SCR) catalyst in simulated subbituminous coal combustion flue gas. The combustion flue gas was doped wit...

  2. Effect of Additives on Green Sand Molding Properties using Design of Experiments and Taguchi's Quality Loss Function - An Experimental Study

    NASA Astrophysics Data System (ADS)

    Desai, Bhagyashree; Mokashi, Pavani; Anand, R. L.; Burli, S. B.; Khandal, S. V.

    2016-09-01

    The experimental study aims to underseek the effect of various additives on the green sand molding properties as a particular combination of additives could yield desired sand properties. The input parameters (factors) selected were water and powder (Fly ash, Coconut shell and Tamarind) in three levels. Experiments were planned using design of experiments (DOE). On the basis of plans, experiments were conducted to understand the behavior of sand mould properties such as compression strength, shear strength, permeability number with various additives. From the experimental results it could be concluded that the factors have significant effect on the sand properties as P-value found to be less than 0.05 for all the cases studied. The optimization based on quality loss function was also performed. The study revealed that the quality loss associated with the tamarind powder was lesser compared to other additives selected for the study. The optimization based on quality loss function and the parametric analysis using ANOVA suggested that the tamarind powder of 8 gm per Kg of molding sand and moisture content of 7% yield better properties to obtain sound castings.

  3. A Pilot Study to Examine the Effect of Additional Structured Outdoor Playtime on Preschoolers' Physical Activity Levels

    ERIC Educational Resources Information Center

    Alhassan, Sofiya; Nwaokelemeh, Ogechi; Lyden, Kate; Goldsby, TaShauna; Mendoza, Albert

    2013-01-01

    The impact of additional structured outdoor playtime on preschoolers'; physical activity (PA) level is unclear. The purpose of this pilot study was to explore the effects of increasing structured outdoor playtime on preschoolers'; PA levels. Eight full-day classrooms (n = 134 children) from two preschool programmes were randomised into a treatment…

  4. Imaging of prompt gamma rays emitted during delivery of clinical proton beams with a Compton camera: feasibility studies for range verification.

    PubMed

    Polf, Jerimy C; Avery, Stephen; Mackin, Dennis S; Beddar, Sam

    2015-09-21

    The purpose of this paper is to evaluate the ability of a prototype Compton camera (CC) to measure prompt gamma rays (PG) emitted during delivery of clinical proton pencil beams for prompt gamma imaging (PGI) as a means of providing in vivo verification of the delivered proton radiotherapy beams. A water phantom was irradiated with clinical 114 MeV and 150 MeV proton pencil beams. Up to 500 cGy of dose was delivered per irradiation using clinical beam currents. The prototype CC was placed 15 cm from the beam central axis and PGs from 0.2 MeV up to 6.5 MeV were measured during irradiation. From the measured data (2D) images of the PG emission were reconstructed. (1D) profiles were extracted from the PG images and compared to measured depth dose curves of the delivered proton pencil beams. The CC was able to measure PG emission during delivery of both 114 MeV and 150 MeV proton beams at clinical beam currents. 2D images of the PG emission were reconstructed for single 150 MeV proton pencil beams as well as for a 5   ×   5 cm mono-energetic layer of 114 MeV pencil beams. Shifts in the Bragg peak (BP) range were detectable on the 2D images. 1D profiles extracted from the PG images show that the distal falloff of the PG emission profile lined up well with the distal BP falloff. Shifts as small as 3 mm in the beam range could be detected from the 1D PG profiles with an accuracy of 1.5 mm or better. However, with the current CC prototype, a dose of 400 cGy was required to acquire adequate PG signal for 2D PG image reconstruction. It was possible to measure PG interactions with our prototype CC during delivery of proton pencil beams at clinical dose rates. Images of the PG emission could be reconstructed and shifts in the BP range were detectable. Therefore PGI with a CC for in vivo range verification during proton treatment delivery is feasible. However, improvements in the prototype CC detection efficiency and reconstruction algorithms are necessary

  5. Wavelet Features Based Fingerprint Verification

    NASA Astrophysics Data System (ADS)

    Bagadi, Shweta U.; Thalange, Asha V.; Jain, Giridhar P.

    2010-11-01

    In this work; we present a automatic fingerprint identification system based on Level 3 features. Systems based only on minutiae features do not perform well for poor quality images. In practice, we often encounter extremely dry, wet fingerprint images with cuts, warts, etc. Due to such fingerprints, minutiae based systems show poor performance for real time authentication applications. To alleviate the problem of poor quality fingerprints, and to improve overall performance of the system, this paper proposes fingerprint verification based on wavelet statistical features & co-occurrence matrix features. The features include mean, standard deviation, energy, entropy, contrast, local homogeneity, cluster shade, cluster prominence, Information measure of correlation. In this method, matching can be done between the input image and the stored template without exhaustive search using the extracted feature. The wavelet transform based approach is better than the existing minutiae based method and it takes less response time and hence suitable for on-line verification, with high accuracy.

  6. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  7. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  8. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  9. Crowd-Sourced Program Verification

    DTIC Science & Technology

    2012-12-01

    S / ROBERT L. KAMINSKI WARREN H. DEBANY, JR. Work Unit Manager Technical Advisor, Information Exploitation & Operations...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. PLEASE DO NOT RETURN...investigation, the contractor constructed a prototype of a crowd-sourced verification system that takes as input a given program and produces as output a

  10. Structural System Identification Technology Verification

    DTIC Science & Technology

    1981-11-01

    USAAVRADCOM-TR-81-D-28Q V󈧄 ADA1091 81 LEI STRUCTURAL SYSTEM IDENTIFICATION TECHNOLOGY VERIFICATION \\ N. Giansante, A. Berman, W. o. Flannelly, E...release; distribution unlimited. Prepared for APPLIED TECHNOLOGY LABORATORY U. S. ARMY RESEARCH AND TECHNOLOGY LABORATORIES (AVRADCOM) S Fort Eustis...Va. 23604 4-J" APPLI ED TECHNOLOGY LABORATORY POSITION STATEMENT The Applied Technology Laboratory has been involved in the development of the Struc

  11. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  12. U.S. verification method disputed

    NASA Astrophysics Data System (ADS)

    Maggs, William Ward

    Milo Nordyke, senior scientist at Lawrence Liver more National Laboratory in Liver more, Calif., testified October 6 at a Senate Foreign Affairs Committee hearing on Soviet test ban noncompliance and the recently concluded Joint Verification Experiment. He said that the the government's method for on-site test monitoring is intrusive, expensive, and could limit some U.S. weapon design programs. In addition, Gregory Van der Vink of the congressional Office of Technology Assessment presented new evidence that White House charges that the Soviet Union has not complied with the current 150 kiloton test limit are probably without basis.Also testifying were Paul Robinson, U.S. negotiator for the Nuclear Testing Talks; Peter Sharfman, program manager for International Security and Commerce at OTA; and physicist David Hafemeister of California Polytechnical State University, San Luis Obispo.

  13. MOV reliability evaluation and periodic verification scheduling

    SciTech Connect

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  14. Measurements for liquid rocket engine performance code verification

    NASA Technical Reports Server (NTRS)

    Praharaj, Sarat C.; Palko, Richard L.

    1986-01-01

    The goal of the rocket engine performance code verification tests is to obtain the I sub sp with an accuracy of 0.25% or less. This needs to be done during the sequence of four related tests (two reactive and two hot gas simulation) to best utilize the loss separation technique recommended in this study. In addition to I sub sp, the measurements of the input and output parameters for the codes are needed. This study has shown two things in regard to obtaining the I sub sp uncertainty within the 0.25% target. First, this target is generally not being realized at the present time, and second, the instrumentation and testing technology does exist to obtain this 0.25% uncertainty goal. However, to achieve this goal will require carefully planned, designed, and conducted testing. In addition, the test-stand (or system) dynamics must be evaluated in the pre-test and post-test phases of the design of the experiment and data analysis, respectively always keeping in mind that a .25% overall uncertainty in I sub sp is targeted. A table gives the maximum allowable uncertainty required for obtaining I sub sp with 0.25% uncertainty, the currently-quoted instrument specification, and present test uncertainty for the parameters. In general, it appears that measurement of the mass flow parameter within the required uncertainty may be the most difficult.

  15. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  16. Earthquake Forecasting, Validation and Verification

    NASA Astrophysics Data System (ADS)

    Rundle, J.; Holliday, J.; Turcotte, D.; Donnellan, A.; Tiampo, K.; Klein, B.

    2009-05-01

    Techniques for earthquake forecasting are in development using both seismicity data mining methods, as well as numerical simulations. The former rely on the development of methods to recognize patterns in data, while the latter rely on the use of dynamical models that attempt to faithfully replicate the actual fault systems. Testing such forecasts is necessary not only to determine forecast quality, but also to improve forecasts. A large number of techniques to validate and verify forecasts have been developed for weather and financial applications. Many of these have been elaborated in public locations, including, for example, the URL as listed below. Typically, the goal is to test for forecast resolution, reliability and sharpness. A good forecast is characterized by consistency, quality and value. Most, if not all of these forecast verification procedures can be readily applied to earthquake forecasts as well. In this talk, we discuss both methods of forecasting, as well as validation and verification using a number of these standard methods. We show how these test methods might be useful for both fault-based forecasting, a group of forecast methods that includes the WGCEP and simulator-based renewal models, and grid-based forecasting, which includes the Relative Intensity, Pattern Informatics, and smoothed seismicity methods. We find that applying these standard methods of forecast verification is straightforward. Judgments about the quality of a given forecast method can often depend on the test applied, as well as on the preconceptions and biases of the persons conducting the tests.

  17. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  18. Verification and validation for induction heating

    SciTech Connect

    Lam, Kin; Tippetts, Trevor B; Allen, David W

    2008-01-01

    Truchas is a software package being developed at LANL within the Telluride project for predicting the complex physical processes in metal alloy casting. The software was designed to be massively parallel, multi-material, multi-physics, and to run on 3D, fully unstructured meshes. This work describes a Verification and Validation assessment of Truchas for simulating the induction heating phase of a casting process. We used existing data from a simple experiment involving the induction heating of a graphite cylinder, as graphite is a common material used for mold assemblies. Because we do not have complete knowledge of all the conditions and properties in this experiment (as is the case in many other experiments), we performed a parameter sensitivity study, modeled the uncertainties of the most sensitive parameters, and quantified how these uncertainties propagate to the Truchas output response. A verification analysis produced estimates of the numerical error of the Truchas solution to our computational model. The outputs from Truchas runs with randomly sampled parameter values were used for the validation study.

  19. Multi-spectroscopic and molecular modeling studies of bovine serum albumin interaction with sodium acetate food additive.

    PubMed

    Mohammadzadeh-Aghdash, Hossein; Ezzati Nazhad Dolatabadi, Jafar; Dehghan, Parvin; Panahi-Azar, Vahid; Barzegar, Abolfazl

    2017-08-01

    Sodium acetate (SA) has been used as a highly effective protectant in food industry and the possible effect of this additive on the binding to albumin should be taken into consideration. Therefore, for the first time, the mechanism of SA interaction with bovine serum albumin (BSA) has been investigated by multi-spectroscopic and molecular modeling methods under physiological conditions. Stern-Volmer fluorescence quenching analysis showed an increase in the fluorescence intensity of BSA upon increasing the amounts of SA. The high affinity of SA to BSA was demonstrated by a binding constant value (1.09×10(3) at 310°K). The thermodynamic parameters indicated that hydrophobic binding plays a main role in the binding of SA to Albumin. Furthermore, the results of UV-vis spectra confirmed the interaction of this additive to BSA. In addition, molecular modeling study demonstrated that A binding sites of BSA play the main role in the interaction with acetate.

  20. Effectiveness of a pressurized stormwater filtration system in Green Bay, Wisconsin: a study for the environmental technology verification program of the U.S. Environmental Protection Agency

    USGS Publications Warehouse

    Horwatich, J.A.; Corsi, Steven R.; Bannerman, Roger T.

    2004-01-01

    A pressurized stormwater filtration system was installed in 1998 as a stormwater-treatment practice to treat runoff from a hospital rooftop and parking lot in Green Bay, Wisconsin. This type of filtration system has been installed in Florida citrus groves and sewage treatment plants around the United States; however, this installation is the first of its kind to be used to treat urban runoff and the first to be tested in Wisconsin. The U.S. Geological Survey (USGS) monitored the system between November 2000 and September 2002 to evaluate it as part of the U.S. Environmental Protection Agency's Environmental Technology Verification Program. Fifteen runoff events were monitored for flow and water quality at the inlet and outlet of the system, and comparison of the event mean concentrations and constituent loads was used to evaluate its effectiveness. Loads were decreased in all particulate-associated constituents monitored, including suspended solids (83 percent), suspended sediment (81 percent), total Kjeldahl nitrogen (26 percent), total phosphorus (54 percent), and total recoverable zinc (62 percent). Total dissolved solids, dissolved phosphorus, and nitrate plus nitrite loads remained similar or increased through the system. The increase in some constituents was most likely due to a ground-water contribution between runoff events. Sand/silt split analysis resulted in the median silt content of 78 percent at the inlet, 87 percent at the outlet, and 3 percent at the flow splitter.

  1. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  2. A mechanistic study of manganese(iii) acetate-mediated phosphonyl group additions to [60]- and [70]-fullerenes: the oxidative-ion-transfer mechanism vs. free radical addition.

    PubMed

    Tumanskii, Boris L; Sabirov, Denis S; Lyakhovetsky, Yury I

    2016-11-14

    The phosphonylation of C60 with HP(O)(OAlk)2 and Mn(OAc)3·2H2O has been considered to occur via a free radical (FR) path involving intermediate radicals ˙P(O)(OAlk)2. The present study provides evidence in support of another mechanism for the reactions, oxidative-ion-transfer (OIT). The mechanism involves the change of an acetate group in Mn(OAc)3 for the phosphonate group and oxidation of C60 by the Mn(OAc)2P(O)(OAlk)2 formed to a pair: (C60˙(+), Mn(OAc)2P(O)(OAlk)2˙(-)) followed by the transfer of the phosphonate anion to give the monophposphonylfullerenyl radical. It undergoes reversible dimerization. The polyaddition occurs analogously. Moreover, the compounds Mn(OAc)2P(O)(OAlk)2 (Alk = Et and i-Pr) obtained make novel reagents for phosphonylation of fullerenes working by the OIT mechanism. The reactions of C60 in benzene with equimolar amounts of Mn(OAc)2P(O)(OPr-i)2 or Hg[P(O)(OPr-i)2]2 which is known as working by the FR mechanism since it produces radical ˙P(O)(OPr-i)2 under UV-irradiation, furnished the same radical ˙C60P(O)(OPr-i)2. However, at a 20-fold molar excess of the reagent toward C60, a single derivative C60[P(O)(OPr-i)2]4 and a mixture of derivatives bearing between two and eight phosphonyls were obtained in the former and latter cases, respectively. With C70, the change of the mechanism produced a change in the regioselectivity: 5 and 3 isomers of ˙C70P(O)(OPr-i)2 were obtained, respectively. DFT-calculations provided the hyperfine coupling (hfc) constants of the isomers and explained the regioselectivity change.

  3. Unravelling the impact of hydrocarbon structure on the fumarate addition mechanism--a gas-phase ab initio study.

    PubMed

    Bharadwaj, Vivek S; Vyas, Shubham; Villano, Stephanie M; Maupin, C Mark; Dean, Anthony M

    2015-02-14

    The fumarate addition reaction mechanism is central to the anaerobic biodegradation pathway of various hydrocarbons, both aromatic (e.g., toluene, ethyl benzene) and aliphatic (e.g., n-hexane, dodecane). Succinate synthase enzymes, which belong to the glycyl radical enzyme family, are the main facilitators of these biochemical reactions. The overall catalytic mechanism that converts hydrocarbons to a succinate molecule involves three steps: (1) initial H-abstraction from the hydrocarbon by the radical enzyme, (2) addition of the resulting hydrocarbon radical to fumarate, and (3) hydrogen abstraction by the addition product to regenerate the radical enzyme. Since the biodegradation of hydrocarbon fuels via the fumarate addition mechanism is linked to bio-corrosion, an improved understanding of this reaction is imperative to our efforts of predicting the susceptibility of proposed alternative fuels to biodegradation. An improved understanding of the fuel biodegradation process also has the potential to benefit bioremediation. In this study, we consider model aromatic (toluene) and aliphatic (butane) compounds to evaluate the impact of hydrocarbon structure on the energetics and kinetics of the fumarate addition mechanism by means of high level ab initio gas-phase calculations. We predict that the rate of toluene degradation is ∼100 times faster than butane at 298 K, and that the first abstraction step is kinetically significant for both hydrocarbons, which is consistent with deuterium isotope effect studies on toluene degradation. The detailed computations also show that the predicted stereo-chemical preference of the succinate products for both toluene and butane are due to the differences in the radical addition rate constants for the various isomers. The computational and kinetic modeling work presented here demonstrates the importance of considering pre-reaction and product complexes in order to accurately treat gas phase systems that involve intra and inter

  4. Enantioselective conjugate addition of nitro compounds to α,β-unsaturated ketones: an experimental and computational study.

    PubMed

    Manzano, Rubén; Andrés, José M; Álvarez, Rosana; Muruzábal, María D; de Lera, Ángel R; Pedrosa, Rafael

    2011-05-16

    A series of chiral thioureas derived from easily available diamines, prepared from α-amino acids, have been tested as catalysts in the enantioselective Michael additions of nitroalkanes to α,β-unsaturated ketones. The best results are obtained with the bifunctional catalyst prepared from L-valine. This thiourea promotes the reaction with high enantioselectivities and chemical yields for aryl/vinyl ketones, but the enantiomeric ratio for alkyl/vinyl derivatives is very modest. The addition of substituted nitromethanes led to the corresponding adducts with excellent enantioselectivity but very poor diastereoselectivity. Evidence for the isomerization of the addition products has been obtained from the reaction of chalcone with [D(3)]nitromethane, which shows that the final addition products epimerize under the reaction conditions. The epimerization explains the low diastereoselectivity observed in the formation of adducts with two adjacent tertiary stereocenters. Density functional studies of the transition structures corresponding to two alternative activation modes of the nitroalkanes and α,β-unsaturated ketones by the bifunctional organocatalyst have been carried out at the B3LYP/3-21G* level. The computations are consistent with a reaction model involving the Michael addition of the thiourea-activated nitronate to the ketone activated by the protonated amine of the organocatalyst. The enantioselectivities predicted by the computations are consistent with the experimental values obtained for aryl- and alkyl-substituted α,β-unsaturated ketones.

  5. A 3D QSAR study of betulinic acid derivatives as anti-tumor agents using topomer CoMFA: model building studies and experimental verification.

    PubMed

    Ding, Weimin; Sun, Miao; Luo, Shaman; Xu, Tao; Cao, Yibo; Yan, Xiufeng; Wang, Yang

    2013-08-22

    Betulinic acid (BA) is a natural product that exerts its cytotoxicity against various malignant carcinomas without side effects by triggering the mitochondrial pathway to apoptosis. Betulin (BE), the 28-hydroxyl analog of BA, is present in large amounts (up to 30% dry weight) in the outer bark of birch trees, and shares the same pentacyclic triterpenoid core as BA, yet exhibits no significant cytotoxicity. Topomer CoMFA studies were performed on 37 BA and BE derivatives and their in vitro anti-cancer activity results (reported as IC₅₀ values) against HT29 human colon cancer cells in the present study. All derivatives share a common pentacyclic triterpenoid core and the molecules were split into three pieces by cutting at the C-3 and C-28 sites with a consideration toward structural diversity. The analysis gave a leave-one-out cross-validation q² value of 0.722 and a non-cross-validation r² value of 0.974, which suggested that the model has good predictive ability (q² > 0.2). The contour maps illustrated that bulky and electron-donating groups would be favorable for activity at the C-28 site, and a moderately bulky and electron-withdrawing group near the C-3 site would improve this activity. BE derivatives were designed and synthesized according to the modeling result, whereby bulky electronegative groups (maleyl, phthalyl, and hexahydrophthalyl groups) were directly introduced at the C-28 position of BE. The in vitro cytotoxicity values of the given analogs against HT29 cells were consistent with the predicted values, proving that the present topomer CoMFA model is successful and that it could potentially guide the synthesis of new betulinic acid derivatives with high anti-cancer activity. The IC₅₀ values of these three new compounds were also assayed in five other tumor cell lines. 28-O-hexahydrophthalyl BE exhibited the greatest anti-cancer activities and its IC₅₀ values were lower than those of BA in all cell lines, excluding DU145 cells.

  6. A study of internal structure in components made by additive manufacturing process using 3 D X-ray tomography

    SciTech Connect

    Raguvarun, K. Balasubramaniam, Krishnan Rajagopal, Prabhu; Palanisamy, Suresh; Nagarajah, Romesh; Kapoor, Ajay; Hoye, Nicholas; Curiri, Dominic

    2015-03-31

    Additive manufacturing methods are gaining increasing popularity for rapidly and efficiently manufacturing parts and components in the industrial context, as well as for domestic applications. However, except when used for prototyping or rapid visualization of components, industries are concerned with the load carrying capacity and strength achievable by additive manufactured parts. In this paper, the wire-arc additive manufacturing (AM) process based on gas tungsten arc welding (GTAW) has been examined for the internal structure and constitution of components generated by the process. High-resolution 3D X-ray tomography is used to gain cut-views through wedge-shaped parts created using this GTAW additive manufacturing process with titanium alloy materials. In this work, two different control conditions for the GTAW process are considered. The studies reveal clusters of porosities, located in periodic spatial intervals along the sample cross-section. Such internal defects can have a detrimental effect on the strength of the resulting AM components, as shown in destructive testing studies. Closer examination of this phenomenon shows that defect clusters are preferentially located at GTAW traversal path intervals. These results highlight the strong need for enhanced control of process parameters in ensuring components with minimal defects and higher strength.

  7. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  8. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  9. Design verification of SIFT

    NASA Technical Reports Server (NTRS)

    Moser, Louise; Melliar-Smith, Michael; Schwartz, Richard

    1987-01-01

    A SIFT reliable aircraft control computer system, designed to meet the ultrahigh reliability required for safety critical flight control applications by use of processor replications and voting, was constructed for SRI, and delivered to NASA Langley for evaluation in the AIRLAB. To increase confidence in the reliability projections for SIFT, produced by a Markov reliability model, SRI constructed a formal specification, defining the meaning of reliability in the context of flight control. A further series of specifications defined, in increasing detail, the design of SIFT down to pre- and post-conditions on Pascal code procedures. Mechanically checked mathematical proofs were constructed to demonstrate that the more detailed design specifications for SIFT do indeed imply the formal reliability requirement. An additional specification defined some of the assumptions made about SIFT by the Markov model, and further proofs were constructed to show that these assumptions, as expressed by that specification, did indeed follow from the more detailed design specifications for SIFT. This report provides an outline of the methodology used for this hierarchical specification and proof, and describes the various specifications and proofs performed.

  10. INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...

    EPA Pesticide Factsheets

    The Synchronous Scanning Luminoscope (Luminoscope) developed by the Oak Ridge National Laboratory in collaboration with Environmental Systems Corporation (ESC) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the Luminoscope and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in five areas contaminated with gasoline, diesel, lubricating oil, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,

  11. INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...

    EPA Pesticide Factsheets

    The EnSys Petro Test System developed by Strategic Diagnostics Inc. (SDI), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the EnSys Petro Test System and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in four areas contaminated with gasoline, diesel, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,

  12. INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...

    EPA Pesticide Factsheets

    The RemediAidTm Total Petroleum Hydrocarbon Starter Kit (RemediAidTm kit) developed by CHEMetries, Inc. (CHEMetrics), and AZUR Environmental Ltd was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the RemediAid Tm kit and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in five areas contaminated with gasoline, diesel, lubricating oil, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,

  13. INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...

    EPA Pesticide Factsheets

    The hifracal' TOG/TPH Analyzer developed by Wilks Enterprise, Inc. (Wilks), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the Infracal' TOG/TPH Analyzer and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation (PE) samples and environmental samples collected in five areas contaminated with gasoline, diesel, lubricating oil, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,

  14. INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...

    EPA Pesticide Factsheets

    site LAB(& Analytical Test Kit UVF-3 I OOA (UVF-3 I OOA) developed by siteLABqD Corporation (siteLABa)) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the UVF-3 I OOA and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in five areas contaminated with gasoline, diesel, lubricating oil, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,

  15. INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...

    EPA Pesticide Factsheets

    The PetroFLAGTm System developed by Dexsilo Corporation (Dexsil) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the PetroFLAGTm System and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation (PE) samples and environmental samples collected in four areas contaminated with gasoline, diesel, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,

  16. INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...

    EPA Pesticide Factsheets

    The OCMA-350 Oil Content Analyzer(OCMA-350) developed by Horiba Instruments Incorporated (Horiba), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the OCNIA-350 and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in five areas contaminated with gasoline, diesel, lubricating oil, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,

  17. Mesoscale model forecast verification during monsoon 2008

    NASA Astrophysics Data System (ADS)

    Ashrit, Raghavendra; Mohandas, Saji

    2010-08-01

    There have been very few mesoscale modelling studies of the Indian monsoon, with focus on the verification and intercomparison of the operational real time forecasts. With the exception of Das et al (2008), most of the studies in the literature are either the case studies of tropical cyclones and thunderstorms or the sensitivity studies involving physical parameterization or climate simulation studies. Almost all the studies are based on either National Center for Environmental Prediction (NCEP), USA, final analysis fields (NCEP FNL) or the reanalysis data used as initial and lateral boundary conditions for driving the mesoscale model. Here we present a mesoscale model forecast verification and intercomparison study over India involving three mesoscale models: (i) the Weather Research and Forecast (WRF) model developed at the National Center for Atmospheric Research (NCAR), USA, (ii) the MM5 model developed by NCAR, and (iii) the Eta model of the NCEP, USA. The analysis is carried out for the monsoon season, June to September 2008. This study is unique since it is based entirely on the real time global model forecasts of the National Centre for Medium Range Weather Forecasting (NCMRWF) T254 global analysis and forecast system. Based on the evaluation and intercomparison of the mesoscale model forecasts, we recommend the best model for operational real-time forecasts over the Indian region. Although the forecast mean 850 hPa circulation shows realistic monsoon flow and the monsoon trough, the systematic errors over the Arabian Sea indicate an easterly bias to the north (of mean flow) and westerly bias to the south (of mean flow). This suggests that the forecasts feature a southward shift in the monsoon current. The systematic error in the 850 hPa temperature indicates that largely the WRF model forecasts feature warm bias and the MM5 model forecasts feature cold bias. Features common to all the three models include warm bias over northwest India and cold bias over

  18. Subsurface barrier integrity verification using perfluorocarbon tracers

    SciTech Connect

    Sullivan, T.M.; Heiser, J.; Milian, L.; Senum, G.

    1996-12-01

    Subsurface barriers are an extremely promising remediation option to many waste management problems. Gas phase tracers include perfluorocarbon tracers (PFT`s) and chlorofluorocarbon tracers (CFC`s). Both have been applied for leak detection in subsurface systems. The focus of this report is to describe the barrier verification tests conducted using PFT`s and analysis of the data from the tests. PFT verification tests have been performed on a simulated waste pit at the Hanford Geotechnical facility and on an actual waste pit at Brookhaven National Laboratory (BNL). The objective of these tests were to demonstrate the proof-of-concept that PFT technology can be used to determine if small breaches form in the barrier and for estimating the effectiveness of the barrier in preventing migration of the gas tracer to the monitoring wells. The subsurface barrier systems created at Hanford and BNL are described. The experimental results and the analysis of the data follow. Based on the findings of this study, conclusions are offered and suggestions for future work are presented.

  19. Compendium of Arms Control Verification Proposals.

    DTIC Science & Technology

    1982-03-01

    ZONAL ON-SITE INSPECTION ............ 123 CHAPTER D - CONTROL POSTS ................................... 139 CHAPTER E - RECORDS MONITORING...de:cribi.nr in reneral the zirnifiemit features of the verification method concerned. I’ ’ ’vi.i Chapters A to D deal with verification by direct on...inspection (i.e. increasing as confidence develops), and chapter D with control or observation posts. Chapter E deals with verification by examination of

  20. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  1. Using tools for verification, documentation and testing

    NASA Technical Reports Server (NTRS)

    Osterweil, L. J.

    1978-01-01

    Methodologies are discussed on four of the major approaches to program upgrading -- namely dynamic testing, symbolic execution, formal verification and static analysis. The different patterns of strengths, weaknesses and applications of these approaches are shown. It is demonstrated that these patterns are in many ways complementary, offering the hope that they can be coordinated and unified into a single comprehensive program testing and verification system capable of performing a diverse and useful variety of error detection, verification and documentation functions.

  2. Implementation of Precision Verification Solvents on the External Tank

    NASA Technical Reports Server (NTRS)

    Campbell, M.

    1998-01-01

    This paper presents the Implementation of Precision Verification Solvents on the External Tank. The topics include: 1) Background; 2) Solvent Usages; 3) TCE (Trichloroethylene) Reduction; 4) Solvent Replacement Studies; 5) Implementation; 6) Problems Occuring During Implementation; and 7) Future Work. This paper is presented in viewgraph form.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT FOR AMMONIA RECOVERY PROCESS

    EPA Science Inventory

    This Technology Verification report describes the nature and scope of an environmental evaluation of ThermoEnergy Corporation’s Ammonia Recovery Process (ARP) system. The information contained in this report represents data that were collected over a 3-month pilot study. The ti...

  4. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  5. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  6. Use laser optics for on-stream alignment verification

    SciTech Connect

    Block, H.P.

    1991-01-01

    One of the many significant factors influencing machinery reliability in process plants is alignment accuracy. When driven process machines operate in misaligned condition with respect to their drivers, equipment bearings are exposed to additional loads. Vibration severity may increase, bearings will be more highly loaded and equipment life expectancy will diminish. This paper reports on the use of laser optics for on-stream alignment verification.

  7. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  8. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  9. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  10. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  11. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  12. Microstructural Development and Technical Challenges in Laser Additive Manufacturing: Case Study with a 316L Industrial Part

    NASA Astrophysics Data System (ADS)

    Marya, Manuel; Singh, Virendra; Marya, Surendar; Hascoet, Jean Yves

    2015-08-01

    Additive manufacturing (AM) brings disruptive changes to the ways parts, and products are designed, fabricated, tested, qualified, inspected, marketed, and sold. These changes introduce novel technical challenges and concerns arising from the maturity and diversity of today's AM processes, feedstock materials, and process parameter interactions. AM bears a resemblance with laser and electron beam welding in the so-called conduction mode, which involves a multitude of dynamic physical events between the projected feedstock and a moving heat source that eventually influence AM part properties. For this paper, an air vent was selected for its thin-walled, hollow, and variable cross section, and limited size. The studied air vents, randomly selected from a qualification batch, were fabricated out of 316L stainless steel using a 4 kW fiber laser powder-fed AM system, referred to as construction laser additive direct (CLAD). These were systematically characterized by microhardness indentation, visual examination, optical and scanning electron microscopy, and electron-back-scattering diffraction in order to determine AM part suitability for service and also broadly discuss metallurgical phenomena. The paper then briefly expands the discussion to include additional engineering alloys and further analyze relationships between AM process parameters and AM part properties, consistently utilizing past experience with the same powder-fed CLAD 3D printer, the well-established science and technology of welding and joining, and recent publications on additive manufacturing.

  13. St. Lawrence Seaway N.Y. Feasibility Study for Additional Locks and Other Navigation Improvements: Plan of Study. Revision.

    DTIC Science & Technology

    1979-11-01

    throughout has increased. The shift to larger vessels, laker and ocean, has been faster than the rate of growth in tonnage demand for carriage. Various studies...Conditions Navigation Season 15 April to 15 December (1) (Thousands of Short Tons) :Annual * : Rate of * * *: Growth Commodity :: 1980- Group :1980 1990...December (1) (Thousands of Short Tons) * . * *:Annual * * * *: Rate of *: Growth Commodity : *: 1980- Group :1980 :1990 :2000 :2010 :2020 :2030 : 2030

  14. A quantum chemical study of the mechanisms of olefin addition to group 9 transition metal dioxo compounds.

    PubMed

    Ahmed, Issahaku; Tia, Richard; Adei, Evans

    2016-01-01

    triplet PES than on the singlet PES for the formation of similar analogues. There are fewer competitive reaction pathways on the triplet surface than on the singlet PES. Also, cycloadditions that seem impossible on the singlet PES seem possible on the doublet and or triplet PESs, this is the case typically for the Rh and Co complexes, illustrating the importance of multiple spin states in organometallic reactions.Graphical AbstractTable of Contents Synopsis: A study of the mechanism of ethylene addition to MO2(CH2)(CH3)(M=Co,Rh,Ir) shows the reactions of the Co complex have lower activation barriers for the preferred [3+2] and [2+2] addition pathways and fewer side reactions than those of Rh and Ir. Reactions are more feasible and selective on the triplet PES than on the singlet PES. These illustrate the importance of multiple spin states in organometallic reactions and shows catalyst activity and selectivity decreases down the group.

  15. A fundamental study of the oxidation behavior of SI primary reference fuels with propionaldehyde and DTBP as an additive

    NASA Astrophysics Data System (ADS)

    Johnson, Rodney

    In an effort to combine the benefits of SI and CI engines, Homogeneous Charge Compression Ignition (HCCI) engines are being developed. HCCI combustion is achieved by controlling the temperature, pressure, and composition of the fuel and air mixture so that autoignition occurs in proper phasing with the piston motion. This control system is fundamentally more challenging than using a spark plug or fuel injector to determine ignition timing as in SI and CI engines, respectively. As a result, this is a technical barrier that must be overcome to make HCCI engines applicable to a wide range of vehicles and viable for high volume production. One way to tailor the autoignition timing is to use small amounts of ignition enhancing additives. In this study, the effect of the addition of DTBP and propionaldehyde on the autoignition behavior of SI primary reference fuels was investigated. The present work was conducted in a new research facility built around a single cylinder Cooperative Fuels Research (CFR) octane rating engine but modified to run in HCCI mode. It focused on the effect of select oxygenated hydrocarbons on hydrocarbon fuel oxidation, specifically, the primary reference fuels n-heptane and iso-octane. This work was conducted under HCCI operating conditions. Previously, the operating parameters for this engine were validated for stable combustion under a wide range of operating parameters such as engine speeds, equivalence ratios, compression ratios and inlet manifold temperature. The stable operating range under these conditions was recorded and used for the present study. The major focus of this study was to examine the effect of the addition of DTBP or propionaldehyde on the oxidation behavior of SI primary reference fuels. Under every test condition the addition of the additives DTBP and propionaldehyde caused a change in fuel oxidation. DTBP always promoted fuel oxidation while propionaldehyde promoted oxidation for lower octane number fuels and delayed

  16. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  17. Applicability of the DPPH assay for evaluating the antioxidant capacity of food additives - inter-laboratory evaluation study -.

    PubMed

    Shimamura, Tomoko; Sumikura, Yoshihiro; Yamazaki, Takeshi; Tada, Atsuko; Kashiwagi, Takehiro; Ishikawa, Hiroya; Matsui, Toshiro; Sugimoto, Naoki; Akiyama, Hiroshi; Ukeda, Hiroyuki

    2014-01-01

    An inter-laboratory evaluation study was conducted in order to evaluate the antioxidant capacity of food additives by using a 1,1-diphenyl-2-picrylhydrazyl (DPPH) assay. Four antioxidants used as existing food additives (i.e., tea extract, grape seed extract, enju extract, and d-α-tocopherol) and 6-hydroxy-2,5,7,8-tetramethylchroman-2-carboxylic acid (Trolox) were used as analytical samples, and 14 laboratories participated in this study. The repeatability relative standard deviation (RSD(r)) of the IC50 of Trolox, four antioxidants, and the Trolox equivalent antioxidant capacity (TEAC) were 1.8-2.2%, 2.2-2.9%, and 2.1-2.5%, respectively. Thus, the proposed DPPH assay showed good performance within the same laboratory. The reproducibility relative standard deviation (RSD(R)) of IC50 of Trolox, four antioxidants, and TEAC were 4.0-7.9%, 6.0-11%, and 3.7-9.3%, respectively. The RSD(R)/RSD(r) values of TEAC were lower than, or nearly equal to, those of IC50 of the four antioxidants, suggesting that the use of TEAC was effective for reducing the variance among the laboratories. These results showed that the proposed DPPH assay could be used as a standard method to evaluate the antioxidant capacity of food additives.

  18. The influence of deposit control additives on nitrogen oxides emissions from spark ignition engines (case study: Tehran).

    PubMed

    Bidhendi, Gholamreza Nabi; Zand, Ali Daryabeigi; Tabrizi, Alireza Mikaeili; Pezeshk, Hamid; Baghvand, Akbar

    2007-04-15

    In the present research, the influence of a deposit control additive on NOx emissions from two types of gasoline engine vehicles i.e., Peykan (base on Hillman) and Pride (South Korea Kia motors) was studied. Exhaust NOx emissions were measured in to stages, before decarbonization process and after that. Statistical analysis was conducted on the measurement results. Results showed that NOx emissions from Peykans increased 0.28% and NOx emissions from Pride automobiles decreased 6.18% on average, due to the elimination of engine deposits. The observed variations were not statistically and practically significant. The results indicated that making use of detergent additives is not an effective way to reduce the exhaust NOx emissions from gasoline engine vehicles.

  19. Neptunium flow-sheet verification at reprocessing plants

    SciTech Connect

    Rance, P.; Chesnay, B.; Killeen, T.; Murray, M.; Nikkinen, M.; Petoe, A.; Plumb, J.; Saukkonen, H.

    2007-07-01

    Due to their fissile nature, neptunium and americium have at least a theoretical potential application as nuclear explosives and their proliferation potential was considered by the IAEA in studies in the late 1990's. This work was motivated by an increased awareness of the proliferation potential of americium and neptunium and a number of emerging projects in peaceful nuclear programmes which could result in an increase in the available quantities of these minor actinides. The studies culminated in proposals for various voluntary measures including the reporting of international transfers of separated americium and neptunium, declarations concerning the amount of separated neptunium and americium held by states and the application of flow-sheet verification to ensure that facilities capable of separating americium or neptunium are operated in a manner consistent with that declared. This paper discusses the issue of neptunium flowsheet verification in reprocessing plants. The proliferation potential of neptunium is first briefly discussed and then the chemistry of neptunium relevant to reprocessing plants described with a view to indicating a number of issues relevant to the verification of neptunium flow-sheets. Finally, the scope of verification activities is discussed including analysis of process and engineering design information, plant monitoring and sampling and the potential application of containment and surveillance measures. (authors)

  20. Why do verification and validation?

    DOE PAGES

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.